Over the past two years, Apple's Worldwide Developers Conference (WWDC) has become a showcase for new ARKit capabilities. This year, it could offer more information related to Apple's long rumored augmented reality wearable.
According to sources close to the matter speaking to 9to5Mac, Apple will introduce game development support for stereo AR headsets and touch controllers. Apple is also expected to reveal its latest update to ARKit, which will include human pose detection, as well as introduce a new AR framework for Swift, and a visual programming app for building AR experiences.
The addition of pose detection to ARKit is intriguing, as it could open up a whole new realm of AR experiences in apps and maybe even full-body Animojis.
Meanwhile, the expansion of the AR development ecosystem may pose a threat to Unity and Unreal Engine, the leading development environments for AR experiences.
But it's the support for AR headsets and peripherals that should really get AR enthusiasts excited. Why would Apple support this hardware if not for its own? Or, would the company be open to tethering to something like an Nreal Light?
The inevitability of Apple smartglasses has been well-documented by insider reports and analysts. Some reports have AR wearables from Apple arriving as soon as 2019 for launch in 2020, while other reports push their arrival back to 2021.
Could this be the lead-up to the big reveal the AR industry and tech early adopters have been waiting for? WWDC is only a couple of months away, so we won't have to wait too long to find out.