Despite hints hidden in internal iOS 13 code, Apple did not unveil its long-rumored smartglasses at its annual iPhone launch event on Tuesday.
But the company did continue to move in the direction of AR wearables. Here's how...
As expected, along with the seventh-generation iPad and the newest Apple Watch, Apple made public its latest line-up of iPhones, namely the iPhone 11, iPhone 11 Pro, and iPhone 11 Pro Max to succeed the iPhone XR, XS, and XS Max, respectively, in a fairly underwhelming presentation.
The devices contained mostly iterative (and hardly ground-breaking) improvements, new color finishes, spatial audio, upgraded camera capabilities, a new A13 Bionic chip, and better battery life. Along with the new chip, the iPhone 11 Pro and Pro Max debuted new matte finishes, a new OLED display, even more battery life, and a new triple rear camera system.
But one of those features, left out of the keynote presentation itself, has implications on future AR hardware as well. The iPhone Pro models include a new U1 chip that supplies spatial awareness to the devices, enabling "directionally-aware" AirDrop capabilities in iOS 13.1, which will arrive Sept. 30.
Previous reports cited the chip as the R1 (codenamed Rosie) and indicated that it would manage data collected from the iPhone's sensor array. The U1 coprocessor integrates an inertial measurement unit (IMU) for motion tracking, Bluetooth 5.1 for direction tracking of paired devices, and camera data to support ARKit features like People Occlusion.
While it is unclear if the U1 chip carries all of those rumored capabilities, the addition of "spatial awareness" has implications on Apple's hardware evolution towards AR wearables.
The TrueDepth depth-sensing camera (introduced with the iPhone X), its ARKit toolkit for developing AR apps without the need for specialized sensors, and now the U1 chip lay the foundation for Apple's future smartglasses. Although previous reports have predicted that Apple will eventually bring depth sensors to the rear camera, the latest hardware addition supports a notion that Apple can continue with a more software-based approach to world sensing by beefing up the processing power.
Even the new Apple Watch, with an always-on display and built-in compass, demonstrates potential advances for future AR wearables. The display runs on a dynamic refresh rate that enables better power efficiency, which is something that will come in handy for smartglasses displays that call for constant usage. With the inclusion of the compass, Apple demonstrates the engineering capability to cram even more sensors into a small package, which will also be useful in smartglasses design.
Perhaps Apple can sense the impatience of the tech world for a new AR product. For instance, the new iPad creates a new entry-level for those looking to experience AR apps with a price tag of just $329 (lower than any iPhone available through Apple). The company is even pitching the tablet's 10.2-inch display as a "vivid canvas for creative expression and perfect for immersive augmented reality (AR) experiences."
So while we still have to wait a bit longer to see how Apple will bring its long-awaited (yet still unconfirmed) smartglasses to the world, at least we can see the steps the company is taking to get there. However, until then, perhaps Apple can forego the usual product launch event when they don't offer something truly new and innovative.
Just updated your iPhone to iOS 18? You'll find a ton of hot new features for some of your most-used Apple apps. Dive in and see for yourself:
Be the First to Comment
Share Your Thoughts