As Apple prepares to potentially introduce its (mostly) secretive AR headset for possible introduction later this year and launch next year, the company has made a strategic investment to ensure its supply chain can support it.
On Wednesday, Apple announced an award of $410 million from its Advanced Manufacturing fund to II-VI, an optics manufacturer that supplies the company with iPhone and iPad components that enable advanced AR experiences.
- Don't Miss: Concept Video Hints That First Mainstream AR Smartglasses Could Feature Glass from Apple-Backed Corning
"The expansion of the company's long-standing relationship with II-VI will create additional capacity and accelerate delivery of future components for iPhone, with 700 jobs in Sherman, Texas; Warren, New Jersey; Easton, Pennsylvania; and Champaign, Illinois," Apple said in a statement on its website.
II-VI supplies Apple with LiDAR sensors, which are embedded in its iPhone Pro and iPad Pro devices. LiDAR enables fast depth sensing for more realistic AR experiences and is expected to fill the same role in Apple's AR headset.
In addition, II-VI makes vertical-cavity surface-emitting lasers (VCSELs), which are found in the depth-sensing front-facing TrueDepth camera of iPhone X and newer devices. TrueDepth powers features like Animojis and Memojis as well as more robust AR Lenses for Snapchat and the virtual try-on tool in the Glasses app for Warby Parker.
In 2017, Apple awarded $390 million to Finisar, which produced VCSELs for the TrueDepth camera. Subsequently, II-VI acquired Finisar in 2019.
The investment is just the tip of the iceberg for Apple's plans. The company has committed to spending $430 billion over the next five years to add 20,000 new jobs and expand its research and development bench strength in the areas of silicon engineering, 5G, and manufacturing, which will, in turn, serve to iterate on its AR hardware. This includes a new campus in North Carolina, a hotbed for tech talent.
Apple
The company's smartphones and tablets have served as a proving ground for its future AR wearable aspirations. On the software side, Apple's ARKit toolkit enables developers to leverage devices sensors and build AR apps, while Apple's own apps, like Clips with its new AR Spaces feature, demonstrates one way the company can make most of its AR ecosystem. Meanwhile, Apple has added hardware components, like the LIDAR sensors and TrueDepth cameras for depth sensing and its U1 chips for spatial awareness, that expand the AR capabilities of its devices.
These mobile AR efforts introduce its customers to AR experiences while serving as real world prototyping and testing for the next era of computing. And, if the rumors and reports hold water, we can expect the LiDAR to make the transition from smartphones and tablets to AR wearables in the not-too-distant future.
Cover image via Apple
Comments
No Comments Exist
Be the first, drop a comment!