Apple Pushes ARKit Closer to the AR Cloud with Location Anchors, Expands Face Tracking Support

Jun 9, 2021 04:13 PM
Jun 10, 2021 07:08 PM
637588266454975824.jpg

While Apple introduced new AR features for iOS 15 and Object Capture for Reality Kit 2 during the WWDC 2021 keynote, updates for ARKit were curiously absent in the official presentation.

That doesn't mean there isn't anything new coming to Apple's toolkit for mobile AR apps, and the next update helps evolve AR experiences via ARKit towards the sci-fi dream of the Metaverse, also known as the AR cloud.

The headlining feature for ARKit 5 is Location Anchors, an expansion of the persistent content functionality introduced in ARKit 2, which enables apps to execute AR experiences at precise locations in real-world locations, from famous landmarks to friendly neighborhoods, based on latitude, longitude, and altitude coordinates.

However, Location Anchors will be limited to London, New York, and other select US cities at launch. You'll also need an iPhone XS, iPhone XS Max, iPhone XR, or newer devices to experience Location Anchors.

Update: An on-demand session exploring ARKit 5 published on Thursday reveals that the AR navigation mode coming to Apple Maps uses Location Anchors. The AR mode will launch in London, Los Angeles, New York, Philadelphia, San Diego, San Francisco Bay area, and Washington, D.C.

Apple

Apple

Apple

Apple

Another new feature arriving in ARKit 5 is App Clip Code, which enables developers to anchor content from ARKit apps or App Clips on a printed or digital marker code.

In addition, ARKit 5 will include improvements for Motion Tracking and add support for Face Tracking to the fifth-generation iPad Pro's Ultra-Wide camera and front-facing cameras for devices with at least the A12 Bionic chip (iPhone SE and later). Devices with the TrueDepth front-facing camera will also be able to track up to three faces at once.

But Location Anchors is the big deal here, as it represents Apple moving further in the direction of shared and persistent AR experiences that are the hallmark of the AR cloud.

Google has begun to tackle the concept via Cloud Anchors for ARCore, but it acts more as a real-world save state for user-generated content. Microsoft offers a similar solution in Azure Spatial Anchors for iOS, Android, and HoloLens.

A closer approximation to Apple's Location Anchors is the Landmark AR tech for Snapchat, available to creators through templates in Lens Studio. Landmark AR uses location plus visual positioning to anchor AR content to buildings and monuments.

More recently, Niantic began accepting applications for the private beta of its Lightship platform, which will eventually use a Visual Positioning System to enable developers to anchor content to real-world landmarks. Part of this effort involves crowdsourced 3D mapping, accomplished through in-game tasks in apps like Pokémon GO.

Facebook and Epic Games are also developing their own flavors of the Metaverse, along with startups like Ubiquity6.

All this to say Apple isn't necessarily late to the AR cloud, but rather fashionably late. The segment is very much a future-oriented pursuit, and the companies already at the party are really just getting started. But, now that Apple is here, the fun can really begin.

Cover image via Apple

Comments

No Comments Exist

Be the first, drop a comment!