In the arms race between ARKit and ARCore, Google scored a big win with the announcement of its Cloud Anchors shared AR experiences platform at Google I/O on Tuesday.
A developer session held on Wednesday offered more detail on how multi-user experiences built via Cloud Anchors will work on Android smartphones and iPhones.
The first app to take advantage of Cloud Anchors is Just a Line, which is available now for Android and will be released for iOS in the coming weeks, along with the update supporting multiple users. As part of the presentation, Google shared a teaser trailer before proceeding with a tutorial on Cloud Anchors.
What we've learned is that Cloud Anchors basically takes shared reference points of scanned horizontal or vertical surfaces from ARKit or ARCore to establish a common anchor between multiple devices.
"What the phone is going to extract from the environment is…what the phone sees as contrast points, where colors change, where the lighting changes," said James Birney, product manager for Cloud Anchors at Google, during the presentation. "Those are the visual features that get abstracted and get uploaded to the cloud."
The visual features are saved as a Cloud Anchor ID, which apps will share between users to establish a shared reference frame between devices. As long as users are viewing the same physical space, the apps can then match visual features observed between the devices to establish a mutual anchor point.
"Even though both devices are in different locations, we'll create Cloud Anchor in a consistent physical location. And that's the magic," said Birney. "Because they are in a consistent, physical location, you then have a shared reference frame."
After reviewing how the experience is coded, Birney and his colleague Eitan Marder-Eppstein, a software engineering manager at Google (who presented the Augmented Images portion of the session), demoed a sample game for the audience.
Birney also teased potential AR experiences from apps such as NASA's Spacecraft AR app, the forthcoming Bait! Under the Surface game, and the Jet.com app as use cases for education, creative expression, gaming, and shopping, respectively.
While Cloud Anchors promises to bring shared experiences to mobile AR, it stops short of persistent experiences, touted by AR cloud companies like 6D.ai, Niantic acquisition Escher Reality, and Google-backed companies Blue Vision and Ubiquity6. In persistent experiences, multiple users can observe the same AR content in the same place at different times. For instance, if I place a stormtrooper on a street corner, another user will be able to see that stormtrooper at that same street corner a week from now.
"The Anchor data can only be accessed within one day of creation. So if you create an anchor for you or others to use, when you come back to it one day after, it will be gone," wrote Alberto Taiuti, engineer and self-described enthusiast of machine learning and augmented reality, in a post on Medium."The raw data sent to the servers is deleted after seven days. Hence, not only your anchor transforms will disappear, you won't be able to retrieve even the raw data consistently."
Nonetheless, it's a start. While multiplayer experiences can already be achieved through marker-based AR, Cloud Anchors enables it in a markerless fashion. Suddenly, Google is further ahead in achieving shared experiences than Apple (which has reportedly put off multiplayer in ARKit until next year) and the various, other AR cloud upstarts, with persistent experiences serving as the next milestone for AR industry.
Just updated your iPhone to iOS 18? You'll find a ton of hot new features for some of your most-used Apple apps. Dive in and see for yourself:
Be the First to Comment
Share Your Thoughts