One of the more exciting augmented reality announcements Apple made during its WWDC keynote on Monday came in the form of Object Capture, a new 3D scanning feature coming to macOS Monterey.
That cry you just heard in the distance came from the chorus of companies who have published 3D scanning apps leveraging the LiDAR sensors of recent or built their business models around 3D capture services.
Meanwhile, Unity is in the catbird's seat of Apple's Object Capture car. Among the early access partners working with Apple on Object Capture, Unity MARS will support Object Capture for AR development by way of the iOS version of its AR Companion app.
Released as an open beta earlier this year, the AR Companion app enables developers to layout scenes for an AR experience and scan test them in real-world environments.
While the production app, along with Object Capture integration, isn't slated to arrive until this fall, Unity has provided a sneak preview of how the feature will work.
Object Capture in the AR Companion app starts with an interactive interface that establishes a virtual shell around the object. As users take photos at various angles, the app places green pins in the corresponding polygons making up the shell. To anyone who has captured a Photo Sphere with Google Camera, this process will seem very familiar. In addition, if the app detects a poor-quality photo, a red pin appears to denote that the sector should be shot again.
The minimum coverage area to proceed to model rendering is 70%. Once that threshold is reached, users can then transfer the photos to Unity for processing via a new local wireless file transfer protocol or other means. Rendering is a two step process, including a preview model where parameters can be adjusted followed by a full-quality model.
Unity developers won't have to use the AR Companion app for this process, though, as Object Capture supports photos taken with traditional cameras as well.
According to a company spokesperson, Unity has been working with Object Capture for six weeks, which makes this much progress all the more impressive. Meanwhile, other companies focused on 3D scanning have some catching up to do.
The introduction of LiDAR to high-end iPhones and iPads revived the 3D scanning app segment, with Occipital Canvas, Polycam, and 3D Scanner App among the notable options. Now, what Apple has given with LiDAR, it has taken away with Object Capture.
Object Capture is even more problematic for companies that offer 3D scanning services. For instance, Jaunt pivoted from VR to 3D capture prior to its acquisition by Verizon. Perhaps it isn't a coincidence that Jaunt co-founder Arthur van Hoff joined Apple after the pivot away from VR.
It isn't the first time that Apple has introduced a new feature that renders whole apps or companies obsolete. However, when it comes to AR development tools like ARKit and Reality Composer, Unity is still along for the ride. If anything, Object Capture makes Unity's development environment, particularly its flexibility to create AR experiences across operating systems, that much more essential.
Just updated your iPhone to iOS 18? You'll find a ton of hot new features for some of your most-used Apple apps. Dive in and see for yourself:
Be the First to Comment
Share Your Thoughts