In a recent series of tweets, investor and entrepreneur (and NR50 member) Amitt Mahajan summarized the challenges and opportunities for iOS developers looking to leverage Apple's ARKit for augmented reality experiences.
"ARKit apps still have the same problem as every other mobile app: convincing users to install your app and make using it a daily habit," said Mahajan, managing partner with Presence Capital, a virtual and augmented reality venture fund with interests in Meta and ScopeAR.
In addition, Mahajan predicted that Apple would facilitate such features through a default OS app for visual search, like Google Lens or Samsung Bixby, that third-party apps could access. The approach would align with functionality that Apple has extended is follows the framework they've developed into iOS and have a history of doing this already with Spotlight and Siri.
Because of download friction, most AR content added to real world places will likely be viewed in an OS-level app and exposed via hook. ... Thus it follows that most AR functionality will be an add-on to existing apps as a feature rather than a standalone app.
As an example, he pointed to Yelp, who could update their iOS app to display a restaurant's star rating based on GPS coordinates when a visual search is performed. Users would not have to install another app, but they would gain the feature through an update along with granted permissions.
"The takeaway is that a standalone AR app needs to do something really special to justify its existence and stand out in the store," said Mahajan. "I'd start with something not possible before ARKit: apps that require the user to move around in 3D space or using camera feed as [input]."