With technology giants like Apple and Google finally entering the fray, the move toward mass adoption of augmented reality is ramping up. Apple's ARKit and Google's ARCore will allow entirely new categories of apps to be made. Unfortunately, in a world of heavy competition, getting these two frameworks to work together wasn't a priority for either company.
Recently, at Unite Austin 2017, during a session called "So, you think you can Augment Reality?" Unity developers Jimmy Alamparambil and Tim Mowrer showed off ARInterface. The API is designed to help developers create experiences that can use both ARKit and ARCore at the same time.
As the two developers played a game, one playing on an iPad, the other on a Google Pixel, minds were blown. If the video had been pointed at the audience, you would have seen a certain writer/developer (yours truly) bouncing up and down in his seat, enthusiastically clapping in response to the demonstration.
In a blog post on Wednesday, Jimmy Alamparambil, part of Unity's Emergent Technology team, finally released the experiment into the wild. Up until this point, getting ARCore and ARKit apps to work together was not an impossible feat, but it was a tough process. While both AR platforms have similar approaches to the technological problems, they have very different underlying characteristics, aside from being written in completely different programming languages.
ARInterface, which can be downloaded from the Unity GitHub, handles the workload of dealing with ARKit and ARCore by abstracting the common functionality into a single layer, giving developers one interface to work with. So while a developer can still access the lower level if they need to, through the ARKitInterface and the ARCoreInterface, this is only necessary for developers looking to do something unique to one platform or the other.
ARRemoteInterface is a tool in the API that allows users to test their app without compiling and deploying it to their device. This will allow the user to iterate faster as they can quickly test changes without the sometimes long build process involved. That's a major timesaver.
For anyone doing multi-platform AR development, whether it's a cross-platform app or an app dedicated to a single platform, the scale that each platform uses can be a big factor. With the HoloLens' scale being "1 unit is 1 meter," one has to scale the size of an object down so far that it becomes hard for the Unity editor to handle a close-up view. If one would then use that same object on a different AR platform, it might be microscopic or massive in size.
During the talk at Unite, Alamparambil and Tim Mowrer, Senior XR Engineer at Unity, revealed a clever solution they've developed to help alleviate this problem, one with a single camera and one with two cameras (for now, that's all the detail I can reveal). Apparently, only a version of the single camera solution will be ready for developer use. In the Unity blog post, Alamparambil framed this as a major topic that's worthy of its own blog post, which will be coming soon.
As someone who deeply believes that the only way for AR to really take off in the mainstream is to make it as simple as possible to enjoy, having cross-platform options available out of the box is really important. Thanks to the work Unity has done, I've done quite a few cross-platform experiments in the last year or two, including having a shared experience running on a PC, HoloLens, Xbox One, and Android at the same time. It wasn't pretty, but it worked.
Expect to see some experiments implementing ARKit and ARCore together here on Next Reality soon. And just maybe we can get the HoloLens working, too. In the meantime, we have a new shiny toy to play with. Grab your iPhone and Android and make something cool.