News: What Is Augmented Reality?

What Is Augmented Reality?

Apple CEO Tim Cook has said that augmented reality (or, AR for short) will "change everything." But what, exactly, is augmented reality?

Generally, augmented reality refers to the process of presenting virtual objects and imagery — usually through a wearable lens or external display — layered over the real world. The level of immersion and degree of interactivity can vary, from 2D virtual images and interfaces (as seen in Google Glass), to fully 3D virtual objects that can be viewed from multiple angles and handled as though the objects had real form in the context of the world (as seen on the HoloLens and Magic Leap One). Although varying terms have been used to describe different methods of delivering this kind of computing experience, "AR" has become the most broadly used and easily understood term to encompass the wide ranging flavors of presenting virtual content over the real world.

And while the terms AR and VR (virtual reality) are often mentioned in the same breath, AR should not be confused with VR. VR is a computing experience that completely closes you off from the real world and presents you with an entirely computer-generated environment. Whatever you see in VR is not real, and your real world surroundings are obscured. On the other hand, with AR, you get to maintain full visual contact with the real world, and the virtual objects you see are only "added" to your view of the real world, rather than closing you off from reality. In Hollywood terms, VR is similar to the Star Trek holodeck, while AR, in its most simple form, is similar to the virtual objects and displays we've seen in Marvel's Iron Man movies when Tony Stark is in his lab, or flying in his suit.

The shortcut definition for most mainstream media outlets is to reference the successful examples that consumers will recognize. Usually, this means name-dropping Pokémon GO, which revolutionized mobile gaming by turning neighborhoods into virtual playgrounds where players can catch creatures they see in their smartphone cameras. Or, some may reference the AR camera of Snapchat, which revolutionized social media by adding another dimension of fun and creativity to photos and videos by layering AR imagery, effects, and masks over the faces of users.

However, over the past few years, the augmented reality industry has grown to mean much more than the few high-profile successes. Augmented reality also means hardware, namely headsets like the Microsoft HoloLens and the Magic Leap One, which push the boundaries of spatial computing into deeper immersive territory. Along with these high-end smartglasses, which help enterprises become more efficient (and could eventually revolutionize computing for consumers, too), the AR evolution is also taking place in the realm of smartphones, primarily embodied by Apple's iPhone X series, and more recently Samsung's Galaxy Note 10+.

On the software side, augmented reality is defined by mobile toolkits, like Apple's ARKit and Google's ARCore, both of which enable mobile app developers to create augmented reality experiences, as well as the next-generation AR cloud platforms that promise to bring even more realistic augmented reality content to smartphones and beyond.

Moreover, based on its impact across multiple industries to date, and its future potential, augmented reality has captured the attention of investors. Even so, the technology is still in its infancy.

Definition of Augmented Reality & Its Attributes

While modern technology has brought about the "silver age" of augmented reality, the concept and even some of the implementations of the technology are not new.

In 1994, researchers Paul Milgram, Haruo Takemura, Akira Utsumi, and Fumio Kishino defined augmented and virtual reality as points on a spectrum, dubbed the Reality-Virtuality continuum. On one end of the spectrum is the real environment seen naturally by humans. On the opposite end of the spectrum lies virtual reality, where the real environment is replaced completely by a digital environment.

Points in between the real environment and virtual reality are occupied by augmented reality, where hardware and software supplement the natural environment with digital content.

The researchers also coined the term mixed reality as an overarching classification for technology that merges the real and virtual environments, with Microsoft co-opting the term and conflating it with its own Mixed Reality platform for VR (thus confusing some consumers in recent years).

Fast-forwarding to the modern era, the realization of the textbook definition of augmented reality relies on environmental understanding by computers, as delivered through a connected camera, to deliver virtual content within the user's field of view.

One way environmental understanding is achieved is via markers that enable the computer to track content within the environment. A marker can be created through the equivalent of a QR code that a computer's camera recognizes as an area for the placement of virtual content. Another means of establishing a marker is a beacon that communicates its physical location to the AR device.

Conversely, environmental understanding without a marker entails building a 3D map of the environment. Initial markerless augmented reality experiences required a camera that can sense depth within the environment. Without a depth sensor, computers can employ a computer vision algorithm trained to estimate surfaces for anchoring virtual content in the environment.

Another element of environmental understanding is occlusion, which refers to real world objects blocking the view of virtual content from the point of view of the computer's camera and its user, thus enhancing the realism of the virtual content. Typically, this requires a depth sensor, but computer vision advancements in testing have shown the ability to identify physical objects within the camera view.

Finally, realistic augmented reality experiences call for 3D content. Developers generally use the same game engines used to create virtual reality experiences, chiefly Unity and the Unreal Engine, to create augmented reality content. Along with 3D engines, AR experiences need 3D models to display in real world physical environments. Models can be created in 3D modeling programs or captured through photogrammetry of real world objects.

How Technology Delivers Augmented Reality Experiences

We've established that augmented reality experiences are delivered via computing devices. For the average consumer, this means smartphones and tablets, which have the cameras for reading markers or detecting surfaces and the mobility for users to orient the device to their field of view. Many current smartphones also include sensors (usually, an accelerometer, a magnetometer, and gyroscope) that enable AR apps to orient the user to its environment and virtual content.

However, for a more natural experience, head-mounted displays or fully immersive augmented reality headsets and smartglasses fit the bill. In layman's terms, AR headsets are essentially mobile devices, with miniature displays configured to the user's viewpoint, along with a computer, either embedded into the wearable device or connected (tethered) to an external computer. In more advanced cases, AR headsets also include depth sensors for environmental mapping.

Augmented reality has also made its way into automobiles. Heads-up displays in modern car models bring instrument panels, infotainment, and navigation into the driver's windshield viewing area. When autonomous vehicles supplant manual models, augmented reality will likely play a role in sharing the vehicle's view of the world — such as its recognition of other cars, pedestrians, and road hazards — with its passengers.

"Altered Carbon" depicts a future of AR via contact lenses. Image via Netflix

There are other form factors for augmented reality devices as well. For example, Lampix is an augmented reality lamp that can project an interactive workspace on any surface. Science fiction gives us examples of other forms that may arise in the very near future as well. Minority Report and Iron Man provide influences for interactive interfaces beyond AR headsets. Also, Netflix original series and movies have been a treasure trove of science fiction examples illustrating our potential real AR future, such AR contact lenses in Altered Carbon and neural implants in Anon and Black Mirror.

Modern Pioneers of Augmented Reality

Augmented reality is far from a new technology. As a military tool, rudimentary AR usage dates back to the 1960s in heads-up displays for fighter jets. And that yellow first down marker line in American football TV broadcasts? Yup, that's a form augmented reality, too.

For all intents and purposes, though, the modern era of augmented reality tracks back to the 2010s. One of the earliest examples of mobile augmented reality was Layar, an augmented reality browser that displays waypoints in its camera view and facilitates marker-based AR experiences. Another augmented reality startup called Blippar bought Layar in 2014 to contribute to its marker-based AR platform for advertisers.

But the one advanced AR device that the general public knows best is Google Glass, which made its public debut at Google I/O in 2012. Google made the wearable device available for purchase for $1,500 through an exclusive Explorer Program in 2013, which expanded to a wider audience in 2014. Unfortunately, the device, which used a not-too-subtle display and camera mounted into its frames for showing notifications and content in the user's field of view, faced a public backlash, with early adopters labeled as potentially privacy-invading "glassholes." Google shelved the product for mainstream consumers, but relaunched the device in 2017 for enterprise customers, a segment that has found the technology useful for improving the productivity of various kinds of workers.

Google took another shot at augmented reality hardware in 2014 with its Project Tango platform, a combination of depth sensors for manufacturers and a development kit for building apps that can take advantage of the hardware. The first commercially available Tango device was released in 2016 via the Lenovo Phab2 Pro, which was followed in 2017 by the Asus ZenPhone AR. Google closed down the program in 2017 in favor of a toolkit designed to work without specialized hardware (more on this later).

The year 2016 ended up being a pivotal year for modern augmented reality. Microsoft made its HoloLens headset available for purchase that year after introducing it in 2015. The HoloLens set the standard for augmented reality wearables by employing a depth sensor, which was adapted from the Kinect camera accessory for Xbox, a tool for mapping physical environments which included a gesture recognition system that has become a blueprint for other augmented reality headsets. However, at a price of about $3,000, the market for the HoloLens has been limited mostly to enterprise businesses and developers on the bleeding edge.

In the mobile AR ecosystem (that is, AR you use through a smartphone or tablet), Pokémon GO became the first blockbuster augmented reality app, turning neighborhoods and parks into virtual playgrounds for players to capture virtual creatures. In addition, Snapchat first added AR camera effects, or Lenses, to its app in 2016. Snapchat's flavor of AR has since kickstarted the marketing industry's adoption of the technology for a number of major brands and entertainment franchises. The two aforementioned mobile AR apps have become synonymous with augmented reality, particularly within mainstream media reports that attempt to explain AR to neophytes.

The Golden & Silver Ages of AR

For some AR industry watchers, it may seem premature to define what would be the silver or golden age of augmented reality. Nevertheless, considering the proliferation of new AR technology over the past two years alone, we may very well look back on this period as the silver age of AR.

As Snapchat has continued to build upon its AR platform, adding AR content for the rear camera and enabling creators and brands to develop their own AR experiences with the Lens Studio desktop tool, Facebook has mirrored its AR strategy with its own AR platform and development app, Spark AR.

Meanwhile, Apple and Google have made it easier for mobile app developers to integrate AR into their apps with ARKit for iOS and ARCore for Android. The development toolkits use the computer vision and camera of compatible smartphones and tablets to detect surfaces for anchoring AR content, simulating environmental lighting, as well as other features that help realistically display virtual content in the real world.

While Google abandoned depth sensors for smartphones, Apple has begun to ship the cameras in its iPhone X series. Apple's TrueDepth cameras have enabled facial recognition experiences that bleed into the AR space with tools such as Animojis. The company is reportedly working on expanding the technology to the rear camera, which would undoubtedly lead other smartphone makers to follow suit.

The platforms for mobile AR are quickly evolving, with the concept of the AR cloud, a digital copy of the world that enables multiuser experiences and persistent content in the real world, and occlusion, gradually taking hold. So far, projects including the Niantic Real World Platform, 6D.ai, and Ubiquity6 are among the leading AR cloud platforms in beta testing.

And after years of hype, Magic Leap finally released its AR headset, the Magic Leap One, in 2018. The device marks the strongest challenge to the HoloLens yet, with similar spatial computing and user interface capabilities at a slightly lower price. While it's still outside of the range of the sweet spot for mainstream consumers, Magic Leap has begun to roll out a content line-up that appears to be drawing more interest from the consumer market, at least in terms of online chatter and general curiosity (it remains to be seen if that interest will translate into sales).

The first wave of mainstream-style smartglasses targeted toward consumers is just now hitting the market, with North's Focals available for as little as $599 and the Vuzix Blade retailing for $999. Both act as heads-up displays, offering on-demand information, limited app interactions, and voice commands with Amazon Alexa. While its release isn't scheduled, the Nreal Light takes smartglasses a step further with the ability to display 3D content via a tethered computer pack and fashionable, sunglasses-style frames.

These lesser-known companies might be the first ones out of the gate, but the big tech companies are preparing to enter the consumer smartglasses race as well, with Apple as the odds-on favorite to eventually dominate the space (as they typically do with consumer electronics). Also, Snap, Facebook, and Google reportedly developing their own smartglasses or AR headsets.

Challenges Facing the Augmented Reality Industry

As the augmented reality industry moves forward, it faces a matrix of challenges in order to become a true mainstream technology. While the industry presents smartglasses as an always-on experience that brings constant information into the user's point of view, wearables also need fall within a form factor that the average consumer would consider stylish enough for everyday use. Hardware makers also need to consider the everyday functionality of these devices, as well as the computing and battery power required to run the content that immersively merges more seamlessly with reality (i.e. replicating what the HoloLens and Magic Leap One are best at, but via a smaller form factor).

And hardware makers also need to be able offer smartglasses at a price that reflects the value of the package. For instance, a pair of smartglasses that offers the same relative functionality as a smartwatch likely won't sell many units if priced like a smartphone. Conversely, in some cases, an AR headset maker may willingly suffer delivering an unwieldy form factor and higher price in order to delivery top-of-the-line functionality.

There are issues the industry needs to solve on the software side as well. A new paradigm of computing calls for an evolution in user interface. Moreover, streaming 3D content requires more data bandwidth, as well as greater network speed to deal with latency issues in AR cloud situations (which is why so many AR companies are pinning their hopes on the speed of new 5G wireless networks).

Still, as long as processors continue to become more efficient, and display technology continues to advance at its current pace, these are all issues that the industry should be able to solve within the next five to 10 years.

Just updated your iPhone? You'll find new emoji, enhanced security, podcast transcripts, Apple Cash virtual numbers, and other useful features. There are even new additions hidden within Safari. Find out what's new and changed on your iPhone with the iOS 17.4 update.

Cover image via Microsoft/YouTube

Be the First to Comment

Share Your Thoughts

  • Hot
  • Latest