The experience of actually using the HoloLens 2 can be difficult to describe to anyone who hasn't had a chance to directly interact with the device in person and be blown away by its immersive capabilities.
That's why any new exploration into exposing the augmented reality magic made possible by the HoloLens 2 demands attention as we reveal how computing is about to change for the entire planet. This latest demo is no different.
This demonstration focuses on displaying the hand tracking and object interactions that allow the HoloLens 2 to simulate the experience of touching and holding objects. In this case, the object is a virtual cube that is being handled by a pair of virtual hands overlaying the real hands of the user. This specific virtual hands effect is useful in that it allows the virtual light beaming from the cube to interact with the virtual hands, thus enhancing the realism.
In addition to showing off the realism that the occlusion mesh the interaction delivers, the demo also introduces us to the virtual physics that make interacting with a virtual object seem real.
"HoloLens 2 does not simulate the sense of touch, a key part of hand interactions with objects," says Oscar Salandin, a designer at Microsoft HoloLens designer based in London, in a blog post detailing his virtual design methodology. "We cannot make virtual objects physically influence your hand through touch, but we can use light to show a relationship between the object and your hand."
The absence of any haptic feedback in augmented reality, as well as in virtual reality, can often serve as a stumbling block in terms of transmitting a sense of realism (assuming you aren't wearing a haptic vest/bodysuit). In VR, this haptic gap is often closed by delivering haptics through the game controllers. But in gesture-based AR, the challenge is steeper.
"Holding a bright object will cast light onto and through your hand, giving you more feedback about the interaction between your hand and the object," says Salandin. "Adding this subtle effect to the virtual hand has a surprisingly strong effect on the realism of the interaction, giving information about depth, proximity, and direction."
Indeed, this virtual realism effect is most often seen in VR, where the full immersion of the experience (being close to a virtual flame, or another person's avatar invading your personal space, for example) can help bridge the gap between the real and the virtual. But in AR, where the virtual is integrated with the real, it's a bit more difficult to trick the brain.
"This lighting effect blurs the line between digital and physical, as the hand you are now looking at is a composite of the lighting from both the user's real environment and virtual objects," says Salandin. "Some users described that holding a particularly bright red glowing hologram and seeing its effect on their skin made their hand feel warm, even though they knew that it could not really be heating up."
Another clever concept introduced in the demo video is the idea of a "telekinesis gesture," which effectively gives your virtual hands the power to move and control virtual objects in a realistic manner rather. "While telekinesis is not part of real physical interactions, it is a gesture many people have seen or imitated from popular media like Star Wars," says Salandin. "Here, we use a combination of eye-gaze and hand tracking to let the user confidently move an object without touching it."
These references to telekinesis and Star Wars to explain full AR immersion illustrate that while many AR hardware and software providers are focused on enterprise, science fiction and gaming remain the best vectors for translating these cutting-edge interactions to the mainstream, even if the technology is still less than optimized for broad mainstream use on the high-end with devices like the HoloLens 2 and the Magic Leap 1.