One of the primary assumptions in the world of VR and augmented reality is that the user has the ability to "see" virtual objects and the real world structures around them. But what if the user doesn't have perfect eyesight, or any eyesight at all?
A new project from Google could eventually pave the way for the blind to participate more actively in the growing virtual worlds of AR.
• Don't Miss: Snap's Acquisition of 3D Mapping Startup Signals Move to Build AR Cloud
Google's new Project Guideline gives blind runners the ability to detect the lines on a running track using image recognition (via an Android smartphone) and on-device machine learning that utilizes TensorFlow.
In combination with this software dynamic, as well as an Android smartphone camera attached to the blind runner's waist along with a pair of bone-conducting headphones, users have been able to successfully run at full speed on tracks that have painted guidelines, using Project Guideline to stay correctly positioned on the track just as any sighted runner might.
Although this latest demo (bottom of this page) is based in Japan, the idea originated at a Google Hackathon back in 2019 when Thomas Panek, the CEO of Guiding Eyes for the Blind, discussed some of his needs with developers at the event. Those hackathon creators quickly came up with a rough prototype in response to his encouragement. In 2020, Google unveiled a polished version of Project Guideline to the public.
When the blind runners wear the specially equipped smartphone and bone-conducting headphones, they are able to hear a sound that gets louder as they veer farther away, left or right, from the painted track guideline. The result is a kind of virtual running assistant that frees a blind runner from the need of a guide dog or human running aide.
The reason experiments like Project Guideline are important to AR is that it could be a gateway for blind users to enter the mostly visual world of the AR cloud. In recent years, we've criticized Bose for its broad use of the term "augmented reality" in relation to its spatial audio glasses primarily because such marketing could confuse the still nascent AR consumer market by giving the impression that the audio-only glasses have a visual component.
However, spatial audio is an integral part of AR. Initiatives like Project Guideline could take the tools of image recognition and spatial audio beyond the running track and into a fully fleshed-out virtual world for blind would-be AR users.
These blind users who can't see the many virtual objects and real-world structures so important to many immersive experiences could eventually be guided into the expanding AR cloud ecosystem via tools like Project Guideline. Additionally, since Google presides over one of the most powerful mapping systems on the planet, the potential for harnessing Google Maps and its increasing reliance on AR is also promising in terms of assisting visually impaired users.
For now, Project Guideline is mostly an experiment, but it, and others like it, promise to throw open the doors to the metaverse for the blind, making it an overall richer experience for everyone.
Cover image via Google
Comments
No Comments Exist
Be the first, drop a comment!