While Magic Leap may have plans to eventually introduce sign language translation for smartglasses in the near future, students at New York University have demonstrated that such a feat is possible today with a smartphone and a prototype app.
Using computer vision and augmented reality, the ARSL app enables users to capture sign language with their smartphone camera and see a live translation into their native language. In turn, the app can translate spoken language into sign language.
The prototype was developed by Heng Li, Jacky Chen, and Mingfei Huang, computer science students at the NYU Tandon School of Engineering through the Connected Futures Prototyping and Talent Development program, a partnership between the NYU Media Lab and Verizon created to invest in emerging technology, particularly augmented reality, virtual reality, and artificial intelligence.
"We make magic when we pair leading students with outstanding mentors in the Envrmnt team at our AR/VR lab," said Christian Egeler, director of XR product development for Envrmnt, Verizon's platform for extended reality solutions, in a statement. "We discover the next generation of talent when we engage them in leading edge projects in real time, building the technologies of tomorrow."
This year, the program will fund a dozen projects, including the sign language translation app. Other apps awarded via the program include Impromptu, a multi-user AR experience that gives non-musicians the ability to perform music, and Dreamine, a clay set that imports creations into an AR game.
"NYC Media Lab is grateful for the opportunity to connect Verizon with technically and creatively talented faculty and students across NYC's universities" said Justin Hendrix, Executive Director of the NYC Media Lab. "We are thrilled to continue to advance prototyping in virtual and augmented reality and artificial intelligence. These themes continue to be key areas of focus for NYC Media Lab, especially with the development of the first publicly funded VR/AR Center, in which the Lab is developing in conjunction with NYU Tandon School of Engineering."
Whether any of these apps will find their way to the App Store or Google Play is uncertain. According to a Verizon spokesperson contacted by Next Reality, commercial app releases were not the goal of the program, however, some teams, potentially including ARSL, will seek commercial release in the future.
"Although we know that we just explored the tip of iceberg for this long time problem for the global sign community, we would like to continue to interview our end users for their insights," Li told Next Reality. "[We would also like to] interview experts in the field to discover what other emerging technologies and techniques can help on top of computer vision."
Just updated your iPhone to iOS 18? You'll find a ton of hot new features for some of your most-used Apple apps. Dive in and see for yourself:
Be the First to Comment
Share Your Thoughts