It may sound like deja vu, but neural interface startup CTRL-labs has closed a $28 million funding round led by GV, Google's funding arm, for technology that reads user's nerve signals to interpret hand gestures.
That's because GV, along with Lux Capital, led a $28 million funding round last May.
The New York-based startup is among a growing field of user input technology, including gesture recognition, eye tracking, and brain control interfaces. The technology caters to augmented reality platforms, among other emerging technologies like robotics, through more natural inputs for the paradigm of spatial computing.
- Don't Miss: Startup Asteroid Aims to Reinvent Computer Interface with Augmented Reality Software & Hardware Kit
"CTRL-labs' development of neural interfaces will empower developers to create novel experiences across a wide variety of applications," said Erik Nordlander, general partner at GV, in a statement. "The company has assembled a team of top neuroscientists, engineers, and developers with deep technology backgrounds, creating human-computer interactions unlike anything we have seen before."
The technology from CTRL-labs differs from other neural interfaces— such as brain control systems — that read the user intent and emotion from brain activity. Instead, CTRL-labs aims to monitor the activity of individual neurons at the user's wrist to read intended hand gestures.
To bring its technology to end users, the company is developing a CTRL-kit, a combination of hardware and software for integrating the system into apps. The hardware consists of a wearable electromyography (EMG) device, which measures neuron signals through skin-contact sensors. On the software side, CTRL-kit provides developers with an SDK compatible with Unity and JavaScript and APIs that enable hand pose reconstruction and measuring the force of gestures.
"Like the developers and creators we hear from, we feel fundamentally dissatisfied with the pervading technologies of the last century," said Thomas Reardon, CEO of CTRL-labs. "Our objective with CTRL-kit is to give the industry's most ambitious minds the tools they need to reimagine the relationship between humans and machines."
The second round of investment from Google into CTRL-Labs compounds its interest (no pun intended) in next-generation user input technology. The company is also developing its own version of hand gesture recognition, dubbed Project Soli, using radar.
As a company that is both selling Google Glass to enterprises and working to bring a consumer-facing augmented reality headset to market, the investment in futuristic input technologies serves as hints to what Google might be preparing to bring to its AR wearables.
- Follow Next Reality on Facebook, Twitter, Instagram, YouTube, and Flipboard
- Sign up for Next Reality's daily, weekly, or monthly newsletters
- Follow WonderHowTo on Facebook, Twitter, Pinterest, and Flipboard
Cover image via CTRL-labs/YouTube
Comments
No Comments Exist
Be the first, drop a comment!