Neurable Planning to Bring Brainwave Controls to AR Headsets Later This Year

Apr 5, 2017 06:37 PM
636269888388733610.jpg

Within the coming months, software startup Neurable plans to introduce the next paradigm in virtual and augmented reality: the brain–computer interface (BCI).

A BCI interprets brain signals so that devices can understand user intention from predefined selections. Relying on subconscious brain activity, users can control devices with their thoughts alone.

While the technology itself has been around for about 40 years, most companies working with BCI technology utilize meditation and concentration-based interaction methods. This approach takes up to a minute for software to understand the user's intention due to difficulties in pulling data out of recorded brain activity.

What sets Neurable apart? It can translate brain activity in real time.

Ramses Alcaide, CEO of Neurable, demonstrates his brain–control interface on a model car; the technology has also been tested on full-sized automobiles.

Neurable relies on a three-pronged artificial intelligence and machine learning approach to interpret intentions instantly, according to Ramese Alcaide, CEO of Neurable. First, the company applies filters to clear out noise that exists in the brain activity. Second, the software makes predictions about users' intentions based on observed similarities between individuals. Third, the platform approaches time-series data from brain activity differently.

"A lot of people look at data as individual points, but we really look at shapes, and the way the time series data comes together," said Alcaide in a phone interview with NextReality.

Neurable expects to debut its software developers kit (SDK) in the second half of 2017 through a closed beta among select developers. According to Alcaide, the decision to keep the beta closed was to ensure quality feedback. However, he notes that Neurable will keep an open mind to developers who are interested in participating.

After completing the beta, Alcaide expects to start licensing the SDK to software developers and AR/VR headset manufacturers. The software is hardware-agnostic and compatible with the likes of Oculus Rift, HTC Vive, and Microsoft HoloLens. Neurable only requires electroencephalogram (EEG) sensors to read brain activity. Alcaide notes that they are talking to several headset makers about integrating the platform into their devices.

636269841121401989.jpg

Neurable is hardware agnostic and requires only EEG sensors to read brain activity.

Initially, Neurable will focus on applying their BCI software to virtual and augmented reality applications for gaming and productivity. Some use cases the team has already approached are users navigating applications with their minds, or players casting spells in a video game.

"There is a desire to have brain control or other ways to control augmented reality and virtual reality. It's a big need right now, because interaction methods in those mediums haven't been solved yet," said Alcaide.

Alcaide notes that, from a development perspective, Neurable has been designed to be easy to program. The SDK includes plugin support for Unity, Unreal, and other game engines and UI platforms.

For the user, Neurable does require some short calibration time to confirm user intentions. However, after more data is available from testing, future versions will be more accurate, removing the training altogether.

The Neurable SDK will include plug-in support for major game engines and UI platforms.

While Neurable will begin with mainstream applications, Alcaide expects to be able to apply the interface to other verticals, in particular Alcaide's original catalyst—helping those with disabilities.

Alcaide has basically spent his life in the pursuit of technology to help people with physical impairments. Alcaide's interest in BCI tech dates back to when he was eight years old and his uncle lost his legs in car accident. This experience led him to focus on engineering development. He earned his undergraduate degree in electrical engineering from the University of Washington, with a particular focus control systems, machine learning, and prosthetic development.

I came to realization that a lot of these engineering problems are going to be will be solved with time. Batteries are going to get denser, materials are going to get lighter; that's not where the true challenge is. The real challenge is connecting the brain with the actual mechanical and electrical systems.

— Ramses Alcaide, CEO of Neurable

He went on to earn his Masters in 2012, and Doctorate in Neuroscience in 2017, at the University of Michigan, studying with Dr. Jane Huggins, lead of the Direct Brain Interface laboratory. It was here that he began working on the technology behind Neurable.

"Neurable's technology embodies a breakthrough in BCI function initially developed at the University of Michigan Direct Brain Interface laboratory and introduces a new generation of BCI capabilities," said Huggins in a press release.

The company already has mindshare among the financial community, raising a $2 million seed round in December 2016. After the beta release, Alcaide notes that the company will seek a Series A round of funding to bring the software platform to market.

"The team at Neurable believe that they can enable people to easily control devices and objects with their minds. The implications would be enormous," said Brian Shin, who led the Boston Syndicate investment in Neurable, in the press release. "They have a chance to completely alter the way humans interact with technology which is something that I had to be a part of."

What kind of AR applications would you be most excited about controlling with your brain? What other "thought experiments" do are you thinking about? We aren't mind readers (yet), so let us know below.

Cover image via Neurable

Comments

No Comments Exist

Be the first, drop a comment!