For patient T6, 2014 was a happy year.
That was the year she learned to control a Nexus tablet with her brain waves, and literally took her life quality from 1980s DOS to modern era Android OS.
A brunette lady in her early 50s, patient T6 suffers from amyotrophic lateral sclerosis (also known as Lou Gehrig’s disease), which causes progressive motor neuron damage. Mostly paralyzed from the neck down, T6 retains her sharp wit, love for red lipstick and miraculous green thumb. What she didn’t have, until recently, was the ability to communicate with the outside world.
Like T6, millions of people worldwide have severe paralysis from spinal cord injury, stroke or neurodegenerative diseases, which precludes their ability to speak, write or otherwise communicate their thoughts and intentions to their loved ones.
The field of brain-machine interfaces blossomed nearly two decades ago in an effort to develop assistive devices to help these “locked-in” people. And the results have been fantastic: eye- or head-tracking devices have allowed eye movement to act as an output system to control mouse cursors on computer screens. In some cases, the user could also perform the click function by staring intently at a single spot, known in the field as “dwell time.”
Yet despite a deluge of promising devices, eye-tracking remains imprecise and terribly tiring to the users’ eyes. Since most systems require custom hardware, this jacks up the price of admission, limiting current technology to a lucky select few.
“We really wanted to move these assisted technologies towards clinical feasibility,” said Dr. Paul Nuyujukian, a neuroengineer and physician from Stanford University, in a talk at the 2015 Society for Neuroscience annual conference that took place this week in Chicago.
That’s where the idea of neural prostheses came in, Nuyujukian said.
In contrast to eye-trackers, neural prostheses directly interface the brain with computers, in essence cutting out the middleman — the sensory organs that we normally use to interact with our environment.
Instead, a baby-aspirin-sized microarray chip is directly implanted into the brain, and neural signals associated with intent can be decoded by sophisticated algorithms in real time and used to control mouse cursors.
It’s a technology that’s leaps and bounds from eye-trackers, but still prohibitively expensive and hard to use.
Nuyujukian’s team, together with patient T6, set out to tackle this problem.
A Nexus to Nexus 9
Two years ago, patient T6 volunteered for the BrainGate clinical trials and had a 100-channel electrode array implanted into the left side of her brain in regions responsible for movement.
At the time, the Stanford subdivision was working on a prototype prosthetic device to help paralyzed patients type out words on a custom-designed keyboard by simply thinking about the words they want to spell.
The prototype worked like this: the implanted electrodes recorded her brain activity as she looked to a target letter on the screen, passed it on to the neuroprosthesis, which then interpreted the signals and translated them into continuous control of cursor movements and clicks.
In this way, T6 could type out her thoughts using the interface, in a way similar to an elderly technophobe reluctantly tapping out messages with a single inflexible finger.
The black-and-white setup was state-of-the-art in terms of response and accuracy. But the process was painfully slow, and even with extensive training, T6 often had to move her eyes to the delete button to correct her errors.
What the field needed was a flexible, customizable and affordable device that didn’t physically connect to a computer via electrodes, according to Nuyujukian. We also wanted a user interface that didn’t look like it was designed in the 80s.
The team’s breakthrough moment came when they realized their point-and-click cursor system was similar to finger tapping on a touchscreen, something most of us do everyday.
We were going to design our own touchscreen hardware, but then realized the best ones were already on the market, laughed Nuyujukian, so we went on Amazon instead and bought a Nexus 9 tablet.
The team took their existing setup and reworked it so that patient T6’s brain waves could control where she tapped on the Nexus touchscreen. It was a surprisingly easy modification: the neuroprosthetic communicated with the tablet through existing Bluetooth protocols, and the system was up and running in less than a year.
“Basically the tablet recognized the prosthetic as a wireless Bluetooth mouse,” explained Nuyujukian. We pointed her to a web browser app and told her to have fun.
In a series of short movie clips, the team demonstrated patient T6 Googling questions about gardening, taking full advantage of the autocompletion feature to speed up her research. T6 had no trouble navigating through tiny links and worked the standard QWERTY keyboard efficiently.
Think about it, said Nuyujukian, obviously excited. It’s not just a prettier user interface; she now has access to the entire Android app store.
According to previous studies, the device can function at least two years without experiencing any hardware or software issues. The team is trying to make the implant even sturdier to extend its lifespan in the brain.
“We set out to utilize what’s already been perfected in terms of the hardware to make the experience more pleasant,” said Nuyujukian. “We’ve now showed that we can expand the scope of our system to a standard tablet.”
But the team isn’t satisfied. They are now working on ways to implement click-and-drag and multi-touch maneuvers. They also want to expand to other operating systems, enable the patients to use the device 24/7 without supervision, and expand their pilot program to more patients in all three of the BrainGate clinical sites.
“Our goal is to unlock the full user interface common to general-purpose computers and mobile devices,” said Nuyujukian. “This is a first step towards developing a fully-capable brain-controlled communication and computer interface for restoring function for people with paralysis.”