Who’s excited to have their brainwaves scanned as a personal ID?
Kusanagi Motoko, Johnny Mnemonic, Takeshi Kovacs, John Perry, Lenny Nero — the practice of melding biological minds with electronics hardware is a cornerstone technology of modern cyberpunk literature. And, if certain medical device startup companies are to be believed, accomplishing similar cybernetic feats — from downloadable memories to “Whoa, I Know Kung Fu”-style instantaneous learning — could become reality sooner than we think. However, a number of leading researchers in the study of brain-computer interfaces (BCI), and the encompassing field of neurology, aren’t quite so bullish on the prospects of an inevitable cybernetic future.
BCIs are, essentially, devices that read the electrochemical firing of the brain’s myriad synapses, interprets and translates that signal into a digital format that can be understood by computers. Research on the technology began in the 1970s at the Brain Research Institute of University of California at Los Angeles under the watch of pioneering neurologist, Dr. Jacques J. Vidal. It took researchers more than two decades to sufficiently lay the basic technological groundwork needed to progress from animal models but by the mid-1990s the very first BCI prototypes were being installed in human craniums.
“People have tried regenerative medicine, stem cells and that’s been a really hot area for years, to try to inject biological payloads to repair injury,” Dr. Charles Liu, Professor of Clinical Neurological Surgery and Director of the USC Neurorestoration Center, explained to Engadget. ”But for people with very severe injuries, like the ones that we deal with. We envision a situation where we can essentially create a replacement technology to the natural way that humans do things.”
BCIs are generally categorized based on whether they collect electrical information from the inside or the outside of the patient’s head, with each method having its own unique qualities and characteristics. “The skull is a big insulator,” Dr. Liu noted. “it attenuates all the information. I mean, if the brain wasn’t inside the skull, it wouldn’t be so mysterious.”
For example, in a 2016 study, Dr. Liu coordinated with a team of neurologists from the University of California, Irvine (Zot! Zot! Zot!), led by Dr. An Do, Assistant Professor
Department of Neurology and member of UCI’s Brain Computer Interface Lab, to help partially restore a paraplegic man’s ability to walk using an external, noninvasive BCI that relied on Electroencephalography (EEG). The patient, 27-year-old Adam Fritz, a Southern California insurance adjuster who had been paralyzed in a 2008 traffic accident, first had to relearn how to walk — but only inside his head.
As part of the therapy, Fritz spent countless hours trying to cajole a video game avatar to walk from one side of a computer screen to the other, using only the power of his mind, while an EEG cap monitored and collected his mental output. Those signals, especially after he figured out how to consistently will his character across the screen, were then fed into a signal processing algorithm to translate them into something that a computer could understand. Those commands were then used to control a device affixed below the break in his spinal column which fired electrical impulses into his legs, allowing him to walk the length of a 12-foot course.
At the other end of the invasiveness scale, you’ve got devices like the almost-fully implantable BrainGate2 system. Installing this BCI requires surgery as a baby aspirin-sized array of micro-electrodes are implanted directly onto the surface of the brain itself. By decoding the collected neuroelectrical signals, patients using the BrainGate system have shown the ability to control on-screen cursors with relative ease.
In a groundbreaking 2012 study, a team of researchers from Brown University implanted 96-pin BrainGates into the motor cortices of two quadriplegic patients, allowing them to control a DLR robotic arm to pick up and serve herself a cup of coffee unassisted for the first time in 15 years (and could still do so five years after the device had initially been implanted). Earlier this year, Brown University researchers made an even more monumental technical advancement — they built a system that works wirelessly. The team replaced the conventional Medusa’s scalp of wires and cables that used to run from the implant site to nearby computer arrays, with a 2-inch long, 1.5 ounce wireless transceiver.
“We’ve demonstrated that this wireless system is functionally equivalent to the wired systems that have been the gold standard in BCI performance for years,” John Simeral, assistant professor of engineering at Brown University and lead author of the study, told Brown University News. “The signals are recorded and transmitted with appropriately similar fidelity, which means we can use the same decoding algorithms we used with wired equipment. The only difference is that people no longer need to be physically tethered to our equipment, which opens up new possibilities in terms of how the system can be used.”
This advance could for example help accelerate the development of a new generation of mind-controlled prosthetics, like the new bidirectional sensing LUKE hand developed at the University of Michigan. ““This is the biggest advance in motor control for people with amputations in many years,” Paul Cederna, Professor of Plastic Surgery at the U-M Medical School, told UMich News.
Noninvasive BCIs, like the EEG-based skull cap Dr. Do’s team used in the 2016 study, are still widely used in research given their convenience and minimal chance of causing complications. However those come at a cost of lower spatial resolution and spectral bandwidth compared to invasive systems that can monitor the activation state of individual neurons. That is, the system is easier to get on and off but it doesn’t generally produce as high quality data as a more invasive subdural array would. In a few circumstances, however, that low fidelity data can still be useful and generalizable beyond its primary function.
“At an ECG level, because everything is so mushed together and the resolution is so low,” Dr. Do explained to Engadget, “the algorithm we use to decode, as the person is thinking about moving or not moving, can be used for both upper and lower extremities.”
“Now, when it comes to the lower extremities and upper extremities within the invasive domain,” Dr. Do continued. “When we get into the extracellular potentials and single neuron activity, that may not hold true.”
Dr. Payam Heydari, a professor at UCI’s Henry Samueli School of Engineering, elaborated “In the upper extremity, the essence of the algorithm perhaps remains the same but the learning based model is going to change because the rotation, the movements and everything else for upper and lower extremity are going to be different, but the essence is going to be the same.”
Dr. Do notes that the added algorithmic and signal processing layers needed to help establish command and control over individual fingers as opposed to toes could preclude some systems built to interpret the signals destined for legs and feet to translate for use with arms and hands.
However, as signal fidelity increases through the use of implanted BCIs capable of recording the activity of single neurons, more generalized sets of actions may be possible, at least for the upper extremities. “Those signals contain so much information about intended movement, it should allow intuitive control over complex devices, including multi-dimensional reach and grasp of robotic prosthetic limbs,” Dr. Leigh Hochberg, Director of the Center for Neurotechnology and Neurorecovery, Neurocritical Care and Acute Stroke Services, Department of Neurology at Massachusetts General Hospital and Professor of Engineering at Brown University, told Engadget. “Those same signals can be used for controlling a computer cursor on screen and, as shown recently, those same signals can be used for decoding intended handwriting.”
“That flexibility is what we want to harness — much like somebody who’s able-bodied may, at one moment, use their hand to write with a pen, and the next moment, use their hands to control a computer mouse, and a few moments after that, reach out to pick up a cup of coffee,” he continued. “Part of the reason to be recording from the brain is to use the signals that allow for flexible control over multiple useful devices.”
Despite numerous advances over the past decade, BCI technology still faces significant roadblocks in getting out of the lab and (literally) into the public’s consciousness. USC’s Dr. Liu points to “miniaturization of the electronics, design of a fully implantable device, better signal processing algorithms and creating the bidirectionality in the sensation and control signals,” as laudable goals but notes that much of their potential success is predicated on having a sufficiently robust power supply.
“The power requirements are critical,” he said. “If you have something that requires a lot of power, then the person has to carry a battery pack and, if you want to implant it fully, then how big can the battery be? And then if you want to change the battery out periodically, how often do you need to do that? Our goal would be to put in something very small that achieves everything.”
“This technology can be useful,” Dr. Karunesh Ganguly, Associate Professor of Neurology at the UCSF Weill Institute for Neurosciences. “But the question of ‘how to make it useful day-to-day for patients’ is the challenge and… requires customization to an individual’s needs.”
Ganguly sees future applications for the technology expanding to help stroke patients as well as for severe paralysis. “That’s probably the one that would make the biggest impact hopefully in the next five or ten years. People who are quadriplegic or paraplegic, upper spinal cord injuries, stroke, ALS, muscular dystrophy and so forth. They’re paralyzed, but their brain is intact. That phenotype, I think, has the greatest chance for this version to work.”
Regardless of the myriad treatments BCIs could one day be used for, the technology remains in its infancy and exists almost exclusively in research labs. But that hasn’t stopped startups like Elon Musk’s Neuralink company from making a number of fantastic claims about what their devices might someday do — from monkeys playing Pong to being able “to rewind memories or download them into robots” a la Altered Carbon.
Such claims have been met with a range of reactions by the neurology research community, from huffed eye rolls to outright hostility. And as for whether we’ll be able to instantly learn anything, a la The Matrix, well, try not to hold your breath.
“The guy is a master of selling things that may never work,” Dr. Miguel Nicolelis, pioneering neurologist and Principal Investigator of the Nicolelis Lab at Duke University, recently told Inverse, “They will never make people download their emotions or their deep cognitive functions, and they’ll never make people learn French by uploading French grammar to a brain-machine interface. You will never reproduce it. For a science fiction movie, that’s fine, but for Elon Musk to come out and say exactly the same thing is bogus — totally bogus.”
“[Musk] sells things that have been invented before and he tries to say that he’s done some amazing thing,” he continued, arguing that much of what Neuralink has touted as advancements is actually old hat for the academic research community.
Dr. Do concedes that “we still have a lot of challenges that are necessary to overcome” before we’re downloading languages at will but also points out that private companies — and the hefty R&D budgets that they command — are able to iterate ideas more quickly and “make things happen on a timescale that’s way faster than anything happens in academia.” So maybe our Kung Fu fighting futures aren’t that far off after all.
However, the prospect of using BCIs to not only heal injuries back to full functionality but to actively enhance human performance and cognition leads — as nearly all cutting edge technologies do — to a morass of ethical issues. Who gets access to the technology and when? Will having computers in our heads exacerbate existing societal inequality? How do we keep these machines secure from hacking attempts, and build public trust in a radically new form of biotechnology such as this? These are questions without easy answers but as Dr. Hochberg points out, despite their world-changing potential, BCIs are still just medical devices. Ones which, in the US at least, are subject to massive amounts of testing, validation and oversight by the FDA and other federal regulatory agencies. And just like previous revolutionary technological advancements — looking at you implantable pacemakers, deep brain stimulation systems, and vagus nerve stimulators — any BCI device that does make it to market, whether it came out of a university lab or Elon Musk’s latest fever dream, will have been put through its paces.
(12)