Thought-Controlled Technology Improves Thanks to Motor Cortex Findings

A new understanding of how motor neurons fire should lead to a dramatic increase in performance of brain-machine interfaces

GettyImages-80637626_brain-cap_600x400_rev
READING THOUGHTS: A sensor-equipped cap is one way to detect brain signals. Credit: Hank Morgan -Rainbow/Getty

A technology that would enable paralyzed patients to move prosthetic limbs merely by thought has taken a major step forward. A team led by investigators from the Simons Collaboration on the Global Brain (SCGB) has found a way to apply new advances in the neuroscience of movement to dramatically improve the response and accuracy of brain-machine interfaces (BMIs).

These interfaces are devices that measure brain activity and, using sophisticated computer algorithms, translate the activity into actions in the real world. BMIs can be as simple as a cap outfitted with sensors that measure brainwaves — users who wore this head-cap could fly small drones around a college gymnasium with the power of their own thoughts. For some paralyzed patients, BMIs surgically implanted directly on the brain are better, because they are closer to the active neurons and receive clearer signals. In 2012, a team led by Brown University researchers reported that they used an implant to enable a paralyzed woman to control a robotic arm and take a sip of coffee. And in a high-profile demonstration, a thought-controlled robotic ‘exoskeleton’ designed by the neuroscientist Miguel Nicolelis of Duke University enabled his paralyzed patient to deliver the opening kickoff at the 2014 World Cup in Brazil.

But despite these impressive demonstrations, thought-controlled prosthetics are still in their infancy. They suffer in accuracy and speed, in part because scientists do not know enough about how the brain initiates and controls movements.

These barriers could soon fall thanks to a team led by SCGB investigator Krishna Shenoy and included SCGB colleagues Mark M. Churchland and John P. Cunningham, both of Columbia University. In an experiment with monkeys, the team surgically implanted small silicon computer chips, about the size of an adult pinkie nail, into the brain areas responsible for movement. These high-tech electrodes can pick up the individual activity of hundreds of neurons.

The monkeys then sat in a room, looking at a computer screen. Wires transmitted signals directly from their brains to the computer. On the screen was a grid of targets. When one of the targets randomly lit up, the monkey had to hover the cursor over it in exchange for a juice reward. But there was a twist — the monkey couldn’t use a computer mouse, or anything else, to control the cursor. Instead, in a matter of minutes, it learned to use the BMI to will the cursor to move just by thinking about it.

Shenoy’s advance, published on July 29 in Nature Communications (pdf), builds on recent findings about how the motor cortex controls movements. Scientists had discovered that when we move, certain patterns of neural activity are more likely than others. In fact, viewed mathematically, seemingly random electrical activity of neurons in the motor cortex organizes itself into simple, predictable patterns.

Shenoy’s team then built a computer model that takes advantage of this new predictive power — a crucial advance in BMIs. The better the BMI can predict which movement the brain intends to make and how, the better the subject will be able to control a cursor, or a robotic prosthetic.

Their model, called a neural dynamical filter, or NDF, infers the ‘laws’ by which the neurons will behave, just as the law of gravity describes how quickly an apple will fall from a tree, how far a home run will fly, or how a spacecraft will rendezvous with a planet. The NDF keeps the laws up to date and accurate by constantly comparing their predictions to the actual observed activity.

The improvement in BMI performance was “jaw-dropping,” says Shenoy, who also directs the Neural Prosthetic Systems Lab at Stanford University. The monkeys reached the targets, on average, 30 to 40 percent faster and more accurately when the NDF decoded their brain activity than when it was decoded using neural data alone.

“It’s a very elegant study,” says Richard Andersen, a professor of neuroscience at the California Institute of Technology in Pasadena, who was not involved in the research. “This lab has been a pioneer in adding to the understanding of the motor cortex by examining the dynamics of a neural population as it evolves through a movement. They’ve now applied these insights to BMIs.”

Because Shenoy’s method does not require new hardware, just a tweak in software, it may find immediate clinical applications. In a previous study, his team showed that similar ‘laws’ are followed by neurons in the human motor cortex. He plans to incorporate the laws into an ongoing clinical trial in humans, in collaboration with Massachusetts General Hospital and other institutions. In this trial, called BrainGate, paralyzed participants will use BMIs to control robotic prosthetics.

Accuracy in neural prosthetics has been a challenge, Andersen says, adding that Shenoy’s work “should greatly improve the estimation of the intended movement.” And with such improvements, paralyzed patients can, researchers hope, move beyond the few shaky steps current BMIs allow, to make the smooth, fluid movements they commanded before they became paralyzed. “We see no reason why this shouldn’t work,” Shenoy says.

Recent Articles