Technology

Unraveling Neural Pathways to Enhance Brain-Controlled Prosthetics

Picking up a cup of coffee, flipping a light switch or grabbing a door handle don’t require much apparent thought. But behind the curtain, the brain performs feats to coordinate these seemingly simple hand-to-object motions.

Using functional MRI brain imaging, or fMRI, University of Oregon researchers have unraveled some of the neural circuitry behind these kinds of actions. Their insights, described in a paper published in the journal eNeuro, can potentially be used to improve the design of brain-computer interface technologies, including brain-controlled prosthetic arms that aim to restore movement in people who have lost it.

“While there are already robotic arms or exoskeletons that take signals from brain activity to mimic the motions of a human arm, they haven’t quite met the gracefulness of how our arms actually move,” said study lead author Alejandra Harris Caceres, an undergraduate senior majoring in neuroscience and human physiology. “We’re hoping to figure out how and when the brain integrates different kinds of sensory information to help make this technology better for patients.”

Michelle Marneweck (left) is an assistant professor in human physiology, studying the neural processes that allow humans to dexterously interact with their environment. Alejandra Harris Caceres is a fourth-year undergraduate student majoring in neuroscience and human physiology. (Photos courtesy of Michelle Marneweck)

For the brain to plan even a simple, goal-directed action, like reaching out and grabbing a cup of coffee on a table, it needs to make lots of calculations, including the direction and distance of one’s hand to the object. To get even more accurate, the brain uses multiple “reference frames” to make calculations from multiple perspectives — for example, from the hand to the cup versus from the eyes to the cup.

In this study, the researchers wanted to understand how the brain combines those different bits of sensory information to plan and generate a goal-directed action. While that has been investigated extensively in nonhuman primates such as monkeys, this is the first evidence in humans of neural representations of direction and distance in multiple reference frames during reach planning, said study senior author Michelle Marneweck, an assistant professor in human physiology at the UO’s College of Arts and Sciences.

“Just like how we use reference frames for navigation, like moving through the woods or driving through fog, we also use it to make the most elementary, everyday actions, such as reaching for a bottle of water,” Marneweck said. “Having a system rich in redundancies allows us to flexibly switch between reference frames, using the most reliable one in the moment, and be good at goal-directed actions, no matter what the environment throws at us.”


How the brain plans a reach

As part of the study, human subjects laid flat on a screening table inside an fMRI machine. Over their lap was a table with an electronic task board of lights and buttons. Because the participants couldn’t sit up during the scan, they wore a helmet-like setup with a mirror attached, allowing them to see and interact with the button-pushing tablet.

The fMRI scan measured blood flow in the brain to highlight active areas when subjects were prompted to reach for a button and when they pressed it. Analyzing the neural activity patterns across key sensory and motor areas, the results showed the human brain prepares for reaching actions by encoding the target direction first, then the distance. The researchers also found that direction and distance are coordinated in multiple, rather than single, reference frames.

That can help improve the design of tools like bionic arms that use brain signals, Harris Caceres said. Neural interface technology that considers the serial processing of direction into distance and incorporates multiple reference frames  and the multitude of regions encoding this information  could add an extra layer of sophistication to the modeling, she suggested.

“It is a really exciting time for brain-computer interface technology, which has advanced rapidly in recent years with the integration of sensory information. One of the overarching goals of our lab is to further optimize these technologies, by figuring out when, where and how the brain integrates sensory information for skilled actions. This experiment was a step in that direction.”

— Michelle Marneweck, UO assistant professor

For Harris Caceres, the experiment also was a step toward her career goal of becoming a physician. She said she initially joined the project with nervous excitement, coming in with just the basics of what’s taught in her undergraduate courses. But under Marneweck’s mentorship, Harris Caceres’ nerves evolved into confidence as she led the study’s data collection, which she learned is her favorite part of the research process.

“I feel like fMRI is one of those techniques that you always marvel at in class, so to be a part of this research and get familiar with fMRI has been valuable to have in my skill set,” she said.


Impacts across the lifespan

The researchers are now performing the same set of experiments but on a skilled performance model: human athletes. In a study funded by the Wu Tsai Human Performance Alliance, Marneweck and her team recruited athletes to check if the neural mechanisms of peak physical performers differ from nonathletes. Their preliminary results suggest athletes require less brain power to complete reaching tasks.

In the future, the researchers also plan to investigate how aging and neurodegenerative disorders, like Alzheimer’s disease and Parkinson’s disease, affect sensory and motor control.

“Topographical disorientation, which colloquially is known as ‘getting lost,’ is one of the earliest markers of Alzheimer’s disease,” Marneweck said. “We and others in the scientific community are interested in determining a causal link between cognitive features of Alzheimer’s and sensory and motor changes that show up as early as 10 to 12 years before the hallmark clinical signs.”

By studying across the spectrum of human health, Marneweck’s lab is shedding light on the complex sensory and motor mechanisms that go on in the background unnoticed.

“Reaching and interacting with objects may seem elementary due to the seamless ease with which we perform these actions,” Marneweck said, “but they are among the most complex and least understood feats of human behavior.”

The Editor

Scaling Clubfoot Treatment in India: GCI, CURE India, and POSI Collaboration

Previous article

Revolutionary Prosthetic Hand Mimics Human Touch with Unmatched Precision

Next article