Patterns of brain activity accurately predict tongue shape while feeding

Patterns of brain activity accurately predict tongue shape while feeding

Neuroscientists have learned a great deal about how the brain interprets and controls movements that make up everyday movements like walking, reaching, and grasping objects. But the mechanics of fundamental behaviors like eating, drinking, and communication have been more difficult to measure, largely because a crucial component—the tongue—is mostly hidden from view.

New research from the University of Chicago takes up that challenge by using 3D X-ray videography and machine learning to record intricate movements of the tongue in non-human primates while they are feeding.

When combined with recordings of neural activity taken simultaneously from the sensorimotor cortex of the brain, the study, published in Nature Communications, shows that the 3D shape of the tongue can be accurately decoded from the brain, opening possibilities for brain computer interface-based prosthetics to restore lost functions of feeding and speech.

Infinite degrees of freedom

In addition to being tucked away inside the mouth, the tongue presents another biomechanical challenge. Movements of the arms or legs are constrained by bones and joints of the skeleton, giving them a certain amount of predictability. Since the tongue is made entirely of muscle and other soft tissue, its freedom of movement is almost limitless (except for the unlucky few who can’t roll their tongue into a U-shape).

“When we think of the brain controlling muscles, we almost invariably think about it like actuating an arm or a leg, which has rigid bones moving about a joint,” said J.D. Laurence-Chasen, Ph.D., the study’s lead author and a former postdoctoral scholar at UChicago who now works as a researcher at the National Renewable Energy Laboratory in Golden, Colorado.

“The tongue has a totally different anatomy. There are no rigid internal structures. There’s a ton of different muscles with overlapping functions, and so, it has functionally infinite degrees of freedom.”

As a postdoc and Ph.D. student, Laurence-Chasen used data analytics and machine learning tools to study how the brain controls the dynamic tongue and jaw movements that are crucial for feeding and speech. In the latest study, he worked with Nicho Hatsopoulos, Ph.D., and Callum Ross, Ph.D., both Professors in Organismal Biology and Anatomy, to capture the tongue movements of two male Rhesus macaque monkeys while they were feeding on grapes.

The monkeys each had a set of seven markers attached to their tongues. These markers could be detected by two X-ray video cameras to record movement and shape of the tongue while it was still inside the mouth, much like motion capture technology used for special effects in movies or video games.

The monkeys eat fast, chewing two to three times a second, so the researchers used a novel 3D imaging technology called X-ray Reconstruction of Moving Morphology (XROMM) to capture and process the high-speed data from the tongues’ various movements, shape changes, and deformations.

At the same time, microelectrode arrays implanted in the motor cortex recorded neural activity while the monkeys were feeding. Laurence-Chasen and the team employed deep neural networks, a form of machine learning software, to analyze the brain activity and learn from this information.

When matched with the actual movements recorded by the X-ray cameras, they found that information about the 3D shape and movement of the tongue is present in the motor cortex. They could then use that data to accurately decode and predict the shape of the tongue based on the neuron activity alone.

“We knew from some earlier research that basic movements of the tongue involved the cortex, but we were surprised by the extent and resolution of information about the tongue shape that we could extract so readily,” Laurence-Chasen said.

A future for soft prosthetics

Intriguingly, this data is represented the same way that arm movements and 3D positions of the hand are represented in the brain. Hatsopoulos and Sliman Bensmaia, Ph.D., James and Karen Frank Family Professor of Organismal Biology and Anatomy at UChicago, have already used that body of research to translate signals from the brain into software algorithms that drive the movements of robotic prosthetic limbs that amputees and quadriplegics can move with their minds, and receive natural sensations of touch in return.

While the technology for applications with the tongue aren’t nearly as far along, a similar approach could help patients who have lost functions of feeding and speech.

“Dysphagia and difficulty swallowing is a big problem, especially with the elderly,” Hatsopoulos said. “If we could use this information about the tongue and its shape to decode when a swallow is about to happen, then you could connect that to a device that could stimulate the right set of muscles to help them swallow.”

“What J.D. has been able to do here to decode the shapes of soft tissue, not a skeletal system, is novel,” he said. “I think it’s super exciting.”

More information:
Jeffrey D. Laurence-Chasen et al, Robust cortical encoding of 3D tongue shape during feeding in macaques, Nature Communications (2023). DOI: 10.1038/s41467-023-38586-3

Journal information:
Nature Communications

Source: Read Full Article