Decoding Body Language Reveals How the Brain Organizes Behavior

Tracking behavior at fine timescales explains how specific activity patterns in the basal ganglia define actions

When tennis star Venus Williams serves a ball, she launches a complex series of movements — she coils her body, bends her knees and swings the racket back, all in preparation for tossing the ball in the air. How does the brain choose which movements to perform and in what sequence? New research in mice shows that individual actions within this type of sequence are defined by the precise ratio of neural activity in two cell types within a brain region called the striatum.

The discovery was made possible by finely tracking both behavior and neural activity. “It’s a tour de force in bringing together large-scale recording technology with unsupervised methods for analyzing behavior,” says Jonathan Pillow, a neuroscientist at Princeton University and an investigator with the Simons Collaboration on the Global Brain (SCGB), who was not involved in the study.

The research demonstrates how new techniques for tracking behavior at fine timescales can yield new insight about the brain. “If you look at behavior in high resolution and holistically, characterizing everything a mouse does, then you can learn something new about how the brain generates behavior,” says Jeffrey Markowitz, first author on the new paper and a postdoctoral researcher in the labs of Sandeep Robert Datta and Bernardo Sabatini at Harvard University. Datta and Sabatini are both SCGB investigators.

MoSeq identifies different syllables of mouse behavior. Each section shows individual animals performing the same syllable, indicated by a dot on the animal's body.Credit: Datta lab

Datta and his collaborators previously developed an automated system called MoSeq that employs machine vision and machine learning to finely parse the structure of mouse behavior. Researchers use a Microsoft Kinect — a motion-sensing device used in video games — to capture video of mice and build a 3-D model of the animal’s body as it moves. Those data are then fed to algorithms originally designed to analyze language; the algorithms search for the underlying architecture of behavior.

Previously, most researchers scored mouse behavior by hand, noting when the animal rears its body, for example. But this type of analysis is inherently biased because it relies on the observer’s designation of the components of mouse behavior. MoSeq generates an unbiased account of behavior and can detect components of behavior too fast for human observation.

The original MoSeq study uncovered a novel structure in mouse behavior, revealing that mice change their behavior roughly every 300 milliseconds. The resulting architecture shares some features with language — the behavior is made up of a defined series of ‘syllables,’ such as running, pausing and rearing, that are strung together according to certain patterns. Mice follow a predictable sequence of syllables when they smell fox urine, for example. “The ordering between the syllables is probabilistic,” Datta says. “If I know the mouse is expressing syllable A right now, I know in the future he is very likely to express B, somewhat likely to express C, and not at all likely to express D.”

To date, the bulk of behavioral neuroscience research has focused on highly overtrained behaviors, such as when an animal, in response to a cue, receives a reward for correctly pulling on a lever, licking a tube or poking its nose into a port. Techniques such as MoSeq allow neuroscientists to better study natural, untrained behaviors. “Most of what animals do in the real world is about using their bodies to explore the world naturally, and until recently we’ve lacked the technology to understand the structure of behavior in these sorts of contexts,” Datta says. “In a way, MoSeq opens a window into how the brain might work under these more natural conditions.”

In the new study, published in Cell in June, Datta and his collaborators moved from simply characterizing behavior to studying the neural activity underlying that behavior. To do this, they first had to figure out how to simultaneously track behavior and neural activity, itself a technical challenge. Neural recordings require a device on top of the animal’s head, obstructing the Kinect camera’s view of the mouse. The researchers solved this obstruction problem by developing a probabilistic technique to infer the animal’s 3-D position beneath the recording device.


Simultaneous imaging of both behavior and neural activity in the striatum. Left: 3-D cameras image mice as they explore an open field. The top image shows the mouse digitally aligned along the axis of its spine, with its nose to the right. The bottom image shows the animal’s position. Right: The MoSeq algorithm uses this data to determine behavioral syllables (top), represented as colored blocks, and aligns them to neural activity (bottom). Each trace represents responses from a single medium spiny neuron in the dorsolateral striatum. Credit: Datta lab.

They then analyzed neural activity while simultaneously performing MoSeq, revealing that the structure of neural activity correlates with the structure of behavior — both share a similar rhythm, changing on the timescale of 200 to 300 milliseconds. “That helped convince us we were on track in understanding something about how the brain creates the structure of behavior,” Datta says.

Researchers also wanted to better understand the role the striatum plays in motor and action planning. The striatum contains two basic components — the direct and indirect pathways. Some research suggests the two pathways act like an on-off switch for actions: the direct pathway triggers the movement, and the indirect pathway inhibits it. But some experiments contradict this antagonist model by showing that often both pathways are active at the same time, suggesting that the two collaborate to regulate behavior.

Datta’s team also found that both pathways are active during movement. Analyzing the two pathways during individual syllables of behavior revealed a more precise pattern — each syllable has its own unique combination of direct and indirect activity. “It’s not that one is telling you to go and one to stop,” Datta says. “Both contribute to the encoding of behavior.” A second paper published in the same issue of Cell found similar results — that the direct and indirect pathways serve complementary roles with respect to behavior.

Studying behavior at a finer temporal resolution — knowing moment by moment what the animal is doing — may help reconcile the two models, Datta says. When looking at activity across several hundred milliseconds, both pathways appear active. But on shorter timescales, activity in the two pathways appear to be very different. Under those conditions, “we see rich decorrelation, consistent with two pathways playing distinct roles in action selection,” Datta says. “We’re able to take two seemingly contradictory results and reconcile them because MoSeq allows us to understand in fine detail what the animal is doing with its body.”

Disrupting both of these pathways impairs the animal’s ability to string together behavioral sequences the way normal mice do, though the animals can still perform individual syllables of behavior. “Mice can still run around, rear up and groom themselves, but the order is all wrong,” Markowitz says. “It destroys the statistical structure of behavior but not the execution of behavior itself.”

As part of Datta and Sabatini’s ongoing SCGB project, the team is working with SCGB fellow Scott Linderman, an expert in machine learning, to develop new methods to detect shared structure in the brain and behavior. Both datasets are incredibly complex — both are high dimensional and operate on multiple timescales simultaneously — making it particularly challenging to understand the relationship between them. “We need statistical tools that allow us to relate complex data to complex data,” Datta says. “We are trying to advance the machine-learning techniques we’ve developed to analyze behavior to look at the shared structure in behavior and neural activity.”

Datta says these types of tools will be essential moving forward, as researchers collect ever more detailed data on neural activity and behavior. “You might imagine that there are ensembles of neurons that correlate with the expression of particular syllables,” Datta says. “It could be that if the model has access to information about what is going on in the brain, it will have a more precise ability to identify a particular syllable as being expressed, even under conditions where the behavioral data itself is noisy.”

In the meantime, others are starting to use MoSeq. “People are very interested in using MoSeq to explore richer behavior, such as sensory driven and social behavior,” Pillow says. The software is free and publicly available to academics. (See the Datta lab website for more information.) Datta says that in its current state, the software is best suited to those who have experience with machine learning and behavior. But the researchers are working on making it more accessible. “We want this kind of analysis to spread throughout the community,” he says.

 

Recent Articles