Textbooks have long divided the brain into dozens of discrete areas, each with its own set of functions. Students learn that the hippocampus is crucial for learning and memory. The cerebellum coordinates movements. The prefrontal cortex is the seat of higher cognition, handling processes such as decision-making.
This tidy picture might work well for introductory classes, but in reality, even the simplest thought or action involves many brain regions, each containing thousands or millions of neurons. Surprisingly little is known about how brain areas interact and cooperate to interpret sensations and generate behaviors. To tackle this puzzle, the Simons Collaboration on the Global Brain (SCGB) held its first Multiregional Models of Population Coding workshop on January 19, bringing together leading experimental and theoretical neuroscientists from around the world to discuss how brain regions work together. Increasing investment in neurotechnology — including that by the U.S. BRAIN initiative — have finally given scientists the tools they need to observe large swaths of the brain in living animals in unprecedented detail.
Having the tools is one thing. The challenge now is to apply these tools in ways that will lead to a better understanding of the brain. “We are still in the dark about basic facts concerning multiregional interactions and function,” said workshop co-organizer Larry Abbott, an SCGB investigator and co-director of Columbia University’s Center for Theoretical Neuroscience. “How do multiple brain regions divide up the components of a task? What assures that the proper information is conveyed from one region to another?” Fortunately, he said, “we are finally getting the appropriate data to begin to figure this out.”
Misha Ahrens, an SCGB investigator and a group leader at the Howard Hughes Medical Institute’s Janelia Research Campus, presented a spectacular example of what the new technology can accomplish. On a monitor screen, thousands of tiny specks of colored light twinkled in bursts and waves, circumscribed by the faint outline of the body of a larval zebra fish. Over the past several years, Ahrens and his colleagues have pioneered a technique that monitors the activity of almost all the neurons in the larval zebra fish brain, and this display showed what tens of thousands of neurons were doing.
Ahrens and his collaborators genetically modified the animal so that the florescence of molecules in the neurons changes whenever calcium ions enter them, which is (a process that serves as a proxy for electrochemical activity. Because the animal is naturally transparent, the researchers were able to apply sophisticated microscopy techniques to see the neurons. Doing so enabled Ahrens to observe interactions among brain regions in unusual detail. With this setup, he can ask more in-depth questions. “Simple analyses are dominated by large populations of neurons doing similar things,” he said. His approach can capture “more complex, nonlinear interactions among neurons across the brain, or interactions depending on just a few neurons.” In new work he presented at the workshop, he showed that a small number of neurons that release serotonin react only when sensory information coincides with motor information. Ahrens surmised that these neurons could detect when the fish’s own movements alter its sensory input — an important skill to distinguish self-generated sensory stimulation from environmental input.
Other presenters at the meeting described work based on data similarly recorded from many neurons at once, spanning organisms from the tiny worm C. elegans, with its 302 neurons, to the human brain, with its nearly 100 billion. David Van Essen, a professor of neuroscience at Washington University in St. Louis, Mo., presented some of the early discoveries from the Human Connectome Project, a five-year, $30-million effort by the National Institutes of Health to map large-scale connections in the human brain. A consortium of over 100 researchers at 10 institutions are employing cutting-edge brain imaging techniques — including functional magnetic resonance imaging — to observe activity and trace the pathways of fiber bundles. In his presentation, Van Essen described the analysis of over 1,000 brains. Researchers used a combination of information from many imaging modalities to generate a much-improved map of the cerebral cortex, which will enable neuroscientists to better navigate the human brain.
The kind of data collected by Ahrens and Van Essen, said Xiao-Jing Wang, a workshop co-organizer and SCGB investigator who co-directs New York University’s Sloan-Swartz Center for Theoretical Visual Neuroscience, will allow neuroscience to move into new territory. “Neuroscientists have been mostly focused on local circuits,” he said, “but really important questions depend on interactions between brain areas.” Wang has personally invested in this approach, as his lab recently shifted focus from local cortical circuits in the prefrontal cortex to large-scale circuits across cortical brain regions. “A really important, fundamental issue,” he said, “is how information flows from area to area.”
Cora Ames, a postdoctoral researcher at Columbia University and an SCGB fellow studying with SCGB investigators Mark Churchland and Larry Abbott, is addressing that very issue using a combined experimental and theoretical approach. Her question is simple: What happens to the motor cortices on the two sides of our brains when we move our hands independently, as opposed to when we move them in sync? “Sometimes we want our hands coordinated, but not all the time,” Ames said. “It’s the same problem as rubbing your tummy and patting your head at the same time.” Because each arm is controlled by the motor cortex in the opposite hemisphere of the brain, these two brain regions must communicate differently depending on how coordinated the movements need to be. “Yet no one really knows much about how that information flows or is gated,” Ames said.
Ames is currently training monkeys to turn pedals with their hands, either in or out of sync. She then records the electrical activity of several neurons at once in the motor cortex on each side of the brain. She has also built a computer model of those regions to simulate the coordinated brain activity. The model generates a sinusoidal output that is analogous to signals in the monkey’s muscles when it turns the pedals. She can vary the strength and complexity of the connections between the two regions in the model and determine how strongly they coordinate the output. The modeling will ultimately enable her to predict what she will observe in the monkeys’ actual brains during the task.
Neuroscience, according to Larry Abbott, is long overdue for research questions that get scientists to think outside of their comfort zone. “When asked what we study,” he said, “most of us answer by identifying a region of the brain. We’ve compartmentalized the brain, but that’s only a crude approximation of the truth.” He hopes this workshop will prompt researchers to “go back to their labs, and think about the data a little differently, build models a little differently.” It could also result in more direct cross-pollination. “I would be surprised if new collaborations did not come out of this meeting.”
Such collaborations will be necessary for researchers to take the next big steps in brain research. Most neuroscientists agree that conceptual breakthroughs have not kept pace with the explosion of technical advances in the past decade. That’s why Cori Bargmann, a neuroscientist at Rockefeller University and co-chair of the committee whose report led to the creation of the BRAIN Initiative, was quick to point out that conferences such as this one are a crucial way of complementing worldwide efforts to develop neurotechnology. “There is a huge need for science to use the technologies being created by the BRAIN Initiative intelligently,” she said. The work presented at this conference, she believes, fits the bill. “This is not about technology, it’s about science.”