Dora Angelaki, Ph.D. Baylor College of Medicine
Xaq Pitkow, Ph.D. Baylor College of Medicine
Imagine catching a firefly. There are multiple steps: seeing the firefly, estimating its location in between flashes, deciding when to move to catch it, and then moving. Each step engages different brain areas—for example, seeing the firefly activates visual processing regions whereas moving activates motor systems—yet how these areas interact to produce an action based on sensory experience is a mystery. We have designed an experiment to get at these interactions using, it turns out, virtual fireflies. Working in monkeys, we trained the animals to forage through a virtual environment for flashing specks of light. As in real life, such a virtual task engages a variety of brain regions, including those involved in sensory and perceptual processing, navigation, decision-making, and movement. While previous work has studied each of these brain regions individually, we have taken the research one step further to study many regions at the same time. Using sophisticated recording technology, we simultaneously monitor the electrical activity of neurons in each brain region, allowing us to observe the flow of information from brain region to brain region as the animal performs the foraging task. Because even the simplest tasks require the precise coordination of many neural networks distributed across many different brain areas, this approach will be broadly applicable to studying many tasks the brain performs.