Normalization as a Canonical Neural Computation

David HeegerIt is hypothesized that the computations performed by the brain are modular, and are repeated across brain regions and modalities to apply similar operations to different problems. A candidate for such a canonical neural computation is normalization, whereby the responses of a neuron is divided by a common factor, which typically includes the summed activity of the local population of neurons. Normalization was developed to explain responses in primary visual cortex, and it is now thought to operate throughout the visual system and in multiple other sensory modalities and brain regions. Normalization may underlie operations as diverse as the deployment of visual attention, the encoding of value in parietal cortex, and the integration of multisensory information. It is present not only in mammals but also in the neural systems of invertebrates, suggesting that it is a computation that was developed at an early stage in evolution. I will present the normalization model of neural computation, some of the empirical tests of the model, and elaborate the hypothesis that dysfunctions of normalization may be associated with schizophrenia, amblyopia, epilepsy, and autism spectrum disorders.

About the Speaker:

David Heeger is a Professor of Psychology and Neural Science at New York University, where he is a member of the Center for Brain Imaging. His research focuses on biological and artificial vision. He has made a number of influential contributions. These include a model how one can measure motion from optic flow, a nonlinear model of responses in the visual cortex called the “normalization model”, and a method for texture synthesis. Since the late 1990s he has been at the forefront of the field of functional magnetic resonance imaging (fMRI).


Recent Articles