Each year, the Simons Foundation requests nominations from a targeted list of institutions in the United States, Canada, the United Kingdom and Ireland for the Simons Investigator programs. Simons Investigators are outstanding theoretical scientists who receive a stable base of research support from the foundation, enabling them to undertake the long-term study of fundamental questions.
Only nominations from institutions that receive the request will be accepted. The Math+X and MMLS programs have been discontinued and the foundation will not be requesting future nominations. Please contact firstname.lastname@example.org for more information.
Simons Investigators in Mathematics, Physics, Astrophysics and Computer Science
The intent of the Simons Investigators in Mathematics, Physics, Astrophysics and Computer Science programs is to support outstanding theoretical scientists in their most productive years, when they are establishing creative new research directions, providing leadership to the field and effectively mentoring junior scientists. Starting in 2020, up to two Simons Investigator in Physics awards will be granted to well-established researchers who develop and apply advance theoretical physics ideas and methods in the life sciences.
A Simons Investigator is appointed for an initial period of five years. Renewal for an additional five years is contingent upon the evaluation of scientific impact of the Investigator. An Investigator receives research support of $100,000 per year. An additional $10,000 per year is provided to the Investigator’s department. The Investigator’s institution receives an additional 20 percent in indirect costs.
To be an Investigator, a scientist must be engaged in theoretical research in mathematics, physics, astrophysics or computer science and must not previously have been a Simons Investigator. He/she must have a primary appointment as a tenured faculty member at an educational institution in the United States, Canada, the United Kingdom or Ireland, on a campus within these countries and the primary department affiliation must have a Ph.D. program.
Simons Investigators in Mathematical Modeling of Living Systems (MMLS)
This program aims to help the research careers of outstanding scientists working on mathematical and theoretical approaches to topics in the life sciences. A Simons Investigator in MMLS is appointed for five years.
This program encourages novel collaborations between mathematics and other fields in science or engineering by providing funds to professors to establish programs at the interface between mathematics and other fields of science or engineering. A Math+X Investigator is appointed for an initial period of five years. Renewal for an additional five years is contingent upon the evaluation of scientific impact of the Investigator.
Ivan Corwin works at the interface of probability and mathematical physics with a particular interest in exactly solvable probabilistic models and stochastic partial differential equations. Much of his work has focused around the Kardar-Parisi-Zhang equation and its universality class.
Nick Sheridan's work centers around Kontsevich's homological mirror symmetry conjecture, which posits a deep relationship between symplectic topology and algebraic geometry. Working mainly on the symplectic side, he has developed tools for proving the conjecture and applied them to prove the conjecture in a number of cases, most notably the quintic threefold. In cases where the conjecture is established, Sheridan has given applications to enumerative geometry and symplectic topology.
Wei Zhang is a number theorist working on automorphic representations and arithmetic geometry. He studies special values of zeta and L-functions and their relation to periods and heights of algebraic cycles on moduli spaces over global fields. A focus of his research is various high-dimensional generalizations of the Gross-Zagier formula. In particular, Zhang proposed the relative trace formula approach, which has led to many new deep questions relating the harmonic analysis on spaces with a group action over local fields to the arithmetic intersection theory on local Shimura varieties.
Chenyang Xu works in algebraic geometry, with an emphasis on understanding the structure of higher dimensional algebraic varieties. With his collaborators, Xu led the establishment of a rich algebraic K-stability theory for Fano varieties, crowned by the novel construction of projective K-moduli spaces parametrizing Fano varieties. The new algebraic method invented by Xu and his collaborators, largely built on the minimal model program in birational geometry, also provides a solution to the algebraic Yau-Tian-Donaldson conjecture for all Fano varieties, and a radically new singularity theory.
Ehud Altman studies quantum many-body phenomena in condensed matter and quantum information systems. He is known for introducing dynamical renormalization group methods to describe many-body localization and for establishing the nature of large-scale fluctuations in exciton-polariton condensates. Altman’s recent work has contributed to elucidating the role of quantum information, entanglement and chaos in the dynamics of many-body systems. With collaborators, he introduced the notion of Krylov complexity as a holographic-like principle for quantum operator growth in generic systems and helped to understand information phase transitions in quantum circuits using tools from field theory and statistical mechanics.
Michael Levin’s research combines ideas from condensed matter physics, quantum information and mathematics with a focus on the theory of topological phases of matter. In one line of research, he has constructed exactly solvable lattice models that realize a general class of two-dimensional strongly interacting topological phases. These lattice models have become a useful theoretical tool for studying anyons. He has also introduced a way to probe topological phases using entanglement entropy, which has given rise to new numerical methods. Recently, Levin has made contributions to the theory of the bulk-boundary correspondence for topological matter, both in equilibrium and in periodically driven systems.
Mariangela Lisanti is a theoretical astroparticle physicist studying the nature of dark matter. Her research is interdisciplinary and often involves application of novel data science techniques or collaborations with experimentalists and observers. Lisanti helped to pioneer the use of simplified models in Large Hadron Collider searches, proposed new directions for dark matter experiments and developed original analysis methods for studying dark matter annihilation in gamma rays. Most recently, Lisanti has been harnessing data from astrophysical surveys to probe the fundamental nature of dark matter and map its distribution in the Milky Way.
Douglas Stanford works on quantum gravity and its connection to quantum mechanics, mainly in the context of simple toy models of black holes. He and his collaborators showed that black holes exhibit the butterfly effect and used this as a starting point to explore chaos in quantum many-body systems. Stanford’s recent work has focused on aspects of black holes that appear in tension with the rules of ordinary quantum mechanics and, in particular, on the search for spacetime effects that can resolve the tension.
Jesse Thaler is a theoretical particle physicist who fuses techniques from quantum field theory and machine learning to address questions in fundamental physics. His current research focuses on maximizing the discovery potential of the Large Hadron Collider through new theoretical frameworks and novel data analysis techniques. Thaler is an expert in jets, which are collimated sprays of particles copiously produced at colliders, and he studies the substructure of jets to enhance the search for new phenomena and illuminate the dynamics of gauge theories.
Olga Dudko is a theoretical physicist who explores the phenomena of the living world. Her research is driven by the notion that deep, physics-based conceptual approaches can encompass living-system complexity. The theory of single-molecule force spectroscopy developed by Dudko and collaborators has been widely used for extracting activation energies and rate constants from experiments on conformational transitions in biological macromolecules. Recent areas of research include the spatiotemporal organization of chromosomes, virus-host cell interactions and neuronal communication. Dudko’s research strives for a unifying understanding of disparate biological processes through analytically tractable theories that reveal unifying principles and are predictive in experiments.
Joshua Weitz explores how viruses transform the fate of cells, individuals, populations and ecosystems. In collaboration with experimentalists, his research in viral ecology has revealed latent structures of virus-host infection networks, identified principles underlying the therapeutic use of viruses against bacterial pathogens and shown how repeated infections catalyze the diversification of complex virus-microbe communities. Weitz's recent and ongoing work on pandemic dynamics examines how behavior change and asymptomatic spread shape outbreaks and can be used to inform data-driven interventions.
Alex Schekochihin is interested in the fundamental nature and practical implications of turbulence in plasmas. His main contributions have concerned the physics of turbulent dynamo (believed responsible for much of the observed cosmic magnetism), the free-energy cascade and the interplay between microscale instabilities and mesoscale dynamics in astrophysical plasmas from galaxy clusters to solar wind, and mechanisms for “phase transitions” between low- and high-transport states in fusion devices. Recently, Schekochihin proposed a theory of fluidization of collisionless plasma turbulence, owing to suppression of Landau damping by stochastic echoes. This has led to his ongoing interest in the general problem of turbulent relaxation and universal equilibria in collisionless plasmas.
Tracy Slatyer is a theoretical physicist working at the interface of particle physics, cosmology and astrophysics, seeking clues to the mystery of dark matter in astrophysical and cosmological data. She was a co-discoverer of the giant gamma-ray structures known as the “Fermi Bubbles” and has done influential work on new theories of dark matter and possible effects of dark matter interactions from the early universe to the present day.
Shayan Oveis Gharan’s research exploits deep tools from mathematics, such as the theory of real stable and log-concave polynomials, spectral graph theory and high-dimensional simplicial complexes to design and analyze algorithms for discrete objects. He is known for his results on improved approximation algorithms for classical optimization problems such as traveling salesperson problems, as well as his analysis of the mixing time of Markov chains to efficiently sample from complex probability distributions such as the uniform distribution over the bases of a matroid.
Shachar Lovett works broadly in theoretical computer science and related mathematics. He focuses on the study of structure and randomness, and how they are pivotal to our understanding of efficient computation. One facet of Lovett’s work is discovering the mathematical structures that underlie efficient computation, such as combinatorial structure in complexity theory, geometric structure in machine learning and algebraic structure in coding theory. Another facet is understanding the power of randomness in computation. Structure and randomness can be seen as complementary, and Lovett’s work aims to identify the fracture lines between structure and randomness, towards a better understanding of computation.
Gregory Valiant works at the intersection of algorithms, information theory, learning and statistics to understand how to extract as much information as possible from data in various fundamental settings. What information can be inferred if the amount of data is sublinear in the support size or dimensionality of the distribution in question? How do restrictions on the amount of available memory affect the time or amount of data required to learn or optimize? Can we make learning and estimation algorithms robust to outlying training data or settings where the test and training sets are dissimilar?