Determining the quantum mechanical behavior of a large number of interacting electrons is one of the grand challenges of modern science. The solution of this ‘many electron problem’ is important because electrons determine the physical properties of materials and molecules: whether, for example, they are hard or soft, reactive or inert, conducting or insulating, superconducting or magnetic, good at converting solar radiation to more useful forms of energy or not.
Recent years have seen spectacular progress in unraveling the subtleties of quantum mechanics, improvements in ideas, in algorithms and in hardware that have transformed our ability to perform numerical computations, and impressive advancements in our understanding of the physics and chemistry of interacting electrons in molecules and bulk materials. The Simons Collaboration on the Many Electron Problem aims to take advantage of these developments, bringing an internationally acclaimed group of scientists together to further refine the new developments and combine the different strands of progress into a new set of tools for predicting the structure and electronic properties of the molecules and materials important to all of us.
The physical properties of materials and molecules (whether, for example, they are hard or soft, reactive or inert, conducting or insulating, superconducting or magnetic) are determined by the quantum mechanical behavior of electrons subject to the forces created by the nuclear charges of the constituent atoms and the forces due to the other electrons. The equations governing the many electron problem were formulated by the British physicist P.A.M. Dirac more than 80 years ago but have proven extraordinarily difficult to solve. Methods developed from the 1950s to the 1970s provided vital information, and improvement of these methods continues, but the problem is not yet solved.
Recent progress is changing the situation. Research into quantum computing and quantum information theory has transformed our understanding of quantum mechanical entanglement. Vast improvements in computers and computational methods have expanded our capabilities. Experimentalists have uncovered new physics which is simply not understandable in terms of the approximate methods developed over the years (e.g., the high transition temperature superconductivity observed in alloys of copper and oxygen and also in alloys of iron and arsenic). This has forced the community to acknowledge the limits of existing methods and has driven a search for new concepts.
Exciting new approaches have emerged, including:
- Self-consistent embedding (dynamical mean-field) methods that isolate relatively small parts which are treated in full detail and are self-consistently embedded into a wider (but approximately treated) electronic structure.
- ‘Matrix product state’ and ‘tensor network’ computational methods derived from our newly improved understanding of quantum mechanical entanglement.
- New classes of Monte Carlo methods for stochastic exploration of abstract spaces, such as the space of Feynman diagrams or the space of Slater determinants.
- Efficient methods for performing electronic structure calculations.
Although the different approaches have been developed by different groups of people, working in different subfields and motivated by different reasons, they are complementary and together define a coherent basis for a new attack on the many electron problem.
The Simons Collaboration on the Many Electron Problem aims to bring together key scientists to develop, implement and test these new ideas, with the ultimate goal of producing a set of concepts, methods and codes widely useful in physics, materials science and chemistry. The collaboration will provide research and travel funding for a core group of the leading scientists working on new approaches to the many electron problem, with additional support to involve others as appropriate. The collaboration will also provide support for algorithm and software development and distribution and will run a summer school to train junior scientists in the new ideas and techniques. Via meetings at the foundation and elsewhere, the collaboration will foster scientific interactions and build the scientific community needed to devise, develop, evaluate and apply the new approaches.
The four initial research areas, key participants, and initial research activities are:
Director: A. Georges (Collège de France and École Polytechnique)
Participants: M. Ferrero (École Polytechnique), E. Kozik (King’s College, London) (jointly with Real Materials Group), D. Zgid (University of Michigan)
The goal is to develop and implement new, unbiased approaches to many electron systems by constructively combining cluster embedding approaches and diagrammatic Monte Carlo (DiagMC) techniques. The initial research activity will focus on understanding the analytic and convergence properties of Feynman diagrammatic perturbation series. Recent progress allows these questions to be addressed at a quantitative level using DiagMC codes. The main question is how the convergence of the series can be controlled and improved. The next step is to use cluster embedding techniques as a powerful tool to sum large classes of diagrams (with the use of state-of-the-art continuous-time quantum Monte Carlo algorithms) and further improve the convergence of the original perturbation series. The developments will be benchmarked on the Hubbard model, aiming at a description of the pseudogap region of its phase diagram, where the effect of inter-site correlations is crucial.
On the algorithmic and computational side, this project will require improving DiagMC codes and building a flexible library framework for combining diagrammatic techniques with existing cluster embedding codes.
Director: E. Gull (University of Michigan)
Participants: N. Prokof’ev (U. Mass Amherst), B. Svistunov (U. Mass Amherst)
The goals are to develop the methods needed to extend the dynamical mean-field methodology to the treatment of realistic orbital and interaction structures and to extend the diagrammatic Monte Carlo methods to the treatment of strong interactions.
The initial research activity will focus on obtaining a comprehensive solution of the Hubbard model: in particular, comparing results obtained by dynamical mean-field, diagrammatic Monte Carlo and tensor network methods (Gull, Prokof’ev, Svistunov; interaction with White, Vidal, Chan); extending the diagrammatic Monte Carlo methods to the uniform electron gas (Prokof’ev, Svistunov, Kozik); and developing the ‘vertex function’ methods needed to extend the dynamical mean-field method to longer-ranged interactions and to the calculation of physically important response functions (Gull, interaction with real materials thrust).
Important computational needs include developing the infrastructure of tools and libraries needed for wider application of the diagrammatic Monte Carlo methods, as well as improvement and systematization of the vertex function calculations needed for extensions of the dynamical mean-field methods.
Director: M. van Schilfgaarde (King’s College London)
Participants: G. Kotliar (Rutgers), K. Haule (Rutgers), E. Kozik (King’s College, London)
The goals are to integrate modern many body techniques into chemically realistic first-principle electronic structure formalisms.
The initial research activity will focus on two directions for improving the ‘GW’ approach to electronic structure. The GW approach was written down as a formal solution to the quantum many body problem in the presence of nuclear potentials, but all existing implementations use simple and uncontrolled low-order diagrammatic approximations. One direction will be to integrate the new diagrammatic Monte Carlo methods with GW (van Schilfgaarde, Kozik, Troyer; interaction with the Monte Carlo thrust). The initial target will be the application of the methodologies to the calculation of energetics and excitation spectra of simple elemental compounds such as silicon. A second direction will be to combine the GW method with modern embedding (dynamical mean-field) methods to obtain a realistic description of strongly correlated compounds (van Schilfgaarde, Haule, Kotliar; interaction with Monte Carlo and wave function thrusts).
Important computational needs include optimizing the GW codes and improving the interface between many body and real materials calculations, for example, by devising more efficient methods for generating orbitals and interactions and developing useful and generally accepted standards for storing and using this information.
Tensor Network and Wave Function Methods:
Director: S. White (U. C. Irvine)
Participants: G. Chan (Princeton), G. Vidal (Perimeter Institute)
The goals are to develop tensor network methods to the point where accurate phase diagrams can be obtained for model systems in two dimensions, to devise and implement a systematic procedure for going from orbital-level descriptions of solids to simulations using tensor network methods, and to improve the computational scaling of tensor network methods.
The initial research activities will include: use of DMRG and tensor network methods to solve the two dimensional Hubbard model (White, Chan; important interactions with Gull, Prokof’ev, Svistunov, Kozik); development of canonical transformation methods to define a chemically realistic few-orbital model of high-Tc copper-oxide superconductors (interactions with Haule, Kotliar, van Schilfgaarde) and the solution of the resulting model via tensor networks (Chan, White). The group will also develop tensor network/renormalization group approaches to improve the scaling of the calculations with system size (Vidal, White); devise and implement tensor networks for highly entangled states (Vidal); and investigate stochastic (Monte Carlo) approaches for estimating properties of tensor networks (Chan, Vidal; interactions with Gull).
The group will also develop and make available the tensor libraries needed to write efficient codes.
The science to be pursued in this collaboration is enabled in part by new algorithms that reduce computation times by many orders of magnitude. Taking full advantage of the algorithmic revolution requires sophisticated software platforms, modern object-oriented libraries and higher-level programming tools, as well as access to state of the art numerical mathematics. The collaboration, together with the Simons Center for Data Analysis, will support the development of the necessary computational infrastructure and the training of scientists in state of the art computational methods.