- Organized by
Patrick Hayden, Ph.D.Stanford University
Matthew Headrick, Ph.D.Brandeis University
The annual meeting assembled 16 principal investigators, 21 It from Qubit (IFQ) Fellows, and 15 guests for two days at the Simons Foundation headquarters in New York. The meeting featured seven hour-long talks by PIs, ranging from pedagogical overviews to workshop-style talks on ongoing research and covering in a broad sense the range of topics studied by the Collaboration (though not by a long shot all of the progress made on them over the past year). Three of the talks were given by our new PIs — Daniel Harlow, Alexei Kitaev and Brian Swingle — as a way of introducing them to the rest of the Collaboration. There were also two poster sessions; the first day’s session was reserved for our seven new IFQ Fellows so they could also introduce themselves, while the second day’s was open, with about a dozen participants presenting posters. Ample time for informal discussion was also built into the schedule.
Two of the talks were reviews aimed at the high-energy physics side of the Collaboration. John Preskill gave a preview of Quantum Computing in the NISQ Era and Beyond, where NISQ stands for Noisy Intermediate-Scale Quantum devices. This talk was based on suggestions from the members of the Collaboration, many of whom are curious about the rapidly evolving technology and some of whom are actively thinking about ways to engage with the experiments. The short story is that there exist devices consisting of about 50 qubits capable of supporting low-noise computations. Those numbers will improve quite rapidly, into the hundreds of qubits, but new ideas will be required to scale up further significantly. Scott Aaronson delivered An Armory of Assumptions, a pedagogical overview of aspects of quantum computational complexity that could be relevant to issues in quantum gravity, such as the black-hole information problem and the Complexity = Action conjecture. His emphasis was on the definitions and models more than theorems, giving the audience a sense of how to formulate their questions and what they can reasonably expect to be able to prove or not. The lively back-and-forth with the audience suggested that it served its purpose.
The remaining talks described recent or ongoing research. Patrick Hayden spoke on Approximate Quantum Error Correction Revisited. Thanks in large part to efforts of IFQ members over the past several years, the theory of quantum error correction has been firmly established as a key feature of the AdS/CFT correspondence. Mapping observables from the quantum gravitational bulk theory to the boundary conformal field theory is mathematically a problem of decoding quantum error correcting codes. This insight has led to new formulas for the mapping in cases where it wasn’t understood how to accomplish it previously, as well as resolved conceptual puzzles like how entropy, a nonlinear function of the state, can be identified with area, an observable. Hayden’s talk was devoted to developing a new theory of approximate quantum error correction appropriate for mapping observables behind a black hole horizon to the boundary. That theory is full of surprises from the quantum information point of view, including a way to squeeze two qubits into one asymptotically.
Brian Swingle gave a beautiful talk on his joint work with IFQ Fellow Isaac Kim about “Tensor Networks and Early Quantum Devices.” Swingle’s work combines quantum gravity, condensed-matter physics, and quantum information; he was the first to propose that tensor-network representations of quantum states might be related to emergent geometry. One of the highlights of his talk was a new hybrid algorithm for finding ground states of many-body systems in which a small quantum computer of the type likely to be built in the next few years would be used to evaluate the objective function in a classical optimization loop. Remarkably, the algorithm itself provably has noise resilience and so should operate well, even though near-term quantum computers will not have enough qubits to support quantum error correction. The second half of his talk was about new tensor network algorithms for simulating the dynamics of many-body quantum systems; for specific problems, his methods can already handle twice as many spins as the previous state of the art, showing surprising behavior for diagnostics of scrambling such as out-of-time-order correlators.
Alexei Kitaev’s talk, “Efficient Decoding for the Hayden-Preskill Protocol,” was devoted to a very surprising result found with former IFQ Fellow Beni Yoshida. He presented a scenario in which, under plausible assumptions, information hidden in Hawking radiation could be decoded by a polynomial-time quantum algorithm. Nearly everyone’s expectation was that the decoding task would instead take exponential time. The information-loss problem in black hole evaporation was the first significant point of contact between the quantum information and high energy/quantum gravity communities. Questions about the complexity of decoding ultimately helped inspire the Complexity = Action conjecture that many members are currently working on. The conjecture further motivates a better understanding the complexity of quantum states and processes that appear in quantum theories of gravity, especially when the answer is as unexpected as in this case.
Two of the talks concerned black holes in the 1+1 dimensional Jackiw-Teitelboim (JT) theory of gravity. This simple theory describes the low-energy dynamics of the gravitational dual of the Sachdev-Ye-Kitaev (SYK) model, and so has been the focus of renewed attention over the past two years. (The pair of theories provide a ‘Goldilocks’ gauge-gravity correspondence in which both sides of the correspondence can be solved, allowing for very detailed comparisons.) In his talk, “Lorentzian vs. Euclidean Quantum Gravity in 1+1 Dimensions,” Daniel Harlow described an approach to quantize this theory directly. Interestingly, however, the partition function obtained by that Lorentzian method fails to match the one obtained by the standard Euclidean method, which reproduces the black hole entropy. Finally, Juan Maldacena explained “How to Make a Wormhole.” The eternal two-sided black hole in anti-de Sitter space and its CFT dual, the thermofield double state, are almost the Drosophila of quantum black holes: a simple model system that seems to provide an inexhaustible supply of remarkable insights. This spacetime contains a wormhole connecting the two boundaries. This wormhole is not traversable, but recent work by Gao, Jafferis and Wall showed how it can be made traversable by coupling the two copies of the CFTs. Using a similar method but working in the simpler JT theory, dual to two copies of the SYK model, Maldacena explained how to make a traversable wormhole with a global time-translation symmetry. Turning off the coupling then leads the system to evolve into an eternal black hole. Both Harlow’s and Maldacena’s talks illustrate the ongoing effort to use the SYK and JT theories to distill the mysteries of quantum gravity in the simplest possible setting.
Thursday, December 7
8:00 AM Check-in & breakfast 9:00 AM Patrick Hayden | Approximate Quantum Error Correction Revisited 10:00 AM Posters | New IFQ Fellows 11:30 AM Brian Swingle | Tensor Networks and Early Quantum Devices 12:30 PM Lunch 2:00 PM Daniel Harlow | Lorentzian vs. Euclidean Quantum Gravity in 1+1 Dimensions 3:00 PM Break 3:30 PM John Preskill | Quantum Computing in the NISQ Era and Beyond 4:30 PM Day one concludes
Friday, December 8
8:00 AM Check-in & breakfast 9:00 AM Alexei Kitaev | Efficient Decoding for the Hayden-Preskill Protocol 10:00 AM Posters | General Session 11:30 AM Scott Aaronson | An Armory of Assumptions 12:30 PM Lunch 1:30 PM Juan Maldacena | How to Make a Wormhole 2:30 PM Meeting concludes
Approximate Quantum Error Correction Revisited
Patrick Hayden, Stanford University
The theory of quantum error correction has been firmly established as a key feature of the AdS/CFT correspondence. Mapping observables from the quantum gravitational bulk theory to the boundary conformal field theory is mathematically a problem of decoding quantum error correcting codes. This insight has led to new formulas for the mapping in cases where it wasn’t understood how to accomplish it previously, as well as resolved conceptual puzzles like how entropy, a nonlinear function of the state, can be identified with area, an observable. Hayden will describe a new theory of approximate quantum error correction appropriate for mapping observables behind a black hole horizon to the boundary. That theory is full of new surprises from the quantum information point of view, including way to squeeze two qubits into one asymptotically.
Tensor Networks and Early Quantum Devices
Brian Swingle, University of Maryland
It is interesting to ask how “it from qubit” differs from “it from bit”, for example, from the point of view of computational hardness. Swingle will present two tensor network results, one in equilibrium and one out of equilibrium, that bear on the question of classical versus quantum computational hardness in physics. He will then discuss a result with Isaac Kim showing that certain renormalization group inspired tensor networks can be efficiently contracted on a small quantum device even in the presence of noise. Finally, Swingle will discuss work with Shenglong Xu showing how quantum scrambling physics can be computed using classical tensor network methods.
Lorentzian vs. Euclidean Quantum Gravity in 1+1 Dimensions
Daniel Harlow, Massachusetts Institute of Technology
Daniel Harlow will describe the canonical quantization of Jackiw-Teitelboim gravity in 1+1 dimensions, taking care to extract all physical degrees of freedom. He will then use the resulting quantum gravity theory to re-interpret some existing Euclidean results in a novel way. There will be nary a Schwarzian in sight.
Quantum Computing in the NISQ Era and Beyond
John Preskill, California Institute of Technology
Noisy Intermediate-Scale Quantum (NISQ) technology will be available in the near future. Quantum computers with 50-100 qubits may be able to perform tasks which surpass the capabilities of today’s classical digital computers, but noise in quantum gates will limit the size of quantum circuits that can be executed reliably. NISQ devices will be useful tools for exploring many-body quantum physics, and may have other useful applications, but the 100-qubit quantum computer will not change the world right away — we should regard it as a significant step toward the more powerful quantum technologies of the future. Quantum technologists should continue to strive for more accurate quantum gates and, eventually, fully fault-tolerant quantum computing.
Efficient Decoding for the Hayden-Preskill Protocol
Alexei Kitaev, California Institute of Technology
We present two procedures for reconstructing a quantum state from the Hawking radiation in the Hayden-Preskill thought experiment. We work in an idealized setting and represent the black hole and its entangled partner by n EPR pairs. The first procedure teleports the state thrown into the black hole to an outside observer by post-selecting on the condition that a sufficient number of EPR pairs remain undisturbed. The probability of this favorable event scales as 1/dA², where dA is the Hilbert space dimension for the input state. The second procedure is deterministic and combines the previous idea with Grover’s search. The decoding complexity is O(dA C) where C is the size of the quantum circuit implementing the unitary evolution operator U of the black hole. (Based on joint work with Beni Yoshida.)
An Armory of Assumptions
Scott Aaronson, University of Texas, Austin
Scott Aaronson will provide an overview of assumptions about computational hardness and what kinds of assumptions are necessary and sufficient to derive various conclusions of interest for quantum gravity.
Aaronson will focus on the relationships among the following:
- “Traditional” hardness assumptions (like PSPACE not in PP/poly, or the existence of one-way functions)
- Assumptions about the hardness of preparing certain states (like the end result of applying a CFT Hamiltonian for exponential time)
- Assumptions about the hardness of applying certain unitary transformations (like that needed to decode the Hawking radiation in the firewall problem)
Aaronson will also discuss the “Unitary Synthesis Problem,” a problem posed by Greg Kuperberg and Aaronson a decade ago that asks whether there’s any hope to justify assumptions of type (3) without needing to justify (1) or (2). Aaronson will also discuss connections to other topics, including the security of quantum money schemes. The goal of this talk is to help It from Qubit participants who might need to deploy complexity assumptions in their own work.
How to Make a Wormhole
Juan Maldacena, Institute for Advanced Study
Juan Maldacena will review aspects of wormholes, thermofield double states and traversable wormholes in nearly AdS₂ spacetimes. He will then discuss a way to produce a state close to the thermofield double state in the SYK model.