The fifth Annual Meeting was the largest ever, with nearly 170 participants. It featured eight talks by leading scientists, covering a wide spectrum of topics in mathematics, physics and theoretical computer science. Attendees had lively discussions between the talks and during the conference dinner.
The event started with a talk by Eve Ostriker on the interstellar medium (ISM). The talk described feedback interactions between the ISM and stars. The ISM comes in several forms, including atomic gas, very cold molecular gas, and warm, ionized gas produced by new stars. Dust grains make up 1% of the ISM mass. The ISM is in general much more diffuse than any laboratory vacuum. The ISM collapses under gravitational forces to form stars. Roughly 2-3 new stars are formed in our galaxy every year. One might expect that the collapsing process would proceed more quickly; however, stars drive away and disperse the material around them by emitting radiation that ionizes the gas. This forms a feedback loop between the ISM and stars which self-regulates the star formation process. A combination of theoretical modeling and innovations in the numerical solution of PDEs has allowed a much more detailed and accurate understanding in recent years of these interactions that shape the evolution of galaxies.
The next talk, by Ashvin Vishwanath, discussed new forms of matter which display spatially extended quantum entanglement. Usually, quantum systems are very localized and are apparent in very small scales. In many cases, quantum states are also highly nonstable. In classical theory, phases such as liquid and gas can be transitioned smoothly in the high energy–high pressure regime. Phases can be shown to be truly distinct if they admit discrete invariants that cannot be perturbed. A classic example in mathematics is the Euler characteristic, which distinguishes surfaces with different numbers of holes. For example, the sphere that has no holes differs from a bagel, which has one hole, which differs from a pretzel. The quantum hall effect provides a physical example, where a certain physically measurable quantity is described essentially via the number of holes in a surface or, more generally, by discrete properties of some related phase space. There are quantum systems with fractional charge which behave differently from either bosons or fermions. The talk described how long-range quantum entanglement provides a distinction between different systems, in particular, between the fractional and integer Hall effects. The relation to novel dualities between quantum fields theories was also described.
Oscar Hallatschek described how randomness can play an important role in evolutionary processes. In particular, having better genes is not enough; one needs a little (or a lot of) luck to succeed. The talk discussed the effects of randomness in both a localized microscopic level of processes at the cell level and also at global epidemiological scales in terms of spreading of disease. Sophisticated stochastic modeling allows a better understanding of such phenomena at all levels.
Bjorn Poonen discussed the classical problem of solvability of multivariable polynomial equations. The great mathematician David Hilbert inquired whether or not there exists a general procedure (computer program), such that given a multivariable polynomial with integer coefficients, it can determine if there is an assignment of integer values to the variables that would produce 0 or a given target value. After much work spanning several decades, this problem was eventually given a negative answer in 1970. In fact, a precise description of the structure of possible sets of values that a polynomial can have was given. It is the family of so-called ‘listable’ (or recursively enumerable) sets. It was known that membership in some of these sets cannot be determined by any computer program. Poonen discussed this result and its generalizations to other number systems more general than the integers, such as the rationals, where such questions are not known or only partially resolved.
Christopher G. Tully gave a talk on PTOLEMY (the Princeton Tritium Observatory for Light, Early-universe, Massive-neutrino Yield) — the instrument built for direct detection of the present-day relic neutrinos predicted by the big bang theory. Light neutrinos have thermally decoupled from other forms of matter at approximately one second after the big bang and have cooled down to a temperature of 1.9K. A method for direct detection of the relic neutrinos is based on the process of neutrino capture on tritium, with two particles in the final state, an electron and a He3 nucleus. PTOLEMY is developing a few crucial technologies, including large-scale nano-fabrication of a graphene tritium cell and massively multiplexed SQUID readout of ultra-high precision micro-calorimetry. It utilizes the tritium-handling capabilities of the Princeton Plasma Physics Laboratory.
Leonardo Rastelli described the bootstrap method for analyzing quantum field theories with a rich set of symmetries. The quantum field theories can be described by algebras of operators. It turns out that in some cases, the underlying symmetries of the theory together with the requirement for self-consistency (calculations which can be done in two ways should yield the same result) are sufficient to pinpoint the structure of the algebra. They produce an infinite set of linear inequalities which provide tighter and tighter control over the structure of the field theory. This method yields the most accurate determination of the critical temperature for the three-dimensional Ising model, a very well-studied model in statistical mechanics.
Piotr Indyk described the relation between the exact degree of difficulty of computationally intractable problems and the exact computational efficiency of tractable problems. Some problems in computer science are such that, given a solution, we can verify that it is indeed a solution. This does not mean that we can easily find a solution to begin with. Among this class of problems, there are those known as ‘NP complete’ problems such that if we could efficiently find their solutions, we could efficiently find solutions to all other problems in this class. The widely accepted ‘P not equal NP’ conjecture states that it is difficult to find solutions to these problems. There is a finer conjecture that states that, in fact, one cannot do substantially better than a brute-force search for solutions. In the talk, it was shown that if this conjecture is true, then existing algorithms for certain other problems are essentially optimal. This is a surprising connection. Several examples of solvable problems where this connection applies were presented and explained.
In a literally beautiful talk that ended the meeting, Ingrid Daubechies demonstrated how sophisticated data-analysis methods can be used to analyze and reconstruct classical artwork. In one example, a missing panel was conjecturally reconstructed using an analysis of the style, posing and color of the other panels and information about its subject matter. In another example, restoration of color was done using image-analysis techniques.