Machine Learning at the Flatiron Institute Seminar: Ben Wandelt

Date & Time


Title: Neural Computation in Bayesian Inference and Applications to Cosmological Data Science

Abstract: Many interesting Bayesian inference problems seem intractable or computationally challenging. Such problems include first-level inference (computing posterior pdfs) or second-level inference (computing Bayesian evidence) of non-linear, hierarchical models with large numbers of parameters. I will give examples of practical solutions that operate by recasting the computational statistics problem as optimization problems and using neural, differentiable function representations. In many cases, the key ingredient is the ability to forward generate data from the model, e.g. through numerical simulations. When numerical simulations are expensive we can train generative models. For cosmological applications, it is often possible to exploit the translational and rotational symmetries of the problem to learn neural generative and conditional models from a single (pair of) training example(s). As we solve these computational problems, new, conceptual challenges come into view: how do we deal with imperfections in our physical models in an observational science such as cosmology? Can we combine ab initio with data-driven approaches? What will be the ultimate limit to cosmological knowledge?

Advancing Research in Basic Science and MathematicsSubscribe to Flatiron Institute announcements and other foundation updates