Mathematics of Deep Learning Seminar: Grant Rotskoff
Title: Sampling with neural networks: prospects and perils
Abstract: In many applications in computational physics and chemistry, we seek to estimate expectation values of observables that yield mechanistic insight about reactions, transitions, and other “rare” events. These problems are often plagued by metastability; slow relaxation between metastable basins leads to slow convergence of estimators of such expectations. In this talk, I will focus on efforts to exploit developments in generative modeling to sample distributions that are challenging to sample with local dynamics (e.g., MCMC or molecular dynamics) due to metastability. I will review some of the progress over the last several years in this area and then discuss a problem of particular relevance to biophysics: sampling when there is not a large, pre-existing data set on which to train. By simultaneously sampling with traditional methods and training a sampler, we assess the prospects of neural network driven sampling to accelerate sampling and to aid exploration of high-dimensional distributions.
This is joint work with Marylou Gabrié and Eric Vanden-Eijnden