Seminar and Problem Sessions on Mathematics of Deep Learning

This seminar and working session is held every Tuesdays from 11 a.m.-12 p.m. ET. It is associated with the project on Mathematics for Deep Learning. The goal of each session is to present a topic and open research questions in this area. It will serve as a backbone for the formation of spontaneous working groups around some of these questions.

Each session is either a presentation of 40 minutes followed by a discussion of 20 minutes or two presentations of 20 minutes followed by discussions of 10 minutes each. The slides and a bibliography of few papers is provided.


November 10, 2020David Donoho (Stanford), Chair Stephane Mallat (CCM)Prevalence of Neural Collapse during the terminal phase of deep learning trainingTo be added
November 17, 2020Joan Bruna (Flatiron CCM, NYU), Chair TBAOn depth separation for neural networks
November 24, 2020Stefanie Jegelka (MIT), Chair Joan BrunaLearning in Graph Neural Networks
December 1, 2020Eric Vanden-Eijnden (NYU), Chair TBA
December 8, 2020Lexing Ying (Stanford), Chair TBA
December 15, 2020Eero Simoncelli (Flatiron CCN, NYU), Chair Stephane Mallat
Advancing Research in Basic Science and MathematicsSubscribe to Flatiron Institute announcements and other foundation updates