Seminar and Problem Sessions on Mathematics of Deep Learning

This seminar and working session is held every Tuesday from 11 a.m.-12 p.m. ET. It is associated with the project on Mathematics for Deep Learning. The goal of each session is to present a topic and open research questions in this area. It will serve as a backbone for the formation of spontaneous working groups around some of these questions.

Each session is either a presentation of 40 minutes followed by a discussion of 20 minutes or two presentations of 20 minutes followed by discussions of 10 minutes each. The slides and a bibliography of few papers is provided.

Spring 2021 Schedule

January 19, 2021Francis BachTowards a quantitative analysis of gradient descent for infinitely wide two-layer neural networks
January 26, 2021Gerard Ben ArousSome naive open problems about optimization in high dimensions: how much data is needed to tame topological complexity and entropy with simple algorithms?
February 2, 2021Florent KrzakalaTeacher-student models: exactly solvable models for statistical learning
February 9, 2021Lenka ZdeborovaGradient-based algorithms in high-dimensions under limited sample complexity
February 16, 2021Stephane MallatHarmonic Analysis View of Deep Network ConcentrationTalk Slides
February 23, 2021Jianfeng LuTowards numerical analysis for solving high dimensional PDEs based on neural networks
March 2, 2021Risi KondorThe representation theory of equivariant neural networks and graph neural nets
March 9, 2021Eric DarveTBD
March 16, 2021Guillermo SapiroCreating ML that is blindly fair while avoiding unnecessary harm
March 30, 2021Grant RotskoffTBD
April 6, 2021Sebastian BubeckA law of robustness for two-layers neural networks
April 27, 2021Rachel WardTBD

Fall 2020 Schedule

November 10, 2020David Donoho (Stanford), Chair Stephane Mallat (CCM)Prevalence of Neural Collapse during the terminal phase of deep learning training
November 17, 2020Joan Bruna (Flatiron CCM, NYU), Chair TBAOn depth separation for neural networks
November 24, 2020Stefanie Jegelka (MIT), Chair Joan BrunaLearning in Graph Neural Networks
December 1, 2020Eric Vanden-Eijnden (NYU)
December 8, 2020Eero Simoncelli (Flatiron CCN, NYU), Chair Stephane MallatSolving Linear Inverse Problems Using the Prior Implicit in a DenoiserTalk Slides
December 15, 2020Lexing Ying (Stanford), Chair TBASolving Inverse Problems with Deep Learning
Advancing Research in Basic Science and MathematicsSubscribe to Flatiron Institute announcements and other foundation updates