This seminar and working session is held every Tuesday from 11 a.m.-12 p.m. ET. It is associated with the project on Mathematics for Deep Learning. The goal of each session is to present a topic and open research questions in this area. It will serve as a backbone for the formation of spontaneous working groups around some of these questions.
Each session is either a presentation of 40 minutes followed by a discussion of 20 minutes or two presentations of 20 minutes followed by discussions of 10 minutes each. The slides and a bibliography of few papers is provided.
Spring 2021 Schedule
Date | Speaker | Title | Slides |
January 19, 2021 | Francis Bach | Towards a quantitative analysis of gradient descent for infinitely wide two-layer neural networks | |
January 26, 2021 | Gerard Ben Arous | Some naive open problems about optimization in high dimensions: how much data is needed to tame topological complexity and entropy with simple algorithms? | |
February 2, 2021 | Florent Krzakala | Teacher-student models: exactly solvable models for statistical learning | |
February 9, 2021 | Lenka Zdeborova | Gradient-based algorithms in high-dimensions under limited sample complexity | |
February 16, 2021 | Stephane Mallat | Harmonic Analysis View of Deep Network Concentration | Talk Slides |
February 23, 2021 | Jianfeng Lu | Towards numerical analysis for solving high dimensional PDEs based on neural networks | |
March 2, 2021 | Risi Kondor | The representation theory of equivariant neural networks and graph neural nets | |
March 9, 2021 | Eric Darve | TBD | |
March 16, 2021 | Guillermo Sapiro | Creating ML that is blindly fair while avoiding unnecessary harm | |
March 30, 2021 | Grant Rotskoff | TBD | |
April 6, 2021 | Sebastian Bubeck | A law of robustness for two-layers neural networks | |
April 27, 2021 | Rachel Ward | TBD |
Fall 2020 Schedule
Date | Speaker | Title | Slides |
November 10, 2020 | David Donoho (Stanford), Chair Stephane Mallat (CCM) | Prevalence of Neural Collapse during the terminal phase of deep learning training | |
November 17, 2020 | Joan Bruna (Flatiron CCM, NYU), Chair TBA | On depth separation for neural networks | |
November 24, 2020 | Stefanie Jegelka (MIT), Chair Joan Bruna | Learning in Graph Neural Networks | |
December 1, 2020 | Eric Vanden-Eijnden (NYU) | ||
December 8, 2020 | Eero Simoncelli (Flatiron CCN, NYU), Chair Stephane Mallat | Solving Linear Inverse Problems Using the Prior Implicit in a Denoiser | Talk Slides |
December 15, 2020 | Lexing Ying (Stanford), Chair TBA | Solving Inverse Problems with Deep Learning |