Seminar and Problem Sessions on Mathematics of Deep Learning

This seminar and working session is held every Tuesday from 11 a.m.-12 p.m. ET. It is associated with the project on Mathematics for Deep Learning. The goal of each session is to present a topic and open research questions in this area. It will serve as a backbone for the formation of spontaneous working groups around some of these questions.

Each session is either a presentation of 40 minutes followed by a discussion of 20 minutes or two presentations of 20 minutes followed by discussions of 10 minutes each. The slides and a bibliography of few papers is provided.

Fall 2021 Schedule

DateSpeakerTitleSlides
September 14, 2021Emmanuel CandesReliable Predictions? Counterfactual Predictions? Equitable Treatment? Some Recent Progress in Predictive Inference
September 21, 2021Alberto BiettiOn the Sample Complexity of Learning under Invariance and Geometric Stability
October 19, 2021Michael LindseyTools for multimodal sampling
October 26, 2021René VidalLearning dynamics and implicit bias of gradient flow in overparameterized linear models
November 2, 2021Jiequn HanPerturbational Complexity by Distribution Mismatch: A Systematic Analysis of Reinforcement Learning in Reproducing Kernel Hilbert Space
November 9, 2021Mikhail BelkinThe Polyak-Lojasiewicz condition as a framework for over-parameterized optimization and its application to deep learning
November 23, 2021Andrea MontanariTractability, generalization and overparametrization in linear and nonlinear models
  • Spring 2021 Schedule plus--large
    DateSpeakerTitleSlides
    January 19, 2021Francis BachTowards a quantitative analysis of gradient descent for infinitely wide two-layer neural networks
    January 26, 2021Gerard Ben ArousSome naive open problems about optimization in high dimensions: how much data is needed to tame topological complexity and entropy with simple algorithms?
    February 2, 2021Florent KrzakalaTeacher-student models: exactly solvable models for statistical learning
    February 9, 2021Lenka ZdeborovaGradient-based algorithms in high-dimensions under limited sample complexity
    February 16, 2021Stephane MallatHarmonic Analysis View of Deep Network ConcentrationTalk Slides
    February 23, 2021Jianfeng LuTowards numerical analysis for solving high dimensional PDEs based on neural networks
    March 2, 2021Risi KondorThe representation theory of equivariant neural networks and graph neural nets
    March 9, 2021Eric DarvePhysics-informed machine learning: open mathematical questions
    March 16, 2021Guillermo SapiroIs Deep Learning getting better?
    March 23, 2021Spring Summit
    March 30, 2021Grant RotskoffSampling with neural networks: prospects and perils
    April 6, 2021Sebastian BubeckA law of robustness for two-layers neural networks
    April 20, 2021Rong GeA Local Convergence Theory for Mildly Over-Parameterized Two-Layer Neural Network
    April 27, 2021Rachel WardGeneralization bounds for sparse random feature expansions
    May 4, 2021Julien MairalTrainable Algorithms for Inverse Imaging ProblemsTalk Slides
    May 11, 2021Ohad ShamirElephant in the Room: Non-Smooth Non-Convex Optimization
    May 18, 2021Bin YuDisentangled interpretations for deep learning with ACD
    May 25, 2021Tom GoldsteinBreaking Machine Learning Systems at the Industrial Scale
    June 1, 2021Leon BottouLearning Representations Using Causal Invariance
    June 8, 2021Sanjeev AroraTrajectory, Trajectory, Trajectory Talk Slides
    June 15, 2021Yi MaDeep (Convolution) Networks from First PrinciplesTalk Slides
Advancing Research in Basic Science and MathematicsSubscribe to Flatiron Institute announcements and other foundation updates