Mathematics of Deep Learning Seminar: Joan Bruna Estrach

Date & Time

Speaker: Joan Bruna Estrach
Courant Institute of Mathematical Sciences, New York University

On depth separation for neural networks

Abstract: Neural networks define functional spaces with appealing behavior in the high dimensional regime, experimentally validated across many applications. However, these spaces are poorly understood from the approximation and optimization point of view, especially as these networks become deeper.

In this talk, I will review existing results about approximation and optimization properties for deep networks, highlighting the techniques employed to establish so-called ‘depth separation’ results, that show exponential approximation lower bounds for the class of shallow networks, together with polynomial upper bounds for deeper architectures.

I will also present recent extensions that highlight the role of input geometry, such as homogeneity and symmetry, and conclude with a collection of open questions that relate depth separation with scale separation, as well as optimization on deeper models.

See full schedule here.

Advancing Research in Basic Science and MathematicsSubscribe to Flatiron Institute announcements and other foundation updates