Geometric and Multiscale Methods for Statistical Learning

Date & Time


About Presidential Lectures

Presidential Lectures are free public colloquia centered on four main themes: Biology, Physics, Mathematics and Computer Science, and Neuroscience and Autism Science. These curated, high-level scientific talks feature leading scientists and mathematicians and are intended to foster discourse and drive discovery among the broader NYC-area research community. We invite those interested in the topic to join us for this weekly lecture series.
Video Thumbnail

By clicking to watch this video, you agree to our privacy policy.

Large, high-dimensional datasets appear in a wide variety of applications. Extracting information from these datasets and performing machine-learning tasks on them can be challenging for both fundamental statistical reasons and because of computational barriers.

In this lecture, Mauro Maggioni will discuss a family of ideas, algorithms and results for learning from high-dimensional data. These methods rely on the idea that complex, high-dimensional data has geometric structures that, once discovered, assist in a variety of tasks, including statistical learning and data visualization. He will focus on multiscale decompositions that can be used to solve problems such as dictionary learning, classification and regression. These decompositions lead to the construction of novel probabilistic models for data, new notions of learning and approximation of high-dimensional stochastic systems.

If this lecture is videotaped, it will be posted here after production.

About the Speaker

Mauro Maggioni works at the intersection of harmonic analysis, approximation theory, probability, machine learning, spectral graph theory and statistical signal processing. He received his B.S. in mathematics from the Universitá degli Studi in Milan, Italy, and his Ph.D. in mathematics from Washington University in St. Louis. He was a Gibbs Assistant Professor of Mathematics at Yale University, and is now professor of mathematics, electrical and computer engineering, and computer science at Duke University. He received the Popov Prize in Approximation Theory in 2007, a National Science Foundation CAREER Award and Sloan Fellowship in 2008, was nominated Fellow of the American Mathematical Society in 2013, and is a member of the American Mathematical Society and the Society for Industrial and Applied Mathematics.

Advancing Research in Basic Science and MathematicsSubscribe to our newsletters to receive news & updates