Learning Fast Requires Good Memory: Time-Space Tradeoff Lower Bounds for Learning

  • Speaker
  • Ran Raz, Ph.D.Professor, Theoretical Computer Science, Princeton University
    Weizmann Institute of Science
Date & Time

About Mathematics and Physical Sciences

Mathematics and Physical Sciences lectures are open to the public and are held at the Gerald D. Fischbach Auditorium at the Simons Foundation headquarters in New York City. Tea is served prior to each lecture.

View all Lectures in This Series

Can one prove unconditional lower bounds on the number of samples needed for learning, under memory constraints? A recent line of works shows that for a large class of learning problems, any learning algorithm requires either a memory of super-linear size or a super-polynomial number of samples. For example, any algorithm for learning parities of size n, from a stream of samples, requires either a memory of quadratic size or an exponential number of samples.

A main message of these works is that for some learning problems, access to a relatively large memory is crucial. Ran Raz will tell about some of these works and discuss relations to computational complexity and applications in bounded-storage cryptography.

Video Thumbnail

By clicking to watch this video, you agree to our privacy policy.

About the Speaker

Raz is a professor of theoretical computer science at Princeton University. He received his B.Sc. in mathematics and physics in 1987 and his Ph.D. in mathematics in 1992 from the Hebrew University of Jerusalem. After spending two years as a postdoc at Princeton University, in 1994 he joined the Weizmann Institute of Science in Israel. He was a visiting professor at Microsoft Research (2006, 2009) and the Institute for Advanced Study (2012–2016). Raz’s main research area is computational complexity theory, with a focus on proving lower bounds for computational models.

Advancing Research in Basic Science and MathematicsSubscribe to our newsletters to receive news & updates