Machine Learning at the Flatiron Institute Seminar: Andrea Liu

Date


Title: Overparameterization is everywhere

Abstract: Neural networks can learn complex tasks easily when they are overparameterized, with the number of parameters that describe interactions between neurons dwarfing the number of constraints imposed by the task. I will introduce a large class of physical many-body systems, which I call “adaptable matter,” with the same property of individually adjustable interactions. Such systems have an extensive number of “adaptive degrees of freedom,” or parameters that characterize the individual interactions. This large number of parameters enables systems to develop complex collective behavior. I argue that this is a powerful way to think about how biological function emerges as a collective phenomenon in many biological systems, and discuss non-living adaptable matter systems that can learn how to perform machine-learning tasks without using a processor.

Advancing Research in Basic Science and MathematicsSubscribe to Flatiron Institute announcements and other foundation updates

privacy consent banner

Privacy preference

We use cookies to provide you with the best online experience. By clicking "Accept All," you help us understand how our site is used and enhance its performance. You can change your choice at any time here. To learn more, please visit our Privacy Policy.