Machine Learning at the Flatiron Institute Seminar: Andrea Liu

Date & Time


Title: Overparameterization is everywhere

Abstract: Neural networks can learn complex tasks easily when they are overparameterized, with the number of parameters that describe interactions between neurons dwarfing the number of constraints imposed by the task. I will introduce a large class of physical many-body systems, which I call “adaptable matter,” with the same property of individually adjustable interactions. Such systems have an extensive number of “adaptive degrees of freedom,” or parameters that characterize the individual interactions. This large number of parameters enables systems to develop complex collective behavior. I argue that this is a powerful way to think about how biological function emerges as a collective phenomenon in many biological systems, and discuss non-living adaptable matter systems that can learn how to perform machine-learning tasks without using a processor.

Advancing Research in Basic Science and MathematicsSubscribe to Flatiron Institute announcements and other foundation updates