Learning Fast Requires Good Memory: Time-Space Tradeoff Lower Bounds for Learning
- Speaker
-
Ran Raz, Ph.D.Professor, Theoretical Computer Science, Princeton University
Weizmann Institute of Science
Presidential Lectures are a series of free public colloquia spotlighting groundbreaking research across four themes: neuroscience and autism science, physics, biology, and mathematics and computer science. These curated, high-level scientific talks feature leading scientists and mathematicians and are designed to foster discussion and drive discovery within the New York City research community. We invite those interested in these topics to join us for this weekly lecture series.
Can one prove unconditional lower bounds on the number of samples needed for learning, under memory constraints? A recent line of works shows that for a large class of learning problems, any learning algorithm requires either a memory of super-linear size or a super-polynomial number of samples. For example, any algorithm for learning parities of size n, from a stream of samples, requires either a memory of quadratic size or an exponential number of samples.
A main message of these works is that for some learning problems, access to a relatively large memory is crucial. Ran Raz will tell about some of these works and discuss relations to computational complexity and applications in bounded-storage cryptography.