How does the brain predict the future? Large-scale neural calcium imaging, two-photon optogenetic and deep neural network studies of future prediction across 11 neocortical areas in behaving mice
- Awardees
-
Mark Schnitzer, Ph.D. Howard Hughes Medical Institute
-
Surya Ganguli, Ph.D. Stanford University
Prediction of the future and mental simulation of experiences are key to many forms of cognition and sensory processing. Humans (and likely also other animals) use mental simulation to evaluate candidate behavioral choices. Predictions of upcoming sensory events and how they relate to planned motor actions seem vital to motor choreography and the ability to properly interpret sensory data during active movement. Motivating our work is the idea that the cortex likely uses a core set of computational approaches and mechanisms that are conserved across multiple forms of future prediction. We conjecture that the cortex uses, in many contexts, a general strategy for jointly encoding and processing sensory and motor data for the sake of predicting future outcomes, deciphering sensory evidence in light of self-actions, and intelligently navigating the world.
To gain insight into these fundamental processes, we are studying a basic form of future prediction — the prediction of upcoming sensory stimuli during active navigation — using groundbreaking new tools for large-scale neural imaging and two-photon optogenetics over multiple brain areas, together with cutting-edge analyses using deep neural networks. Specifically, we can image and analyze the concurrent activity of 5,000 cells across 11 neocortical areas of a mouse navigating a virtual reality. Our recordings typically span nearly all of the primary and higher-order visual areas and parts of the somatosensory, auditory, posterior parietal, motor and retrosplenial cortex. As the mouse navigates, we record its eye and body movements. We also provide expected and unexpected visual and motor stimuli and unexpected changes in the coupling between locomotion and forward advances in the virtual reality. Together, the unprecedented neural imaging and high-resolution behavioral data will propel our study of how the cortex encodes and predicts upcoming stimuli and actions.
To examine relationships between cortical activity and mouse behavior in the recent past, present and immediate future, we use decoding and encoding analyses. Both rely on deep neural networks, configured in several distinct ways to address a range of questions. Decoding analyses will evaluate the predictive capabilities of single brain areas and groups of areas, for actions and stimuli. We will also assess the way and the extent to which each brain area or set of areas accounts for body motions in anticipating upcoming stimuli, and how it encodes unexpected perturbations. Encoding analyses will identify cells that encode visual or motor information, or a mixture of the two. They will also identify the logic by which cells transfer stimulus and motor information across areas and locate specific cells that strongly influence other cells. Finally, we will use two-photon optogenetics to causally test the analytic findings. Overall, our project combines multiple new approaches and seeks fundamental insights into what is likely to be a highly conserved mode of computation.