Liang Liang, Ph.D.
Assistant Professor , Yale UniversityLiang Liang is an assistant professor in the Department of Neuroscience at Yale University. Her laboratory studies the organizational and computational principles underlying visual signal selectivity along the visual hierarchy with a primary focus on the visual thalamic circuitry, taking a combination of in vivo imaging, genetic, behavioral, and computational approaches.
Liang Liang received her B.S. in Mathematics and Physics from Tsinghua University in China. She then moved to the United States and completed her M.S. and Ph.D. in Applied Physics at Stanford University under the supervision of Drs. Liqun Luo and Mark Schnitzer. During her graduate work, Liang identified a novel circuit motif that recruited excitatory and inhibitory channels in parallel to shape odor processing in the fruit fly, using two-photon imaging, laser dissection and optogenetics. She was supported by a Stanford Graduate Fellowship and a Lubert Stryer Stanford Interdisciplinary Graduate Fellowship. Liang joined the groups of Drs. Chinfei Chen and Mark Andermann at Harvard Medical School for her postdoctoral training, where she studied fine-scale functional organization and state-dependent modulation of retinal axons in the early visual system of awake behaving mice. She was supported by a postdoctoral fellowship from the Simons Collaboration on the Global Brain (SCGB). Liang joined the Department of Neuroscience at Yale School of Medicine as an assistant professor in 2020.
Project
“State-dependent influences on retinogeniculate ensemble activity”
It’s intuitive to think of our sensory systems as simply relaying information about the world to our brain—i.e., the eye acts like a camera, or the ear like a microphone. However, it turns out that sensing isn’t really a passive process, but that our brains highly modulate incoming sensory information depending on the current activity in the brain. The sum of the total activity in the brain is termed the brain’s “state,” and how sensory information is processed depends upon that state. For example, we process visual information differently if we are paying attention to an object rather than if it is in our periphery. Most research that investigates how current brain state affects sensory processing has been performed in a brain area called the cortex. But modulation of sensory information probably occurs in many other brain areas as well. Working in mice, we plan to investigate a region of the brain—the thalamus—that receives visual information before the cortex does and passes it along. We have developed new sophisticated optical techniques for simultaneously measuring the activity of many neurons in the thalamus, and plan to use this technology to understand how visual information is processed differently depending on the brain’s state. We will allow the mouse to either walk on a treadmill or stay still while it observes visual stimuli. The visual stimuli will remain the same, but when the mouse is walking, the brain will presumably be in a different state than when the mouse is sitting still. We can then compare the activity in the thalamus in both states. Because the visual stimulus is the same, any difference will be due to brain state. With this experimental setup, we will be able, for the first time, to determine which neural networks are responsible for modulating visual information, with broad implications for how sensory processing occurs in any animal, including humans.