Title: Concepts and technologies for studying neural circuits in behaving mice
Every time we execute a behavior - walking, talking, playing guitar - we change the world around us. For example, walking causes the the visual scene around us to change (e.g. via optic flow) while also producing sounds that hit our ears (e.g. footsteps). The brain must distinguish sensory stimuli that are produced by our own actions from those that arise from objects in the environment. It does this is by forming an internal (or forward) model of how the sensory world will change given the particular action an animal is about to make.
We study how the brain learns, stores, and recalls memories that comprise an internal model about the world, specifically related to self-generated sounds. We have developed an acoustic augmented reality (aAR) platform to study how the brain learns to predict the sounds associated with a movement using the mouse as a model organism. We combine aAR with in vivo strategies for monitoring and manipulating neural activity including multi-electrode array recordings, 2-photon calcium imaging, and optogenetic circuit perturbations. Using these techniques, we have identified a putative synaptic locus for storing memories about self-generated sounds and a motor-auditory circuit that learns and recalls these memories.
In this talk, I will provide a general scientific overview of our work. In addition, I will discuss some of the major goals/challenges that we have for analyzing the data that these experiments produce.