A Cortical Circuit for Audio Visual Predictions
Learned associations between stimuli in different sensory modalities can shape the way we perceive these stimuli (Mcgurk and Macdonald, 1976). During audio-visual associative learning, auditory cortex has been shown to underlie multi-modal plasticity in visual cortex (McIntosh et al., 1998; Zangenehpour and Zatorre, 2010). However, how processing in visual cortex is altered when an auditory stimulus signals a visual event and what the neural mechanisms are that mediate such experience-dependent audio-visual associations is not well understood. Here we describe a neural mechanism that contributes to shaping visual representations of behaviorally relevant stimuli through direct interactions between auditory and visual cortices. We show that auditory association with a visual stimulus leads to an experience-dependent suppression of visual responses in visual cortex. This suppression of the predictable visual stimulus response is driven in part by input from auditory cortex. By recording from auditory cortex axons in visual cortex, we find that these axons carry a mixture of auditory and retinotopically matched visual input. Moreover, optogenetic stimulation of auditory cortex axons in visual cortex selectively suppresses the neurons responsive to the associated visual stimulus after, but not before, learning. Our results are consistent with the interpretation that cross-modal associations can be stored in long-range cortical connections and that with learning these cross-modal connections function to suppress the responses to predictable input.
Dr. Garner is an Assistant Professor at Harvard Medical School
NACS Seminars are free and open to the public.