RT Journal Article SR Electronic T1 Dual Neural Routing of Visual Facilitation in Speech Processing JF The Journal of Neuroscience JO J. Neurosci. FD Society for Neuroscience SP 13445 OP 13453 DO 10.1523/JNEUROSCI.3194-09.2009 VO 29 IS 43 A1 Luc H. Arnal A1 Benjamin Morillon A1 Christian A. Kell A1 Anne-Lise Giraud YR 2009 UL http://www.jneurosci.org/content/29/43/13445.abstract AB Viewing our interlocutor facilitates speech perception, unlike for instance when we telephone. Several neural routes and mechanisms could account for this phenomenon. Using magnetoencephalography, we show that when seeing the interlocutor, latencies of auditory responses (M100) are the shorter the more predictable speech is from visual input, whether the auditory signal was congruent or not. Incongruence of auditory and visual input affected auditory responses ∼20 ms after latency shortening was detected, indicating that initial content-dependent auditory facilitation by vision is followed by a feedback signal that reflects the error between expected and received auditory input (prediction error). We then used functional magnetic resonance imaging and confirmed that distinct routes of visual information to auditory processing underlie these two functional mechanisms. Functional connectivity between visual motion and auditory areas depended on the degree of visual predictability, whereas connectivity between the superior temporal sulcus and both auditory and visual motion areas was driven by audiovisual (AV) incongruence. These results establish two distinct mechanisms by which the brain uses potentially predictive visual information to improve auditory perception. A fast direct corticocortical pathway conveys visual motion parameters to auditory cortex, and a slower and indirect feedback pathway signals the error between visual prediction and auditory input.