TY - JOUR T1 - Perceptual Fusion and Stimulus Coincidence in the Cross-Modal Integration of Speech JF - The Journal of Neuroscience JO - J. Neurosci. SP - 5884 LP - 5893 DO - 10.1523/JNEUROSCI.0896-05.2005 VL - 25 IS - 25 AU - Lee M. Miller AU - Mark D'Esposito Y1 - 2005/06/22 UR - http://www.jneurosci.org/content/25/25/5884.abstract N2 - Human speech perception is profoundly influenced by vision. Watching a speaker's mouth movements significantly improves comprehension, both for normal listeners in noisy environments and especially for the hearing impaired. A number of brain regions have been implicated in audiovisual speech tasks, but little evidence distinguishes them functionally. In an event-related functional magnetic resonance imaging study, we differentiate neural systems that evaluate cross-modal coincidence of the physical stimuli from those that mediate perceptual binding. Regions consistently involved in perceptual fusion per se included Heschl's gyrus, superior temporal sulcus, middle intraparietal sulcus, and inferior frontal gyrus. Successful fusion elicited activity biased toward the left hemisphere, although failed cross-modal binding recruited regions in both hemispheres. A broad network of other areas, including the superior colliculus, anterior insula, and anterior intraparietal sulcus, were more involved with evaluating the spatiotemporal correspondence of speech stimuli, regardless of a subject's perception. All of these showed greater activity to temporally offset stimuli than to audiovisually synchronous stimuli. Our results demonstrate how elements of the cross-modal speech integration network differ in their sensitivity to physical reality versus perceptual experience. ER -