RT Journal Article SR Electronic T1 Auditory–Visual Multisensory Interactions in Humans: Timing, Topography, Directionality, and Sources JF The Journal of Neuroscience JO J. Neurosci. FD Society for Neuroscience SP 12572 OP 12580 DO 10.1523/JNEUROSCI.1099-10.2010 VO 30 IS 38 A1 Céline Cappe A1 Gregor Thut A1 Vincenzo Romei A1 Micah M. Murray YR 2010 UL http://www.jneurosci.org/content/30/38/12572.abstract AB Current models of brain organization include multisensory interactions at early processing stages and within low-level, including primary, cortices. Embracing this model with regard to auditory–visual (AV) interactions in humans remains problematic. Controversy surrounds the application of an additive model to the analysis of event-related potentials (ERPs), and conventional ERP analysis methods have yielded discordant latencies of effects and permitted limited neurophysiologic interpretability. While hemodynamic imaging and transcranial magnetic stimulation studies provide general support for the above model, the precise timing, superadditive/subadditive directionality, topographic stability, and sources remain unresolved. We recorded ERPs in humans to attended, but task-irrelevant stimuli that did not require an overt motor response, thereby circumventing paradigmatic caveats. We applied novel ERP signal analysis methods to provide details concerning the likely bases of AV interactions. First, nonlinear interactions occur at 60–95 ms after stimulus and are the consequence of topographic, rather than pure strength, modulations in the ERP. AV stimuli engage distinct configurations of intracranial generators, rather than simply modulating the amplitude of unisensory responses. Second, source estimations (and statistical analyses thereof) identified primary visual, primary auditory, and posterior superior temporal regions as mediating these effects. Finally, scalar values of current densities in all of these regions exhibited functionally coupled, subadditive nonlinear effects, a pattern increasingly consistent with the mounting evidence in nonhuman primates. In these ways, we demonstrate how neurophysiologic bases of multisensory interactions can be noninvasively identified in humans, allowing for a synthesis across imaging methods on the one hand and species on the other.