Elsevier

Neuroscience Letters

Volume 363, Issue 2, 10 June 2004, Pages 112-115
Neuroscience Letters

Time course of multisensory interactions during audiovisual speech perception in humans: a magnetoencephalographic study

https://doi.org/10.1016/j.neulet.2004.03.076Get rights and content

Abstract

During social interaction speech is perceived simultaneously by audition and vision. We studied interactions in the processing of auditory (A) and visual (V) speech signals in the human brain by comparing neuromagnetic responses to phonetically congruent audiovisual (AV) syllables with the arithmetic sum of responses to A and V syllables. Differences between AV and A+V responses were found bilaterally in the auditory cortices 150–200 ms and in the right superior temporal sulcus (STS) 250–600 ms after stimulus onset, showing that both sensory-specific and multisensory regions of the human temporal cortices are involved in AV speech processing. Importantly, our results suggest that AV interaction in the auditory cortex precedes that in the multisensory STS region.

Section snippets

Acknowledgments

We thank Professor Riitta Hari and Dr Iiro Jääskeläinen for the comments on the manuscript and Mika Seppä for help in data analysis. The MRIs were obtained at the Department of Radiology, Helsinki University Central Hospital. This study has been supported by the Academy of Finland and the Sigrid Juselius Foundation.

References (20)

  • G.A. Calvert et al.

    Response amplification in sensory-specific cortices during crossmodal binding

    NeuroReport

    (1999)
  • G.A. Calvert et al.

    Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex

    Curr. Biol.

    (2000)
  • G.A. Calvert et al.

    Detection of audio-visual integration sites in humans by application of electrophysiological criteria to the bold effect

    Neuroimage

    (2001)
  • M.H. Giard et al.

    Auditory-visual integration during multimodal object recognition in humans: a behavioral and electrophysiological study

    J. Cogn. Neurosci.

    (1999)
  • V. Klucharev et al.

    Electrophysiological indicators of phonetic and non-phonetic multisensory interactions during audiovisual speech perception

    Cogn. Brain Res.

    (2003)
  • K. Matsuura et al.

    Selective minimum-norm solution of the biomagnetic inverse problem

    IEEE Trans. Biomed. Eng.

    (1995)
  • S. Molholm et al.

    Multisensory auditory-visual interactions during early sensory processing in humans: a high-density electrical mapping study

    Cogn. Brain Res.

    (2002)
  • R. Möttönen et al.

    Processing of changes in visual speech in the human auditory cortex

    Cogn. Brain Res.

    (2002)
  • J.-L. Olivès et al.

    Audiovisual speech synthesis for Finnish

  • T. Raij et al.

    Audiovisual integration of letters in the human brain

    Neuron

    (2000)
There are more references available in the full text version of this article.

Cited by (65)

  • Somatosensory contribution to audio-visual speech processing

    2021, Cortex
    Citation Excerpt :

    Previous studies using magnetencephalography have detailed the time course of audio-visual speech processing for the McGurk effect (Hertrich et al., 2007; Möttönen et al., 2004). The early component (<200 msec) is processed in the sensory-specific areas and the latter component (>250 msec) is processed in multisensory regions of the human temporal cortex (Möttönen et al., 2004). A similar time course can be seen in our results.

  • Adult dyslexic readers benefit less from visual input during audiovisual speech processing: fMRI evidence

    2018, Neuropsychologia
    Citation Excerpt :

    In typical readers, a number of brain regions have been repeatedly implicated in audiovisual speech processing. These regions include high-level associative areas such as the superior temporal gyrus (including the planum temporale) (e.g., Beauchamp et al., 2004a; Beauchamp et al., 2010; Calvert et al., 2000; Stevenson et al., 2010; Stevenson et al., 2011) and the supramarginal gyrus (e.g., Skipper et al., 2005), as well as other more primary sensory regions such as Heschl's gyrus (e.g., Calvert et al., 1999; Callan et al., 2003; Möttönen et al., 2004; Pekkola et al., 2005) and the superior temporal gyrus (e.g., Beauchamp et al., 2004a, 2010; Calvert et al., 2000) for auditory information, and the middle temporal gyrus (e.g., Callan et al., 2003; Calvert et al., 1999; Calvert and Campbell, 2003) and the fusiform gyrus (e.g., Calvert and Campbell, 2003; Macaluso et al., 2004; Stevenson et al., 2010; Wyk et al., 2010) for visual information processing. Some studies have shown that the regions involved in multisensory processing exhibit enhanced responses to audiovisual stimuli, when compared to the sum of the responses to unisensory auditory and visual stimuli (Giard and Peronnet, 1999; Calvert et al., 2000, 2001; Macaluso et al., 2000; Bushara et al., 2001; Klucharev et al., 2003; Wright et al., 2003; Molholm et al., 2004).

View all citing articles on Scopus
View full text