Elsevier

NeuroImage

Volume 56, Issue 2, 15 May 2011, Pages 681-687
NeuroImage

Decoding the direction of auditory motion in blind humans

https://doi.org/10.1016/j.neuroimage.2010.04.266Get rights and content

Abstract

Accurate processing of nonvisual stimuli is fundamental to humans with visual impairments. In this population, moving sounds activate an occipito-temporal region thought to encompass the equivalent of monkey area MT+, but it remains unclear whether the signal carries information beyond the mere presence of motion. To address this important question, we tested whether the processing in this region retains functional properties that are critical for accurate motion processing and that are well established in the visual modality. Specifically, we focussed on the property of ‘directional selectivity’, because MT+ neurons in non-human primates fire preferentially to specific directions of visual motion. Recent neuroimaging studies have revealed similar properties in sighted humans by successfully decoding different directions of visual motion from fMRI activation patterns.

Here we used fMRI and multivariate pattern classification to demonstrate that the direction in which a sound is moving can be reliably decoded from dorsal occipito-temporal activation in the blind. We also show that classification performance is at chance (i) in a control region in posterior parietal cortex and (ii) when motion information is removed and subjects only hear a sequence of static sounds presented at the same start and end positions. These findings reveal that information about the direction of auditory motion is present in dorsal occipito-temporal responses of blind humans. As such, this area, which appears consistent with the hMT+ complex in the sighted, provides crucial information for the generation of a veridical percept of moving non-visual stimuli.

Introduction

For humans with visual impairments, independence and efficient interaction with the environment critically depends on accurate processing of nonvisual stimuli. For example, moving sounds produced by vehicles can provide essential and reliable information about street layout, approaching vehicles, and traffic cycles. To compensate for the lack of visual information, blind observers often show exceptional performance in other modalities, i.e. when localizing a sound source (Lessard et al., 1998, Röder et al., 1999). This enhanced performance has been linked to cross-modal plasticity, because many nonvisual tasks reliably activate occipital areas that are recruited for processing visual stimuli in the sighted but are deprived of this visual input in the blind (Theoret et al., 2004).

Given that non-visual motion processing is crucial to visually impaired travellers, it is important to understand how their brains process such cues. Two studies have directly addressed this issue: Poirier et al. (2006) compared moving and static auditory stimuli and found that the presence of motion induced stronger activation in a large network of regions, including an occipito-temporal region assumed to encompass the motion-sensitive hMT+ complex. Similar results have been reported for tactile motion (Ricciardi et al., 2007), suggesting that regions implicated in visual motion processing in the sighted are recruited for non-visual motion processing in the blind.

One important unresolved question is whether the motion related activation observed in blind humans plays a functional role in nonvisual motion processing or whether the activation merely reflects unspecific coactivation (Collignon et al., 2009). Specifically, in order to generate a veridical percept of a moving stimulus, the brain needs to extract crucial properties such as speed and direction. In the visual modality, MT+ neurons in non-human primates respond preferentially to specific directions and speeds of visual motion, thus conveying information about the characteristics of the stimulus (Born and Bradley, 2005). Neuroimaging studies have revealed similar functional properties in the human homologue (hMT+), as evidenced by different directions of linear and rotational optic flow being decoded from multivariate fMRI activation patterns in this area (Kamitani and Tong, 2006, Seymour et al., 2009). These findings indicate that neuronal populations in hMT+ have similar properties as their monkey counterparts.

Here we used fMRI to determine the functional significance of auditory motion processing in dorsal occipito-temporal cortex of blind humans. Four blind participants underwent fMRI scanning while listening to blocks of leftward or rightward moving broadband noise sources that were realistically simulated using binaural recording techniques. Using multivariate pattern classification, we then tested whether the direction of motion could be decoded from ensemble responses in dorsal occipito-temporal cortex and in a posterior parietal control region.

Section snippets

Subjects

The study involved four blind, male volunteers, aged 35–62 (mean duration of blindness: 44.3 years). The experiment was approved by the local ethics committee, and informed consent was obtained for all participants. The etiology of blindness was Retinopathy of Prematurity for two congenitally totally blind participants, Leber's Congenital Amaurosis and Retinitis Pigmentosa (onset age: 24 years, duration of blindness: 25 years) for the other two who had only minimal light perception. All subjects

Results

Deviant stimuli were detected with near-perfect accuracy (> 95%) in all four conditions, and a repeated measures ANOVA did not reveal any differences between moving and static stimuli (accuracy: p = 0.49, reaction times: p = 0.13). In addition, stimulus direction did not affect behavioural performance (accuracy: p = 0.49, reaction times: p = 0.38), and there was no evidence for an interaction between stimulus direction and the presence or absence of motion (accuracy: p = 0.09, reaction times: p = 0.39).

We

Discussion

Using multivariate pattern classification, we have shown that the direction in which an auditory stimulus is moving can be decoded from ensemble responses in dorsal occipito-temporal cortex. These results were not driven by the specific sequence of start and end positions since decoding accuracy was at chance for the corresponding static stimuli that lacked the intervening motion component. Furthermore, we did not observe evidence for specific patterns of head motion to accompany the motion

Acknowledgments

We would like to thank Magdalena Wutte for help with data collection. This work was supported by NSF grant BCS-0745328 and by a study grant from the Brain Imaging Center at the University of California Santa Barbara.

References (31)

  • K. Seymour et al.

    The coding of color, motion, and their conjunction in the human visual cortex

    Curr. Biol.

    (2009)
  • H. Theoret et al.

    Behavioral and neuroplastic changes in the blind: evidence for functionally relevant cross-modal interactions

    J. Physiol. Paris

    (2004)
  • A. Amedi et al.

    Shape conveyed by visual-to-auditory sensory substitution activates the lateral occipital complex

    Nat. Neurosci.

    (2007)
  • O. Baumann et al.

    Neural correlates of coherent audiovisual motion perception

    Cereb. Cortex

    (2007)
  • D. Bavelier et al.

    Cross-modal plasticity: where and how?

    Nat. Rev. Neurosci.

    (2002)
  • Cited by (62)

    • Tactile expectancy modulates occipital alpha oscillations in early blindness

      2023, NeuroImage
      Citation Excerpt :

      This idea is reminiscent of the observation that occipital regions maintain their computational specialization following the loss of vision, but shift their preferred input modality (e.g., from vision to touch), taking advantage of the underlying computational specialization of that area. For instance, the human visual motion-selective area hMT+ has been found to respond to tactile (Beauchamp et al., 2007; Blake et al., 2004; Jiang et al., 2015; van Kemenade et al., 2014) and auditory motion (Battal et al., 2022; Dormal et al., 2016; Poirier et al., 2006; Wolbers et al., 2011) in case of early blindness. Similarly, ventral occipital regions typically showing category-selective responses to specific domains of vision in sighted people were found to respond to analogous categories in non-visual modalities in congenitally blind people (Mattioni et al., 2020; Pietrini et al., 2004; Striem-Amit and Amedi, 2014).

    • Do blind people hear better?

      2022, Trends in Cognitive Sciences
    • Naturalistic stimuli reveal a sensitive period in cross modal responses of visual cortex: Evidence from adult-onset blindness

      2022, Neuropsychologia
      Citation Excerpt :

      Although a handful of TMS studies suggest the behavioral relevance of visual cortex activity in congenitally blind people, no such evidence exists for adult-onset blind people (Amedi et al., 2004; Cohen et al., 1999; Ptito et al., 2008). Thus, while cross-modal effects are present in visual cortices of all people, early blindness substantially modifies the character of these responses (Collignon et al., 2011; Merabet et al., 2004; Pascual-Leone and Hamilton, 2001; Sathian and Stilla, 2010; Stilla et al., 2008; Wolbers et al., 2011). A key outstanding question concerns how blindness/visual experiences build on a common anatomical blueprint present at birth to enable the distinct functional pattern observed in congenitally blind, as opposed to sighted and adult-onset blind adults.

    • Neural mechanisms of visual sensitive periods in humans

      2021, Neuroscience and Biobehavioral Reviews
    View all citing articles on Scopus
    View full text