Bayesian integration of visual and auditory signals for spatial localization

J Opt Soc Am A Opt Image Sci Vis. 2003 Jul;20(7):1391-7. doi: 10.1364/josaa.20.001391.

Abstract

Human observers localize events in the world by using sensory signals from multiple modalities. We evaluated two theories of spatial localization that predict how visual and auditory information are weighted when these signals specify different locations in space. According to one theory (visual capture), the signal that is typically most reliable dominates in a winner-take-all competition, whereas the other theory (maximum-likelihood estimation) proposes that perceptual judgments are based on a weighted average of the sensory signals in proportion to each signal's relative reliability. Our results indicate that both theories are partially correct, in that relative signal reliability significantly altered judgments of spatial location, but these judgments were also characterized by an overall bias to rely on visual over auditory information. These results have important implications for the development of cue integration and for neural plasticity in the adult brain that enables humans to optimally integrate multimodal information.

Publication types

  • Research Support, U.S. Gov't, Non-P.H.S.
  • Research Support, U.S. Gov't, P.H.S.

MeSH terms

  • Bayes Theorem
  • Hearing / physiology*
  • Humans
  • Models, Neurological*
  • Sound Localization / physiology
  • Space Perception / physiology*
  • Vision, Ocular / physiology*