A Bayesian model unifies multisensory spatial localization with the physiological properties of the superior colliculus

Exp Brain Res. 2007 Jun;180(1):153-61. doi: 10.1007/s00221-006-0847-2. Epub 2007 Feb 14.

Abstract

"Multisensory integration" refers to the phenomenon by which information from different senses is integrated in order to interpret and guide responses to external events. Here, we advance two specific hypotheses: (1) the process of multisensory integration in spatial localization is statistically optimal, and (2) the optimality of the processes guiding this localization results from the implementation of Bayes' rule. We explicitly test the predictions of an optimal (Bayesian) model for the behavior of animals trained and tested in a spatial localization task, and find that the model correctly predicts behavioral patterns which are at times counterintuitive. The model also predicts the receptive field properties of superior colliculus neurons that are involved in these behaviors, and sheds new light on the computational responsibilities different circuits have in effecting these behaviors. Thus, the Bayesian model appears to represent not only a yardstick for the optimality of a behavior, but also a descriptor of the underlying neural processes.

MeSH terms

  • Acoustic Stimulation / methods
  • Animals
  • Auditory Perception
  • Bayes Theorem*
  • Behavior, Animal
  • Cats
  • Photic Stimulation / methods
  • Reaction Time / physiology
  • Sensation / physiology*
  • Space Perception / physiology*
  • Superior Colliculi / physiology*
  • Visual Perception