Information theory in auditory research

Hear Res. 2007 Jul;229(1-2):94-105. doi: 10.1016/j.heares.2007.01.012. Epub 2007 Jan 16.

Abstract

Mutual information (MI) is in increasing use as a way of quantifying neural responses. However, it is still considered with some doubts by many researchers, because it is not always clear what MI really measures, and because MI is hard to calculate in practice. This paper aims to clarify these issues. First, it provides an interpretation of mutual information as variability decomposition, similar to standard variance decomposition routinely used in statistical evaluations of neural data, except that the measure of variability is entropy rather than variance. Second, it discusses those aspects of the MI that makes its calculation difficult. The goal of this paper is to clarify when and how information theory can be used informatively and reliably in auditory neuroscience.

Publication types

  • Research Support, Non-U.S. Gov't
  • Review

MeSH terms

  • Acoustic Stimulation
  • Animals
  • Auditory Cortex / physiology*
  • Auditory Pathways / physiology
  • Auditory Perception / physiology
  • Information Theory*
  • Models, Neurological