Temporal dynamics of contrast gain in single cells of the cat striate cortex

Vis Neurosci. 1991 Mar;6(3):239-55. doi: 10.1017/s0952523800006258.

Abstract

The response amplitude of cat striate cortical cells is usually reduced after exposure to high-contrast stimuli. The temporal characteristics and contrast sensitivity of this phenomenon were explored by stimulating cortical cells with drifting gratings in which contrast sequentially incremented and decremented in stepwise fashion over time. All responses showed a clear hysteresis, in which contrast gain dropped on average 0.36 log unit and then returned to baseline values within 60 s. Noticeable gain adjustments were seen in as little as 3 s and with peak contrasts as low as 3%. Contrast adaptation was absent in responses from LGN cells. Adaptation was found to depend on temporal frequency of stimulation, with greater and more rapid adaptation at higher temporal frequencies. Two different tests showed that the mechanism controlling response reduction was influenced primarily by stimulus contrast rather than response amplitude. These results support the existence of a rapid and sensitive cortically based system that normalizes the output of cortical cells as a function of local mean contrast. Control of the adaptation appears to arise at least in part across a population of cells, which is consistent with the idea that the gain control serves to limit the information converging from many cells onto subsequent processing areas.

Publication types

  • Research Support, Non-U.S. Gov't
  • Research Support, U.S. Gov't, P.H.S.

MeSH terms

  • Adaptation, Ocular / physiology
  • Animals
  • Cats
  • Contrast Sensitivity / physiology*
  • Visual Cortex / cytology
  • Visual Cortex / physiology*