Computing with continuous attractors: stability and online aspects

Neural Comput. 2005 Oct;17(10):2215-39. doi: 10.1162/0899766054615626.

Abstract

Two issues concerning the application of continuous attractors in neural systems are investigated: the computational robustness of continuous attractors with respect to input noises and the implementation of Bayesian online decoding. In a perfect mathematical model for continuous attractors, decoding results for stimuli are highly sensitive to input noises, and this sensitivity is the inevitable consequence of the system's neutral stability. To overcome this shortcoming, we modify the conventional network model by including extra dynamical interactions between neurons. These interactions vary according to the biologically plausible Hebbian learning rule and have the computational role of memorizing and propagating stimulus information accumulated with time. As a result, the new network model responds to the history of external inputs over a period of time, and hence becomes insensitive to short-term fluctuations. Also, since dynamical interactions provide a mechanism to convey the prior knowledge of stimulus, that is, the information of the stimulus presented previously, the network effectively implements online Bayesian inference. This study also reveals some interesting behavior in neural population coding, such as the trade-off between decoding stability and the speed of tracking time-varying stimuli, and the relationship between neural tuning width and the tracking speed.

Publication types

  • Comparative Study

MeSH terms

  • Humans
  • Learning / physiology*
  • Models, Neurological*
  • Nerve Net / physiology*
  • Neural Networks, Computer*
  • Neurons / physiology*
  • Nonlinear Dynamics
  • Synapses
  • Time Factors