Learning to control a brain-machine interface for reaching and grasping by primates

PLoS Biol. 2003 Nov;1(2):E42. doi: 10.1371/journal.pbio.0000042. Epub 2003 Oct 13.

Abstract

Reaching and grasping in primates depend on the coordination of neural activity in large frontoparietal ensembles. Here we demonstrate that primates can learn to reach and grasp virtual objects by controlling a robot arm through a closed-loop brain-machine interface (BMIc) that uses multiple mathematical models to extract several motor parameters (i.e., hand position, velocity, gripping force, and the EMGs of multiple arm muscles) from the electrical activity of frontoparietal neuronal ensembles. As single neurons typically contribute to the encoding of several motor parameters, we observed that high BMIc accuracy required recording from large neuronal ensembles. Continuous BMIc operation by monkeys led to significant improvements in both model predictions and behavioral performance. Using visual feedback, monkeys succeeded in producing robot reach-and-grasp movements even when their arms did not move. Learning to operate the BMIc was paralleled by functional reorganization in multiple cortical areas, suggesting that the dynamic properties of the BMIc were incorporated into motor and sensory cortical representations.

Publication types

  • Research Support, Non-U.S. Gov't
  • Research Support, U.S. Gov't, Non-P.H.S.

MeSH terms

  • Animals
  • Arm
  • Artificial Intelligence
  • Behavior, Animal
  • Biomechanical Phenomena*
  • Biophysical Phenomena
  • Biophysics*
  • Brain / pathology*
  • Brain Mapping
  • Electromyography / methods
  • Electrophysiology
  • Female
  • Hand
  • Hand Strength*
  • Learning
  • Macaca
  • Models, Neurological
  • Models, Statistical
  • Models, Theoretical
  • Motor Activity
  • Motor Cortex / pathology
  • Movement
  • Neurons / metabolism
  • Primates
  • Psychomotor Performance / physiology*
  • Robotics
  • Somatosensory Cortex / pathology
  • Space Perception
  • Time Factors