Investigating the brain basis of facial expression perception using multi-voxel pattern analysis

Cortex. 2015 Aug:69:131-40. doi: 10.1016/j.cortex.2015.05.003. Epub 2015 May 14.

Abstract

Humans can readily decode emotion expressions from faces and perceive them in a categorical manner. The model by Haxby and colleagues proposes a number of different brain regions with each taking over specific roles in face processing. One key question is how these regions directly compare to one another in successfully discriminating between various emotional facial expressions. To address this issue, we compared the predictive accuracy of all key regions from the Haxby model using multi-voxel pattern analysis (MVPA) of functional magnetic resonance imaging (fMRI) data. Regions of interest were extracted using independent meta-analytical data. Participants viewed four classes of facial expressions (happy, angry, fearful and neutral) in an event-related fMRI design, while performing an orthogonal gender recognition task. Activity in all regions allowed for robust above-chance predictions. When directly comparing the regions to one another, fusiform gyrus and superior temporal sulcus (STS) showed highest accuracies. These results underscore the role of the fusiform gyrus as a key region in perception of facial expressions, alongside STS. The study suggests the need for further specification of the relative role of the various brain areas involved in the perception of facial expression. Face processing appears to rely on more interactive and functionally overlapping neural mechanisms than previously conceptualised.

Keywords: Emotion; Face perception; Facial expressions; MVPA; fMRI.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Adult
  • Brain / physiology*
  • Brain Mapping
  • Emotions / physiology*
  • Facial Expression*
  • Female
  • Humans
  • Image Processing, Computer-Assisted
  • Magnetic Resonance Imaging
  • Male
  • Visual Perception / physiology*
  • Young Adult