Skip to main content

Main menu

  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Collections
    • Podcast
  • ALERTS
  • FOR AUTHORS
    • Information for Authors
    • Fees
    • Journal Clubs
    • eLetters
    • Submit
    • Special Collections
  • EDITORIAL BOARD
    • Editorial Board
    • ECR Advisory Board
    • Journal Staff
  • ABOUT
    • Overview
    • Advertise
    • For the Media
    • Rights and Permissions
    • Privacy Policy
    • Feedback
    • Accessibility
  • SUBSCRIBE

User menu

  • Log out
  • Log in
  • My Cart

Search

  • Advanced search
Journal of Neuroscience
  • Log out
  • Log in
  • My Cart
Journal of Neuroscience

Advanced Search

Submit a Manuscript
  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Collections
    • Podcast
  • ALERTS
  • FOR AUTHORS
    • Information for Authors
    • Fees
    • Journal Clubs
    • eLetters
    • Submit
    • Special Collections
  • EDITORIAL BOARD
    • Editorial Board
    • ECR Advisory Board
    • Journal Staff
  • ABOUT
    • Overview
    • Advertise
    • For the Media
    • Rights and Permissions
    • Privacy Policy
    • Feedback
    • Accessibility
  • SUBSCRIBE
PreviousNext
Journal Club

Perception of Facial Expression in Somatosensory Cortex Supports Simulationist Models

Elizabeth Hussey and Ashley Safford
Journal of Neuroscience 14 January 2009, 29 (2) 301-302; https://doi.org/10.1523/JNEUROSCI.5205-08.2009
Elizabeth Hussey
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Ashley Safford
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Info & Metrics
  • eLetters
  • PDF
Loading

Understanding the emotional states of others is thought to involve simulating the same state in one's own mind. Simulationist models of embodied emotion argue that expression recognition cannot be performed as a disembodied cognitive process involving the amodal matching of physical properties with abstract concepts; rather, perception of this biologically significant stimulus class relies on the activation of a distributed sensorimotor network that facilitates emotion recognition. The somatosensory and perceptual elements that are encoded when we experience an emotion are reactivated when we see the facial expression associated with that emotion (Goldman and Sripada, 2005; Niedenthal, 2007). Several lines of research support these theories: patients with deficits in the production of fear, disgust, or anger tend to also be impaired in face-based recognition of the same emotion; neuroimaging studies have also shown that similar brain regions are active when a participant observes an emotional expression as when they imitate that same expression; and behavioral studies have shown that mimicking facial expressions facilitates perception of the related emotion, and that this mimicry reflects the internal simulation of the perceived emotion (for review, see Goldman and Sripada, 2005). Missing from this data has been direct evidence of the selective involvement of somatosensory cortical regions in emotion recognition.

In a study recently published in The Journal of Neuroscience involving repetitive transcranial magnetic stimulation (rTMS), Pitcher et al. (2008) aimed to test the hypothesis that facial expression recognition depends on somatoviseral responses associated with the perceived expression. rTMS was delivered to the right occipital face area (rOFA) and the face area of the right somatosensory cortex (rSC) during either a facial expression or facial identity discrimination task. On each trial, subjects were presented with pairs of visual face stimuli (sample and target pictures) separated by a 1000 ms interval; rTMS was delivered at 10 Hz for 500 ms during the presentation of the target stimuli. rTMS was also delivered to the vertex, to control for the effects of TMS, and a no-TMS condition was included as a behavioral baseline. To study the possible dissociation of neural involvement between various emotions, facial stimuli expressed one of six possible emotions: happy, sad, surprise, fear, disgust, and anger. Pitcher et al. (2008) found that accuracy on the expression task was reduced for stimulation of both rOFA and rSC, but stimulation at these sites had no effect on the identity task. To demonstrate that the effects of TMS were specific to face regions of rSC, a clever additional experiment used rTMS to stimulate both the face and the finger region of rSC, indicating that the expression impairment was not simply a general effect of rTMS on the somatosensory cortex.

The lack of an effect of rTMS to the rOFA on identity recognition, in the presence of impaired expression recognition after rTMS to the rOFA, was interpreted as evidence for greater configural processing of face identity and more part-based processing of face expression. Previous work by these authors has indicated that although rTMS impairs part-based processing of faces, it had no effect on configural processing of faces (Pitcher et al., 2007). Although part-based processing plays an important role in the recognition of certain emotions (Smith et al., 2005), there is also evidence for configural processing of emotional expressions (Calder et al., 2000). An alternative explanation could be that identity processing can draw on bilateral neural mechanisms that compensate for disruption to the rOFA.

The involvement of rSC in processing expression (rather than identity) found in the study by Pitcher et al. (2008) can be explained by the emotional content that is critical to the expression task but is not relevant for the identity discrimination task. It has been suggested that embodiment only occurs when informational processing relies on emotional associations. In contrast, when the decision can be made on the basis of perceptual features alone, simulation is unnecessary (Niedenthal, 2007). This phenomenon is consistent with findings from other domains, indicating that it is not specific to perception of facial expressions. Instead, it appears that simulation is involved whenever the emotional aspect of information processing is manipulated; for example, in judgments of the emotional content of language and in recall of emotional memories (Niedenthal, 2007).

An important issue in the study of emotion is whether specific emotions are processed by distinct mechanisms. In a study using single-pulse TMS, Pourtois et al. (2004) found that TMS to right somatosensory cortex selectively interfered with perception of fearful expressions but not happy faces. However, when comparing six different emotions, Pitcher et al. (2008) found no preferential impairment of individual expressions. The authors attributed this result to lack of statistical power because of the restricted number of trials per expression. However, an alternative explanation for this is that different emotions could vary in the level of internal somatic representation required for recognition. The rSC excitation threshold may be emotion-dependent, and thus different levels of interference by TMS stimulation would be required to influence performance for various emotions. Because fear is a biologically relevant and salient emotion, it is logical that it involves stronger somatoviseral representation that is more easily interfered with. Pitcher et al. (2008) used higher intensity repetitive TMS than Pourtois et al. (2004), which could be sufficient to interfere with the representation of a wider range of expressions, including happiness. It is also possible that happy faces are less dependent on somatosensory representation than fearful faces.

Distinct processing of individual emotions is also supported by the findings of Oberman et al. (2007). These authors found that blocking facial mimicry by holding a pen between the teeth interfered with perception of faces expressing happiness, but did not have a similar influence on more inwardly expressed and facially neutral emotions of disgust, fear, or sadness. The manipulation engaged the facial musculature involved in emotional expressions, but was most similar to that involved in expression of happiness compared with the other three emotions. Importantly, in this study, performance on a facial expression discrimination task was influenced by manipulating the motor representation of the associated emotion, whereas in the study by Pitcher et al. (2008), TMS interfered with somatosensory representation of emotion. Therefore, the seemingly inconsistent results can be resolved by attributing them to different stages of the same action-perception network as modeled by simulationist theories (Goldman and Sripada, 2005).

The study by Pitcher et al. (2008) provides strong evidence for simulationist models of emotion recognition and is also relevant to perception of emotional meaning, which is particularly significant for facilitating successful social interaction. Empathy, the ability to understand and resonate with another's emotional state, shows significant interindividual differences. This emotional reactivity appears to predict the degree to which an individual simulates another's emotions neurally. Evidence suggests that highly empathic individuals experience nonconscious imitation of others' facial expressions and body language (the chameleon effect), to a greater degree than less empathic people (Chartrand and Bargh, 1999). There appears to be a positive correlation between the ability to empathize and the ability to visually recognize emotions conveyed by another: a recent neuroimaging study (Jabbi et al., 2007) showed that activity in the insula and frontal operculum correlated with individual ratings of empathy for both negative and positive emotions.

Furthermore, a recent functional magnetic resonance imaging study suggests different brain regions are associated with the empathic response to different emotions (Chakrabarti et al., 2006). Ventral striatal activation to happy faces correlated positively with empathy measures, whereas activation in this region to sad faces showed a negative correlation. Consistent with embodied simulation, activity in the premotor cortex was positively correlated with empathy across all four tested emotions. These results suggest that there are both common and divergent neural mechanisms underpinning expression production and observation. Whereas premotor mirror neuron activity underlies the action representation associated with emotional expression in general (and thus, not surprisingly, predicts an individual's ability to “feel” or empathize with another regardless of the specific action or expression), the specificity of empathic response pathways suggests that neural subsystems exist, particularly for viscerosensory aspects associated with the concomitant internal simulation of a viewed emotion, thus unique to each expression.

The association between emotional imitation and empathy leads to the idea that a breakdown in the simulation network could lead to significant social deficits; evidence suggests that such mechanisms may be involved in disorders such as autism. However, future neuroimaging- and TMS-related studies will help determine whether emotional analysis through facial expression perception is organized modularly, with a discrete neural pathway underlying a given affective state.

Footnotes

  • We thank James C. Thompson for his valuable comments.

  • Editor's Note: These short, critical reviews of recent papers in the Journal, written exclusively by graduate students or postdoctoral fellows, are intended to summarize the important findings of the paper and provide additional insight and commentary. For more information on the format and purpose of the Journal Club, please see http://www.jneurosci.org/misc/ifa_features.shtml.

  • Correspondence should be addressed to Ashley Safford, Krasnow Institute for Advanced Study, George Mason University, 4400 University Drive MS 2A1, Fairfax, VA 22030. ahamlin2{at}gmu.edu

References

  1. ↵
    1. Calder AJ,
    2. Young AW,
    3. Keane J,
    4. Dean M
    (2000) Configural information in facial expression perception. J Exp Psychol Hum Percept Perform 26:527–551.
    OpenUrlCrossRefPubMed
  2. ↵
    1. Chakrabarti B,
    2. Bullmore E,
    3. Baron-Cohen S
    (2006) Empathizing with basic emotions: common and discrete neural substrates. Soc Neurosci 1:364–384.
    OpenUrlCrossRefPubMed
  3. ↵
    1. Chartrand TL,
    2. Bargh JA
    (1999) The chameleon effect: the perception-behavior link and social interaction. J Pers Soc Psychol 76:893–910.
    OpenUrlCrossRefPubMed
  4. ↵
    1. Goldman AI,
    2. Sripada CS
    (2005) Simulationist models of face-based emotion recognition. Cognition 94:193–213.
    OpenUrlCrossRefPubMed
  5. ↵
    1. Jabbi M,
    2. Swart M,
    3. Keysers C
    (2007) Empathy for positive and negative emotions in the gustatory cortex. Neuroimage 34:1744–1753.
    OpenUrlCrossRefPubMed
  6. ↵
    1. Niedenthal PM
    (2007) Embodying emotion. Science 316:1002–1005.
    OpenUrlAbstract/FREE Full Text
  7. ↵
    1. Oberman LM,
    2. Winkielman P,
    3. Ramachandran VS
    (2007) Face to face: blocking facial mimicry can selectively impair recognition of emotional expressions. Soc Neurosci 2:167–178.
    OpenUrlCrossRefPubMed
  8. ↵
    1. Pitcher D,
    2. Walsh V,
    3. Yovel G,
    4. Duchaine BC
    (2007) TMS evidence for the involvement of the right occipital face area in early face processing. Curr Biol 17:1568–1573.
    OpenUrlCrossRefPubMed
  9. ↵
    1. Pitcher D,
    2. Garrido L,
    3. Walsh V,
    4. Duchaine BC
    (2008) Transcranial magnetic stimulation disrupts the perception and embodiment of facial expressions. J Neurosci 28:8929–8933.
    OpenUrlAbstract/FREE Full Text
  10. ↵
    1. Pourtois G,
    2. Sander D,
    3. Andres M,
    4. Grandjean D,
    5. Peveret L,
    6. Olivier E,
    7. Vuilleumier P
    (2004) Dissociable roles of the human somatosensory and superior temporal cortices for processing social face signals. Eur J Neurosci 20:3507–3515.
    OpenUrlCrossRefPubMed
  11. ↵
    1. Smith ML,
    2. Cottrell GW,
    3. Gosselin F,
    4. Schyns PG
    (2005) Transmitting and decoding facial expressions. Psychol Sci 16:184–189.
    OpenUrlAbstract/FREE Full Text
Back to top

In this issue

The Journal of Neuroscience: 29 (2)
Journal of Neuroscience
Vol. 29, Issue 2
14 Jan 2009
  • Table of Contents
  • Table of Contents (PDF)
  • About the Cover
  • Index by author
Email

Thank you for sharing this Journal of Neuroscience article.

NOTE: We request your email address only to inform the recipient that it was you who recommended this article, and that it is not junk mail. We do not retain these email addresses.

Enter multiple addresses on separate lines or separate them with commas.
Perception of Facial Expression in Somatosensory Cortex Supports Simulationist Models
(Your Name) has forwarded a page to you from Journal of Neuroscience
(Your Name) thought you would be interested in this article in Journal of Neuroscience.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Print
View Full Page PDF
Citation Tools
Perception of Facial Expression in Somatosensory Cortex Supports Simulationist Models
Elizabeth Hussey, Ashley Safford
Journal of Neuroscience 14 January 2009, 29 (2) 301-302; DOI: 10.1523/JNEUROSCI.5205-08.2009

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Respond to this article
Request Permissions
Share
Perception of Facial Expression in Somatosensory Cortex Supports Simulationist Models
Elizabeth Hussey, Ashley Safford
Journal of Neuroscience 14 January 2009, 29 (2) 301-302; DOI: 10.1523/JNEUROSCI.5205-08.2009
Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Footnotes
    • References
  • Info & Metrics
  • eLetters
  • PDF

Responses to this article

Respond to this article

Jump to comment:

No eLetters have been published for this article.

Related Articles

Cited By...

More in this TOC Section

  • Beyond Motor Control: Diffusion MRI Reveals Associations between the Cerebello-VTA Pathway and Socio-affective Behaviors in Humans
  • A Novel APP Knock-In Mouse Model to Study the Protective Effects of the Icelandic Mutation In Vivo
  • Bridging Species Differences in Rule Switching: How Humans and Monkeys Solve the Same Wisconsin Card Sorting Task
Show more Journal Club
  • Home
  • Alerts
  • Follow SFN on BlueSky
  • Visit Society for Neuroscience on Facebook
  • Follow Society for Neuroscience on Twitter
  • Follow Society for Neuroscience on LinkedIn
  • Visit Society for Neuroscience on Youtube
  • Follow our RSS feeds

Content

  • Early Release
  • Current Issue
  • Issue Archive
  • Collections

Information

  • For Authors
  • For Advertisers
  • For the Media
  • For Subscribers

About

  • About the Journal
  • Editorial Board
  • Privacy Notice
  • Contact
  • Accessibility
(JNeurosci logo)
(SfN logo)

Copyright © 2025 by the Society for Neuroscience.
JNeurosci Online ISSN: 1529-2401

The ideas and opinions expressed in JNeurosci do not necessarily reflect those of SfN or the JNeurosci Editorial Board. Publication of an advertisement or other product mention in JNeurosci should not be construed as an endorsement of the manufacturer’s claims. SfN does not assume any responsibility for any injury and/or damage to persons or property arising from or related to any use of any material contained in JNeurosci.