Elsevier

NeuroImage

Volume 49, Issue 2, 15 January 2010, Pages 1857-1867
NeuroImage

Three stages of facial expression processing: ERP study with rapid serial visual presentation

https://doi.org/10.1016/j.neuroimage.2009.09.018Get rights and content

Abstract

Electrophysiological correlates of the processing facial expressions were investigated in subjects performing the rapid serial visual presentation (RSVP) task. The peak latencies of the event-related potential (ERP) components P1, vertex positive potential (VPP), and N170 were 165, 240, and 240 ms, respectively. The early anterior N100 and posterior P1 amplitudes elicited by fearful faces were larger than those elicited by happy or neutral faces, a finding which is consistent with the presence of a ‘negativity bias.’ The amplitude of the anterior VPP was larger when subjects were processing fearful and happy faces than when they were processing neutral faces; it was similar in response to fearful and happy faces. The late N300 and P300 not only distinguished emotional faces from neutral faces but also differentiated between fearful and happy expressions in lag2. The amplitudes of the N100, VPP, N170, N300, and P300 components and the latency of the P1 component were modulated by attentional resources. Deficient attentional resources resulted in decreased amplitude and increased latency of ERP components. In light of these results, we present a hypothetical model involving three stages of facial expression processing.

Introduction

As a fundamental emotional stimulus, facial expression conveys important information in social exchanges. Numerous event-related potential (ERP) and magnetoencephalography (MEG) studies (Bentin et al., 1996, Jeffreys, 1996, Eimer and McCarthy, 1999, Eimer, 2000, Liu et al., 2000, Liu et al., 2002, Itier and Taylor, 2004, Xu et al., 2005, Itier et al., 2006) have investigated the time course of face visual processing. In these studies, face-sensitive P1, N170 (their magnetic equivalent, the M100 and M170), and vertex positive potential (VPP) components have displayed differential activity when different facial expressions were being viewed, especially when faces with a fearful expression were being viewed.

P1 is a positive-direction component detected at the parieto–occipital electrode with an onset latency between 60 and 80 ms that peaks at around 100–130 ms post-stimulus. It is thought to reflect processing of the low-level features of stimuli. However, MEG and electrophysiological studies (Linkenkaer-Hansen et al., 1998, Liu et al., 2000, Liu et al., 2002, Pizzagalli et al., 2002) have indicated that this component correlates with processing of face categorization. While effects of facial expression on P1 have been reported previously (Halgren et al., 2000, Batty and Taylor, 2003, Eger et al., 2003), differences related to distinct emotions have not been described (Batty and Taylor, 2003, Esslen et al., 2004). However, Pourtois et al. (2004) demonstrated that a lateral occipital P1 component was selectively increased when replacing a fearful face with a bar compared with replacing a neutral face with the same bar. Vuilleumier and Pourtois (2007) have thus suggested that there may be a rapid extraction of information related to emotion or salience that occurs before more fine-grained perceptual processes are complete.

N170 is a negative-going component detected at the lateral occipito–temporal electrode with a waveform observed in the 120–220 ms range that peaks around 170 ms post-stimulus. This component clearly distinguishes faces from non-face visual stimuli and is therefore considered to be an index of the configural processing of the face (Bentin et al., 1996, Rossion et al., 1999, Bentin and Deouell, 2000, Rossion et al., 2003, Itier and Taylor, 2004). N170 tends to have a shorter latency and larger amplitude in the right hemisphere than the left hemisphere (Bentin et al., 1996). Source localization studies have localized the generators of the N170 to the fusiform gyrus and the superior temporal sulcus; the implicated fusiform gyrus has been termed the “fusiform face area” (Kanwisher et al., 1997, Itier and Taylor, 2002, Itier and Taylor, 2004, Schweinberger et al., 2002a, Schweinberger et al., 2002b, Caldara et al., 2003). There is conflicting evidence regarding whether N170 is responsive to emotional expression. Some researchers have found that N170 does not discriminate emotional expression (Münte et al., 1998, Herrmann et al., 2002, Eimer et al., 2003), while others have found that expression modulates N170 amplitude (Batty and Taylor, 2003, Miyoshi et al., 2004, Caharel et al., 2005), noting a larger amplitude for fearful relative to neutral faces (Batty and Taylor, 2003, Stekelenburg and de Gelder, 2004, Pourtois et al., 2005, Leppänen et al., 2007).

The VPP is a positive deflection detected at the fronto–central electrode with a latency similar to that of the N170. Previous research has shown that, like the N170, the VPP is sensitive to configural processing of the face (Rossion et al., 1999, Eimer, 2000, Jemel et al., 2003). Some source localization studies concluded that the VPP and N170 components are derived from the same neural dipole located in or near the fusiform gyrus (Rossion et al., 1999, Itier and Taylor, 2002, Joyce and Rossion, 2005), while other studies suggested that there are two independent neural generators (Botzel et al., 1995, Bentin et al., 1996, Eimer, 2000, George et al., 2005). Recent findings have indicated that facial expression modulates VPP amplitude and that its amplitude is larger in response to fearful faces relative to happy and neutral faces. This effect could be attributed, at least in part, to a significant difference in recognition accuracy for these expressions, which planned contrasts showed was due to lesser accuracy for fear relative to neutral faces, but greater accuracy for happy relative to neutral and fear faces (Williams et al., 2006).

Previous studies examining the effect of attention during emotional perception on ERPs have shown that correlates of facial expression processing are modulated by spatial attention (Pessoa et al., 2002, Eimer et al., 2003, Holmes et al., 2003). In this study, we investigate how the time dimension of attention and attentional resources modulate facial expression processing. If the processing of fearful facial expression is a spontaneous process, not significantly affected by a competitive stimulus and not requiring attentional resources, then it should not be hindered by an experimentally controlled temporary deficit in attention, referred to as an ‘attentional blink’ (Raymond et al., 1992). Different facial expressions may be subject to differing attentional blink effects, such that the processing of emotional (fearful and happy) facial expressions would be less affected by an attentional blink than that of neutral facial expressions, and detectivity of fearful facial expression may be less impaired than that of happy faces.

Previous studies have demonstrated that N100, P1, VPP, and N170 are significantly affected by affect in the early phase of perception and attention processing (Eimer and Holmes, 2002, Batty and Taylor, 2003, Eger et al., 2003, Ashley et al., 2004, Carretié et al., 2004, Miyoshi et al., 2004, Bar-Haim et al., 2005, Caharel et al., 2005, Huang and Luo, 2006). N300 largely reflects the dimensionality of affective valence (Carretié et al., 2001). The same is true with P300 in higher-level phases of cognitive process, such as those reflecting stimulus evaluation and selection (Campanella et al., 2002, Miltner et al., 2005). The present study hypothesizes that ERP components, such as N100, P1, VPP, N170, N300, and P300, will be sensitive to facial expression in the rapid serial visual presentation (RSVP) paradigm.

According to Lang's theory of emotional dimensions, valence and arousal are the two primary dimensions of emotion (Lang, 1995). Accordingly, arousal should be controlled for when studying the valence of emotion. In order to control for the effects of arousal on emotional valence in this study, pictures of standardized facial expressions were adopted with valence and arousal subjected to a standardized appraisal. To remove the potential for a race effect, pictures of facial expressions in the Chinese Facial Affective Picture System (CFAPS)1 were used.

Section snippets

Subjects

As paid volunteers, 15 junior undergraduates (8 women, 7 men, aged 19–26 years, mean age, 22.6 years) from Southwest University in China participated in the experiment. All participants were healthy, right-handed, and had normal or corrected to normal vision.

Stimuli

Materials consisted of 30 face pictures and 3 upright house stimuli. Face pictures from the native Chinese Facial Affective Picture System (CFAPS) were used to elicit emotional responses, including 18 pictures of an upright face and 12 of

Behavioral performance

ANOVAs revealed that response accuracy was significantly affected by task, lag, and facial expression (F1,14 = 15.93, P < 0.001; F1,14 = 26.70, P < 0.001; F2,28 = 54.79, P < 0.001). Subjects performed at a higher correct rate in the single task condition (94.93 ± 2.84%) than in the dual task (92.43 ± 4.76%), and performed better in lag6 (94.64 ± 3.06%) than in lag2 (92.72 ± 4.75%). Pairwise comparison of the main effect of facial expression showed that the accuracy of fearful faces (96.31 ± 2.40%) was higher than

Discussion

Response accuracy was better with fearful face stimuli than with happy and neutral face stimuli. However, responses to happy facial expression stimuli were less impaired by the presence of an attentional blink than responses to neutral ones. These findings are consistent with the hypothesis that processing of emotional, especially potentially threatening stimuli, may occur without substantial attentional involvement. This distinction would be expected to provide a highly valuable biological

Conclusions

The amplitudes of the N100, VPP, N170, N300, and P300 components and the latency of the P1 component elicited by viewing of emotionally expressive faces can be modulated by attentional resources. Component amplitudes and latencies tend to be larger and shorter, respectively, when attentional resources are plentiful. Furthermore, the ERP components have differential preferences for different emotional expressions in a manner consistent with multi-stage processing of emotional facial expressions.

Acknowledgments

This research was supported by the NSF China (30930031, 30900399), the Ministry of Education (PCSIRT, IRT0710) and the Global Research Initiative Program, NIH, USA (1R01TW007897). We thank Yu Fengqiong, Gan Tian, Li Fuhong, and Qiu Jiang for comments at various stages of this project. The authors have declared that no competing interests exist.

References (74)

  • HagenG.F. et al.

    P3a from visual stimuli: task difficulty effects

    Int. J. Psychophysiol.

    (2006)
  • HerrmannM.J. et al.

    Face-specific event-related potential in humans is independent from facial expression

    Int. J. Psychophysiol.

    (2002)
  • HolmesA. et al.

    The processing of emotional facial expression is gated by spatial attention: evidence from event-related brain potentials

    Cogn. Brain Res.

    (2003)
  • HuangY.X. et al.

    Temporal course of emotional negativity bias: an ERP study

    Neurosci. Lett.

    (2006)
  • ItierR.J. et al.

    Inversion and contrast polarity reversal affect both encoding and recognition processes of unfamiliar faces: a repetition study using ERPs

    NeuroImage

    (2002)
  • ItierR.J. et al.

    Face, eye and object early processing: what is the face specificity?

    NeuroImage

    (2006)
  • JemelB. et al.

    Is the N170 for faces cognitively penetrable? Evidence from repetition priming of Mooney faces of familiar and unfamiliar persons

    Cogn. Brain Res.

    (2003)
  • JoyceC. et al.

    The face-sensitive N170 and VPP components manifest the same brain processes: the effect of reference electrode site

    Clin. Neurophysiol.

    (2005)
  • KrancziochC. et al.

    Event-related potential correlates of the attentional blink phenomenon

    Cogn. Brain Res.

    (2003)
  • KrombholzA. et al.

    Modification of N170 by different emotional expression of schematic faces

    Biol. Psychol.

    (2007)
  • Linkenkaer-HansenK. et al.

    Face-selective processing in human extrastriate cortex around 120 ms after stimulus onset revealed by magneto and electroencephalography

    Neurosci. Lett.

    (1998)
  • MiltnerW.H. et al.

    Event-related brain potentials and affective responses to threat in spider snake-phobic and non-phobic subjects

    Int. J. Psychophysiol.

    (2005)
  • MünteT.F. et al.

    Brain potentials reveal the timing of face identity and expression judgments

    Neurosci. Res.

    (1998)
  • NarumotoJ. et al.

    Attention to emotion modulates fMRI activity in human right superior temporal sulcus

    Cogn. Brain Res.

    (2001)
  • PizzagalliD.A. et al.

    Affective judgments of faces modulate early activity (160 ms) within the fusiform gyri

    NeuroImage

    (2002)
  • PourtoisG. et al.

    Two electrophysiological stages of spatial orienting towards fearful faces: early temporo-parietal activation preceding gain control in extrastriate visual cortex

    NeuroImage

    (2005)
  • RossionB. et al.

    Task modulation of brain activity related to familiar and unfamiliar face processing: an ERP study

    Clin. Neurophysiol.

    (1999)
  • RossionB. et al.

    Early lateralization and orientation tuning for face, word, and object processing in the visual cortex

    NeuroImage

    (2003)
  • SantosI.M. et al.

    Differential effects of object-based attention on evoked potentials to fearful and disgusted faces

    Neuropsychologia

    (2008)
  • SchutterD.J. et al.

    Functionally dissociated aspects in anterior and posterior electrocortical processing of facial threat

    Int. J. Psychophysiol.

    (2004)
  • SchweinbergerS.R. et al.

    Event-related potential evidence for a response of inferior temporal cortex to familiar face repetitions

    Cogn. Brain Res.

    (2002)
  • SchweinbergerS.R. et al.

    Human brain potential correlates of repetition priming in face and name recognition

    Neuropsychologia

    (2002)
  • SmithN.K. et al.

    May I have your attention, please: electrocortical responses to positive and negative stimuli

    Neuropsychologia

    (2003)
  • VuilleumierP. et al.

    Distributed and interactive brain mechanisms during emotion face perception: evidence from functional neuroimaging

    Neuropsychologia

    (2007)
  • Wild-WallN. et al.

    Interaction of facial expressions and familiarity ERP evidence

    Biol. Psychol.

    (2008)
  • WilliamsL.M. et al.

    The when and where of perceiving signals of threat versus non-threat

    NeuroImage

    (2006)
  • XuY. et al.

    The M170 is selective for faces, not for expertise

    Neuropsychologia

    (2005)
  • Cited by (0)

    View full text