A common neural basis for receptive and expressive communication of pleasant facial affect
Introduction
The mental representation of facial expression includes both a visual code to describe the physical structure of the face and a motor-program code that provides a description of how to produce the expression (Calder et al., 2000a). Separate lines of investigation have provided evidence that the visual representation of facial affect is closely linked to its motor representation. First, humans tend to show emotional facial expressions that are congruent with the expressions displayed by the sender. When people are exposed to emotionally expressive faces, they spontaneously react with distinct facial electromyographic (EMG) reactions in emotion relevant facial muscles even when facial expressions are not consciously perceived (Dimberg and Petterson, 2000, Dimberg et al., 2000). Second, in the monkey's premotor and posterior parietal cortex, neurons have been described that discharge both when the monkey performs a specific goal-directed action and when it observes a similar action performed by another individual (di Pellegrino et al., 1992, Fogassi et al., 1998, Gallese et al., 1996, Rizzolatti et al., 1996a). Most importantly, mirror neurons located in lateral area F5 of the monkey's premotor cortex have been described that are triggered by the observation of communicative mouth gestures, that is, lip smacking (‘communicative mirror neurons’, Ferrari et al., 2003). Further evidence suggesting that the visual representation of facial affect may be intimately related to its motor representation comes from two recent functional imaging studies that addressed the neural correlates involved in imitation of static (Carr et al., 2003) and dynamic facial expressions (Leslie et al., 2004). These studies revealed that neural activation within several regions associated with observation of emotionally expressive faces including premotor and superior temporal areas as well as the insula and amygdala is modulated during imitation.
Together these findings suggest that the perception of emotional facial expressions in humans may incorporate neural circuitries implicated in their production. The actual brain regions involved in a shared representation network for perceiving and expressing specific emotions, however, are unclear. Whereas a recent functional imaging study has focussed on the role of emotional experience and its somatosensory representation in recognition of negative facial affect (Wicker et al., 2003), the present fMRI study addresses the possible involvement of expressive aspects in recognizing facial expressions. In contrast to previous imaging studies investigating modulatory effects of imitative facial movements on neural activity associated with observation of facial expressions (Carr et al., 2003, Leslie et al., 2004), the present study aimed to directly test for regions implicated in both observation and execution of emotional facial action. More specifically, we were interested in whether observation of a facial action pattern expressing pleasant emotion activates neural circuitries also involved in its execution.
To this purpose, BOLD-signal changes were monitored in 12 healthy subjects while they took part in four separate functional sessions. In two ‘observation sessions’ participants passively viewed movies depicting different faces with smiling or neutral expressions. During both ‘execution sessions’ they were asked to generate smile expressions by themselves or to keep their facial muscles relaxed and fixate on a static cross. Whereas the voluntary, non-imitative production of different negative facial expressions is difficult and involves complex coordination of various facial muscles, voluntary smiles can easily be produced within a short time interval by pulling the bilateral lip corners upward and are unambiguously recognized as signals of pleasant emotion (Floyd and Burgoon, 1999). Further, since voluntary smile expressions can easily be generated, the risk that subjects use different strategies (e.g., imagining facial expressions or emotionally arousing events) to generate facial expressions can be minimized. The investigation of pleasant facial affect therefore allows a close matching between observed and executed facial expressions which constitutes a prerequisite for the identification of shared neural representations involved in both observation and execution of emotional facial signals. Unlike voluntary smiling, the investigation of involuntary (emotion-induced) smiling or laughing does not allow to separate neural circuitries involved in action representation from those associated with the appreciation of humor and emotional experience. Since we aimed to examine the action representation system involved in the generation of emotional facial signals, emotion-induced facial expressions were not considered in the present study.
The occurrence of motion artefacts induced by facial movements was avoided using a compressed image acquisition protocol (Amaro et al., 2002), where facial action was timed to coincide with a short gap between the acquisition of each image volume where no data were acquired. To identify regions implicated in both perception and expression of pleasant facial affect, statistical analyses were focussed on overlaps between the activations determined by smile observation and execution. Based on the findings reviewed above, we hypothesized that perception of smile expressions activates sensorimotor circuitries also involved in their production.
Section snippets
Subjects
Twelve healthy volunteers (6 females, 6 males, mean age 24.5 years), all right handed according to the criteria of the Edinburgh inventory (Oldfield, 1971) participated in the study. Subjects gave written informed consent according to the Declaration of Helsinki. The study protocol was approved by the local Ethical Committee (‘Ethikkommission der Medizinischen Fakultät der Technischen Universität München’).
Facial expression stimuli
Stimuli were selected and edited from video sequences (30 frames/s) of 4 female and 4
Visual action network: smile observation
Activations related to observation of dynamic smiles compared to observation of neutral facial expressions (smile-neutral) are shown in Fig. 2 (green color) and local maxima of activated foci in Table 1. This contrast revealed prominent activations in the region of the bilateral occipito-temporal junction mainly corresponding to motion-sensitive visual area V5 (BA 19/37) (Watson et al., 1993) that extended caudally into earlier visual areas (BA 18) and ventrally into the fusiform face area of
Discussion
The results of the present study provide evidence for a common neural basis of perceiving and expressing pleasant facial affect. Regions involved in this shared representation network were located in the right premotor cortex and pars opercularis of the inferior frontal gyrus, right parietal operculum (SII) and left anterior insula. Observation of smile expressions further yielded signal increases within the posterior STS, fusiform gyrus and ventral amygdala. We will first examine the areas
Conclusion
The results of the present study provide evidence for a common neural basis of perceiving and expressing pleasant facial affect. This network includes areas concerned with motor as well as somato- and limbic-sensory processing, that is, premotor cortex and SII/anterior insula. Together with temporal regions serving the visual analysis of facial expressive features, a mechanism that maps the observed expressions onto neural circuitries associated with the production of these expressions and its
Acknowledgments
This study was supported by the Deutsche Forschungsgemeinschaft (Ce 4.1 and SFB 462, TP C3) and the Kommission Klinische Forschung des Klinikums rechts der Isar (KKF).
References (82)
- et al.
Social perception from visual cues: role of the STS region
Trends Cogn. Sci.
(2000) - et al.
Modulated activation of the human SI and SII cortices during observation of hand actions
NeuroImage
(2002) - et al.
Neural circuits underlying imitation learning of hand actions: an event-related fMRI study
Neuron
(2004) - et al.
Fear conditioning in humans: the influence of awareness and autonomic arousal on functional neuroanatomy
Neuron
(2002) - et al.
A PET exploration of the neural mechanisms involved in reciprocal imitation
NeuroImage
(2002) - et al.
Multisubject fMRI studies and conjunction analyses
NeuroImage
(1999) - et al.
Simulationist models of face-based emotion recognition
Cognition
(2005) - et al.
Activations related to “mirror” and “canonical” neurones in the human brain: an fMRI study
NeuroImage
(2003) - et al.
Functional magnetic resonance imaging of proactive interference during spoken cued recall
NeuroImage
(2002) - et al.
A touching sight: SII/PV activation during the observation and experience of touch
Neuron
(2004)