Skip to main content

Main menu

  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Collections
    • Podcast
  • ALERTS
  • FOR AUTHORS
    • Information for Authors
    • Fees
    • Journal Clubs
    • eLetters
    • Submit
  • EDITORIAL BOARD
  • ABOUT
    • Overview
    • Advertise
    • For the Media
    • Rights and Permissions
    • Privacy Policy
    • Feedback
  • SUBSCRIBE

User menu

  • Log in
  • My Cart

Search

  • Advanced search
Journal of Neuroscience
  • Log in
  • My Cart
Journal of Neuroscience

Advanced Search

Submit a Manuscript
  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Collections
    • Podcast
  • ALERTS
  • FOR AUTHORS
    • Information for Authors
    • Fees
    • Journal Clubs
    • eLetters
    • Submit
  • EDITORIAL BOARD
  • ABOUT
    • Overview
    • Advertise
    • For the Media
    • Rights and Permissions
    • Privacy Policy
    • Feedback
  • SUBSCRIBE
PreviousNext
ARTICLE, Behavioral/Systems/Cognitive

Emotion Processing in Chimeric Faces: Hemispheric Asymmetries in Expression and Recognition of Emotions

Tim Indersmitten and Ruben C. Gur
Journal of Neuroscience 1 May 2003, 23 (9) 3820-3825; DOI: https://doi.org/10.1523/JNEUROSCI.23-09-03820.2003
Tim Indersmitten
1Brain Behavior Laboratory, Department of Psychiatry, University of Pennsylvania, Philadelphia, Pennsylvania 19104
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Ruben C. Gur
1Brain Behavior Laboratory, Department of Psychiatry, University of Pennsylvania, Philadelphia, Pennsylvania 19104
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF
Loading

Abstract

Since the discovery of facial asymmetries in emotional expressions of humans and other primates, hypotheses have related the greater left-hemiface intensity to right-hemispheric dominance in emotion processing. However, the difficulty of creating true frontal views of facial expressions in two-dimensional photographs has confounded efforts to better understand the phenomenon. We have recently described a method for obtaining three-dimensional photographs of posed and evoked emotional expressions and used these stimuli to investigate both intensity of expression and accuracy of recognizing emotion in chimeric faces constructed from only left- or right-side composites. The participant population included 38 (19 male, 19 female) African-American, Caucasian, and Asian adults. They were presented with chimeric composites generated from faces of eight actors and eight actresses showing four emotions: happiness, sadness, anger, and fear, each in posed and evoked conditions. We replicated the finding that emotions are expressed more intensely in the left hemiface for all emotions and conditions, with the exception of evoked anger, which was expressed more intensely in the right hemiface. In contrast, the results indicated that emotional expressions are recognized more efficiently in the right hemiface, indicating that the right hemiface expresses emotions more accurately. The double dissociation between the laterality of expression intensity and that of recognition efficiency supports the notion that the two kinds of processes may have distinct neural substrates. Evoked anger is uniquely expressed more intensely and accurately on the side of the face that projects to the viewer's right hemisphere, dominant in emotion recognition.

  • emotion
  • chimeric faces
  • hemispheric asymmetry
  • brain laterality
  • face perception
  • facial expression

Introduction

There is considerable evidence that in humans (Christman and Wolff, 1943; Campbell, 1978; Sackeim and Gur, 1978; Sackeim et al., 1978; Rubin and Rubin, 1980; Borod et al., 1989; Wittling and Roschmann, 1993; Borod et al., 1997), monkeys (Hauser, 1993), and chimpanzees (Parr and Hopkins, 2000;Fernandez-Carriba et al., 2002), emotions are expressed more intensely in the left hemiface (LHF). Because most facial muscles, particularly in the lower part, are innervated by the contralateral hemisphere, this finding has been interpreted as support for the hypothesis of right hemispheric dominance for emotion processing (Benton et al., 1975;Schwartz et al., 1975; Sackeim et al., 1982; Natale et al., 1983; Christman and Hackworth, 1993; Hugdahl et al., 1993; Adolphs et al., 1996; Dimberg and Petterson, 2000). The method used for studying hemiface effects followed Wolff's (1943) chimeric faces approach. Thus, Sackeim et al. (1978) found higher intensity ratings (IRs) for left-side than for right-side composites for all negative emotions (anger, sadness, disgust, and fear) except for happiness.

Asymmetries in facial displays of emotions have implications for both expression and perception of emotions. In a face-to-face situation, the poser's LHF, which displays the higher intensity, falls into the perceiver's right visual field, which projects to the perceiver's left hemisphere. Considering right hemispheric overall dominance in emotion processing, and a perceiver bias to judge the left hemifaces as more similar to the whole face (Wolff, 1943; Gilbert and Bakan, 1973), this creates a situation in which the side of the poser's face that expresses greater emotional intensity is projected to the perceiver's hemisphere less adept at emotion processing. This byproduct of neuronal wiring in the human visual system may have an evolutionary advantage in that it compensates for a perceiver's bias by communicating greater intensity to the hemisphere that could miss subtler signals (Sackeim et al., 1978). Such hypotheses could be tested by examining hemiface asymmetries not only in intensity but also in the accuracy of conveying the emotions and by comparing posed to evoked expressions. These effects have not been examined.

The chimeric methodology has several limitations when standard photographs are used as stimulus material. Tilt effects are unpreventable and result in informational over-representation in one hemiface and under-representation in the other. Another shortcoming of extant stimuli is that emotional displays were obtained under varied conditions, without controlling for intensity or distinguishing between posed and evoked emotions. The differentiation between posed and evoked emotions is particularly central because of the dual role of facial expressions of emotion as reflecting hemispheric activation and in communication. Finally, although it has been assumed that higher intensity of expression portends better accuracy in recognizing the emotion, none of the studies have examined the relationship between expression intensity and recognition accuracy.

The first goal was to replicate the original finding by Sackeim et al. (1978) that composites made of the LHF [left–left (LL)-composites] are judged as more emotionally intense than right–right (RR)-composites. To avoid tilt effects in chimeric composites, the exact head-on position of faces was identified by using three-dimensional (3D) photographs of faces, expressing a set of emotions under standardized conditions (Gur et al., 2002). Our second goal was to investigate the laterality of composite effects on recognition accuracy. The hypothesis was that LL-composites differ from RR-composites in the efficiency with which they are recognized by observers. Although we expected higher intensity ratings to be associated with better recognition efficiency (RE), there was no previous research to justify a directional hypothesis. Our third goal was to examine the association between the laterality of intensity and accuracy.

Materials and Methods

Participants. Participants were 57 undergraduate students from Drexel University (Philadelphia, PA). Of these, eight were excluded because of previous medical conditions that affect brain function and 11 were excluded because of incomplete data, leaving a total of 38 participants (19 women, 19 men) for statistical analysis. All participants were right-handed and had between 12 and 14 years of education. Participants were between 17 and 31 years of age. The participants' ethnicity included four African Americans, 18 Caucasians, and 16 Asians.

Materials. Two tests were constructed from the Penn 3D facial emotion stimuli (Gur et al., 2002) and presented to the participants in a counterbalanced order. The same stimuli were used in both tests, with one test probing for intensity, the Penn Composite Intensity Rating Test (PCIRT), and the other probing for recognition of the emotion, the Penn Composite Recognition Efficiency Test (PCRET).

To create the stimuli for the PCIRT and PCRET, a total of 128 three-dimensional pictures of 16 actors (eight females and eight males) were selected from the available set. The group of actors included five African Americans, nine Caucasians, one Asian, and one Hispanic. Each actor showed four emotions (happiness, sadness, anger, and fear) in two conditions (posed and evoked). In the posed condition, the actor was told to show the specific emotion, whereas in the evoked condition, the actor was coached to recall an actual experience that has elicited the emotion and re-experience it. Evoked emotions were truly felt by the actor and can be considered to represent genuine, if not spontaneous, emotions. In contrast, posed emotions were merely displayed without any affective valence and can be operationalized as faked emotions.

Although expressions were available in the mild, moderate, and extreme intensities, only the moderately intense expressions were used in this study. All actors were between 12 and 63 years of age. For each face, the position of the head was corrected laterally, vertically, and medially by rotation using a virtual reality modeling language (VRML) player, until the face looked exactly toward the viewer in the precise head-on position. Positioning was done with the help of a wire skeleton of the texture created by the software (Fig.1).

Fig. 1.
  • Download figure
  • Open in new tab
  • Download powerpoint
Fig. 1.

VRML screenshot during the rotation of a three-dimensional face to determine the exact head-on position. A wire skeleton was used as texture to facilitate determination of the head-on position.

To create chimeric faces, duplicates of the pictures with the reversed orientation were generated in Photoshop (Adobe Systems, San Jose, CA). For every picture, the original and reversed versions were divided vertically through the midline, and the two left–left (LL-composites) and two right–right (RR-composites) hemifaces were combined to make composite faces with only LL or RR hemifaces (Fig.2). This procedure ensured that the composites were exactly symmetrical. These chimeric faces constituted the 256 stimuli that were used for both tests and presented in the same pseudorandomized order.

Fig. 2.
  • Download figure
  • Open in new tab
  • Download powerpoint
Fig. 2.

Examples of chimeric faces that are composed of only LL-composites or RR-composites of actors and actresses showing happiness, sadness, anger, and fear in the posed and evoked condition.

Procedure. In the PCIRT, participants had to provide intensity ratings of the stimuli between 0 and 100 in intervals of 10. In the PCRET, participants had to recognize the stimuli as happy, sad, angry, fearful, or no emotion, yielding measures of accuracy and speed (i.e., recognition efficiency). The no emotion response was always wrong, because none of the displays were emotionally neutral. In both tests, participants had a maximum response time of 6 sec for each stimulus. If participants did not respond, a missing value was assigned as response to the stimulus. The tests were implemented using the PowerLaboratory platform (MacLaboratory, Devon, PA) (Chute and Westall, 1997) running on Apple Macintosh computers (Apple Computers, Cupertino, CA). All participants were tested within 2 d. The testing session for both days took place between 9:00 A.M. and 4:00 P.M. The average test duration was 9 min for the PCIRT and 10 min for the PCRET.

Statistics. To test the hypotheses, laterality indices were calculated from the IRs and the RE data. The IR laterality for each LL- and RR-composite pair is defined as the percentage difference between the intensity ratings for the composites relative to the average rating for both: 100 × [(LL-composite − RR-composite)/0.5 × (LL-composite + RR-composite)]. An IR laterality of >0 means that LL-composites were judged to be more intense than RR-composites, as hypothesized. To obtain RE, we first calculated recognition efficiency defined as recognition accuracy divided by the logarithm of the reaction time (RT) for correct identifications. Thus, the RE laterality for each LL/RR-composite pair was defined as: 100 × [([NcorrectLL-composite −NcorrectRR-composite]/0.5 × [NcorrectLL-composite +NcorrectRR-composite])/(logRT[NcorrectLL-composite − NcorrectRR-composite]/0.5 × (logRT[NcorrectLL-composite +NcorrectRR-composite]))]. RE laterality of >0 means that LL-composites were recognized more efficiently than RR-composites.

Comparisons of condition (posed vs evoked) and effects of emotion on laterality scores were tested with a 2 (LL- vs RR-composites) × 2 (posed vs evoked conditions) × 4 (happiness, sadness, anger, and fear emotions) factorial general estimated equation model (GEE). Because of the different scales of the intensity rating (0–100) and the recognition accuracy (0–8), Spearman correlations were performed to test the association between intensity ratings and accuracy of recognition. All p values were two-tailed. The α level for rejecting the null hypothesis was set at p ≤ 0.05, and graphic presentation of results used means ± SEM, withn equal to the number of participants used for statistical analysis. All statistical procedures were performed with SAS (SAS Institute, Cary, NC) implemented on a Linux platform.

Results

IR laterality

The first hypothesis (i.e., that emotions are expressed more intensely in the LHF) was supported by the results of the factorial GEE. Emotion expressions in LL-composites were rated as significantly more intense than in RR-composites across all emotions, as indicated by a main effect of composite (χ2(1) = 10.76; p = 0.001). This was reflected by an overall IR laterality of 2.79 ± 0.65 (mean ± SEM). The effect for condition was also significant across all emotions (χ2(1) = 24.31; p< 0.0001) with IR significantly higher in the evoked (60.8 ± 1.6) than in the posed condition (55.8 ± 1.6). The three-way interaction of composite × condition × emotion was significant (χ2(4) = 17.73;p < 0.01) legitimizing the evaluation of composite laterality effects for each emotion without correction for multiple comparisons. To examine individual emotions and conditions, one sample two-tailed t tests were performed to determine whether the IR laterality differed significantly from zero. Happiness had a significant IR laterality, favoring LL-composites (2.66 ± 0.78;t(37) = 2.47; p < 0.001), as did sadness (5.57 ± 1.57;t(37) = 3.55; p < 0.001) and fear (2.63 ± 1.28;t(37) = 2.63; p < 0.01). No significant interaction of composite × condition was found in a 2 × 2 factorial GEE. For anger, there was a significant composite × condition interaction (χ2(3) = 9.08; p < 0.01). The laterality in the posed condition (5.07 ± 2.20) had the opposite direction of that in the evoked condition (−4.47 ± 1.54). The IR laterality values for emotions and conditions are shown in Figure 3. Descriptive statistics andt test results are shown in Table 1.

Fig. 3.
  • Download figure
  • Open in new tab
  • Download powerpoint
Fig. 3.

IR lateralities for happiness, sadness, anger, and fear in the posed and evoked conditions. A laterality of >0 indicates an IR laterality to LL-composites. A laterality of <0 indicates the opposite. A bracket with an asterisk means that the combined effect of posed and evoked emotions had a significant laterality effect.

RE laterality

The hypothesis that emotions are recognized more efficiently in one hemiface was tested with the same 2 × 2 × 4 factorial GEE design that was applied to the recognition efficiency scores. There was a main effect of composite (χ2(1) = 5.80; p = 0.016). Emotion expressions in RR-composites were recognized more efficiently (more accurate relative to shorter RT) than in LL-composites across emotions (−2.87 ± 1.69; mean ± SEM of RE laterality). There was a main effect for condition (χ2(1) = 12.43; p< 0.001), and evoked emotions were perceived more efficiently (0.64 ± 0.02) than posed (0.59 ± 0.02). The three-way interaction of composite × condition × emotion was also significant (χ2(4) = 14.76;p = 0.005).

View this table:
  • View inline
  • View popup
Table 1.

Descriptive statistics and t test results for IR lateralities of emotions and conditions

To decompose the three-way interaction, one sample two-tailedt tests were performed on RE laterality effects without correction for multiple comparisons. Across both conditions, happiness had a positive laterality effect (1.58 ± 0.72;t(37) = 2.19; p < 0.05), indicating a slight advantage for LL-composites. Sadness showed a negative laterality, indicating a better recognition efficiency for RR-composites (−8.68 ± 3.40;t(1) = 2.55; p = 0.01), whereas no significant RE laterality could be found in anger or fear. Examining the effect of condition, it was found that for happiness, emotions in the evoked condition were more efficiently recognized in LL-composites (2.98 ± 0.72;t(37) = 3.10; p < 0.01), whereas almost no RE laterality could be found in the posed condition (0.45 ± 1.06). The RE laterality in sadness and fear remained significantly different from zero only in the posed condition. In both cases, efficiency advantage was conferred to RR-composites. In anger, this advantage was significant for the evoked condition (t(37) = 2.84; p = 0.005). The RE lateralities for emotions and conditions are shown in Figure 4. Descriptive statistics andt test results are shown in Table2.

Fig. 4.
  • Download figure
  • Open in new tab
  • Download powerpoint
Fig. 4.

RE lateralities for happiness, sadness, anger, and fear in the posed and evoked conditions. A laterality of >0 indicates an RE laterality to LL-composites. A laterality of <0 indicates the opposite. A bracket with an asterisk means that the combined effect of posed and evoked emotions had a significant laterality effect.

View this table:
  • View inline
  • View popup
Table 2.

Descriptive statistics and t test results for RE lateralities of emotions and conditions

Correlation between IR and RE laterality

The third hypothesis was not confirmed. Across all emotions, no significant correlation between the IR and RE laterality could be found (r = −0.02; p > 0.05).

Discussion

IR laterality

Facial emotion expressions were rated as more intense in LL-composites. This finding supports the first hypothesis (i.e., that emotions are expressed more intensely in the LHF), implicating greater right hemispheric involvement in emotional expression. Our design permits generalization of the effect across emotions, regardless of whether the emotion is posed or evoked.

However, the overall effect of greater left hemiface intensity of emotional expression was moderated by a significant three-way interaction of hemiface by condition and emotion. This interaction was attributable to the sole exception to this rule, evoked anger, which was expressed more intensely in the right hemiface (RHF). This unpredicted finding should be replicated before it is accepted. As pointed out by Sackeim et al. (1978), evoked anger is unique in that its purpose is to prepare the organism for conflict. The greater RHF intensity of evoked anger would project to the perceiver's right hemisphere, potentiating the impact on the hemisphere more dominant in emotion processing and thereby increasing the likelihood that the intensity of the emotion will be appreciated by the perceiver. Thus, anger could be an evolutionarily important sign for action, which is elaborated more thoroughly by the left hemisphere (Buck, 1986).

Unlike previous studies reporting no asymmetry for happiness (Sackeim and Gur, 1978), the present results indicated more intense happiness for the LHF. However, the comparatively weak effect probably was the corollary of a ceiling effect. Although all expressions were selected to be in the moderately intense range, happy facial expressions have distinct easily identified features, and almost all were recognized with perfect accuracy. This prevented the formation of a large laterality effect in recognition efficiency. Future studies may use lower intensities for happiness, rendering recognition more difficult and thereby better equating performance across emotions.

RE laterality

The hypothesis that emotions are more efficiently recognized in one hemiface was also supported by our results. However, unlike intensity of expression, which was greater in LL-composites, facial emotion expressions were recognized more efficiently in RR-composites. This seems counterintuitive, because better recognition implies better accuracy of expression. In contrast, the expected correlation between intensity of expression and its ease of recognition, which probably exists across a wider range of intensities, was absent within the narrow range used in the present study, suggesting that the greater accuracy of expression in the RHF merits a more specific mechanistic explanation.

One possibility is that the LHF expresses emotions more intensely but also less specifically by mixing in other emotions. This possibility would be consistent with the model developed by Semmes (1968), which postulates a more diffuse functional organization in the right hemisphere compared with a more focal functional representation in the left hemisphere. The model was supported with anatomic data (Gur et al., 1980) and could also be tested with the present paradigm by examining whether intense expressions in the LHF do indeed contain a greater mixture of emotions. Another explanation is the greater right hemispheric involvement in emotion processing. Although emotional expressions are generated in the right hemisphere, the left hemisphere gives them the verbal label, which facilitates more accurate expression. This model would predict a shift in the time course of emotional expressions, a prediction that can be tested using movies of emotional displays. Differences in the time course of the response of the brain could be investigated with functional neuroimaging.

Another reason for the reversed hemiface effect of expression and recognition may relate to nonemotional aspects of facial processing, such as superiority of the left hemisphere in categorical relative to right hemisphere superiority in coordinate visuospatial processing (Slotnik et al., 2001). Manipulating the intensity of facial expressions resembles properties of a unidimensional coordinate task; the same action units have only to be exaggerated (Ekman et al., 1971). In contrast, manipulating the accuracy of facial expressions resembles properties of a multidimensional categorical task (i.e., action units not contributing to the expression efficiency of an emotion have to be shut off, whereas action units that do contribute to this have to be turned on). However, different action units have to be shut off or turned on in different emotions to increase expression accuracy. This distinguishes the categorical from the coordinate task (Kosslyn et al., 1992). In relation, the opposite asymmetry for emotion intensity and accuracy reflects left hemispheric dominance in analytic processing and right hemispheric dominance in holistic processing (Levy and Sperry, 1968; Gazzaniga et al., 1998). It is likely that expressing an emotion intensely is a holistic process, because versatile action units have to be activated. For example, a single inflated action unit would make the facial expression look bizarre rather than intense. In contrast, expressing an emotion more accurately is likely to be analytic, because the activation of a few action units is sufficient for identification (e.g., the smile in happiness or the dropping corners of the mouth in sadness). Future studies investigating scanpaths and eye movements (Walker-Smith et al., 1977; Rizzo et al., 1987; Loughland et al., 2002) could clarify this issue.

The overall RHF superiority in the accuracy of emotional expression was moderated by a significant three-way interaction of laterality × condition × emotion. Happiness, posed anger, and evoked sadness were exceptions in showing better efficiency of recognition for the LHF. These emotions also showed more intense expressions on the left, and perhaps because of their greater social acceptance, they are less subject to left hemispheric modulation.

Limitations

Possibly constraining the generalizability of the findings is the relatively small number of actors included. Therefore, the study should be replicated with a new sample of faces. Another limitation is that the posers were all professional actors and may not represent the general population. The advantage of using professional actors is that they feel comfortable in the filming environment and are used to displaying requested emotions. However, it is questionable whether an actor's posed and evoked emotional expressions are ecologically valid operationalizations for authentic ones. Because of the asymmetry in the human face, the chimeric faces used as stimulus material varied in the extent of resemblance to authentic faces. This could have biased judgments and reaction times. Other methods could be explored to test similar effects.

The ethnic diversity of our sample and stimuli, with almost equal distribution of Asian and Caucasian participants, raises questions about cultural and ethnic factors affecting emotion processing. Mandal et al. (2001) noted identical asymmetries for positive and negative emotions in composite hemifacial expressions for Japanese viewers, as reported by Sackeim and Gur (1978) for Caucasians. However, there is also evidence for differences in emotion processing between Asian and Caucasian cultures. Shioiri et al. (1999) found poorer emotion recognition but comparable emotion intensity ratings for Japanese individuals compared with Caucasians. Elfenbein and Ambady (2002)reported recognition accuracy to be higher when emotions were expressed and recognized by members of the same national, ethnic, or regional group, suggesting a within-group advantage. This advantage diminished in cultural groups that were exposed to each other. In addition, majority group members were poorer at judging minority group members than the reverse. The ethnic diversity of faces and responders in this study was designed to increase the generalizability of findings, and sample size is insufficiently powered to examine such interactions. However, such paradigms can be used to examine ethnic differences in emotion processing and neural substrates for phenomena such as xenophobia and ethnic conflict.

Finally, this study addresses hemispheric asymmetries in processing rather than visual field asymmetries. The symmetry of the chimeric faces guarantees identical presentation into the perceiver's left and right visual field. Therefore, such effects can only be explained by asymmetries in the expression of emotions by the poser and not by perceptual asymmetries in the perceiver. Experiments manipulating the relationship between the projected hemiface and visual field could help establish the relationship between the laterality of expression and that of perception.

Conclusion

The combination of evidence for greater intensity of emotions, with the exception of evoked anger, on the left side of the face and better accuracy of expression on the right challenges current conceptualizations of an overall right hemispheric dominance in emotion processing. It indicates the operation of a more complex system, in which hemispheric asymmetry interacts with perceiver bias to match the emotional valence in both hemifaces. The lack of a correlation between the IR and RE laterality suggests dissociated neural structures, not only for separable emotions (Reuter-Lorenz and Davidson, 1981;Sprengelmeyer et al., 1998; Gorno-Tempini et al., 2001; Harmer et al., 2001) but also for expression and recognition and other aspects of emotion processing. Our findings agree with those of Magnussen et al. (1994), who argue for separable mechanisms for emotion expression and face identity and suggest bilateral hemispheric contribution to the perceptual analysis of emotional signals, depending on strength of emotion expression and sign. A similar dissociation may exist between posed and evoked expressions, most dramatically seen in the reversal of effects for evoked compared with posed anger. A better understanding of these interactions could help elucidate neural substrates for facial emotion processing in humans.

Footnotes

  • This work was supported by National Institutes of Health Grant MH-60772 and the Bosworth Fund. We thank James Loughead for helping us test the participants, Warren Bilker and Claire McGrath for statistical advice, Larry Macy for computer assistance, Stace L. Moore for manuscript preparation, and several colleagues who contributed to various phases of this study.

  • Correspondence should be addressed to Ruben C. Gur, Brain Behavior Laboratory, Department of Psychiatry, University of Pennsylvania, 10th Floor, Gates Building, Philadelphia, PA 19104. E-mail:gur{at}bbl.med.upenn.edu.

References

  1. ↵
    1. Adolphs R,
    2. Damasio H,
    3. Tranel D,
    4. Damasio AR
    (1996) Cortical systems for the recognition of emotion in facial expressions. J Neurosci 16:7678–7687.
    OpenUrlAbstract/FREE Full Text
  2. ↵
    1. Benton A,
    2. Hannay HJ,
    3. Varney NR
    (1975) Visual perception of line direction in patients with unilateral brain disease. Neurology 25:907–910.
    OpenUrlAbstract/FREE Full Text
  3. ↵
    1. Borod JC,
    2. Vingiano W,
    3. Cytryn F
    (1989) Neuropsychological factors associated with perceptual biases for emotional chimeric faces. Int J Neurosci 45:101–110.
    OpenUrlCrossRefPubMed
  4. ↵
    1. Borod JC,
    2. Haywood CS,
    3. Koff E
    (1997) Neuropsychological aspects of facial asymmetry during emotional expression: a review of the normal adult literature. Neuropsychol Rev 7:41–60.
    OpenUrlCrossRefPubMed
  5. ↵
    1. Buck R
    (1986) The psychology of emotion. in Mind and brain: essays in cognitive neuroscience, eds Le Doux J, Hirst W (Cambridge UP, Cambridge, UK), pp 275–300.
  6. ↵
    1. Campbell A
    (1978) Asymmetries in interpreting and expressing a posed facial emotion. Cortex 14:327–342.
    OpenUrlCrossRefPubMed
  7. ↵
    1. Christman S,
    2. Hackworth MD
    (1993) Equivalent perceptual asymmetries for free viewing of positive and negative emotional expressions in chimeric faces. Neuropsychologia 31:621–624.
    OpenUrlCrossRefPubMed
  8. ↵
    1. Chute DL,
    2. Westall RF
    (1997) Power laboratory. (MacLaboratory Incorporated, Devon, PA).
  9. ↵
    1. Dimberg U,
    2. Petterson M
    (2000) Facial reactions to happy and angry facial expressions: evidence for right hemisphere dominance. Psychophysiology 37:693–696.
    OpenUrlCrossRefPubMed
  10. ↵
    1. Ekman D,
    2. Perch R,
    3. Friesen WV
    (1971) Constants across cultures in the face and emotion. J Pers Soc Psychol 17:124–129.
    OpenUrlCrossRefPubMed
  11. ↵
    1. Elfenbein HA,
    2. Ambady N
    (2002) Is there an in-group advantage in emotion recognition? Psychol Bull 128:243–249.
    OpenUrlCrossRefPubMed
  12. ↵
    1. Fernandez-Carriba S,
    2. Loeches A,
    3. Morcillo A,
    4. Hopkins WD
    (2002) Functional asymmetry of emotions in primates: new findings in chimpanzees. Brain Res Bull 57:561–564.
    OpenUrlCrossRefPubMed
  13. ↵
    1. Gazzaniga MS,
    2. Ivry RB,
    3. Mangun GR
    (1998) Cognitive neuroscience: the biology of the mind. (Norton, New York).
  14. ↵
    1. Gilbert C,
    2. Bakan P
    (1973) Visual asymmetries in perception of faces. Neuropsychologia 11:355–362.
    OpenUrlCrossRefPubMed
  15. ↵
    1. Gorno-Tempini ML,
    2. Pradelli S,
    3. Serafini M,
    4. Pagnoni G,
    5. Baraldi P,
    6. Porro C,
    7. Nicoletti R,
    8. Umita C,
    9. Nichelli P
    (2001) Explicit and incidental facial expression processing: an fMRI study. NeuroImage 14:465–473.
    OpenUrlCrossRefPubMed
  16. ↵
    1. Gur RC,
    2. Packer IK,
    3. Hungerbuhler JP,
    4. Reivich M,
    5. Obrist WD,
    6. Amarnek WS,
    7. Sackeim HA
    (1980) Differences in the distribution of gray and white matter in human cerebral hemispheres. Science 207:1226–1228.
    OpenUrlAbstract/FREE Full Text
  17. ↵
    1. Gur RC,
    2. Sara R,
    3. Hagendoorn M,
    4. Marom O,
    5. Hughett P,
    6. Turner T,
    7. Bajcsy R,
    8. Posner A,
    9. Gur RE
    (2002) A method for obtaining 3-dimensional facial expressions and its standardization for use in neurocognitive studies. J Neurosci Methods 115:137–143.
    OpenUrlCrossRefPubMed
  18. ↵
    1. Harmer CJ,
    2. Thilo KV,
    3. Rothwell JC,
    4. Goodwin GM
    (2001) Transcranial magnetic stimulation of medial-frontal cortex impairs the processing of angry facial expressions. Nat Neurosci 4:17–18.
    OpenUrlCrossRefPubMed
  19. ↵
    1. Hauser MD
    (1993) Right hemisphere dominance for the production of facial expression in monkeys. Science 261:475–477.
    OpenUrlAbstract/FREE Full Text
  20. ↵
    1. Hugdahl K,
    2. Iverson PM,
    3. Johnsen BH
    (1993) Laterality for facial expressions: does the sex of the subject interact with the sex of the stimulus face? Cortex 29:325–331.
    OpenUrlCrossRefPubMed
  21. ↵
    1. Kosslyn SM,
    2. Chabris CF,
    3. Marsolek CJ,
    4. Koenig O
    (1992) Categorical versus coordinate spatial relations: computational analyses and computer simulations. J Exp Psychol 18:562–577.
    OpenUrlCrossRef
  22. ↵
    1. Levy J,
    2. Sperry RW
    (1968) Differential perceptual capacities in major and minor hemispheres. Proc Natl Acad Sci USA 61:1151.
    OpenUrl
  23. ↵
    1. Loughland CM,
    2. Williams LM,
    3. Gordon E
    (2002) Schizophrenia and affective disorder show different visual scanning behavior for faces: a trait versus state-based distinction? Biol Psychiatry 52:338–348.
    OpenUrlCrossRefPubMed
  24. ↵
    1. Magnussen S,
    2. Sunde B,
    3. Dyrnes S
    (1994) Patterns of perceptual asymmetry in processing facial expression. Cortex 30:215–229.
    OpenUrlCrossRefPubMed
  25. ↵
    1. Mandal MK,
    2. Harizuka S,
    3. Bhushan B,
    4. Mishra RC
    (2001) Cultural variation in hemifacial asymmetry of emotion expressions. Br J Soc Psychol 40:385–398.
    OpenUrlCrossRefPubMed
  26. ↵
    1. Natale M,
    2. Gur RE,
    3. Gur RC
    (1983) Hemispheric asymmetries in processing emotional expressions. Neuropsychologia 21:555–565.
    OpenUrlCrossRefPubMed
  27. ↵
    1. Parr LA,
    2. Hopkins WD
    (2000) Brain temperature asymmetries and emotional perception in chimpanzees, Pan troglodytes. Physiol Behav 71:363–371.
    OpenUrlCrossRefPubMed
  28. ↵
    1. Reuter-Lorenz P,
    2. Davidson RJ
    (1981) Differential contributions of the two cerebral hemispheres to the perception of happy and sad faces. Neuropsychologia 19:609–613.
    OpenUrlCrossRefPubMed
  29. ↵
    1. Rizzo M,
    2. Hurtig R,
    3. Damasio AR
    (1987) The role of scanpaths in facial recognition and learning. Ann Neurol 22:41–45.
    OpenUrlCrossRefPubMed
  30. ↵
    1. Rubin P,
    2. Rubin R
    (1980) Differences in asymmetry of facial expression between left- and right-handed children. Neuropsychologia 18:373–377.
    OpenUrlCrossRefPubMed
  31. ↵
    1. Sackeim HA,
    2. Gur RC
    (1978) Lateral asymmetry in intensity of emotional expression. Neuropsychologia 16:473–481.
    OpenUrlCrossRefPubMed
  32. ↵
    1. Sackeim HA,
    2. Gur RC,
    3. Saucy MC
    (1978) Emotions are expressed more intensely on the left side on the face. Science 202:433–435.
    OpenUrl
  33. ↵
    1. Sackeim HA,
    2. Greenberg MS,
    3. Weiman AL,
    4. Gur RC,
    5. Hungerbuhler JP,
    6. Geschwind N
    (1982) Hemispheric asymmetry in the expression of positive and negative emotions. Arch Neurol 39:210–218.
    OpenUrlCrossRefPubMed
  34. ↵
    1. Schwartz GE,
    2. Davidson RJ,
    3. Maer F
    (1975) Right hemisphere lateralization for emotion in the human brain: interactions with cognition. Science 190:286–288.
    OpenUrlAbstract/FREE Full Text
  35. ↵
    1. Semmes J
    (1968) Hemispheric specialization: a possible clue to mechanism. Neuropsychologia 6:11–26.
    OpenUrlCrossRef
  36. ↵
    1. Shioiri T,
    2. Someya T,
    3. Helmeste D,
    4. Tang SW
    (1999) Cultural difference in recognition of facial emotional expression: contrast between Japanese and American raters. Psychiatry Clin Neurosci 53:629–633.
    OpenUrlCrossRefPubMed
  37. ↵
    1. Slotnik SD,
    2. Moo LR,
    3. Tesoro MA,
    4. Hart J
    (2001) Hemispheric asymmetry in categorical versus coordinate visuospatial processing revealed by temporary cortical deactivation. J Cognit Neurosci 13:1088–1096.
    OpenUrlCrossRefPubMed
  38. ↵
    1. Sprengelmeyer R,
    2. Rausch M,
    3. Eysel UT,
    4. Przuntek H
    (1998) Neural structures associated with recognition of facial expressions of basic emotions. Proc R Soc Lond B Biol Sci 265:1927–1931.
    OpenUrlPubMed
  39. ↵
    1. Walker-Smith GJ,
    2. Gale AG,
    3. Findlay JM
    (1977) Eye movement strategies involved in face perception. Perception 6:313–326.
    OpenUrlCrossRefPubMed
  40. ↵
    1. Wittling W,
    2. Roschmann R
    (1993) Emotion-related hemisphere asymmetry: subjective emotional responses to laterally presented films. Cortex 29:431–448.
    OpenUrlPubMed
  41. ↵
    1. Wolff W
    (1943) The expression of personality. (Harper, New York).
Back to top

In this issue

The Journal of Neuroscience: 23 (9)
Journal of Neuroscience
Vol. 23, Issue 9
1 May 2003
  • Table of Contents
  • Index by author
Email

Thank you for sharing this Journal of Neuroscience article.

NOTE: We request your email address only to inform the recipient that it was you who recommended this article, and that it is not junk mail. We do not retain these email addresses.

Enter multiple addresses on separate lines or separate them with commas.
Emotion Processing in Chimeric Faces: Hemispheric Asymmetries in Expression and Recognition of Emotions
(Your Name) has forwarded a page to you from Journal of Neuroscience
(Your Name) thought you would be interested in this article in Journal of Neuroscience.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Print
View Full Page PDF
Citation Tools
Emotion Processing in Chimeric Faces: Hemispheric Asymmetries in Expression and Recognition of Emotions
Tim Indersmitten, Ruben C. Gur
Journal of Neuroscience 1 May 2003, 23 (9) 3820-3825; DOI: 10.1523/JNEUROSCI.23-09-03820.2003

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Respond to this article
Request Permissions
Share
Emotion Processing in Chimeric Faces: Hemispheric Asymmetries in Expression and Recognition of Emotions
Tim Indersmitten, Ruben C. Gur
Journal of Neuroscience 1 May 2003, 23 (9) 3820-3825; DOI: 10.1523/JNEUROSCI.23-09-03820.2003
del.icio.us logo Digg logo Reddit logo Twitter logo Facebook logo Google logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • Introduction
    • Materials and Methods
    • Results
    • Discussion
    • Footnotes
    • References
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF

Keywords

  • emotion
  • chimeric faces
  • hemispheric asymmetry
  • brain laterality
  • face perception
  • facial expression

Responses to this article

Respond to this article

Jump to comment:

No eLetters have been published for this article.

Related Articles

Cited By...

More in this TOC Section

ARTICLE

  • Conditional Expression in Corticothalamic Efferents Reveals a Developmental Role for Nicotinic Acetylcholine Receptors in Modulation of Passive Avoidance Behavior
  • A Within-Subjects, Within-Task Demonstration of Intact Spatial Reference Memory and Impaired Spatial Working Memory in Glutamate Receptor-A-Deficient Mice
  • Netrin-1 Is a Chemorepellent for Oligodendrocyte Precursor Cells in the Embryonic Spinal Cord
Show more ARTICLE

Behavioral/Systems/Cognitive

  • Influence of Reward on Corticospinal Excitability during Movement Preparation
  • Identification and Characterization of a Sleep-Active Cell Group in the Rostral Medullary Brainstem
  • Gravin Orchestrates Protein Kinase A and β2-Adrenergic Receptor Signaling Critical for Synaptic Plasticity and Memory
Show more Behavioral/Systems/Cognitive
  • Home
  • Alerts
  • Visit Society for Neuroscience on Facebook
  • Follow Society for Neuroscience on Twitter
  • Follow Society for Neuroscience on LinkedIn
  • Visit Society for Neuroscience on Youtube
  • Follow our RSS feeds

Content

  • Early Release
  • Current Issue
  • Issue Archive
  • Collections

Information

  • For Authors
  • For Advertisers
  • For the Media
  • For Subscribers

About

  • About the Journal
  • Editorial Board
  • Privacy Policy
  • Contact
(JNeurosci logo)
(SfN logo)

Copyright © 2023 by the Society for Neuroscience.
JNeurosci Online ISSN: 1529-2401

The ideas and opinions expressed in JNeurosci do not necessarily reflect those of SfN or the JNeurosci Editorial Board. Publication of an advertisement or other product mention in JNeurosci should not be construed as an endorsement of the manufacturer’s claims. SfN does not assume any responsibility for any injury and/or damage to persons or property arising from or related to any use of any material contained in JNeurosci.