Abstract
This study is part of an effort to map neural systems involved in the processing of emotion, and it focuses on the possible cortical components of the process of recognizing facial expressions. We hypothesized that the cortical systems most responsible for the recognition of emotional facial expressions would draw on discrete regions of right higher-order sensory cortices and that the recognition of specific emotions would depend on partially distinct system subsets of such cortical regions. We tested these hypotheses using lesion analysis in 37 subjects with focal brain damage. Subjects were asked to recognize facial expressions of six basic emotions: happiness, surprise, fear, anger, disgust, and sadness. Data were analyzed with a novel technique, based on three-dimensional reconstruction of brain images, in which anatomical description of surface lesions and task performance scores were jointly mapped onto a standard brain-space. We found that all subjects recognized happy expressions normally but that some subjects were impaired in recognizing negative emotions, especially fear and sadness. The cortical surface regions that best correlated with impaired recognition of emotion were in the right inferior parietal cortex and in the right mesial anterior infracalcarine cortex. We did not find impairments in recognizing any emotion in subjects with lesions restricted to the left hemisphere. These data provide evidence for a neural system important to processing facial expressions of some emotions, involving discrete visual and somatosensory cortical sectors in right hemisphere.
Clinical and experimental studies have suggested that the right hemisphere is preferentially involved in processing emotion in humans (Ley and Bryden, 1979; DeKosky et al., 1980; Ross, 1985; Silberman and Weingartner, 1986; Bowers et al., 1987, 1991;Blonder et al., 1991; Borod et al., 1992; Van Strien and Morpurgo, 1992; Borod, 1993; Darby, 1993). Earlier studies showed that damage to the right hemisphere can impair the processing of emotional faces or scenes (DeKosky et al., 1980) and that electrical stimulation of right temporal visual-related cortices can disrupt the processing of facial expressions (Fried et al., 1982). Several discrete sectors in the right hemisphere have been reported to result in defects in processing emotion. Lesions in the right temporal and parietal cortices have been shown to impair emotional experience and arousal (Heller, 1993) and to impair imagery for emotion (Blonder et al., 1991; Bowers et al., 1991), and it has been proposed that the right hemisphere contains modules for nonverbal affect computation (Bowers et al., 1993), which may have evolved to subserve aspects of social cognition (Borod, 1993).
Much recent work has focused on the visual recognition of emotion signaled by human facial expressions. Selective impairments in recognizing facial expressions, sparing the ability to recognize identity, can occur after right temporoparietal lesions (Bowers et al., 1985). Specific anomia for emotional facial expressions has been reported after right middle temporal gyrus lesions (Rapcsak et al., 1989, 1993). The evidence that the right temporoparietal cortex is important in processing emotional facial expressions is corroborated by data from PET imaging (Gur et al., 1994) and neuronal recording (Ojemann et al., 1992) in humans.
The above findings suggest, therefore, that damage to right temporal or parietal cortices can impair recognition of emotional facial expressions, but they leave open the possibility that only specific anatomical sectors are involved and that not all emotions are impaired equally, as has been reported recently with respect to subcortical structures (Adolphs et al., 1994, 1995). Accordingly, the purpose of the present study was to extend the characterization of the system components involved in recognizing facial expressions to a deeper level of anatomical detail and to relate the anatomical findings to distinct emotions as opposed to emotion in general.
Based on the findings reviewed above, we undertook to test the following hypotheses: (1) that higher-order sensory cortices within the right, but not the left, hemisphere would be essential to recognize emotion in facial expressions; and (2) that partly different sets of such cortical regions might be important in processing different basic emotions. We are aware, of course, that brain regions in frontal cortex and subcortical nuclei may also be involved in processing emotion, but the present study concentrates on investigating the contribution of sensory cortices, and on one aspect of emotion processing: that of recognizing facial expressions of emotion.
Previous studies have often relied on single case data and have used a variety of different experimental tasks, making comparisons and generalizations difficult. To obtain results that circumvent these problems, we tested our hypothesis in a large number of subjects with circumscribed lesions in left or right sensory neocortex on a carefully designed, quantitative task of the recognition of facial expressions of emotion (Adolphs et al., 1994, 1995), using both standard (Damasio and Damasio, 1989) and novel lesion analysis techniques. The results allow us to infer the existence of putative cortical systems important to processing facial expressions of emotions.
MATERIALS AND METHODS
Thirty-seven brain-damaged subjects [verbal IQ (WAIS-R) = 99 ± 10; age = 53 ± 16 (mean ± SD)] who were all right-handed participated in a task of the recognition of facial expressions of emotion. We compared their performances to the mean performance of 15 normal controls (7 males, 8 females) of similar age and IQ [estimated verbal IQ (NART-R) = 104 ± 7; age = 55 ± 13]. Brain-damaged subjects were selected from the Patient Registry of the Division of Behavioral Neurology and Cognitive Neuroscience at the University of Iowa and had been fully characterized neuroanatomically and neuropsychologically according to the standard protocols of the Benton Neuropsychology Laboratory (Tranel, 1996) and the Laboratory of Neuroimaging and Human Neuroanatomy (Damasio and Damasio, 1989; Damasio and Frank, 1992). For each brain-damaged subject, MR and/or CT scan data were available. Three-dimensional reconstructions of MR images were obtained wherever possible.
The neurological diagnoses of the subjects included stroke (n = 28), neurosurgical lobectomies for the treatment of epilepsy (n = 6), or herpes simplex encephalitis (n = 3).
Subject selection
Brain-damaged subjects were chosen on the basis of neuroanatomical criteria. Out of an initial pool of 68 subjects, we first chose any subjects who satisfied the inclusion and exclusion criteria below.
Inclusion criteria. Included were subjects with (1) stable, chronic lesions (>3 months post onset) (2) in primary or higher-order sensory cortices.
We included subjects with lesions of any size.
Exclusion criteria. We excluded 31 subjects before data analysis, for the following reasons: (1) no clear lesions were visible on CT or MR scans taken at the time the subject was tested on our task; (2) the subject had predominantly subcortical or prefrontal lesions; (3) the subject was judged to be too aphasic to give a valid task performance; or (4) the subject had questionable or atypical cerebral dominance.
These criteria yielded an initial group of 34 subjects. After an examination of the distribution of the sites of lesions of these subjects, we found it necessary to control for the fact that some subjects with very large, posteriorly centered lesions nonetheless had some involvement of frontal cortex. Although it was not an aim of this study to examine frontal cortex, we decided to add 3 subjects (#1331, #1656, and #1569) with lesions primarily in the right frontal lobe, specifically to control for those right frontal sectors that were also involved in some of our subjects who had lesions centered in the right parietal cortex.
This brought our final group to 37 subjects, 22 with unilateral right hemisphere lesions, 13 with unilateral left hemisphere lesions, and 2 with bilateral lesions. The 2 subjects with bilateral lesions were included in both the left hemisphere group and the right hemisphere group for neuroanatomical analyses, but were excluded from statistical comparisons of left versus right hemisphere damage (both subjects had lesions in primary visual cortex and turned out to perform entirely normally on our task).
Experimental tasks
Subjects were shown black-and-white slides of faces with emotional expressions and were asked to judge the expressions with respect to several verbal labels (the adjectives that corresponded to the emotions we showed), as described previously (Adolphs et al., 1994,1995). We chose 39 facial expressions from Ekman and Friesen (Ekman, 1976) that had all been shown to be identified reliably by normal subjects at >80% success rate. Each of the 39 expressions was presented 6 times in two blocks separated by several hours. Six faces (both male and female) each of anger, fear, happiness, surprise, sadness, and disgust, as well as three neutral faces were projected on a screen, one at a time, in randomized order. Subjects had in front of them cards with the names of the emotions typed in large print and were reminded periodically of these by the experimenter. Before each rating of the faces on a new emotion label, subjects were involved in a brief discussion that clarified the meaning of that label through examples. Subjects were asked to judge each face on a scale of 0–5 (0 = not at all, 5 = very much) on the following six labels: happy, sad, disgusted, angry, afraid, surprised (1 adjective per block of slides), in random order. There was no time limit. Subjects gave verbal responses whenever possible or pointed to the numbers on a scale if they could not give verbal responses. Care was taken to ensure that all subjects knew which label they were using for the rating and that they used the scale correctly. All subjects understood the labels, as assessed by their ability to comprehend scenarios pertaining to that emotional label.
Neuropsychological analysis
We calculated the correlations between a subject’s ratings of an expression on the six emotion labels and the mean rating given to that expression by 15 normal control subjects. This yielded a measure of recognition of facial expressions of emotion. The correlations wereZ-transformed to normalize their distribution, averaged over faces that expressed the same emotion, and inverseZ-transformed to give the mean Pearson correlation to normal ratings for each emotion category.
Neuroanatomical analysis
The neuroanatomical data were analyzed with a new method for quantitative visualization of lesion overlaps in two dimensions, MAP-2. We traced the surface damage of each subject’s brain in the group onto the corresponding regions of cortex in the image of a normal reference brain that had been reconstructed in three dimensions (Damasio and Frank, 1992). A straight lateral and mesial view were used. The method for transferring a lesion onto the normal brain is described below.
(a) In those cases in which a three-dimensional reconstruction of the lesioned brain was available, lateral and mesial views of the brain with the lesion were matched to the corresponding views of the normal brain. The surface contour of the lesion was then mapped onto the normal brain, taking into account its relation to sulcal and gyral landmarks (which had been color-coded previously in both brains).
(b) In those cases in which only two-dimensional MR or CT data were available, we used a modification of the template method (Damasio and Damasio, 1989) as follows.
(1) Using the program BRAINVOX (Damasio and Frank, 1992), the normal brain was resliced so as to match the slice orientation and thickness of the two-dimensional images of the lesioned brain. In this manner, we created a complete set of images matched for level and attitude between the two brains.
(2) For each matched pair of brain slices, we manually transferred the region that was lesioned from the subject’s brain onto the normal brain, taking care to maintain the same relations to identifiable anatomical landmarks.
(3) The cumulative transfer of lesions from each slice of the subject’s damaged brain onto the normal brain resulted in a series of normal brain slices with a trace of the subject’s entire lesion. When the normal brain slices were reconstructed in three dimensions, we obtained mesial and lateral views showing the lesion on the surface of the brain.
After lesions had been traced onto the normal reference brain, we verified the lesion transfer by visually comparing the lesion in the original subject’s brain to the transferred lesion in the normal reference brain. In all cases, the two representations of the subject’s lesion corresponded closely with respect to neuroanatomical landmarks.
We computed overlaps of subjects’ lesions so as to determine which lesion sites were shared among subjects. Additionally, we computed the mean neuropsychological scores associated with all the subjects who had lesions that included a particular neuroanatomical location, so as to obtain a measure of the extent to which different neuroanatomical loci contribute to task performance.
The lesion traces in the normal reference brain were convolved with a 2-pixel-wide Gaussian filter (pixel size = 0.937 mm). This minimized sharp discontinuities in the images by blurring the boundaries of the lesion trace. The composite traces for all the lesions, together with the neuropsychological data for each subject, were subsequently averaged as follows. Images were composed in a hue-saturation-lightness (HSL) space. Pixel hue was used to encode the average, or weighted average, scores of those subjects whose lesion included the pixel position; pixel saturation encoded the number of subjects who had lesions that included that pixel; and pixel lightness encoded the underlying view of the normal brain onto which the lesion and neuropsychological data were mapped. This procedure yielded a map of the superimposed lesions on the surface of the normal brain, color-coded to reflect the mean (or weighted mean) task performance score for all subjects who had a lesion that encompassed a particular neuroanatomical location.
We computed both mean and weighted mean neuropsychological scores in our analysis. Z-transforms of correlations were used in all averaging procedures. Mean scores are simply the average score of all the subjects whose lesion included a particular neuroanatomical location. Weighted mean scores are obtained by averaging subjects’ scores such that more weight is given to some subjects’ scores than to others, as described below. The rationale for computing weighted mean scores is that subjects with normal performances should contribute more to the mean performance index for a given pixel than subjects with impaired performances. Subjects with more normal performances, therefore, will tend to override subjects with more impaired performances when both share lesion sectors, consequently permitting us to infer which sectors are most important to normal task performance. For example, a subject with a large lesion might be impaired, but the lesion will give little information about the specific neuroanatomical substrate of the impairment. However, when other subjects with partly overlapping lesions perform normally, we can infer that the first subject’s impairment may depend on that sector of the lesion that does not overlap with the lesions of the subjects who performed normally.
Weighted mean scores were calculated by assigning a weight,w = 0.01 + 0.99/(1 + exp(−10(x − 0.5))), to each subject’s score (x), such that subjects with more normal scores (closer to 1) were weighted more than subjects with very defective scores (close to 0). The functionw(x) is a well behaved sigmoid function commonly used to sum inputs in neural network simulations. This method in effect subtracts from an impaired subject’s lesion all sectors that are shared in common with lesions of subjects who are not impaired, allowing us to focus on those sectors of the lesion that correlate best with defective performance. During our analysis, we examined a large number of different functions of the form w(x) that varied in steepness and offset. In all cases, the analysis converged on very similar results, indicating that the method is robust for the data in our sample.
Multiple interactive regression analysis
We wanted to control for the possibility that impaired recognition of facial expressions of emotion might be attributable to other defects. Of special interest were general visuoperceptual function, IQ, and measures of depression. We examined subjects’ scores on the following neuropsychological tests: verbal and performance IQ (Wechsler, 1981), perceptual matching of unfamiliar faces (Benton et al., 1983), judgment of line orientation (Benton et al., 1983), the Rey–Osterrieth complex figure test (copy), three-dimensional block construction (Benton et al., 1983), the D-scale of the Minnesota Multiphasic Personality Inventory (Greene, 1980), the Beck Depression Inventory (Beck, 1987), and naming and recognition of famous faces (Tranel et al., 1995). We used an interactive regression analysis so as to examine to what extent performances on our experimental tasks covaried with performances on these neuropsychological control tasks.
RESULTS
We first examined the effects on emotion recognition caused by side of lesion (left or right) and by the emotion to be recognized in the task (happy, surprised, afraid, angry, disgusted, or sad) with a 2 × 6 ANOVA, with side of lesion as a between-subjects factor and type of emotion as a within-subjects factor. There was a main effect of emotion type: performances differed significantly depending on the specific emotion (F = 13.0; p < 0.0001). There was no significant main effect of side of lesion (F = 2.2; p = 0.14), but a significant interaction between side of lesion and emotion (F = 4.0; p = 0.002), showing that subjects with right hemisphere damage did not differ from subjects with left hemisphere damage with respect to recognition of emotion in general, but that they did differ with respect to specific emotions, as we had predicted.
Analysis with respect to individual emotions revealed that different emotions were differentially impaired (Fig. 1). Recognition of happy emotions was not impaired, whereas recognition of several negative emotions, especially fear, was notably impaired. Recognition of happy faces differed significantly from recognition of all faces except angry faces, and recognition of afraid faces differed from recognition of all other faces (Scheffe test, p < 0.01). To analyze these data further with respect to the specific anatomical sectors that might be responsible for our results, we calculated surface overlap between lesions together with mean performance scores for all subjects (see Materials and Methods for details). The results are depicted on the lateral and mesial views of the left and right hemispheres of a normal brain in the following sections.
Left hemisphere lesions
Fifteen subjects with lesions of the left hemisphere were tested on their recognition of facial expressions of emotion. None had difficulty recognizing any facial expressions of emotion. We computed average performances for all subjects sharing a lesion locus, as detailed in Materials and Methods. We show the mean (unweighted) performance scores for subjects with left hemisphere lesions in Figure2a. To obtain a lower limit to subjects’ performance with regard to any emotion, we show the means of each subject’s lowest correlation on any of the six emotions.
Right hemisphere lesions
We tested 24 subjects with lesions of the right hemisphere. Several of these subjects were impaired on our task. The composite image pertaining to the analysis of right hemisphere lesions is shown in Figure 2b.
Initial analysis of mean performance scores showed that there are sectors in the right hemisphere that contribute differentially to impaired recognition of emotion (Fig. 2b). Anterior and inferior temporal cortex appeared not to be essential to the recognition of emotion in facial expressions, whereas parietal and mesial occipital cortices were involved when there was impaired recognition of emotion.
Subjects were not equally impaired on the recognition of all emotional expressions. The recognition of expressions of fear was the most impaired, whereas the recognition of expressions of happiness was not impaired (Figs. 1, 3). Although some subjects who were impaired in recognizing fear were also impaired in recognizing other negative emotions, the impaired recognition of negative emotions other than fear did not result in a mean impaired score at any anatomical location (Fig. 3). Possible exceptions to this observation are anger and sadness, which showed very small regions of somewhat impaired mean performance (Fig. 3); however, the relatively small number of subjects associated with these results (compare Fig. 1) does not allow us to draw any firm conclusions.
To examine directly the overlap of lesions of those subjects who were the most impaired in recognizing fear, we generated overlap images for various subject groups with respect to the lateral and mesial surfaces of the right hemisphere. We calculated the surface overlaps of the lesions of all subjects whose scores in recognition of fear were less than a specific cut-off. In all cases, this was equivalent to choosing the subject’s worst score on any emotion. We chose cut-offs of 0.5 and 0.3 and show these overlaps together with the lesion overlaps of the entire subject sample in Figure 4a. The maximal overlap of subjects with the most impaired performance is in parietal and mesial occipital sectors in right hemisphere. The top panel in Figure 4a shows the lesions of the entire subject pool and demonstrates that our results are not likely to be attributable to the way in which different neuroanatomical loci were sampled.
As an additional method to extract specific sectors that may account for impaired performance, we used a weighted mean analysis in which subjects with higher (more normal) scores were weighted more than subjects with lower (more impaired) scores. With this analysis, sectors shared by subjects who performed normally and by subjects whose performance was impaired would show up as essentially normal (see Materials and Methods for details). Our analysis suggested that specific and circumscribed sectors on the lateral and mesial surfaces of the right hemisphere were most important in contributing to impaired recognition of fear; we call such loci “hot-spots.” On the lateral surface of the brain, the territory of the supramarginal gyrus and the posterior sector of the superior temporal gyrus appear to be hot-spots with this approach. On the mesial surface of the brain, there appears to be a hot-spot in a sector of the infracalcarine cortex corresponding to the anterior segment of the lingual gyrus (Fig. 4b). To determine the reliability of these findings, we next conducted statistical comparisons between subject groups. With respect to recognition of fear, subjects whose lesions included one of the two hot-spots (n = 9) differed significantly from subjects whose lesions did not include a hot-spot (n = 28; Mann–Whitney U test, p < 0.0001).
Thus, the regions of maximal overlap of lesion for impaired subjects (Fig. 4a) and the “hot-spots” obtained from the weighted MAP-2 analysis of all subjects (Fig. 4b) both point to two neuroanatomical regions: the inferior parietal cortex and the mesial anterior infracalcarine cortex. With respect to our subject sample, lesions within either of these two areas are the most important contributors to impaired recognition of emotional facial expressions, specifically fear.
Relationships between the processing of different emotions and between the processing of emotions and other neuropsychological measures
We found that recognition of fear tends to be more consistently impaired by specific brain lesions than does recognition of other negative emotions and that recognition of happiness is never impaired. Does impaired recognition of some emotions covary within subjects? For each subject, we calculated Pearson correlations between the performance scores on all the different emotions (we calculated correlations between Z-transforms). The mean results of this analysis for all subjects are given in Table 1. The Bonferroni-corrected probabilities that these correlations are significant suggest that (1) damage that includes the right inferior parietal cortex results in recognition impairments that correlate for most negative emotions, especially fear and sadness, and (2) damage that includes the right anterior infracalcarine cortex results in recognition impairments that appear to be more specific to fear, and that correlate for surprise and fear. Recognition scores on happy expressions did not correlate with the recognition of any other emotion for any group of subjects, suggesting that happy expressions are processed differently from all other expressions.
We also wanted to investigate to what extent other factors such as visuoperceptual function, IQ, or depression might correlate with impaired recognition of facial expressions. We consequently examined subjects on a large number of neuropsychological tasks (see Materials and Methods; Table 2), including measures of visuoperceptual and visuospatial capability and depression. All of these variables, in addition to subject age and gender, were entered into an interactive multiple linear regression program so as to calculate the extent to which each of these variables could predict the scores on our experimental task of emotion recognition. Significant regressions were found only for the recognition of afraid and sad faces. For both of these emotions, age and performance IQ were the only significant predictors (Fig. 5). For fear, PIQt-ratio = 3.71 (p < 0.01), aget-ratio = −2.58 (p = 0.018), and Beck Depression Inventory t-ratio = 1.7 (p = 0.1; not significant); adjustedR2 = 52.1%. For sadness, PIQt-ratio = 2.15 (p = 0.038), aget-ratio = −2.12 (p = 0.041), and adjusted R2 = 30.5%. Thus, age and performance IQ correlate with recognition of facial expressions of fear and sadness, although these two factors could not fully account for the impairments in recognizing the emotions. Importantly, there was no correlation between performance on our experimental task and performance on visuoperceptual discrimination tasks (compare Fig. 5), showing that the impairments in emotion recognition cannot be attributed to impaired perception but, instead, reflect a difficulty in recognizing the emotion signalled by the perceived face.
To ensure that nonspecific visuoperceptual impairment could not account for our findings, we repeated our original ANOVA (first section of Results) with visuoperceptual performance as a covariate. We used the Benton Facial Discrimination Test, a task in which subjects have to match an unfamiliar face with one or more different aspects of that same face embedded in a number of other faces (compare Table 2 and Fig.5 for subjects’ scores on this task). This test provides a sensitive measure of the ability to discriminate between different people’s faces and provides the most relevant control task for our purposes, because our experimental task also used faces as stimuli. The ANCOVA of emotion × side of lesion, using the scores on the Benton task as a covariate, yielded the same significant effects as we reported above.
DISCUSSION
The most salient results in this study are as follows. First, no impairment in the processing of facial expressions of emotion was found in subjects with lesions restricted to left hemisphere; only damage in right hemisphere was ever associated with an impairment. Second, most of the impaired processing of facial expressions of emotion correlated with damage to two discrete regions in right neocortex: (1) the right inferior parietal cortex on the lateral surface, and (2) the anterior infracalcarine cortex on the mesial surface (Fig. 6). Third, expressions of happiness were recognized normally by all subjects. Fourth, the impaired recognition of facial expressions pertained to a few negative emotions, especially fear. An ANCOVA showed that these results cannot be explained on the basis of impaired visuoperceptual function but, instead, are specific to processing facial expressions of emotion. We attribute impaired recognition of facial expressions of fear to damage in the anatomical regions identified here, although it will be important to establish the reliability of this finding in additional subjects. The findings support the widely held notion that the right hemisphere contains essential components of systems specialized in the processing of emotion. However, the findings further suggest that impairments in the recognition of emotional facial expressions occur relative to discrete and specific visual and somatosensory cortical system components, and that processing different emotions draws on different sets of such components. The results also provide specific predictions for future studies with alternate methods, such as functional imaging studies in normal subjects.
The impaired recognition of emotion that we report might also be a consequence of damage to essential white matter communications between visual and somatosensory cortices. It is probable, in fact, that most lesions we reported in either infracalcarine or inferior parietal cortices also disrupt underlying white matter. Future studies will need to address the possibility that damage to such white matter connections could result in impaired recognition of facial expressions of emotion.
Different emotions are differentially impaired
None of our subjects was impaired in recognizing happy faces, whereas several subjects had difficulty recognizing certain negative emotions. In attempting to account for this result, we propose that two factors may have resulted in a relative separation of the neural systems that process positive or negative emotions. First, there are fewer kinds of positive than negative emotions, which probably makes it more difficult to distinguish among negative emotions, at a basic level, than among positive emotions. In fact, it seems possible that, at a basic level, there is only one positive emotion, happiness, and that recognizing happiness is thus a simpler task than recognizing specific negative emotions. Second, virtually all happy faces contain some variant of a stereotypic signal, the smile. Our findings are also consistent with EEG studies that suggest the right hemisphere may be specialized for processing negative, but not positive, emotions (Davidson and Fox, 1982; Davidson, 1992).
With respect to the especially impaired recognition of fear and sadness, there are two possible explanations. One is that these two emotions are the most difficult ones to process and, therefore, those whose recognition is most impaired. Another is that there may be specific systems for processing specific negative emotions such as fear. The presence of a significant interaction in the ANOVA of lesion group × emotion, and the finding that recognition of fear differed significantly from recognition of all other emotions, suggests that lesions in the right hemisphere regions specifically impair the processing of fear. Additionally, there is no evidence from normal subjects to suggest that fear is any more difficult to process than other emotional expressions (Ekman, 1976) (our unpublished observations). Instead, we believe that there are right hemisphere systems dedicated to processing stimuli that signal fear. This proposal is also consonant with lexical priming studies indicating that the right hemisphere may be specialized to process stimuli related to threat (Van Strien and Morpurgo, 1992).
Networks for the acquisition and retrieval of information about emotion
Our working hypothesis regarding the recognition of expressions of emotion proposes that perceptual representations of facial expressions (in early visual cortices) normally leads to the retrieval of information from diverse neural systems (located “downstream”), including those that represent pertinent past states of the organism’s body, and those that represent factual knowledge associated with certain types of facial expressions during development and learning (Damasio, 1994, 1995; Adolphs et al., 1995). The retrieval of previous body-state information would rely on structures such as the somatosensory and motor cortices, as well as limbic structures involved in visceral and autonomic/neuroendocrine control. Lack of access to such information would result in defective concept retrieval and, therefore, impaired performance on our task. (It should be clear that we are not suggesting that body-state information is accessed necessarily in the form of a conscious emotional experience during our task.)
The present findings on emotion recognition are thus especially interesting, because the right hemisphere is also preferentially involved in emotional expression and experience. For instance, there is substantial evidence that the right hemisphere has an important role in regulating the autonomic and somatovisceral components of emotion (Gainotti et al., 1993), and it has been proposed that right temporoparietal sectors regulate both the experience of emotion and autonomic arousal (Heller, 1993). When facial expressions are used as conditioned stimuli, conditioning of stimuli to autonomic responses is most vulnerable to right hemisphere lesions (Johnsen and Hugdahl, 1993), and right posterior hemisphere damage leads to impaired autonomic responses to emotionally charged stimuli (Morrow et al., 1981; Zoccolotti et al., 1982; Tranel and Damasio, 1994). Tachistoscopic presentation of emotional stimuli to the right hemisphere has been reported to result in larger blood pressure changes than do presentations to the left hemisphere (Wittling, 1990).
We would like to advance the hypothesis that the experience of some emotions, notably fear, during development would play an important role in the acquisition of conceptual knowledge of those emotions (by conceptual knowledge we mean all pertinent information, not just lexical knowledge). It seems plausible that the partial reevocation of such conceptual knowledge would be a prerequisite for the ability to recognize the corresponding emotions normally.
These considerations raise an important issue regarding the role of different neural systems in the acquisition and in the retrieval of information about emotions. We have described previously a subject who acquired bilateral amygdala damage early in life and who was impaired in recognizing facial expressions of fear (Adolphs et al., 1994, 1995). We subsequently reported in a collaborative study that two subjects who acquired bilateral amygdala damage in adulthood did not show the same impairment (Hamann et al., 1996). We believe that these findings support the following hypothesis: during development, the human infant/child acquires the connection between faces expressing fear and the conceptual knowledge of what fear is (which includes instances of the subject’s experience of fear). Such a process requires two neural components: (1) a structure that can link perceptual information about the face to information about the emotion that the face denotes; and (2) structures in which conceptual knowledge of the emotion can be recorded, and from where it can be retrieved in the future. Two candidates for structures fulfilling roles (1) and (2) would be, respectively, the amygdala and neocortical regions in the right hemisphere. Our previous data (Adolphs et al., 1994, 1995; Hamann et al., 1996) suggest that the amygdala is required during development so as to establish the networks that permit recognition of facial expressions of fear. Once established, however, these networks may function independently of the amygdala. The present study suggests two cortical sectors that are important components of the system by which adults retrieve knowledge about facial expressions of emotion. We therefore expect that impaired recognition of facial emotion could result from amygdala damage provided that the lesion occurred early in life, but could result from damage to right hemisphere cortical regions at any age. This framework is open to further testing in both human and nonhuman primates, part of which is currently under way in our laboratory.
Footnotes
This study was supported by National Institute of Neurological Diseases and Stroke Grant NS 19632. R.A. is a Burroughs-Wellcome Fund Fellow of the Life Sciences Research Foundation. We thank Randall Frank for help with image analysis and Kathy Rockland for many helpful discussions concerning neuroanatomy.
Correspondence should be addressed to Dr. Ralph Adolphs, Department of Neurology, University Hospitals and Clinics, 200 Hawkins Drive, Iowa City, IA 52242.