Skip to main content

Main menu

  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Collections
    • Podcast
  • ALERTS
  • FOR AUTHORS
    • Information for Authors
    • Fees
    • Journal Clubs
    • eLetters
    • Submit
  • EDITORIAL BOARD
  • ABOUT
    • Overview
    • Advertise
    • For the Media
    • Rights and Permissions
    • Privacy Policy
    • Feedback
  • SUBSCRIBE

User menu

  • Log in
  • My Cart

Search

  • Advanced search
Journal of Neuroscience
  • Log in
  • My Cart
Journal of Neuroscience

Advanced Search

Submit a Manuscript
  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Collections
    • Podcast
  • ALERTS
  • FOR AUTHORS
    • Information for Authors
    • Fees
    • Journal Clubs
    • eLetters
    • Submit
  • EDITORIAL BOARD
  • ABOUT
    • Overview
    • Advertise
    • For the Media
    • Rights and Permissions
    • Privacy Policy
    • Feedback
  • SUBSCRIBE
PreviousNext
ARTICLE

Temporal Cortex Activation in Humans Viewing Eye and Mouth Movements

Aina Puce, Truett Allison, Shlomo Bentin, John C. Gore and Gregory McCarthy
Journal of Neuroscience 15 March 1998, 18 (6) 2188-2199; DOI: https://doi.org/10.1523/JNEUROSCI.18-06-02188.1998
Aina Puce
1Neuropsychology Laboratory, Veterans Administration Medical Center, West Haven, Connecticut 06516, Departments of
2Neurosurgery,
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Truett Allison
1Neuropsychology Laboratory, Veterans Administration Medical Center, West Haven, Connecticut 06516, Departments of
3Neurology, and
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Shlomo Bentin
5Department of Psychology and Center for Neural Computation, Hebrew University, Jerusalem 91905, Israel
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
John C. Gore
4Diagnostic Radiology, Yale University School of Medicine, New Haven, Connecticut 06510, and
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Gregory McCarthy
1Neuropsychology Laboratory, Veterans Administration Medical Center, West Haven, Connecticut 06516, Departments of
2Neurosurgery,
3Neurology, and
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF
Loading

Abstract

We sought to determine whether regions of extrastriate visual cortex could be activated in subjects viewing eye and mouth movements that occurred within a stationary face. Eleven subjects participated in three to five functional magnetic resonance imaging sessions in which they viewed moving eyes, moving mouths, or movements of check patterns that occurred in the same spatial location as the eyes or mouth. In each task, the stimuli were superimposed on a radial background pattern that continually moved inward to control for the effect of movement per se. Activation evoked by the radial background was assessed in a separate control task. Moving eyes and mouths activated a bilateral region centered in the posterior superior temporal sulcus (STS). The moving check patterns did not appreciably activate the STS or surrounding regions. The activation by moving eyes and mouths was distinct from that elicited by the moving radial background, which primarily activated the posterior-temporal-occipital fossa and the lateral occipital sulcus—a region corresponding to area MT/V5. Area MT/V5 was also strongly activated by moving eyes and to a lesser extent by other moving stimuli. These results suggest that a superior temporal region centered in the STS is preferentially involved in the perception of gaze direction and mouth movements. This region of the STS may be functionally related to nearby superior temporal regions thought to be involved in lip-reading and in the perception of hand and body movement.

  • extrastriate cortex
  • eye movement
  • mouth movement
  • temporal lobe
  • superior temporal sulcus
  • gaze direction

Face recognition and analysis of facial expression form an important part of everyday interaction for humans and other primates. Electrophysiological studies in humans have demonstrated that discrete regions within the fusiform gyrus respond preferentially to faces and that stimulation of those regions can lead to transient prosopagnosia (Allison et al., 1994a,c). Neuroimaging data have provided further support for the role of ventral occipitotemporal cortex and, in particular, the fusiform gyrus in face perception (Sergent et al., 1992a; Haxby et al., 1994; Puce et al., 1995, 1996;Clark et al., 1996; Kanwisher et al., 1997; McCarthy et al., 1997). A close correspondence of ventral regions activated by faces in neuroimaging and electrophysiological studies has been recently demonstrated in the same individuals (Puce et al., 1997a). Activation by faces is not, however, limited to ventral occipitotemporal cortex. For example, in previous neuroimaging studies, we have shown discrete foci of activation to faces in lateral temporal cortex, particularly in the right hemisphere (Puce et al., 1995, 1996). We have also recorded event-related potentials (ERPs) sensitive to faces directly from lateral temporal cortex (Puce et al., 1997a).

Studies in nonhuman primates have suggested a functional differentiation of regions responsive to faces. Face-sensitive neurons are found within monkey inferior temporal (IT) cortex and within the superior temporal sulcus (STS) (Desimone, 1991; Gross, 1992; Perrett et al., 1992; Rolls, 1992). However, neurons within the STS are also sensitive to gaze and head direction and to face parts (Perrett et al., 1985, 1992; Yamane et al., 1988; Hasselmo et al., 1989). Some cells in the STS also respond to moving views of the head and body (Perrett et al., 1990) and to “biological motion” (Oram and Perrett, 1994) using point-light displays (Johansson, 1973).

It is possible that a similar functional distinction exists between ventral and lateral regions responsive to faces in humans. ERPs recorded directly from ventral cortex, primarily the fusiform gyrus, are larger to full faces than to isolated eyes (Allison et al., 1994b). By contrast, ERPs recorded over the lateral temporal scalp are larger to isolated eyes than to full faces (Bentin et al., 1996). Neuropsychological studies also suggest that portions of the temporal lobe are sensitive to face parts. For example, some patients with temporal lobe lesions are deficient in determining gaze direction, whereas others can no longer lip-read (Campbell et al., 1986; Perrett et al., 1988). Taken together, these results suggest the existence of neuronal systems sensitive to face parts located in lateral occipitotemporal cortex, in addition to face-perception mechanisms located in ventral occipitotemporal cortex.

In this study, we investigated the cortical activation patterns of subjects viewing faces in which the eyes repeatedly changed their direction of gaze or the mouth opened and closed. The results demonstrate that a region of superior temporal cortex, located primarily in the STS, is activated preferentially by moving eyes and mouths.

A preliminary report of these results has appeared (Puce et al., 1997b).

MATERIALS AND METHODS

Subjects. Eleven right-handed, neurologically normal subjects (six males) with an average age of 33.7 (range, 25–47) years participated in these studies. All subjects gave their informed consent for a protocol approved by the Human Investigation Committee of Yale University School of Medicine. Each subject participated in three to five imaging sessions.

Experimental tasks.There were six experimental tasks (Fig.1, top panel). Each consisted of two subtasks (A and B) that alternated throughout each imaging run as described previously (Puce et al., 1995, 1996). The duration of each subtask was 6 sec (Fig. 1, bottom panel). Fifteen AB cycles were presented during the 192 sec duration of each imaging run. Each task was replicated four times; i.e., four imaging runs were acquired. Two of these runs began with subtask A (ABAB …), and two runs began with subtask B (BABA …). The starting order was counterbalanced across imaging runs.

Fig. 1.
  • Download figure
  • Open in new tab
  • Download powerpoint
Fig. 1.

The top panel illustrates the six experimental tasks. In EYES, lateral eye movements were contrasted to a static face with the eyes looking straight ahead. In MOUTH, an open mouth was contrasted to a closed mouth in a static face. Eye movements were contrasted with mouth movements in the EYES versus MOUTH task. In SIMULATED (SIM) EYES and SIMULATED MOUTH, colored checkerboard patterns with checks reversing position in spatially equivalent positions (white arrows) to the real eyes and mouth were contrasted to a static checkerboard. In all of these tasks the radial background moved continuously in an inward direction (small white arrows) during the entire duration of the imaging run. In RADIAL, the face remained static, and the radial background either moved in the direction indicated by the white arrows or remained static. The effect of an inwardly moving radial background was generated by changing the color of the concentric rings on each frame (see bottom panel). Thebottom panel depicts a schematic of a single cycle in the ABAB alternating design for the EYES versus MOUTH task. The duration of each subtask (A or B) was 6 sec. During each subtask, a series of 10 images (600 msec duration) was shown. In subtask A, the eyes shifted their position from the center to either left or right and back to center in a random manner. In subtask B, the mouth closed on alternate frames.  

In the EYES task, one of two possible faces (male or female) was continuously present during the duration of the imaging run. In this and all other tasks, the male and female faces were used on alternate runs. The faces were in color and were superimposed on a radial background pattern consisting of three concentric black, white, and gray rings (Fig. 1, bottom panel). In subtask A, the eyes within the face seemed to move naturally from the center to the left and then back to center, or from the center to the right and then back to center. This apparent eye movement was achieved by presenting 10 successive pictures over the 6 sec duration of the subtask in which the eyes were either centered, fixated left, or fixated right, while the head stayed in register. The sequence of apparent eye movements was random, and there were equal numbers of right and left movements across runs. In subtask B, the eyes remained fixed at the center. Thus, the subject viewed alternating periods of eye movements and fixation on an otherwise stationary face. The purpose of this manipulation was to identify brain regions activated by movement of the eyes.

During both subtasks of EYES, the radial background seemed to continuously move inward. This radial motion was designed to activate brain regions sensitive to motion per se and to diminish their contribution to the activation differences between eye movement and eye fixation.

The MOUTH task was similar to the EYES task. In subtask A, the mouth within the face seemed to open and close. In subtask B, the mouth remained closed. The radial background moved continuously in both subtasks, as described above. The purpose of this task was to identify brain regions activated by mouth movements.

In the EYES versus MOUTH task (Fig. 1, bottom panel), subtask A consisted of moving eyes as described above for EYES. Subtask B consisted of the moving mouth as described above for MOUTH. Thus, in this task the subject viewed alternating periods of eye and mouth movements. This task was designed to identify activations specific to eye or mouth movement while de-emphasizing activations common to both types of movement. The radial background continuously moved inward as described above.

In the SIMULATED EYES task, the face was replaced by an oval equal in area to the average area of the two faces used in the previous tasks. The oval contained a rectangular pattern of checks, the overall luminance and contrast of which were equal to the average luminance and contrast of the two individual faces. The check colors were chosen from the red–green–blue values of the faces and their inner components. The exposed area of the continuously moving radial background was the same as in the previous tasks. In subtask A, checks similarly located within the rectangle as the eyes were located within the face made discrete left and right movements. These movements had identical timing to the eye movements as described above. The movements, however, were not conjugate to avoid the illusion that the flesh-colored pattern was an abstract face. In subtask B, the checks did not move. This task was designed to determine whether activations generated by the moving eyes in the EYES and EYES versus MOUTH tasks were simply because of movements in a specific part of the visual field.

In the SIMULATED MOUTH task, subtask A consisted of the movement of checks similarly located within the rectangle as the mouth was located within the face and equal in area. The movement of the checks mimicked the opening and closing movements of the mouth. No movements occurred in subtask B. The radial background moved continuously during both subtasks A and B.

In the RADIAL task, a stationary face was presented during the entire imaging run. In subtask A, the radial background moved inward as described above. However, in subtask B the radial background did not move. This task was designed to identify brain areas activated by the radial motion.

Subjects were instructed to attend to the stimulus on the screen and to focus on a point midway between the eyes of the face for the duration of each imaging run. Similarly, for the control conditions using the checkerboards, subjects were instructed to focus on a point in space identical to that on the real face. The eye movements of subjects were not monitored while they were in the scanner.

Three separate imaging sessions were required to complete all six tasks. EYES, MOUTH, and RADIAL were run together in a single session. EYES versus MOUTH was run in another session, and SIMULATED EYES and SIMULATED MOUTH were run in a third session. Eleven subjects completed the EYES, MOUTH, and RADIAL tasks. Nine subjects completed the SIMULATED EYES and SIMULATED MOUTH tasks, and eight subjects completed EYES versus MOUTH. In addition to the above, six subjects returned for additional sessions in which the EYES, MOUTH, and RADIAL tasks were repeated, but in which images were acquired in oblique axial planes. Experimental timing and stimulus presentation were controlled by computer. All stimuli were back-projected onto a translucent screen mounted at the end of the patient gurney. Subjects viewed stimuli through a mirror mounted on the head coil. All stimuli subtended a visual angle of 5.4 × 5.4°.

Images were acquired using a 1.5 T General Electric Signa scanner with a standard quadrature head coil and ANMR echoplanar subsystem (ANMR Systems, Inc., Wilmington, MA). The subject’s head was positioned along the canthomeatal line and immobilized using a vacuum cushion and forehead and chin straps. For the three sessions constituting the main experiment, T1-weighted sagittal scans were used to select seven contiguous coronal slices beginning at the posterior edge of the splenium. Functional images were acquired using a gradient-echo echoplanar sequence [repetition time (TR), 1500 msec; echo time (TE), 45 msec; α = 60°; number of excitations (NEX), 1; voxel size, 3.2 × 3.2 × 7 mm]. Each imaging run consisted of 128 images per slice. Four radio frequency excitations were performed before image acquisition to achieve steady-state transverse relaxation. Higher-resolution anatomical images for these seven slices were acquired using a T1-weighted sequence [TR, 500 msec; TE, 11 msec; NEX, 2; field of view (FOV), 24 cm; slice thickness, 7 mm; skip factor, 0; imaging matrix, 128 × 64]. Whole-brain axial images were acquired using a spoiled gradient-recalled acquisition in a steady state sequence (TR, 25 msec; TE, 5 msec; α = 45°; NEX, 2; FOV, 24 cm; slice thickness, 2 mm; skip factor, 0; imaging matrix, 256 × 192).

For the six subjects who repeated the EYES, MOUTH, and RADIAL tasks, functional images were acquired from seven contiguous oblique axial slices aligned parallel to, and centered on, the right STS. These additional imaging runs were included to explore regions of the temporal lobe anterior to the coronal slices used in the primary experiment.

Data analysis. All functional imaging runs were screened for movement and other artifacts by examining center of mass plots supplemented by visual inspection of the image series in a cine loop. Activated voxels were then identified for each subject and task. Three images from each subtask were used for analysis. These were offset by 4.5 sec from subtask onset to compensate for the hemodynamic delay; i.e., images occurring at 4.5, 6.0, and 7.5 sec after the onset of each subtask were used for analysis. Because there were 15 cycles per run and four imaging runs per task, 180 images per subtask were available for comparison. There were two run pairs per imaging session, each pair consisting of one run performed in an alternate task order (AB and BA). The alternate task orders were used to provide experimental replicates that would balance any systematic physiological artifacts such as change in breathing pattern, or physical artifacts associated with the onset of imaging. The AB run for each of the two run pairs was averaged into a single AB run, and an unpaired t test was performed voxel by voxel on that average. A similar unpaired t test was performed on the average of the two BA runs. The t test images from each replicate were then averaged. A criterion oft > 1.96 was used to identify positive activations in this resulting t map, i.e., nominally a p < 0.05 two-tailed test. However, because this criterion was applied to an average of two t maps, the probability of a voxel with purely random variation having a mean t value > 1.96 is 0.00125 (or 0.0025 when tested two-tailed). Activated voxels were then superimposed on higher-resolution anatomical images for each subject as the initial basis of analysis. Because the scaling involved with image interpolation can smooth the shape of the activation, all quantitative analyses were performed on uninterpolated activation images. The Talairach coordinates (Talairach and Tournoux, 1988) of activated voxels were then determined. Finally, the anatomical locations of activated voxels were determined by two investigators working together and were classified using the atlas of Duvernoy (1991). The activated voxels within each anatomical structure were then counted and further categorized as described below.

To simplify the initial anatomical analysis, the voxels were sorted into four anatomical groups based on contiguity and previous functional findings (Fig. 2). The dorsomedial region included the cingulate, superior parietal, superior occipital, angular, and supramarginal gyri, the intraparietal and cingulate sulci, and the precuneus. The lateral region included the superior temporal, middle temporal, inferior temporal, and middle occipital gyri, the Sylvian, superior temporal, inferior temporal, and lateral occipital sulci, and the parieto-temporo-occipital fossa (PTOF) (Vaina, 1994). The PTOF and nearby cortex is a movement-sensitive region (Watson et al., 1993;McCarthy et al., 1995; Tootell et al., 1995) probably homologous to monkey movement-sensitive areas MT/V5 and MST (Maunsell and Van Essen, 1983; Desimone and Ungerleider, 1986; Tanaka and Saito, 1989; Lagae et al., 1994). We will use the term PTOF to refer to an anatomically defined region and the term MT/V5 for functionally defined movement-sensitive cortex. The ventral region comprised the fusiform, inferior occipital and fourth occipital gyri, and the occipitotemporal and inferior occipital sulci. The ventral region includes those regions strongly activated by faces in previous functional magnetic resonance imaging (fMRI) studies (Puce et al., 1995, 1996; Clark et al., 1996;Kanwisher et al., 1997; McCarthy et al., 1997). The ventromedial region included the collateral and calcarine sulci, the lingual and cuneate gyri, and the cuneus and parieto-occipital fissure.

Fig. 2.
  • Download figure
  • Open in new tab
  • Download powerpoint
Fig. 2.

Four anatomical regions for classification of activated voxels (lateral, dorsomedial, ventromedial, and ventral) and their borders are outlined on the left side of a coronal anatomical image. Some of the structures falling within each region are shown on the right. STS, Superior temporal sulcus; MTG, middle temporal gyrus;ITS, inferior temporal sulcus; ITG, inferior temporal gyrus; OTS, occipitotemporal sulcus;FG, fusiform gyrus; CS, collateral sulcus; LG, lingual gyrus; CaS, calcarine sulcus; POF, parieto-occipital fissure;PrC, precuneus; Ci, cingulate gyrus and sulcus; SPG, superior parietal gyrus;IPS, intraparietal sulcus; AG/SuG, angular or supramarginal gyri.

Within subjects, repeated ANOVAs were computed in which the number of activated voxels was the dependent variable and the task, hemisphere (left or right), slice (1–7), and anatomical region (lateral, dorsomedial, ventromedial, and ventral) were independent variables. Four task comparisons were computed: (1) EYES, MOUTH, and RADIAL; (2) EYES, MOUTH, SIMULATED EYES, and SIMULATED MOUTH; (3) RADIAL, SIMULATED EYES, and SIMULATED MOUTH; and (4) the moving eyes and moving mouth subtasks from the EYES versus MOUTH task. Additional analyses were performed to look for anatomical patterns within the structures constituting the four anatomical regions.

RESULTS

Figure 3 presents results from a single subject for five experimental tasks. Five contiguous anatomical slices are shown, with the most anterior slice at the left. Discrete foci of activation (framed by white squares) in the right STS were observed in anterior slices 1 and 2 for EYES and MOUTH but not for SIMULATED EYES, SIMULATED MOUTH, or RADIAL. In contrast, all tasks activated the right PTOF in slice 4 (framed by white circles) with additional bilateral activation of the PTOF in slice 5. Similar patterns of activation were observed in the other subjects.

Fig. 3.
  • Download figure
  • Open in new tab
  • Download powerpoint
Fig. 3.

Individual subject activation data overlaid on T1-weighted coronal anatomical images. Slice 1 is the most anterior. In EYES and MOUTH, focal activation was observed in the right lateral cortex of the two most anterior slices (framed by white squares). No activation was seen in the same regions for the other tasks. Activation in all tasks (framed by white circles) was seen in another region of right lateral cortex posterior and inferior to that seen to EYES and MOUTH. In this and Figure 4, the right hemisphere appears on the left side of the image, and the red to yellow color scale indicates lower to higher t values of activation. In this and Figure 4, activation data have been scaled, translated, and interpolated to fit their anatomical counterparts.

Figure 4 presents results from another individual for EYES and RADIAL from coronal and axial imaging sessions. Activation of the right STS to EYES was observed in coronal slices 1–3 (Fig. 4A, white squares) and in the corresponding regions in axial slices 5 and 7 (Fig.4B, white squares). Less extensive activation of the left STS was also observed (Fig. 4, A, slice 2, B, slice 7, white squares). There was little or no activation to RADIAL in the STS in these slices. In contrast, activation common to both tasks was seen in the PTOF and the lateral occipital sulcus (LOS) (Fig. 4, A, coronal slices 4–7, corresponding regions in B, axial slices 1–4).

Fig. 4.
  • Download figure
  • Open in new tab
  • Download powerpoint
Fig. 4.

Individual subject activation data for EYES (top) and RADIAL (bottom) overlaid on T1-weighted anatomical images. A, Coronal slices 1–7. Slice 1 is the most anterior. A region of activation in the right lateral cortex is seen in slices 1–3 to EYES but not to RADIAL (white squares). Extensive activation of lateral cortex bilaterally occurs in slices 4–7 for both EYES and RADIAL. Activation in the IPS (white circles) was also seen to EYES anteriorly in slices1 and 2 and posteriorly to RADIAL in slices 6 and 7. B, Oblique axial slices (1–7) for the same subject and tasks. Slice 1 is the most ventral. Activation to EYES (white squares) but not to RADIAL is seen in slices5 and 7, as in A.

Activation was observed in the intraparietal sulcus (IPS) to EYES (Fig.4A, slices 1, 2, framed by white circles) and to RADIAL (Fig. 4A, slices 6, 7, framed bywhite circles). Activation was also observed in the calcarine cortex and collateral sulcus to RADIAL (Fig. 4, A, slices 4–7, B, slice 1).

Activation in the lateral region

Consistent with the illustrative data of Figures 3 and 4, the greatest number of activated voxels occurred within the lateral region for all conditions in all subjects. EYES and MOUTH produced activation mainly in the anterior slices (Fig. 5,top), whereas RADIAL produced activation mainly in slices 4 and 5 of the left hemisphere (p < 0.01 for task; p < 0.05 for slice; p < 0.01 for hemisphere × task × slice).

Fig. 5.
  • Download figure
  • Open in new tab
  • Download powerpoint
Fig. 5.

Voxel counts as a function of hemisphere (R, right; L, left) and slice for each region in 11 subjects. Lateral (top), dorsomedial (second from top), ventromedial (second frombottom), and ventral (bottom) for EYES (gray histograms), MOUTH (white histograms) and RADIAL (black histograms). Slice1 is the most anterior, and slice 7 is the most posterior. EYES elicited more activation in slices1–3 than the other two tasks, whereas RADIAL elicited the most prominent activation in slices 4–7 in the left hemisphere. In the dorsomedial region, the most prominent activation was elicited to EYES in slices 1 and 2 of the left hemisphere and to RADIAL in slices 5–7 of both hemispheres. In the ventromedial region, RADIAL elicited the most prominent activation in slices 4–7 of both hemispheres. The least activation was seen in the ventral region and was not different across tasks.

When the number of activated voxels in the right lateral region was examined as a function of anatomical structure, the combined STS and ITS accounted for 49 and 46% of the total activation for EYES and MOUTH, respectively (Fig. 6, left panel). The number of activated voxels for EYES was greater than that for MOUTH, but this difference did not reach statistical significance (p = 0.12). In contrast to the activation in STS and ITS by EYES and MOUTH, RADIAL mainly activated the left PTOF and the LOS (Fig. 6, right panel), which together accounted for 58% of the total activation across all slices in left lateral cortex.

Fig. 6.
  • Download figure
  • Open in new tab
  • Download powerpoint
Fig. 6.

Voxel counts as a function of hemisphere and anatomical structure for the lateral region for EYES (gray histograms), MOUTH (white histograms), and RADIAL (black histograms) in 11 subjects. In the right hemisphere, EYES produced the most activation in the STS, whereas in the left hemisphere the most activation occurred in the PTOF and LOS to radial. Syl, Sylvian fissure;STG, superior temporal gyrus; STS, superior temporal sulcus; MTG, middle temporal gyrus;ITS, inferior temporal sulcus; ITG, inferior temporal gyrus; PTOF, parieto-temporo-occipital fossa; LOS, lateral occipital sulcus;MOG, middle occipital gyrus.  

As illustrated in Figures 3 and 4, the activation in lateral cortex formed two discontinuous clusters, an anterior cluster elicited mainly by EYES and MOUTH and a posterior cluster elicited by all three tasks. The centroids of these clusters were calculated for EYES, MOUTH, and RADIAL. The anterior centroids were calculated in two ways: (1) an unrestricted method that included all activated voxels from the Sylvian fissure to the inferior temporal gyrus regardless of their proximity to the major activation cluster; and (2) a restricted method that included only activated voxels from the STS and ITS, in which the major activation occurred. A similar approach was used to calculate the posterior centroids. The unrestricted calculation included voxels from the PTOF, LOS, and middle occipital gyrus, whereas the restricted calculation included only voxels from the PTOF and LOS. As shown in Table 1, the centroids for the unrestricted and restricted calculations were virtually identical. Thus, the centroids calculated from the more restricted anatomical structures provide an accurate representation of the results. A graphical depiction of these centroids is shown superimposed on a sagittal view of a representative brain in Figure7, in which the close spatial correspondence of the anterior centroids for EYES and MOUTH can be appreciated. The spatial overlap of the posterior centroids for EYES, MOUTH, and RADIAL is also apparent.

View this table:
  • View inline
  • View popup
Table 1.

Activation centroids in Talairach coordinates (x, y, z) and SEM

Fig. 7.
  • Download figure
  • Open in new tab
  • Download powerpoint
Fig. 7.

Activation centroids to EYES, MOUTH, and RADIAL. Two centroids are shown: anteriorly for the STS/ITS and posteriorly for the PTOF/LOS for coronal and axial fMRI studies. A, Right hemisphere. B, Left hemisphere. Centroids are superimposed on a sagittal view of a representative brain, 44 mm from the midline. In this and Figure 9, coordinates in they-axis (horizontal) andz-axis (vertical) are in the system of Talairach and Tournoux (1988), and the anterior commissure–posterior commissure line (horizontal line) and the anterior commissure at y = 0 (vertical line) are shown. The SEs around the centers of activation (x, y, z) for the coronal studies were EYES anterior (left, 1, 1, 2; right, 1, 1, 2), MOUTH anterior (left, 1, 3, 2; right, 3, 2, 2), EYES posterior (left, 1, 1, 2;right, 2, 2, 2), MOUTH posterior (left, 9, 2, 2; right, 3, 4, 2), and RADIAL posterior (left 1, 2, 1; right 2, 2, 2).

The preferential activation of STS to EYES and MOUTH is shown in Figure8. Here, a single cycle of activation was created by averaging across all cycles for each of the EYES and MOUTH tasks for 10 subjects (one subject with no activation to any task was eliminated). The activated voxels in the right STS were interrogated across the image time series for all experimental runs. The magnetic resonance activation signal in the right STS (Fig. 8) increased steadily during eye or mouth movement and then decayed after movement cessation. The peak signal change was 0.7%. There was negligible activation in these same voxels by RADIAL.

Fig. 8.
  • Download figure
  • Open in new tab
  • Download powerpoint
Fig. 8.

Time course of activation of the right STS for a single 12 sec cycle averaged over all cycles in each task for 10 subjects. Percent signal change (%ΔS/S) is shown on the y-axis for EYES, MOUTH, and RADIAL. For the first 6 sec of the cycle the relevant stimulus is in motion, whereas for the second half of the cycle it is stationary. The right STS is activated by EYES (solid line) and MOUTH (broken line) but not by RADIAL (dotted line).

Fewer voxels were activated within the lateral region during the EYES versus MOUTH task. Only 40% of the voxels activated in EYES were activated by moving eyes within EYES versus MOUTH. Moving mouths within EYES versus MOUTH produced 67% of the voxels activated by MOUTH. These results suggest considerable overlap in the activation by EYES and MOUTH. The statistical analysis for EYES versus MOUTH (Fig. 3, slices 1, 2) showed only a significant main effect of slice (p < 0.01), indicating that activation occurred primarily in anterior slices.

Even fewer voxels were activated in SIMULATED EYES and SIMULATED MOUTH tasks. Significantly fewer voxels were activated when both control tasks were compared with EYES and MOUTH (task p < 0.01) or with RADIAL (task, p < 0.01). No other significant effects or interactions were noted. The restricted posterior activation centroids for the SIMULATED EYES and SIMULATED MOUTH were similar to those of EYES, MOUTH, and RADIAL (Table 1).

The results of the 11 subjects tested with coronal slices showed that the most consistent activation to EYES and MOUTH occurred in the most anterior slices. This raised the concern that our coronal slices may have been posterior to the main locus of activation. For this reason, the six subjects with the strongest activation in the lateral region were rescanned in an oblique axial-imaging study for the EYES, MOUTH, and RADIAL tasks. The patterns of activation in the axial study were similar to those seen in the coronal studies (Fig. 4). The restricted activation centroids in the STS and ITS for EYES and MOUTH in the axial study were similar to those in the coronal study for both hemispheres (Fig. 7, Table 1), indicating that the coronal study encompassed the main locus of activation to EYES and MOUTH. The posterior activation centroids in the axial study were virtually unchanged from the coronal study (Fig. 7, Table 1).

Activation in the dorsomedial region

Fewer voxels were activated within the dorsomedial region than the lateral region. The most prominent activation occurred for EYES in slices 1 and 2 of the left hemisphere (Figs. 4A, slices 1, 2, white circles, 5, second fromtop) and for RADIAL in slices 5–7 in both hemispheres (Figs. 4B, slices 6, 7, white circles, 5,second from top). These observations were confirmed by ANOVA, which revealed a significant main effect of task (p < 0.05) and a significant interaction effect of hemisphere × task × slice (p < 0.01). The IPS contributed 59, 45, and 70% of the activated voxels in EYES, MOUTH, and RADIAL, respectively. EYES preferentially activated the left anterior IPS, whereas RADIAL activated the posterior IPS in both hemispheres (Fig. 4, white circles).

Statistical comparison of the EYES versus MOUTH task revealed only a main effect for slice (p < 0.05), confirming that more activation occurred in the anterior slices. Greater activation occurred to EYES and MOUTH than to SIMULATED EYES and SIMULATED MOUTH in the anterior slices (task, p < 0.05; slice, p < 0.01, hemisphere × task × slice, p < 0.05). A comparison of RADIAL, SIMULATED EYES, and SIMULATED MOUTH revealed greater activation to RADIAL (task,p < 0.01; hemisphere × task × slice,p < 0.01).

Activation in the ventromedial region

Strong posterior activation occurred in slices 4–7 for the RADIAL task (slice, p < 0.01; task, p < 0.01), whereas EYES and MOUTH elicited negligible activation (Fig. 5,second from bottom). The collateral sulcus and the lingual gyrus combined produced 81% of the activation in this region.

Activation in the ventral region

The ventral region produced the fewest number of activated voxels of the four regions, with the greatest concentration occurring in slices 3–5 (slice, p < 0.05) but with no significant differences among RADIAL, EYES, and MOUTH (Fig. 5,bottom).

DISCUSSION

The major results of this study indicate that a region of the temporal lobe centered in the STS is activated when subjects view a face in which the eyes or mouth are moving (Figs. 7, 8). The active region comprises the posterior portion of the straight segment of the STS (Fig. 7). These activations were not attributable to movement per se. Nonfacial movement in the same part of the visual field as occupied by the eyes or mouth, or movement of a radial background, activated an area that was ventral and posterior to this region (the PTOF and LOS), corresponding to area MT/V5. As can be seen in Figure9, the activation centroids in MT/V5 in the present study correspond closely to those reported in other studies of nonbiological motion.

Fig. 9.
  • Download figure
  • Open in new tab
  • Download powerpoint
Fig. 9.

Centroids of activation (STS/ITS for EYES and MOUTH and PTOF/LOS for RADIAL) in this study compared with centroids of activation for the perception of hand action or body movement (Bonda et al., 1996), hand grasping (Rizzolatti et al., 1996; Grafton et al., 1996), silent lip-reading of numbers (Calvert et al., 1997), nonanimate movement (Watson et al., 1993; McCarthy et al., 1995; Tootell et al., 1995), and the perception of static faces (Puce et al., 1995, 1996).A, Right hemisphere. B, Left hemisphere. Centroids of activation are superimposed on two sagittal views of a representative brain.

These results suggest that a discrete region of cortex centered in the STS is involved in the perception of eye and mouth movement. That such regions may be lateralized is suggested by Campbell et al. (1986), who reported that a prosopagnosic patient with a right occipitotemporal lesion was deficient in determining direction of gaze but could lip-read normally, whereas a patient with a left occipitotemporal lesion was alexic and could not lip-read but could recognize familiar faces and determine direction of gaze normally. We found that changes in direction of gaze activated the right STS more than the left. However, this difference did not reach statistical significance (p = 0.10). Calvert et al. (1997) reported that silent lip-reading of words activated a bilateral region of the superior temporal gyrus (presumably including cortex within the STS) 2.2–3.0 cm anterior to the bilateral regions described here (Fig. 9). The STS may also participate in the perception of biological motion. When subjects viewed point-light simulations of hand action, body movement, object motion, and random motion, a region of the STS was activated by hand and body movement but not by the other movement tasks (Bonda et al., 1996). Their activations were 0.5–1.5 cm posterior and superior to the region activated in our study (Fig. 9). In positron emission tomographic studies, Rizzolatti et al. (1996) found that the observation of grasping movements activated the left middle temporal gyrus and STS centered at y = −36 (Rizzolatti et al., 1996) and at y = −21 (Grafton et al., 1996). This region is considerably anterior (Fig. 9) to the region activated by hand action by Bonda et al. (1996) for reasons that are unclear. However, taken together these studies strongly implicate the human STS and adjacent cortex in the perception of facial and body movements of other individuals.

In previous fMRI studies, we reported two regions activated by faces (Puce et al., 1995, 1996). The major activation occurred in ventral occipitotemporal cortex, primarily within the fusiform gyrus. It is notable that this region showed negligible activation in the present study, presumably because of the continuous presence of a face during each task. We also reported activation of lateral cortex by faces, including activation within the PTOF and in and near the STS (Fig. 9) (Puce et al., 1995, their Fig. 7). The activation of these same regions in the present study by moving eyes and mouths suggests a functional dissociation between the ventral and lateral regions activated by faces.

Further support for a functional dissociation in face processing is derived from differences we have observed between intracranial and scalp ERP recordings. An intracranial ERP (N200), recorded primarily from the fusiform gyrus (Allison et al., 1994a,c), is evoked predominantly by faces and to a lesser extent by nonface stimuli. N200 is larger to faces than to eyes and other face parts viewed in isolation (Allison et al., 1994b). A similar face-specific ERP (N170) can be recorded from the lateral temporal scalp. N170 is larger to eyes viewed in isolation than to faces, leading Bentin et al. (1996) to conclude that N170 reflects activity in a different eye-sensitive region of cortex. The neural generator of the scalp-recorded N170; hence, the location of the eye-sensitive region is unknown. Bentin et al. (1996) concluded on the basis of its location and orientation that the fusiform gyrus was an unlikely generator of N170 and instead proposed the occipitotemporal sulcus (OTS). The present study shows that moving eyes primarily activate the STS and not the OTS. The STS and adjacent surface cortex is favorably located for the generation of N170, but this issue is unresolved and complicated by the fact thatBentin et al. (1996) used static views of faces and isolated eyes. It may be that eye movement is necessary to engage the STS. However, combined with the present study, these results suggest that there are two separate systems participating in the processing of information relating to faces: a ventral region involved with faces and a lateral region concerned with face components, or the movement of face components. The former system would provide information necessary for the recognition of facial identity, whereas the latter would provide information necessary for the successful interpretation of facial gesture.

Direction of gaze is thought to be an important facial gesture. In monkeys, gaze direction is an important component of facial expressions, particularly those related to dominance and submission (Hinde and Rowell, 1962; Mendelson et al., 1982; Perrett et al., 1990;Perrett and Mistlin, 1990; Brothers and Ring, 1993). Given the importance of these facial signals, it is not surprising that some neurons in monkey temporal visual cortex (primarily in the STS) are sensitive to eye and head direction (Hasselmo et al., 1989; Perrett et al., 1985, 1992). These neurons may play a role in what Perrett et al. (1992) call “social attention,” or cells that signal the direction of another individual’s attention. In the monkey temporal lobe, cells responsive to direction of gaze tend to be located within the STS, whereas cells responsive to face identity tend to be located in adjacent inferior temporal cortex (Yamane et al., 1988; Hasselmo et al., 1989; Perrett et al., 1990, 1992). In humans and monkeys, direction of gaze provides information in social situations, expresses intimacy, and allows inferences about the direction of attention of another individual (Kleinke 1986; Perrett and Mistlin, 1990). We suggest that the superior temporal region activated by moving eyes (Fig. 9) is involved in the perception of direction of gaze.

This same region of superior temporal cortex also responded to mouth movement (Fig. 9). In monkeys, mouth movements are also an important component of facial gesture. For example, mouth opening and teeth baring are components of threat or fear for many species, whereas “smiling” denotes submission or a positive affect (Chevalier-Skolnikoff, 1973; Redican, 1982). It is possible that in humans the STS and surrounding cortex are involved in the interpretation of facial gestures involving the mouth. We have interpreted our results to mean that the activated portion of the STS is preferentially involved in the perception of dynamic facial movement. Although plausible, this interpretation remains unproven, because (1) we have not studied activation evoked by eye and mouth movement compared with static views of direction of gaze or mouth configuration; (2) we have not studied the possible activation of this region by complex but inanimate objects, e.g., a swinging pendulum; and (3) the responsiveness of monkey STS cells to moving eyes and mouths has not yet been reported.

Aside from the activations already discussed, the only other substantial activation occurred bilaterally in the IPS. The IPS is a large structure and likely functionally diverse. For example, it is activated by viewing gratings (Gulyás and Roland, 1995), by viewing letter strings and faces (Puce et al., 1996), and by reading music (Sergent et al., 1992b). The functional significance of IPS activation in this study is unknown. However, the radial task primarily activated the posterior portion of the IPS, suggesting that this region may be a component of the dorsal visual pathway dealing with movement and spatial location.

Finally, we note that EYES activated area MT/V5 in the right hemisphere only slightly less than did RADIAL (Fig. 6), although the radial background moved continuously during the EYES task. Thus, the continuously moving radial background did not control movement per se in the EYES task in MT/V5. We consider four possible explanations. First, EYES may have activated a population of MT/V5 cells responsive to more central portions of the visual field, in addition to the cells responsive to the peripheral radial background. This explanation does not, however, account for the relative lack of MT/V5 activation by MOUTH, SIMULATED EYES, and SIMULATED MOUTH, which also included movements in the central portions of the visual field. Second, MT/V5 may be more sensitive to coherent motion, such as that produced by conjugate eye movements, than by the noncoherent motion of the other tasks. Third, activation of MT/V5 above that elicited by the moving radial background may represent attentional modulation (O’Craven et al., 1997). Moving eyes may be a highly salient stimulus and thus may engage attention more than the other tasks. Last, MT/V5, or a subregion of it, may in fact be sensitive to moving eyes. Single-unit recordings in monkey MT/MST have determined its responsiveness to moving slits, dots, optical flow, and other kinds of nonbiological movement (Maunsell and Van Essen, 1983; Desimone and Ungerleider, 1986; Tanaka and Saito, 1989; Lagae et al., 1994). A portion of STS receives input from MST (Baizer et al., 1991). If the human STS has a similar connectivity, the region of STS described here may receive input from a region of MT/V5 that itself is responsive to eye movement. Whether a population of cells preferentially responsive to movements of animate objects is present in monkey MT/MST, and whether such results could explain the activation of MT/V5 by eye movements in this study, remain to be determined.

Footnotes

  • This work was supported by the Department of Veterans Affairs, by the US-Israel Binational Science Foundation, and by National Institutes of Mental Health Grant MH-05286. We thank H. Sarofin for assistance.

    Correspondence should be addressed to Dr. Aina Puce, Neuropsychology Laboratory 116B1, Veterans Administration Medical Center, West Haven, CT 06516.

REFERENCES

  1. ↵
    1. Allison T,
    2. Ginter H,
    3. McCarthy G,
    4. Nobre A,
    5. Puce A,
    6. Luby M,
    7. Spencer DD
    (1994a) Face recognition in the human extrastriate cortex. J Neurophysiol 71:821–825.
    OpenUrlPubMed
  2. ↵
    1. Allison T,
    2. McCarthy G,
    3. Belger A,
    4. Puce A,
    5. Luby M,
    6. Spencer DD,
    7. Bentin S
    (1994b) What is a face?: electrophysiological responsiveness of human extrastriate visual cortex to human faces, face components, and animal faces. Soc Neurosci Abstr 20:316.
    OpenUrl
  3. ↵
    1. Allison T,
    2. McCarthy G,
    3. Nobre A,
    4. Puce A,
    5. Belger A
    (1994c) Human extrastriate visual cortex and the perception of faces, words, numbers, and colors. Cereb Cortex 5:544–554.
    OpenUrl
  4. ↵
    1. Baizer JS,
    2. Ungerleider LG,
    3. Desimone R
    (1991) Organization of visual inputs to the inferior temporal and parietal cortex in macaques. J Neurosci 11:168–190.
    OpenUrlAbstract/FREE Full Text
  5. ↵
    1. Bentin S,
    2. Allison T,
    3. Puce A,
    4. Perez E,
    5. McCarthy G
    (1996) Electrophysiological studies of face perception in humans. J Cognit Neurosci 8:551–565.
    OpenUrlCrossRefPubMed
  6. ↵
    1. Bonda E,
    2. Petrides M,
    3. Ostry D,
    4. Evans A
    (1996) Specific involvement of human parietal systems and the amygdala in the perception of biological motion. J Neurosci 16:3737–3744.
    OpenUrlAbstract/FREE Full Text
  7. ↵
    1. Brothers L,
    2. Ring B
    (1993) Mesial temporal neurons in the macaque monkey with responses selective for aspects of social stimuli. Behav Brain Res 57:53–61.
    OpenUrlCrossRefPubMed
  8. ↵
    1. Calvert GA,
    2. Bullmore ET,
    3. Brammer MJ,
    4. Campbell R,
    5. Williams SCR,
    6. McGuire PK,
    7. Woodruff PWR,
    8. Iversen SD,
    9. David AS
    (1997) Activation of auditory cortex during silent lipreading. Science 276:593–596.
    OpenUrlAbstract/FREE Full Text
  9. ↵
    1. Campbell R,
    2. Landis T,
    3. Regard M
    (1986) Face recognition and lip-reading: a neurological dissociation. Brain 109:509–521.
    OpenUrlCrossRefPubMed
  10. ↵
    1. Chevalier-Skolnikoff S
    (1973) Facial expression of emotion in nonhuman primates. in Darwin and facial expression: a century of research in review, ed Ekman P (Academic, New York), pp 11–89.
  11. ↵
    1. Clark VP,
    2. Keil K,
    3. Maisog JM,
    4. Courtney S,
    5. Ungerleider LG,
    6. Haxby JV
    (1996) Functional magnetic resonance imaging of human visual cortex during face matching: a comparison with positron emission tomography. NeuroImage 4:1–15.
    OpenUrlCrossRefPubMed
  12. ↵
    1. Desimone R
    (1991) Face-selective cells in the temporal cortex of monkeys. J Cognit Neurosci 3:1–8.
    OpenUrlCrossRefPubMed
  13. ↵
    1. Desimone R,
    2. Ungerleider LG
    (1986) Multiple visual areas in the caudal superior temporal sulcus of the macaque. J Comp Neurol 248:164–189.
    OpenUrlCrossRefPubMed
  14. ↵
    1. Duvernoy H
    (1991) The human brain. (Springer, Vienna).
  15. ↵
    1. Grafton ST,
    2. Arbib MA,
    3. Fadiga L,
    4. Rizzolatti G
    (1996) Localization of grasp representations in humans by positron emission tomography. 2. Observation compared with imagination. Exp Brain Res 112:103–111.
    OpenUrlPubMed
  16. ↵
    1. Gross CG
    (1992) Representation of visual stimuli in inferior temporal cortex. Philos Trans R Soc Lond B Biol Sci 335:3–10.
    OpenUrlAbstract/FREE Full Text
  17. ↵
    1. Gulyás B,
    2. Roland PE
    (1995) Cortical fields participating in spatial frequency and orientation discrimination: functional anatomy by positron emission tomography. Hum Brain Mapp 3:133–152.
    OpenUrlCrossRef
  18. ↵
    1. Hasselmo ME,
    2. Rolls ET,
    3. Baylis GC,
    4. Nalwa V
    (1989) Object-centered encoding by face-selective neurons in the cortex in the superior temporal sulcus of the monkey. Exp Brain Res 75:417–429.
    OpenUrlCrossRefPubMed
  19. ↵
    1. Haxby JV,
    2. Horwitz B,
    3. Ungerleider LG,
    4. Maisog JM,
    5. Pietrini P,
    6. Grady CL
    (1994) The functional organization of human extrastriate cortex: a PET-rCBF study of selective attention to faces and locations. J Neurosci 14:6336–6353.
    OpenUrlAbstract/FREE Full Text
  20. ↵
    1. Hinde RA,
    2. Rowell TE
    (1962) Communication posture and facial expression in the rhesus monkey (Macaca mulatta). Proc Zool Soc Lond 138:1–21.
    OpenUrl
  21. ↵
    1. Johansson G
    (1973) Visual perception of biological motion and a model of its analysis. Percept Psychophys 14:202–211.
    OpenUrl
  22. ↵
    1. Kanwisher N,
    2. McDermott J,
    3. Chun MM
    (1997) The fusiform face area: a module in human extrastriate cortex specialized for face perception. J Neurosci 17:4302–4311.
    OpenUrlAbstract/FREE Full Text
  23. ↵
    1. Kleinke CL
    (1986) Gaze and eye contact: a research review. Psychol Bull 100:78–100.
    OpenUrlCrossRefPubMed
  24. ↵
    1. Lagae L,
    2. Maes H,
    3. Raiguel S,
    4. Xiao D-K,
    5. Orban GA
    (1994) Responses of macaque STS neurons to optic flow components: a comparison of areas MT and MST. J Neurophysiol 71:1597–1626.
    OpenUrlPubMed
  25. ↵
    1. Maunsell JHR,
    2. Van Essen DC
    (1983) Functional properties of neurons in middle temporal visual area of the macaque monkey. I. Selectivity for stimulus direction, speed, and orientation. J Neurophysiol 49:1127–1147.
    OpenUrlPubMed
  26. ↵
    1. McCarthy G,
    2. Spicer M,
    3. Adrignolo A,
    4. Luby M,
    5. Gore J,
    6. Allison T
    (1995) Brain activation associated with visual motion studied by functional magnetic resonance imaging in humans. Hum Brain Mapp 2:234–243.
    OpenUrlCrossRef
  27. ↵
    1. McCarthy G,
    2. Puce A,
    3. Gore JC,
    4. Allison T
    (1997) Face-specific processing in the human fusiform gyrus. J Cognit Neurosci 9:604–609.
    OpenUrl
  28. ↵
    1. Mendelson MJ,
    2. Haith MM,
    3. Goldman-Rakic PS
    (1982) Face scanning and responsiveness to social cues in infant rhesus monkeys. Dev Psychol 18:222–228.
    OpenUrlCrossRef
  29. ↵
    1. O’Craven KM,
    2. Rosen BR,
    3. Kwong KK,
    4. Treisman A,
    5. Savoy RL
    (1997) Voluntary attention modulates fMRI activity in human MT-MST. Neuron 18:591–598.
    OpenUrlCrossRefPubMed
  30. ↵
    1. Oram MW,
    2. Perrett DI
    (1994) Responses of anterior superior temporal polysensory (STPa) neurons to “biological motion” stimuli. J Cognit Neurosci 6:99–116.
    OpenUrlCrossRefPubMed
  31. ↵
    1. Perrett DI,
    2. Mistlin AJ
    (1990) Perception of facial characteristics by monkeys. in Comparative Perception, Complex signals, eds Stebbins WC, Berkley MA (Wiley, New York), 2:187–215.
    OpenUrl
  32. ↵
    1. Perrett DI,
    2. Smith PAJ,
    3. Potter DD,
    4. Mistlin AJ,
    5. Head AS,
    6. Milner AD,
    7. Jeeves MA
    (1985) Visual cells in the temporal cortex sensitive to face view and gaze direction. Proc R Soc Lond B Biol Sci 223:293–317.
    OpenUrlCrossRefPubMed
  33. ↵
    1. Perrett DI,
    2. Mistlin AJ,
    3. Chitty AJ,
    4. Harries M,
    5. Newcombe F,
    6. de Haan E
    (1988) Neuronal mechanisms of face-perception and their pathology. in Physiological aspects of clinical neuro-ophthalmology, eds Kennard C, Rose FC (Chapman and Hall, London), pp 137–154.
  34. ↵
    1. Perrett DI,
    2. Harries MH,
    3. Mistlin AJ,
    4. Hietanen JK,
    5. Benson PJ,
    6. Bevan R,
    7. Thomas S,
    8. Oram MW,
    9. Ortega J,
    10. Brierly K
    (1990) Social signals analyzed at the cell level: someone is looking at me, something touched me, something moved! Int J Comp Psychol 4:25–54.
    OpenUrl
  35. ↵
    1. Perrett DI,
    2. Hietanen JK,
    3. Oram MW,
    4. Benson PJ
    (1992) Organization and functions of cells responsive to faces in the temporal cortex. Philos Trans R Soc Lond B Biol Sci 335:23–30.
    OpenUrlAbstract/FREE Full Text
  36. ↵
    1. Puce A,
    2. Allison T,
    3. Gore JC,
    4. McCarthy G
    (1995) Face-sensitive regions in human extrastriate cortex studied by functional MRI. J Neurophysiol 74:1192–1199.
    OpenUrlPubMed
  37. ↵
    1. Puce A,
    2. Allison T,
    3. Asgari M,
    4. Gore JC,
    5. McCarthy G
    (1996) Differential sensitivity of human visual cortex to faces, letterstrings, and textures: a functional MRI study. J Neurosci 16:5205–5215.
    OpenUrlAbstract/FREE Full Text
  38. ↵
    1. Puce A,
    2. Allison T,
    3. Spencer SS,
    4. Spencer DD,
    5. McCarthy G
    (1997a) A comparison of cortical activation evoked by faces measured by intracranial field potentials and functional MRI: two case studies. Hum Brain Mapp 5:298–305.
    OpenUrlCrossRefPubMed
  39. ↵
    1. Puce A,
    2. Allison T,
    3. Bentin S,
    4. Gore JC,
    5. McCarthy G
    (1997b) An fMRI study of changes in gaze direction and mouth position. NeuroImage 5:S161.
    OpenUrl
  40. ↵
    1. Redican WK
    (1982) An evolutionary perspective on human facial displays. in Emotion in the human face, Ed 2, ed Ekman P (Cambridge UP, Cambridge, UK), pp 212–280.
  41. ↵
    1. Rizzolatti G,
    2. Fadiga L,
    3. Matelli M,
    4. Bettinardi V,
    5. Paulesu E,
    6. Perani D,
    7. Fazio F
    (1996) Localization of grasp representations in humans by PET: 1. Observation versus execution. Exp Brain Res 111:246–252.
    OpenUrlPubMed
  42. ↵
    1. Rolls ET
    (1992) Neurophysiological mechanisms underlying face processing within and beyond the temporal cortical visual areas. Philos Trans R Soc Lond B Biol Sci 335:11–21.
    OpenUrlAbstract/FREE Full Text
  43. ↵
    1. Sergent J,
    2. Ohta S,
    3. MacDonald B
    (1992a) Functional neuroanatomy of face and object processing: a positron emission tomography study. Brain 115:15–36.
    OpenUrlCrossRefPubMed
  44. ↵
    1. Sergent J,
    2. Zuck E,
    3. Terriah S,
    4. MacDonald B
    (1992b) Distributed neural network underlying musical sight-reading and keyboard performance. Science 257:106–109.
    OpenUrlAbstract/FREE Full Text
  45. ↵
    1. Talairach J,
    2. Tournoux P
    (1988) Co-planar stereotaxic atlas of the human brain. (Thieme, New York).
  46. ↵
    1. Tanaka K,
    2. Saito H
    (1989) Analysis of motion of the visual field by direction, expansion/contraction, and rotation cells clustered in the dorsal part of the medial superior temporal area of the macaque monkey. J Neurophysiol 62:626–641.
    OpenUrlPubMed
  47. ↵
    1. Tootell RBH,
    2. Reppas JB,
    3. Kwong KK,
    4. Malach R,
    5. Born RT,
    6. Brady TJ,
    7. Rosen BR,
    8. Belliveau JW
    (1995) Functional analysis of human MT and related visual cortical areas using magnetic resonance imaging. J Neurosci 15:3215–3230.
    OpenUrlAbstract/FREE Full Text
  48. ↵
    1. Vaina L
    (1994) Functional segregation of color and motion processing in the human visual cortex: clinical evidence. Cereb Cortex 4:555–572.
    OpenUrlCrossRefPubMed
  49. ↵
    1. Watson JDG,
    2. Myers R,
    3. Frackowiak RSJ,
    4. Hajnal JV,
    5. Woods RP,
    6. Mazziotta JC,
    7. Shipp S,
    8. Zeki S
    (1993) Area V5 of the human brain: evidence from combined study using positron emission tomography and magnetic resonance imaging. Cereb Cortex 3:79–94.
    OpenUrlCrossRefPubMed
  50. ↵
    1. Yamane S,
    2. Kaji S,
    3. Kawano K
    (1988) What facial features activate face neurons in the inferotemporal cortex? Exp Brain Res 73:209–214.
    OpenUrlCrossRefPubMed
Back to top

In this issue

The Journal of Neuroscience: 18 (6)
Journal of Neuroscience
Vol. 18, Issue 6
15 Mar 1998
  • Table of Contents
  • Index by author
Email

Thank you for sharing this Journal of Neuroscience article.

NOTE: We request your email address only to inform the recipient that it was you who recommended this article, and that it is not junk mail. We do not retain these email addresses.

Enter multiple addresses on separate lines or separate them with commas.
Temporal Cortex Activation in Humans Viewing Eye and Mouth Movements
(Your Name) has forwarded a page to you from Journal of Neuroscience
(Your Name) thought you would be interested in this article in Journal of Neuroscience.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Print
View Full Page PDF
Citation Tools
Temporal Cortex Activation in Humans Viewing Eye and Mouth Movements
Aina Puce, Truett Allison, Shlomo Bentin, John C. Gore, Gregory McCarthy
Journal of Neuroscience 15 March 1998, 18 (6) 2188-2199; DOI: 10.1523/JNEUROSCI.18-06-02188.1998

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Respond to this article
Request Permissions
Share
Temporal Cortex Activation in Humans Viewing Eye and Mouth Movements
Aina Puce, Truett Allison, Shlomo Bentin, John C. Gore, Gregory McCarthy
Journal of Neuroscience 15 March 1998, 18 (6) 2188-2199; DOI: 10.1523/JNEUROSCI.18-06-02188.1998
Reddit logo Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • MATERIALS AND METHODS
    • RESULTS
    • DISCUSSION
    • Footnotes
    • REFERENCES
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF

Keywords

  • extrastriate cortex
  • eye movement
  • mouth movement
  • temporal lobe
  • superior temporal sulcus
  • gaze direction

Responses to this article

Respond to this article

Jump to comment:

No eLetters have been published for this article.

Related Articles

Cited By...

More in this TOC Section

  • Functional Hemichannels in Astrocytes: A Novel Mechanism of Glutamate Release
  • Conditional Expression in Corticothalamic Efferents Reveals a Developmental Role for Nicotinic Acetylcholine Receptors in Modulation of Passive Avoidance Behavior
  • A Within-Subjects, Within-Task Demonstration of Intact Spatial Reference Memory and Impaired Spatial Working Memory in Glutamate Receptor-A-Deficient Mice
Show more ARTICLE
  • Home
  • Alerts
  • Visit Society for Neuroscience on Facebook
  • Follow Society for Neuroscience on Twitter
  • Follow Society for Neuroscience on LinkedIn
  • Visit Society for Neuroscience on Youtube
  • Follow our RSS feeds

Content

  • Early Release
  • Current Issue
  • Issue Archive
  • Collections

Information

  • For Authors
  • For Advertisers
  • For the Media
  • For Subscribers

About

  • About the Journal
  • Editorial Board
  • Privacy Policy
  • Contact
(JNeurosci logo)
(SfN logo)

Copyright © 2023 by the Society for Neuroscience.
JNeurosci Online ISSN: 1529-2401

The ideas and opinions expressed in JNeurosci do not necessarily reflect those of SfN or the JNeurosci Editorial Board. Publication of an advertisement or other product mention in JNeurosci should not be construed as an endorsement of the manufacturer’s claims. SfN does not assume any responsibility for any injury and/or damage to persons or property arising from or related to any use of any material contained in JNeurosci.