Abstract
Our ability to interact with the immediate surroundings depends not only on an adequate representation of external space but also on our ability to represent the location of objects with respect to our own body and especially to our hands. Indeed, electrophysiological studies in monkeys revealed multimodal neurons with spatially corresponding tactile and visual receptive fields in a number of brain areas, suggesting a representation of visual peripersonal space with respect to the body. In this functional magnetic resonance imaging study, we localized areas in human intraparietal sulcus (IPS) and lateral occipital complex (LOC) that represent nearby visual space with respect to the hands (perihand space), by contrasting the response to a ball moving near-to versus far-from the hands. Furthermore, by independently manipulating sensory information about the hand, in the visual (using a dummy hand) and proprioceptive domains (by changing the unseen hand position), we determined the sensory contributions to the representation of hand-centered space. In the posterior IPS, the visual contribution was dominant, overriding proprioceptive information. Surprisingly, regions within LOC also displayed visually dominant, hand-related activation. In contrast, the anterior IPS was characterized by a proprioceptive representation of the hand, as well as showing tactile hand-specific activation, suggesting a homology with monkey parietal hand-centered areas. We therefore suggest that, whereas cortical regions within the posterior IPS and LOC represent hand-centered space in a predominantly visual manner, the anterior IPS uses multisensory information in representing perihand space.
Introduction
Peripersonal space is the multisensory space immediately surrounding our body, or more specifically the sector of space that closely surrounds a certain body part (Rizzolatti et al., 1981, 1997). A crucial factor that distinguishes the perception of our immediate surroundings from that of more distant space is our potential ability to interact with (i.e., to reach and grasp, or to avoid) objects within peripersonal space. Electrophysiological studies in macaque monkeys suggest that multisensory peripersonal space is represented in body part-centered coordinate frames, within both subcortical structures (the putamen), and in cortical regions of the frontal and parietal lobes (Hyvärinen, 1981; Rizzolatti et al., 1981; Colby et al., 1993; Graziano and Gross, 1993). Multisensory neurons within these areas have visual and, less commonly, also auditory receptive fields (RFs) mapping the space surrounding the monkey's body in close proximity to the somatosensory RF of the neuron.
In the monkey parietal cortex, several areas are related to peripersonal space processing: neurons with multisensory and spatially overlapping RFs have been reported in the ventral intraparietal area (VIP), most commonly centered on the head (Colby et al., 1993; Duhamel et al., 1997, 1998; Avillac et al., 2005; Schlack et al., 2005); larger tactile RFs with closely matching visual RFs, centered on the arm, torso, and face have been reported in area 7b (Leinonen and Nyman, 1979; Robinson and Burton, 1980a,b; Hyvärinen, 1981; Graziano and Gross, 1994), and some neurons in areas 2 and 5 have visual RFs that have been reported to follow the hand when it was moved to a new position in space (Obayashi et al., 2000).
In humans, most of the evidence for the existence of a multisensory system representing peripersonal space comes from neuropsychological studies with patients suffering from cross-modal extinction after a right hemisphere stroke. In studies of cross-modal extinction, a visual stimulus presented near to the patient's ipsilesional (right) hand often extinguished the perception of a simultaneous tactile stimulus on the patient's contralesional (left) hand. When the right visual stimulus was presented far from the patient's hand, however, the degree of extinction was reduced (di Pellegrino et al., 1997; Làdavas et al., 1998). Furthermore, when the hands were held in a crossed position (such that the left hand was positioned in the right hemispace and vice versa), visual stimulation near the right hand still induced significant extinction of left hand tactile stimuli. These findings are consistent with the electrophysiological findings from monkeys suggesting that the representation of peripersonal space is body part (i.e., hand) centered (Farnè et al., 2005). Other than these neuropsychological findings, and some behavioral evidence showing cross-modal interactions in near space (Spence et al., 2004), little is known about the nature of the representation of peripersonal space in humans, and of perihand space in particular (Culham et al., 2006).
We set out to find cortical areas in humans that represent visual information in hand-centered coordinates, by independently manipulating visual stimulus distance and both visual and proprioceptive feedback of hand position. We report here the finding of a transition, within the parietal cortex, from a strictly visual representation of the hand in the posterior intraparietal sulcus (IPS), to a representation based on both visual and proprioceptive information in the anterior IPS.
Materials and Methods
Subjects.
Eleven healthy volunteers without histories of neurological, psychiatric, or visual deficits were recruited (25–33 years of age; three females). All of the subjects were naive regarding the purpose of the study. All procedures were approved by the Tel-Aviv Sourasky Medical Center Ethics Committee, and subjects gave written informed consent before the experiment.
Magnetic resonance imaging.
All experiments were conducted on a whole-body 1.5 T, SIGMA Horizon, LX8.25 General Electric (Milwaukee, WI) scanner, equipped with a quadrature surface coil (Nova Medical, Wakefield, MA). Blood oxygenation level-dependent (BOLD) contrast was obtained with multislice gradient echo-planar (EPI) sequence (repetition time, 2000 ms; echo time, 55 ms; flip angle, 70°; field of view, 21 × 21 cm2; matrix size, 64 × 64), using a real-time system. The scanned volume included 19 near-axial slices of 4 mm thickness with 1 mm interslice gap, covering the whole brain (except for one subject, for whom a slice thickness of 4.5 mm was required). T1-weighted high-resolution (1.2 × 1.2 mm2) anatomical images with the same orientation and slice thickness as the EPI slices were also acquired after each scan to improve the quality of the spatial matching of the functional data with the anatomical data. The functional magnetic resonance imaging (fMRI) images were superimposed on a high-resolution (1 × 1 × 1 mm3) whole-brain spoiled gradient (3D-SPGR) sequence (using a standard head coil), allowing accurate cortical segmentation, reconstruction, and volume-based statistical analysis.
Apparatus and procedure.
Because of mounting evidence from neuropsychological research showing disorders in perception of peripersonal space after right hemisphere injuries (i.e., causing deficits for the left side of the body, specifically for the left hand), we chose to study the perihand space of the left hand. We hoped that this procedure would enable us to better identify brain areas in the contralateral (right) hemisphere and therefore be useful in comparison with the data from patients with right hemisphere lesions.
The subjects lay in the magnetic resonance imaging scanner and viewed the apparatus through a mirror placed above their faces. Two cardboard targets were hung in the subject's field of view: one on a table above the subject's left thigh (“near” target) and another suspended centrally in their upper visual field, 70 cm further from the near position toward their feet (“far” target). A fixation point was positioned halfway between the two targets, on which the subject was instructed to fixate through all of the experiments (except for the “tactile test,” in which the subject's eyes were closed and covered) (see below).
The visual stimulus was a ball attached to a 70-cm-long stick, which was moved toward (and stopping 2–5 cm from the target) and away from one of the targets at a frequency of ∼1 Hz. The visual stimuli were delivered by a trained experimenter, who listened to a metronome with a frequency of 1 Hz. The experimenter stood to the right of the subject, occluded from their sight by a curtain. The subject could not see the experimenter's hand and could see only the moving ball attached to the stick. The diameters of the balls (2.5 and 4 cm) were matched so that the visual image on the retina was of approximately the same size for the near and far targets, respectively (their retinal positions, however, were different). During the stimulation periods, subjects were required to determine whether the trajectory of the ball would hit the center of the target or not, by covertly responding “yes” or “no.” Subjects practiced this task before each scan, and were able to maintain visual fixation while performing the task, and reported responding approximately equally “yes” and “no.” Eye movements were not monitored during scanning.
This procedure was repeated four times for each subject, under different experimental conditions (for experimental setup, see Fig. 1). In the “real-hand experiment,” the subject's left hand was placed (palm up) on the table and within view. The near target was positioned on the subject's hand, such that the moving ball was 2–5 cm away at its closest point. The subject's right hand rested comfortably by their right side (Fig. 1A). In the “retracted-hand experiment,” the subject's left hand was retracted toward their left shoulder and covered from sight. The near target was placed on the table at the same position as in the real-hand experiment (Fig. 1D). In the “occluded-hand experiment,” the subject's left hand was placed on the table in the same position as in the real-hand experiment, but this time it was occluded from the subject's sight by a cardboard shield, on which the near target was laid (Fig. 1B). In the “dummy-hand experiment,” the subject's left hand was again retracted to their left shoulder and covered. This time, a left-hand-and-wrist cosmetic prosthetic dummy hand was placed on the table, on which the near target was laid (Fig. 1C). After the dummy-hand experiment, subjects were interviewed to determine whether they experienced the “rubber hand illusion” during the scan (Botvinick and Cohen, 1998; Ehrsson et al., 2004). None reported experiencing any such illusion. For six of the subjects, we repeated the procedure with the left hand retracted and the near target placed on the table. In this “far-dummy-hand experiment,” the dummy hand was suspended in midair behind the far target. A sample of the experiments was videotaped and analyzed off-line to assess whether the visual stimuli were similar in frequency across experiments (see supplemental data, available at www.jneurosci.org as supplemental material).
Experimental setup: a 2 × 2 × 2 factorial design. A schematic illustration of the experimental conditions is shown. In each experiment, a moving ball was presented in either of two positions, near or far (with respect to the hand), while the subject was maintaining fixation between the two targets. Proprioceptive and visual information about hand position with respect to the near condition were manipulated between experiments. A, Real-hand experiment: having congruent visual and proprioceptive information. B, Occluded-hand experiment: proprioceptive information only. C, Dummy-hand experiment: illusory visual information. D, Retracted-hand experiment: with the hand away from the near target.
For nine of the subjects, we performed an additional tactile test at the end of these scans, in which they laid both of their hands palm-up with their eyes covered. In a block design, the 2.5 cm plastic ball-on-stick was moved at 1 Hz touching either the left hand or the right hand, in a similar manner as described above for the near target stimulus. The subjects were asked to keep their eyes closed and to assess on each trial if the touch they felt was in the same spatial position as the trial before (i.e., a one-back task).
Experimental design.
The experiments followed a repeated-measures block design. In each scan, two conditions (near target and far target for the majority of experiments, and “left hand” and “right hand” for the tactile test) were interleaved and repeated five times per block, with a randomized order of presentation between subjects. In the real-hand experiment, a third condition was added in which the visual stimulus actually touched the subject's left hand at the end of each trajectory. In all experiments, each block lasted 12 s and was followed by a rest (i.e., fixation) period of 10 s. The first and last rest periods were longer (24 and 14 s, respectively). The order of the experiments (real hand, retracted hand, dummy hand, occluded hand, and far dummy hand) was randomized between subjects. The tactile test was always performed after completion of the other scans.
Data analysis.
Analysis was performed using BrainVoyager 4.96 and BrainVoyager QX (BV QX) software packages (Brain Innovation, Maastricht, The Netherlands; 2000). The first seven images of each functional scan were discarded to allow for signal stabilization. Preprocessing included head motion correction, slice-time correction, and high-pass temporal smoothing in the frequency domain (three cycles per total scan time) to remove drifts and to improve the signal-to-noise ratio. The functional images were superimposed on two-dimensional anatomical images and incorporated into the three-dimensional data sets through trilinear interpolation. The complete data set was transformed into standard space (Talairach and Tournoux, 1988) and Z-normalized for the whole time course. A correction for temporal autocorrelation was applied. To compute statistical parametric maps for the individual and group analyses, we applied a general linear model (GLM) using predictors convolved with the canonical hemodynamic response function (Boynton et al., 1996).
Statistical significance was assessed using a cluster approach, in which the initial voxelwise threshold criterion was p < 0.01, and a subsequent spatial extent threshold of 20–40 contiguous functional voxels (986–1972 mm3), resulting in a whole-brain corrected value of p < 0.05 for each experiment (Forman et al., 1995) (implemented by Monte Carlo simulations in BV QX).
The activation time courses from individual subjects' data were obtained from statistically significant clusters only [voxelwise t test, p < 0.05, corrected for a minimum spatial extent of 10 functional voxels (488 mm3) in each region of interest (ROI)], after applying the GLM analysis. The average percentage signal change (between 8 and 16 s after block onset) was calculated per condition for all subjects showing significant activation in the real-hand experiment ROI. Paired two-tailed t tests were applied to assess significant differences between the mean percentage signal change across conditions and experiments (using a derived preference index) (see below). Voxels were selected on an individual basis if they showed a preference for the near condition (over the far condition), in the real-hand experiment, and were within a given cortical area according to anatomical markers. Because of extended activation for some of the subjects, the size of each ROI was restricted to a maximum of 15 mm from the anatomical marker in each axis. For the lateral occipital complex (LOC) ROI, instead of the anatomical marker, we used a functional marker from a visual object localizer (taken from Culham et al., 2003) and combined it with the preference for the near stimulus in the real-hand experiment. The Talairach and Tournoux coordinates were determined for the mean center of gravity of each ROI across subjects: aIPS, the anterior and inferior end of the IPS (10 subjects; center x = 34, y = −37, z = 43); pIPS, the posterior activation found in the middle of the superior–inferior axis (11 subjects; 22, −66, 47); LOC (9 subjects; 43, −66, −5). To quantify the preference for the near over the far stimulus between experiments, we calculated a standardized preference index adapted from Field and Wann (2005). The index was measured as the difference between the average percentage signal change (within a subject) for the near minus far conditions, divided by the averaged percentage signal change (between subjects) for the near condition (nearsubject − farsubject/neargroup).
Across-subjects statistical parametric maps were calculated using a hierarchical random-effects model analysis (Friston et al., 1999), after spatial smoothing with a three-dimensional Gaussian kernel of 6 mm (full-width at half-maximum). For the far-dummy-hand experiment, which was performed in six subjects, we used a fixed effect model analysis, corrected for false positives (whole-brain cluster analysis with the same initial t value cutoff, and p values as reported before). The statistical parametric maps were overlaid on a representative inflated cortical surface map, reconstructed from the T1-weighted 3D-SPGR scan. This procedure included segmentation of the white matter using a grow-region function, the smooth covering of a sphere around the segmented region, and the expansion of the reconstructed white matter into the gray matter. The surface was then unfolded, cut along the calcarine sulcus, and flattened.
Results
We first localized cortical areas that selectively responded to a visual stimulus (a moving ping-pong ball attached to a stick) (Graziano et al., 1997) when approaching the immediate space surrounding the hand (perihand space). For this purpose, we contrasted the BOLD response to the ball approaching and receding from the left hand, with the response to a ball approaching and receding from a distant target far from the hand (near vs far conditions in the real-hand experiment) (for experimental setup, see Fig. 1A). Because the retinal projection of the ball in the two conditions was different, the preference for the near over the far ball could have resulted from low-level visual differences unrelated to distance from the hand (such as the distance from the eyes, for example). To control for this, we repeated the procedure in the retracted-hand experiment (Fig. 1D): areas that show preference for the near ball when the subject's hand is retracted away from both targets are likely to be sensitive to such changes in the retinal input. We therefore focus our attention on regions that show preference for the near stimulus (compared with the far), only when it is near the hand, suggesting that this preference is indeed hand related.
Second, we wanted to determine whether the preference for the near ball depended only on proprioceptive information concerning the hand, or also on visual information. To this end, we occluded the subject's hand from sight and repeated the experiment (Fig. 1B). In this occluded-hand experiment, although the subject's hand was placed close to the near target, no visual feedback of hand position was available, so that any specificity for the near ball was likely to be based on proprioceptive information (about the occluded hand's location). Additionally, we investigated the role of vision of a “hand” in the representation of perihand space in another experiment (dummy-hand experiment) by placing a realistic dummy hand by the near target, while the subject's own hand was retracted away. This created the illusory visual input of a hand positioned close to the near target, thus conflicting with veridical information from proprioception regarding the subject's actual hand position (Fig. 1C).
Model predictions
Figure 2 depicts three models that allowed us to determine whether an area is hand related, and if so, if it was visually or proprioceptively driven. If neither visual nor proprioceptive hand position information affected stimulus processing, the preference for the near over the far ball in a given area would be constant and significant in all four experiments. Such an outcome is depicted in Figure 2A.
ROI analysis: model predictions. A schematic illustration of the expected hemodynamic relationship between the near (light lines) and far (dark lines) conditions across four different experiments. A, When the preference for the near condition is consistent regardless of visual and proprioceptive hand position information. B, When the preference for the near condition is observed only when visual information of the hand (real or dummy) indicates its position close to the near target. C, If the preference for the near condition is present only when there is proprioceptive information that the hand is positioned close to the near target.
Another possibility is that the preference for the near ball in perihand space (real-hand experiment) would not be replicated when the hand was retracted away, suggesting that the preference for the near stimulus is modulated by proprioceptive and/or visual hand position information. If the preference for the near ball is recovered when a dummy hand is visible close to the near target, and abolished again when no hand is visible, we can conclude that the area has a visually determined representation of hand position (Fig. 2B). Finally, if the preference for the near ball is significant when the real hand is placed by the near target, but not when it is retracted away (regardless of any visual information), we can conclude that the area has a proprioceptively determined representation of hand position (Fig. 2C). These different models guided our interpretation of the experimental results as described below.
Statistical parametric maps (group analysis)
Localizing areas that represent visual information with respect to the hand
Our first step was to delineate hand-related areas, using a group analysis statistical parametric map approach. Areas with significantly higher BOLD signal in response to a visual stimulus moving near the hand, compared with stimuli moving far from the hand (real-hand experiment) are shown in Figure 3A and Table 1, experiment A. Overall, the activation was more prominent in the right hemisphere (RH) than in the left hemisphere (LH). This is in accordance with the fact that the near ball was presented in the left visual field (near the subject's left hand). In the right hemisphere, activation was found in the occipital cortex, spreading from the calcarine sulcus (CalS) to the posterior collateral sulcus (pColS), and to the most caudal part of the IPS (intersecting with the transverse occipital sulcus [intraparietal transverse occipital (IPTO)]). Activation was also prominent in the parietal cortex along the IPS, in the frontal cortex around the middle frontal sulcus, and in the ventral premotor cortex (PMv), and in the right LOC.
Group results: determining the relative contributions of visual and proprioceptive information to the hand schema. fMRI differential activation (whole brain corrected, p < 0.05) for near versus far stimuli on representative inflated and unfolded maps of the right hemisphere (RH) and left hemisphere (LH). Shown are the areas with preference for the ball approaching the near target. A, When next to the subject's hand. B, When the subject's hand was occluded from sight. C, When a dummy hand was placed at the same position as the occluded hand, while the subject's own hand was retracted. D, When the subject's hand was retracted away from the near target. The comparison between the activation preference in the different experiments enables identification of putative hand-related areas in the cortex, as well as the factors (visual or proprioceptive) governing the hand position-related representation. Note that the mere presence of the dummy hand modulated parietal areas in a similar way to the real hand. A, Anterior; P, posterior; CS, central sulcus; ColS, collateral sulcus.
Group results: fMRI peak activation coordinates within significant clusters of voxels for the contrast between near and far conditions across four experiments (A–D)
Clusters of voxels showing preference for the near ball position over the “far,” even when the hand was retracted away from the near target, are shown in Figure 3D and Table 1, experiment B. In the right hemisphere, such near preference was seen only in the occipital cortex, in the CalS, pColS, and in the IPTO. Because these areas showed preference for the near stimulus over the far in both the real-hand and retracted-hand experiments, we can conclude (based on the model prediction in Fig. 2A) that this preference is not related to the position of the ball with respect to the hand (i.e., perihand space). However, this is not the case in other areas, such as the IPS, LOC, and the frontal areas, for which the preference for the near ball in the real-hand experiment was lost when the hand was retracted. These areas, showing preference for the near ball only when presented close to the hand, can be regarded as representing visual information with respect to hand position.
Determining the relative contributions of visual and proprioceptive information to the hand schema
The hand schema is defined here as the cortical representation of the hand across sensory modalities. To study the relative contribution of proprioceptive information to the hand schema, we isolated the proprioceptive contribution to the observed BOLD response, by occluding the subject's hand from sight. The contrast between the near and far conditions for the occluded-hand experiment is shown in Figure 3B and Table 1, experiment C. When no visual information concerning hand position was available, activation was restricted to the occipital cortex, again along the CalS, the pColS, and the IPTO. Because these areas showed a similar preference in the retracted-hand experiment, we conclude that their near preference resulted from low-level visual differences between the conditions. At a lower threshold (voxelwise p < 0.04), we also noticed a small cluster in area PMv located in the inferior frontal gyrus (Table 1, experiment C), suggesting that in this area proprioceptive information contributes to the hand schema.
To determine the contribution of visual information to the hand schema, we placed a dummy hand in a natural position by the near target. The results for the contrast between the near and far conditions in the dummy-hand experiment are shown in Figure 3C and Table 1, experiment D. In addition to the occipital areas (CalS, pColS, and IPTO), activation selective for the near condition was also restored in the right hemisphere LOC and posterior parietal cortex, primarily in the posterior part of the IPS. Because these areas showed a preference for the near ball in perihand space, and did not show this preference when the hand was retracted away or occluded from sight, we can conclude (based on the model prediction in Fig. 2B) that the visual information from the dummy hand was sufficient to reactivate them. This suggests that the response of these areas to visual stimuli (placed in perihand space) is based primarily on visual information about hand position, regardless of veridical but conflicting information from proprioception. None of the frontal activation specific to the near ball in the real-hand experiment was retained in the dummy-hand experiment. However, at a reduced threshold (voxelwise p < 0.03, uncorrected), we did find a small cluster in PMv, which did not survive the whole brain cluster-size correction (see Materials and Methods and Table 1, experiment D). This suggests that, in accordance with electrophysiological studies in monkeys, human ventral premotor cortex might be sensitive both to visual and proprioceptive information regarding hand position. Given the marginal statistical significance of this result, however, this conclusion should be taken with caution.
Tactile representation of the hand
To study further the multisensory properties of areas identified as potentially representing perihand space, we conducted a tactile experiment in nine of the subjects, in which the experimenter repeatedly touched the subjects' hands with the same ball previously used as the visual stimulus, while subjects kept their eyes closed. When masked inclusively with the areas showing preference for the near stimulus in the real-hand experiment, the postcentral sulcus and the anterior section of the IPS also responded significantly to the tactile stimulation (Fig. 4). Generally, these areas were not modulated by illusory visual information (i.e., they did not show a preference for the near ball in the dummy-hand experiment) (a small overlap with the areas activated in the dummy-hand experiment was found in the right superior parietal gyrus and left inferior postcentral sulcus). Other sites with overlapping tactile and near-hand visual processing were found in the right central sulcus, posterior superior temporal sulcus, inferior frontal gyrus, and in the left parietal operculum (see supplemental Table 1, available at www.jneurosci.org as supplemental material).
Tactile properties of the hand-related visual areas. Activation on an inflated whole-brain map (shown in orange) represents areas that significantly responded to a tactile stimulus on the subject's left hand. We present only voxels that also show a preference for the near ball over the far in the real-hand experiment (in the absence of any tactile input; green lines). Areas showing preference for the near ball with illusory visual information of the hand position (dummy-hand experiment) are shown in pink. Note that there is little overlap between the tactile areas and the areas modulated by the seen (dummy or real) hand position. LH, Left hemisphere; RH, right hemisphere; CS, central sulcus. Cluster corrected (p < 0.05).
ROI analysis
The results described above were derived from statistical maps and are therefore threshold dependent. To further confirm the representation of visual information with respect to the hand, we conducted an ROI analysis, focusing on some of the higher-order visual areas activated in the real-hand experiment. The regions were chosen for each individual subject according to anatomical markers, as well as for their preference for the near over the far stimulus condition in the real-hand experiment (see Materials and Methods). Within three selected areas, the percentage BOLD signal change, relative to stimulus onset, was compared between conditions and experiments to determine how visual and proprioceptive hand position information contributed to the hand schema (i.e., to the difference between near and far conditions).
Intraparietal sulcus
Unlike the caudal part of the IPS (IPTO), which showed a significant preference for the near ball in all contrast maps (for an ROI analysis, see supplemental Fig. S1, available at www.jneurosci.org as supplemental material), the main section of the IPS showed hand-related properties (i.e., a preference for the near ball only when the hand was present). We focused on two ROIs in the right (contralateral) hemisphere: The posterior aspect of the IPS (pIPS) (not to be confused with the most caudal part of IPS), and the anterior aspect of IPS (aIPS). ROI analyses of the anterior and posterior IPS in the left hemisphere can be found in the supplemental information (supplemental Fig. S3, available at www.jneurosci.org as supplemental material).
The posterior aspect of the IPS had shown a preference for the near ball only with concurrent visual information about hand position (i.e., in both the real-hand and dummy-hand experiments) (Fig. 3A,C). The ROI analysis for this area (Fig. 5) confirmed this finding: the preference for the near ball was abolished when the hand was retracted (p = 0.10, t test comparing the area under the average hemodynamic response curve between the near and far conditions) (see Materials and Methods) and was recovered when a dummy hand was placed instead of the real hand (p = 0.0001). Without visual feedback of the hand, the preference for the near ball in the occluded-hand experiment remained nonsignificant (although p = 0.07). This pattern of results is consistent with the model suggested in Figure 2B, describing an area that represents visual stimuli in hand-centered coordinates, and in which hand position information is principally determined by vision. This result was further confirmed by a comparison between the preference indices for the near ball in the dummy-hand and retracted-hand experiments (see Materials and Methods). Because the only difference between these two experiments was the presence of the dummy hand positioned by the near ball, the significant difference in the preference index between these experiments (paired t test, p = 0.0002) indicates that the visual information concerning the position of the hand (real or illusory) was the main factor governing the response in the pIPS. (Note that there was little effect of proprioceptive information in these areas: the retracted-hand and the occluded-hand experiments yielded similar preference indices, paired t test, p = 0.23)
ROI analysis: posterior IPS. A, Areas showing significant preference for the near over the far condition in the real-hand experiment superimposed on a representative inflated right hemisphere. The white-filled circle represents the average location (center of gravity) of the ROI between subjects. B, Averaged hemodynamic response curves of the percentage signal change for the near condition (dark colors) and the far condition (light colors). For each subject, clusters of voxels showing significantly greater activation for the near (compared with the far) condition in the real-hand experiment (within the posterior IPS) were selected as the ROI. These clusters were then closely examined in the three other experiments. The gray background denotes the time of presentation of the visual stimuli (in seconds). Asterisks denote statistical significance between the averaged time courses: *p < 0.05; **p < 0.01. Note that the visual presence of the (dummy or real) hand modulates the response of the region to the visual stimuli.
In the anterior part of the IPS, the contrast maps showed a preference for the near stimulus only in the real-hand experiment, suggesting that converging and matching proprioceptive and visual inputs are needed to generate activity in this region. However, in the ROI analysis (Fig. 6), we found a small but statistically significant preference for the near ball when the hand was present but not seen (occluded-hand experiment; p = 0.029). A comparison between the preference indices for the occluded-hand and retracted-hand experiments (which differ only in the proprioceptive information regarding hand position) yielded a significant difference between the two (paired t test, p = 0.01). In the same area, a comparison between the preference for the near ball in the dummy-hand and retracted-hand experiments (which differ only in visual aspects of hand position) also showed a significant difference (p = 0.02). We conclude that, in the aIPS, in addition to visual information, there is an important proprioceptive contribution to the determination of hand position for perihand space representation.
ROI analysis: anterior IPS. Notations are as in Figure 5. Note that the proprioceptive information of hand position (next to the near target) modulates the response of the region to the visual stimuli.
Lateral occipital complex
Another ROI was aimed to study further the LOC. Surprisingly, our group results showed enhanced activation in LOC for visual stimuli near the hand, compared with far stimuli. This result was unexpected because this area is regarded as a higher-level visual area, which is not usually considered as being related to spatial processing (Grill-Spector and Malach, 2004). As far as we know, LOC has never before been examined in this respect (i.e., modulation of activity dependent on the location of visual stimuli with respect to body parts). In our group results, a preference for the near stimulus was shown only with available visual hand position information (i.e., in the real- and dummy-hand experiments). The ROI analysis in LOC (Fig. 7) replicated these findings (dummy-hand experiment, paired t test for “near” vs “far,” p = 0.008). Without visual information regarding hand position, the preference for the near ball was no longer significant, regardless of the actual (i.e., proprioceptive) hand position (retracted-hand experiment, p = 0.13; occluded-hand experiment, p = 0.25). When compared with the retracted-hand experiment, only the preference for the near ball in the dummy-hand experiment was significant (preference indices: retracted hand vs dummy hand, p = 0.04; retracted hand vs occluded hand, p = 0.53), suggesting that in LOC, visual processing of perihand space is modulated by the seen position of the hand (whether real or illusory), although this effect was less prominent than in the aIPS.
ROI analysis: LOC. Notations are as in Figure 5.
Controlling for possible confounding effects of attention
One possible explanation for the observed selectivity of the BOLD response for visual stimuli approaching the subject's real hand (or dummy hand) is that the hand (or dummy hand) naturally captured the subject's visual attention. As a result, the representation of any stimulus near the hand might have been enhanced. This explanation emphasizes the hand as an object of interest, per se, rather than as a body part. Such an interpretation would therefore predict that any object of interest (regardless of whether it resembled the hand) would generate the same preference for stimuli near the attended object.
To account for these possible confounding effects of attention, for six of the subjects we conducted another experiment, in which we placed the far target on a dummy hand positioned far from the subject (far-dummy-hand experiment). Under such circumstances, the dummy-hand position clearly does not correspond anatomically with the normal body scheme. If the preference for the near ball in the hand-related areas resulted from referring attention to the hand (or dummy) as an object of interest, rather than as a hand per se, one would predict that the preference for the near ball should switch to the far ball (which now approaches the far dummy hand). However, if the preference for the near ball in the main experiment was attributable to these regions' representation of the hand (or dummy) as a body part, no such preference should be seen when the dummy is placed in the far position.
The statistical parametric maps for the far-dummy-hand experiment are presented in Figure 8. Preference for the near ball (over the far) was observed bilaterally in early visual areas, as well as in the right posterior collateral sulcus and the IPTO. Conversely, the preference for the far ball was significant only in occipital visual areas in the left hemisphere (in accordance with the position of the ball in the right visual field). We did not find significant differences between the BOLD responses to near and far stimuli in any of the previously identified hand-related areas. This result suggests that the preference for the near ball in the hand-related areas was specific only for hands in near space, and argues strongly against an attentional interpretation of our principal results.
Attentional control: placing the dummy hand outside perihand space. Areas showing preference for the near over the far ball (reds) and for the far ball (blues) over the near ball in the far-dummy-hand experiment (solid patches) and in the retracted-hand experiment (outline patches) are shown. Note that the two experiments differ only in the presence of the dummy hand by the far target. The overlap in responses in the two experiments suggests that, when the dummy hand is placed out of peripersonal space, it does not affect hand-related areas, as would be expected if the modulation was attributable to attention referred to the dummy hand. CS, Central sulcus.
Discussion
Several cortical areas showed preference for the near ball when it approached the left hand (compared with the far ball), but not when the same stimulus was presented with the hand retracted away (Fig. 3). Because the only variable changing between these two experiments was the position of the hand, this result can be explained in terms of hand position-dependent modulations of the sensitivity to visual stimuli moving within perihand space. By independently manipulating visual and proprioceptive information concerning hand position, we were able to identify three main patterns of visual processing: areas that represent visual information regardless of hand position (in the occipital cortex) (supplemental Fig. S1, available at www.jneurosci.org as supplemental material); areas modulated by purely visual aspects of hand position [in the posterior IPS (Fig. 5) and in LOC (Fig. 7)]; and regions modulated by both proprioceptive and visual information regarding hand position [in the anterior IPS (Fig. 6) and the ventral premotor cortex (Table 1)]. In the next section, we discuss the effect of the dummy hand in determining visually based hand representation.
Visually based hand schema
Perihand neurons in macaque ventral premotor cortex, as well as postural neurons in area 5, responded to a realistic dummy monkey hand, even if the position of the dummy conflicted with proprioceptive information regarding the true position of the monkey's hand (Graziano, 1999; Graziano et al., 2000). Similarly, in studies of cross-modal extinction in neuropsychological patients, placing a dummy hand near to an ipsilesional visual stimulus can increase the proportion of extinguished contralesional tactile stimuli during simultaneous presentation, although the patient's ipsilesional hand was retracted away from both the visual stimulus and the dummy hand (Farnè et al., 2000). In healthy human subjects, viewing a dummy hand being stroked by a paintbrush in synchrony with feeling similar strokes on their own occluded hand can create both an illusion of ownership of the dummy hand, and a change in felt hand position (the “rubber-hand illusion”) (Botvinick and Cohen, 1998).
In our study, the mere presence of the dummy hand in front of the subjects' retracted hands modulated the preference for a near stimulus in the posterior IPS and the LOC (Fig. 3C). This result was obtained although all subjects reported that they did not sense an illusion of ownership over the dummy hand. Therefore, this activation cannot be explained by the conscious perception of the rubber-hand illusion.
Furthermore, we excluded the possibility that the mere presence of the hand (or dummy hand) covertly captured the subjects' visual attention, thereby enhancing the representation of the near ball. In the far-dummy-hand experiment, although the far ball was approaching the dummy hand, there was no selective preference for it (unlike the near preference in the dummy-hand experiment) (Fig. 8). We therefore conclude that the preference for the near ball in the hand-related areas was not simply attributable to the representation of the hand (or dummy) as an attentionally capturing object per se, but rather was specific for hands in near space. Whether this preference is unique to the representation of the hand, or may also be modulated by the presence of other functionally equivalent objects in their spatial surrounding (such as suggested by the literature on tool use) (Maravita and Iriki, 2004), is an important area for future research. For additional discussion of the rubber-hand illusion and the dummy-hand effect in other research, see the supplemental information (available at www.jneurosci.org as supplemental material).
Visually based hand schema: posterior IPS
Activation in the posterior aspect of the medial IPS in humans has been shown to play a role in tasks requiring visuomotor coordination of hand movements with respect to targets (Chaminade and Decety, 2002; Simon et al., 2002; Grefkes et al., 2004). The same area has been reported to show a topographic mapping of space both for saccades and for pointing to targets, which is updated with eye movements (Medendorp et al., 2003, 2005). According to Ehrsson et al. (2004), activity in the medial wall of the IPS reflects the seen position of the hand. These, as well as our present findings, support the potential role of pIPS as an area that integrates visual and spatial information in hand-centered coordinates. Because this area was not modulated to the same extent by proprioceptive cues in the absence of visual feedback, we suggest that, in this hand-related area, the coordinate system is dominated by visual, rather than by somatosensory information concerning the hand. It is possible, however, that some proprioceptive information concerning hand position can also be derived from the pIPS.
LOC: a new candidate region for a visually based hand schema?
The other region that showed modulation of visual responses with respect to the seen position of the hand (or dummy hand) regardless of proprioception was LOC. Peripersonal space has typically been associated with vision-for-action in the dorsal stream (Graziano and Cooke, 2006). Why, therefore, would the ventral stream be involved in spatial representation of objects with relation to the hand? Several studies have demonstrated that LOC shows sensitivity to disparity-defined shape (Gilaie-Dotan et al., 2002), achieved by combining different depth cues (Brouwer et al., 2005; Welchman et al., 2005). This suggests that LOC may be sensitive not only to depth within objects but also in space. Moreover, a recent fMRI study showed the involvement of ventral stream visual areas in peripersonal space: Lloyd et al. (2006) investigated the neural responses to noxious stimuli compared with innocuous stimuli, both of which touched a dummy hand. In addition to parietal and frontal areas, the fusiform gyrus also showed differential activation to the noxious stimuli only when the dummy hand was aligned with the subject's shoulders (compared with the same procedure repeated with the dummy hand in an unnatural position). This, in addition to the present results, suggests that the ventral stream might also be involved in the representation of stimuli in peripersonal space.
Our results revealed a second type of visual processing with respect to the hand: hand-related visual representation, which is modulated mainly by proprioception. In the next section, we will discuss these results.
Proprioceptively based hand schema
Neurons in macaque area 5 are predominantly modulated by proprioceptively determined hand position (i.e., are sensitive to hand position while it is hidden from view) but may also encode the position of the monkey's arm visually (Graziano et al., 2000). Some neurons in the ventral premotor cortex that respond to an approaching visual stimulus in perihand space also respond when the hand is occluded from sight (Graziano, 1999). In humans, Làdavas et al. (2000) have shown proprioceptive information of the hand alone does not suffice to modulate multisensory processing in peripersonal space. We wanted to see whether the hand-related areas from our experiments would continue to respond selectively to a visual stimulus when it was presented close to the subjects' occluded hand.
Proprioceptive prevalence in hand representation: anterior IPS
In the ROI analysis, the most anterior part of the IPS showed a significant preference for the near ball (over the far) when the hand was occluded (Fig. 6). The same area was sensitive to the visual position of the hand (the preference indices in the dummy-hand experiment were significantly greater than in the retracted-hand experiment) and was activated by tactile stimulation applied to the left hand (Fig. 4). We therefore suggest that clusters within human aIPS are multisensory, showing perihand properties similar to those reported in single-unit studies of perihand space in macaque frontal and parietal cortices.
What do we know about aIPS from imaging studies? Consistent with the macaque anterior intraparietal (AIP) area, the human analog hAIP (Culham et al., 2006) is highly activated by visuomotor tasks such as visually guided grasping (Binkofski et al., 1998; Shikata et al., 2003; Frey et al., 2005) and also responds to hand manipulation without visual feedback (Binkofski et al., 1999; Jäncke et al., 2001; Stoeckel et al., 2003). These reports fit nicely with our results showing visual responses modulated by unseen (i.e., proprioceptive) changes in hand position. Furthermore, Grefkes et al. (2002) and Macaluso et al. (2003) both found visual–tactile integration in this area, in addition to activations contingent on motor responses. Finally, our Talairach and Tournoux coordinates for the ROI analysis (averaged center of mass between subjects, x = 34, y = −37, z = 43) approximately match those of the possible human VIP homolog (x = 38, y = −44, z = 46), which responded to multisensory stimuli approaching subject's faces (Bremmer et al., 2001).
A possible homolog to monkey PMv
The activation patterns in PMv [consistent with the functional localization of Mayka et al. (2006)] in our study naturally raise special interest, because this cortical area has been recognized as a region containing perihand neurons in monkeys. In humans, it has also been associated with the rubber-hand illusion (Ehrsson et al., 2004), suggesting that it might be highly influenced by the visually inferred position of the hand. In our study, the effects in this region were not as prominent as in the other areas we focused on. We found some preference for the near ball in the absence of visual feedback of hand position, and to a lesser extent with only visual information available (Table 1). Similarly to the aIPS, PMv also responded to tactile stimuli applied to the hand (Fig. 4). It is therefore possible that strong PMv activation requires concurrent visual and tactile stimulation within perihand space.
Laterality effects for peripersonal space
Our results show a clear right hemisphere prevalence (Fig. 3) in terms of the magnitude and extent of the resultant activation. It remains to be seen whether this laterality is attributable to the predominantly contralateral representation of the stimulus next to the left hand, to the position of the hand in the visual field, or rather to a general dominance of the right hemisphere in spatial processing (Jager and Postma, 2003). This is particularly interesting, given the presumed functional dissociation between the dorsal and ventral visual streams (Shmuelof and Zohary, 2005, 2006).
Conclusions
Using both group statistical parametric maps and ROI analyses, we found a decreasing gradient in the dominance of visual information along the IPS: in the most caudal part of the IPS (adjoining the transverse occipital sulcus), processing of visual information was independent of hand position; in the posterior IPS, as well as in clusters within LOC, we found that visual processing was modulated primarily by the visible position of the hand and, to a much lesser extent (if at all), by its veridical (proprioceptive) position. We therefore conclude that these areas may represent perihand space in visually driven hand coordinates (regardless of whether the hand is real or illusory). Finally, the anterior part of the IPS showed multisensory properties: along with a weakened visual modulation of hand-related processing, both significant proprioceptive modulation of visual processing and a significant response to purely tactile stimulation of the hand were revealed. We therefore conclude that aIPS represents perihand space in both somatosensory and visual hand coordinates.
Footnotes
-
This work was supported in part by Center of Excellence Grant 80009 from the Israel Academy of Sciences. N.P.H. was supported by a Science Research Fellowship from the Royal Commission for the Exhibition of 1851, Imperial College, London, United Kingdom. We thank Prof. Marshall Devor, Prof. Shaul Hochstein, Ayelet McKyton, and Lior Shmuelof for their useful suggestions.
- Correspondence should be addressed to Tamar R. Makin, Neurobiology Department, Life Sciences Institute, Hebrew University, Jerusalem 91904, Israel. tamarmakin{at}pob.huji.ac.il