Abstract
Humans can judge grating orientation by touch. Previous studies indicate that the extrastriate cortex is involved in tactile orientation judgments, suggesting that this area is related to visual imagery. However, it has been unclear which neural mechanisms are crucial for the tactile processing of orientation, because visual imagery is not always required for tactile spatial tasks. We expect that such neural mechanisms involve multisensory areas, because our perception of space is highly integrated across modalities. The current study uses functional magnetic resonance imaging during the classification of grating orientations to evaluate the neural substrates responsible for the multisensory spatial processing of orientation. We hypothesized that a region within the intraparietal sulcus (IPS) would be engaged in orientation processing, regardless of the sensory modality. Sixteen human subjects classified the orientations of passively touched gratings and performed two control tasks with both the right and left hands. Tactile orientation classification activated regions around the right postcentral sulcus and IPS, regardless of the hand used, when contrasted with roughness classification of the same stimuli. Right-lateralized activation was confirmed in these regions by evaluating the hemispheric effects of tactile spatial processing with both hands. In contrast, visual orientation classification activated the left middle occipital gyrus when contrasted with color classification of the same stimuli. Furthermore, visual orientation classification activated a part of the right IPS that was also activated by the tactile orientation task. Thus, we suggest that a part of the right IPS is engaged in the multisensory spatial processing of grating orientation.
Introduction
There has been considerable interest in the neural mechanisms involved in judging tactile orientation, because this task is considered a valid indicator of tactile spatial acuity (Van Boven and Johnson, 1994a,b). The tactile system is less efficient than the visual system in both speed and accuracy when processing the spatial attributes of objects (Jones and Lederman, 2006). One possible heuristic for tactile spatial processing is that tactile inputs are translated into a corresponding visual representation that is further processed by the visual system (visual mediation heuristics) (Lederman et al., 1990). The extrastriate cortex, near the parieto-occipital fissure, is activated during the tactile discrimination of grating orientations (Sathian et al., 1997; Zangaladze et al., 1999). Because this activation is associated with subjects reporting the use of visual imagery, it has been proposed that this area is engaged in the visual mediation heuristic of gratings.
However, it is still unclear which neural mechanisms are crucial for processing tactile orientation, because visual mediation is not necessarily required for all tactile spatial tasks (Marmor and Zaback, 1976; Carpenter and Eisenberg, 1978). One possible mechanism might involve not only sensory-specific areas including the postcentral gyrus, but also multisensory areas, because our spatial perception of orientation is highly integrated. For instance, the anterior part of the intraparietal sulcus (IPS) was activated during the passive tactile discrimination of the shape of ellipsoids compared with a rest condition in a positron emission tomography (PET) study (Bodegard et al., 2001). This area was also active during tactile grating orientation discrimination relative to judging the spacing between gratings in a functional magnetic resonance imaging (fMRI) study (Zhang et al., 2005). In contrast, part of the IPS was activated during the visual discrimination of grating orientation compared with visual detection of the same stimulus in a PET study (Vandenberghe et al., 1996) and the detection of the dimming of the fixation point in an fMRI study (Faillenot et al., 2001). These findings indicate that a subregion within the human IPS might construct a multisensory representation of orientation. However, there is, as yet, little evidence for the existence of such a multisensory IPS subregion.
In the present study, we used fMRI to test the hypothesis that a region within the IPS is engaged in multisensory spatial processing during the classification of grating orientations. To visualize the neural substrates of its spatial processing, the orientation task for each sensory modality was contrasted with its own control condition (roughness classification for touch, color classification for vision). The premise is that the orientation classification relies on some form of spatial reference system as a reference, whereas the control tasks do not. Each control condition was designed to control for the sensory input, responses, and task demands within one sensory modality, that is, touch or vision. To assess the existence of a multisensory subregion for orientation classification, we examined the activity of the IPS within the regions that were activated during the tactile orientation task during a visual orientation task.
Materials and Methods
Subjects
Sixteen healthy Japanese volunteers (10 males and 6 females) aged 22–47 years, including students and researchers, participated in the fMRI study. All participants were right-handed according to the Edinburgh handedness inventory (Oldfield, 1971). All subjects gave informed written consent, and the study was approved by the ethical committee of the National Institute for Physiological Sciences of Japan. None of the volunteers had a history of symptoms requiring neurological, psychological, or other medical care.
Tactile stimulus
The rectangular linear gratings were prepared from plastic sheets with a photosensitive layer (Makoto Craft, Yokohama, Japan). The height and width of the ridges were constant at 1.0 and 0.5 mm, respectively. The length and width of the gratings were 20 and ∼40 mm, respectively. We used nine different gratings, each of which was produced with one of three different orientations (−30, 0, and 30°) (Fig. 1 B) and one of three different degrees of roughness. Three gratings at each orientation contained different groove widths (∼1.50, 2.00, and 2.50 mm for −30°; ∼1.25, 1.75, and 2.25 mm for 0°; and ∼1.75, 2.25, and 3.00 mm for 30°).
These different groove widths were specified for each orientation to match the perceived magnitude of roughness between the orientations. The choice of groove widths was based on two psychophysical pilot experiments, which were conducted outside of the MR scanner. In the first experiment, we obtained psychophysical roughness functions for each of the three grating orientations using a conventional magnitude-estimation procedure. Twelve right-handed subjects participated in this experiment. Tactile stimulation of the right middle fingertip was performed using 24 gratings (eight surfaces for each orientation) (Fig. 1A). Each of the gratings was swept under the right middle finger three times. The subjects were asked to estimate the roughness magnitude of each grating. Each surface was presented three times. The order in which the surfaces were presented was pseudorandomized. The mean force was controlled at ∼30 g by a balance, and the average speed was controlled by the autostimulator at ∼50 mm/s. The other procedures, and the method of analysis, were described previously (Lederman and Taylor, 1972; Lederman, 1981). The psychophysical roughness functions were then used to select eight pairs (±30°) of groove-width values that matched the perceived roughness magnitudes of the eight 0° gratings. Five subjects chose a pair of gratings with ±30° orientations to achieve the closest match in perceived roughness magnitude with each of the 0° gratings. We chose the groove width that was selected most frequently across all subjects.
In the second pilot experiment, we chose three 0° gratings of eight, such that the difference in groove width was equivalent between the grating pairs. Using the nine surfaces (three 0° gratings with three pairs of ±30°), we confirmed that performance accuracy was matched between the orientation and roughness classification tasks.
Tactile stimulus application
Two sets of three gratings (6 surfaces) were glued onto an L-shaped slider. One set of three gratings had the same magnitude of perceived roughness with three different orientations. The other set of gratings comprised gratings of the same orientation but with three different levels of roughness. The first set was used for the tactile orientation task, and the second set was used for the tactile sensorimotor (roughness) control task. Eighteen sliders were prepared (three surfaces × two sets × 18 sliders = 108 surfaces altogether). Twelve sliders were pseudorandomly chosen for each hand for each subject, so that the tactile orientation task and its sensorimotor control contained identical surfaces. A single slider was used with each run. The order of surfaces in each set and the order of the two different sets on a slider were pseudorandomized.
The subjects lay supine on a bed with their eyes open and their ears plugged and were instructed to relax. The subjects were asked to fixate on a white cross (viewing angle, 1.7 × 1.7°) on a semitransparent viewing screen, projected from an LCD projector (DLA-M200L; Victor, Yokohama, Japan) through a mirror. The subjects’ right arms were extended along the sides of their body and comfortably supported by a cushion. The subjects placed their right middle fingertips lightly on the surface of the slider through a bore in the plastic holder, with the other fingers resting on a plastic frame just above the surface (Fig. 1C). The distal–proximal axis of the middle finger was parallel to the body axis of the subjects. The bore size was 18 mm in the proximal–distal direction and 17 mm in the lateral–medial direction. The thickness of the plastic holder was 2 mm. The finger was immobilized against the horizontal movement of the slider. We explained to the subjects that they should avoid applying excessive pressure to the stimulator. The experimenter did not observe any conspicuous movement by any subject when presenting the stimuli to the fingertip. The vertical pressure was also monitored by a strain gauge during the experiment. The subjects’ left hands were extended along the side of their body and placed on a response box. The index, middle, and ring fingers of the left hand were placed on each of three buttons of the response box. The three buttons were aligned orthogonal to the body axis. The same procedure was followed when the left middle finger was stimulated, except that the stimulated and response hands were interchanged.
An experimenter moved the slider back and forth in the horizontal direction, guided by auditory cues. These cues were presented only to the experimenter through a pair of ceramic-condenser headphones (Hitachi Medical Corporation, Tokyo, Japan). The rail moved quietly, without making any sound that could be related to the task. The range of surface displacement was ∼50 mm and was demarcated with black ink on each stimulus plate. The mean speed of the slider was ∼50 mm/s (HT-5100; Ono Sokki, Yokohama, Japan).
Tactile task
The subjects first performed the three tactile tasks (orientation, its sensorimotor control, and motor control) and then performed the two visual tasks (orientation, its sensorimotor control). The tactile tasks were designed to show the cortical regions related to tactile-orientation classification by contrasting it with its own control condition. This contrast was designed to subtract the activity related to factors including the sensory input, task demand, and responses.
Both right and left hands were tested in the tactile tasks. The order of the two hand conditions was counterbalanced within the group of subjects. A single run consisted of three 24 s blocks. In a single block, the subjects were engaged in one of the three tasks. The subjects went through 12 runs for each hand (Fig. 1D). The order of the tasks was pseudorandomized in each run. Before scanning, in the MRI room, the subjects practiced the tactile classification tasks using the same sets of gratings until they reached a specified level of performance of at least seven correct judgments of nine surfaces.
Tactile orientation task (TO).
The subjects were instructed to classify the orientation of the linear gratings under their middle finger while fixating the visual cues on the screen (Fig. 1E, top). A single task block consisted of the instruction (3 s), test (15 s), and response (6 s) periods. The instruction cue (viewing angle, 3.6 × 6.4°) was presented in Chinese characters for the first second of the instruction period. Then, the finger pads were stimulated by the three surfaces during the test period. For each surface, the slider was moved three times in 3 s: 50 mm in the left-to-right direction for the first second, 50 mm in the right-to-left direction for the next second, and 50 mm in the left-to-right direction for the final second. Three seconds of tactile stimulation alternated with a 3 s interstimulus interval. The surface always moved from the smooth portions beside the gratings. During stimulation with each surface, a square of one of three different colors (blue, yellow, and red) was presented on the center of the screen (viewing angle, 1.7 × 1.7°). The order of the colors was pseudorandomized to avoid any possible association between a given color and a certain grating orientation. After the presentation of the third surface, the subjects were asked to press three buttons in succession, which were assigned to three different orientations. The subjects pushed the left button for left-oriented (−30°) gratings, the middle button for 0° gratings, and the right button for right-oriented (+30°) gratings. The subjects were asked to press all buttons accurately within 6 s. Eight functional volumes (24 s; 3 s per volume) were acquired in each block.
Sensorimotor control (roughness task, TSM).
The roughness classification of the grating surface was chosen as a tactile control condition. The premise was that the spatial processing of orientation relies on some form of spatial reference system, whereas roughness classification does not require such a stage of processing. Rather, roughness classification depends on comparison of the perceived roughness magnitude of the current surface to those of the other stimulus surfaces (Lederman and Klatzky, 1997). This condition was designed to control for the sensory input factors, as well as the response and task demands of the orientation task. The experimental design was the same as in the orientation task, except for the arrangement of the stimuli and the task instructions. The surfaces for this task possessed the same orientations with three different levels of roughness in each run. The subjects were instructed to classify the roughness of the linear gratings (Fig. 1E, middle). The order of the colors of the squares was also pseudorandomized to avoid any associations between a given color and a certain roughness of gratings. The subjects pushed the left button for the smoothest gratings, the middle button for intermediately rough gratings, and the right button for the roughest gratings.
Motor control.
This condition was designed to control for visual input and the responses during the orientation task. In this condition, only visual (color) cues were presented on the screen, and there was no tactile stimulation (Fig. 1 E, bottom). The subjects were instructed to describe the colors of the squares in order of appearance after the end of the presentation of the third square. The subjects pushed the left button for blue, the middle button for yellow, and the right button for red.
In each run, a 12 s baseline period was added before the first task block, and a 15 s baseline period was added after the third task block. No baseline period was added between the task blocks. During the baseline condition, the subjects were instructed to relax and fixate on the white cross on the screen while their fingers were placed on the smooth portion of the surface beside the gratings. Altogether, 33 volumes (eight volumes times three tasks plus nine volumes for baseline) were collected during each run of the tactile tasks.
Visual task
The orientation task and its own control condition were performed visually to test for the existence of a multisensory subregion for orientation classification within the IPS. Because the visual task design was different from the tactile design in the timing of responses (see Discussion for an explanation of why this was the case), the visual orientation task was contrasted with its own sensorimotor control condition (color task), which contained the same timing of responses as the visual orientation task. To discourage the subjects from imagining the gratings visually during the tactile tasks, the visual task was performed after the tactile tasks were completed (Fig. 2). The response hand and the order of the tasks were counterbalanced within the subjects. Subjects completed two runs, each of which included six repetitions of each task (Fig. 2B).
Visual orientation task (VO).
The subjects were instructed to classify the orientation of the linear gratings presented on the screen. A single task block consisted of both instruction (3 s) and test (9 s) periods (Fig. 2C, top). The instruction cue (viewing angle, 3.6 × 6.4°) was presented in Chinese characters for the first second of the instruction period. During the test period, six gratings were presented (viewing angle, 8.0 × 8.0°) (Fig. 2A). The stimuli had different orientations (−30, 0, and +30°) and different colors (blue, yellow, and red). The orientation of the gratings changed six times, whereas the color of the gratings changed only once during each test period. Each stimulus was presented for 300 ms, and the interstimulus interval was 1200 ms. Immediately after the presentation of each individual grating, the subjects were asked to press the buttons assigned to each orientation. A single test block was always followed by a 9 s baseline period (Fig. 2B). During the baseline condition, the subjects were instructed to relax and fixate on the white cross on the screen. The subjects responded using the same buttons as in the tactile orientation task.
Sensorimotor control (color task, VSM).
This condition was designed to control for the sensory input, responses, and task demands of the visual orientation task. The experimental design was the same as in the orientation task, except for the arrangement of stimuli and the task instruction (Fig. 2C, bottom). During the test period, six gratings were presented (Fig. 2A). The color of the gratings changed six times, whereas the orientation of the gratings changed only once during each test period. The subjects were instructed to press the button assigned to each color immediately after the presentation of each individual grating. A 12 s baseline period was added before the first task block, and a 3 s baseline period was added after the final task block (five volumes). Altogether, 89 volumes [five baseline volumes + two tasks × six repetitions × (four volumes per task block + three volumes per baseline)] were obtained in each run. The subjects pressed the same response buttons as in the tactile motor control task.
Data acquisition and processing
Vertical force.
The vertical pressure of the fingertip on the surface was measured. The bore of the plastic frame was supported by a horizontal acrylic bar, to which two sheets of foil strain gauges were attached (KFP-120-C1-65; Kyowa Electronic Instruments, Tokyo, Japan). The force signals were sampled at 100 Hz and acquired with a personal computer (Thinkpad R51; IBM, Tokyo, Japan) through an amplifier (CDA-700A; Kyowa Electronic Instruments) and an external analog-to-digital converter (MP100A; BioPac Systems, Colorado Springs, CO). The force signals were filtered with a 10 Hz low-pass filter. The acquired data were further processed with Windows-based software (Acknowledge 3.7.0; BioPac Systems, Goleta, CA).
MRI.
Functional MR images were acquired on a 3 tesla head scanner (Allegra; Siemens, Erlangen, Germany) with echoplanar imaging (EPI) capability. Standard sequence parameters were used for obtaining the functional images as follows: gradient-echo EPI; repetition time, 3000 ms; echo time, 30 ms; flip angle, 85°; 44 axial slices of 3 mm thickness with no interslice gap; field of view, 192 × 192 mm; and in-plane resolution, 3.0 × 3.0 mm. After the acquisition of functional images, T1-weighted high-resolution anatomical images were obtained (voxel size, 0.9 × 0.9 × 1 mm). Image processing and statistical analyses were performed using the statistical parametric mapping (SPM) package (SPM99; http://www.fil.ion.ucl.ac.uk/spm; Wellcome Department of Cognitive Neurology, London, UK) implemented in MATLAB (MathWorks, Sherborn, MA) (Friston et al., 1995a,b). The first five volumes of each fMRI run were discarded because of unsteady magnetization. Realigned images were normalized to a standard EPI template as defined by the Montreal Neurological Institute; this closely approximates to the space described in the Talairach and Tournoux (1988) atlas. The normalized EPI images were filtered using a Gaussian kernel of 8 mm full-width at half-maximum in the x, y, and z axes. The T1-weighted high-resolution anatomical images were normalized by the same procedure.
Statistical analysis
Behavioral data were collected and statistically evaluated with SPSS software (version 10.0J; SPSS Japan, Tokyo, Japan). Statistical analyses of fMRI data were conducted at two levels. First, individual task-related activation was evaluated. Second, to make inferences at a population level, individual data were summarized and incorporated into a random-effect model (Holmes and Friston, 1998; Friston et al., 1999).
Individual analysis.
We fitted a general linear model to the functional MRI data from each subject (Friston et al., 1994; Worsley and Friston, 1995). The time series for each voxel was high-pass filtered to 0.021 Hz and low-pass filtered by a canonical hemodynamic-response function. In the tactile tasks, the neural activity during the instruction periods, response periods, and three test periods were each modeled with boxcar functions convolved with a canonical hemodynamic-response function. Twenty-four runs (12 right-handed and 12 left-handed) were included in the group-design matrix for each subject. Each run included five regressors of a boxcar function: a regressor for the instruction period of all of the task conditions, a regressor for the response period, and three regressors for each test period of the task conditions. In contrast, the two runs of the visual task were included in the separate group-design matrix of each subject. The neural activities for the instruction and test periods were modeled, each with boxcar functions convolved with a canonical hemodynamic-response function. Each run included three regressors: a regressor for the instruction of all the conditions and two regressors for each test period of the conditions.
To test hypotheses about regionally specific condition effects, the estimates for each condition were compared by means of linear contrasts. The resulting set of voxel values for each comparison constituted an SPM of the t statistic [SPM{t}]. The SPM{t} was transformed to normal distribution units [SPM{z}]. In the tactile tasks, we first evaluated the contrast between the orientation and motor control conditions TO − TM, and the contrast between the sensorimotor control and motor control conditions TSM − TM within the whole brain, to confirm that the somatosensory cortex was activated during tactile stimulation (Table 1). We then performed the contrast between the orientation and sensorimotor control conditions To − TSM in the whole brain to evaluate the neural substrates for the spatial processing of orientation classification. The opposite contrast TSM − To was also performed in the whole brain. The contrast To − TSM was also performed to compare the left and right hands, to further examine any differences in brain activity attributable to the hand used to perform the task.
To examine the multisensory activity in the IPS area, the contrast between the visual orientation versus the control condition (VO − VSM) was evaluated within the areas highlighted by contrasting the tactile orientation task and the sensorimotor control condition (To − TSM). The opposite contrast VSM − Vo was also performed in the whole brain. The threshold for SPM{z} was set at Z > 2.33.
Group analysis with random-effect model.
The weighted sum of the parameter estimates in an individual analysis constituted contrast images, which were used for the group analysis (Holmes and Friston, 1998; Friston et al., 1999). At the group level, we performed the same linear contrasts as in the individual analyses (Table 1). The contrast images obtained from the individual analyses represent the normalized task-related increment of the MR signal of each subject. For each contrast, a one-sample t test was performed for every voxel in the brain to obtain population inferences. The resulting set of voxel values for each contrast constituted the SPM{t}. The SPM{t} was transformed to normal distribution units [SPM{z}].
To evaluate the hemisphere effects on tactile spatial processing, contrast images of TO − TSM (Table 1) were flipped in the horizontal (right–left) direction. Asymmetric involvement of the neural substrates for tactile spatial processing, regardless of the hand used, was shown by the comparison between the unflipped and flipped images in a pairwise manner (Harada et al., 2004). The comparison was performed within the regions that showed activation in the TO − TSM contrast.
To examine multisensory activity in the IPS area, the visual orientation versus control conditions contrast VO − VSM was evaluated within the areas that showed activation by the contrast of TO − TSM. The threshold for SPM{z} was set at Z > 2.33. The statistical threshold for the spatial extent test on the clusters was set at p < 0.05 and corrected for multiple comparisons within the search volume (Friston et al., 1996). The thresholds of the spatial extent test were as follows: 5656 mm3 for TO − TM; 4320 mm3 for TSM − TM; 4240 mm3 for TO − TSM and TSM − TO; 4000 mm3 for the left hand (TO − TSM); 3800 mm3 for the right hand (TO − TSM); 744 mm3 for the hemispheric laterality test; and 1584 mm3 for VO − VSM.
Results
Task performance
Tactile tasks
The subjects were able to classify the gratings equally well, regardless of the task or which hand was used (Fig. 3A). A two-way repeated-measures ANOVA [(three task conditions: tactile orientation, sensorimotor control, and motor control) × (two hands: left and right)] on the accuracy scores (percentage correct) showed a significant effect of condition (F(2,30) = 20.5; p < 0.001). Bonferroni’s t tests for multiple comparisons showed significant differences only between the motor control and the other conditions (p < 0.001). The response times were also similar between the orientation (TO) and sensorimotor control tasks (TSM) (Fig. 3B). A three-way repeated measures ANOVA [(three task conditions: tactile orientation, sensorimotor control and motor control) × (two hands: left and right) × (order of response: first, second, and third button press)] of the response times showed significant effects of condition (F(2,30) = 22.0; p < 0.001) and order (F(2,30) = 249; p < 0.001). Bonferroni’s t tests for multiple comparisons showed significant differences between the motor control and the other conditions (p < 0.001).
The vertical pressure applied by the right finger was nearly constant between conditions, whereas the pressure exerted by the left finger varied by condition (Fig. 3C). The same repeated-measures ANOVA performed on the vertical pressure showed a significant condition × hand interaction (F(2,30) = 4.92; p = 0.014). Multiple comparisons showed significant differences between the motor control and the other conditions (p < 0.01) for the left hand. However, there were no significant differences between the tactile orientation task and sensorimotor control condition for either the right hand (p > 0.9) or the left hand (p = 0.064).
Visual tasks
The subjects classified the gratings equally well, regardless of the task. Performance accuracy was matched between the conditions (mean ± SEM = 98.0 ± 0.5% for the orientation condition and 97.1 ± 0.8% for the control condition). A Student’s t test showed no significant difference between the accuracy of responses (p > 0.3). Response times for the orientation task were marginally faster than for the sensorimotor control condition (527.4 ± 16.8 ms for the orientation task and 557.9 ± 24.3 ms for the control condition). A Student’s t test showed a significant difference between these response times (t(15) = 2.39; p = 0.030); however, the difference was only 5.7%, which was negligible within the context of this study.
fMRI results
Comparisons between the tactile task and motor control conditions (TO − TMand TSM − TM)
Table 2 shows the coordinates of the foci in the significantly activated areas. The contrast of the tactile orientation versus motor control condition (TO − TM) significantly activated the postcentral gyrus (presumably the primary somatosensory cortex), parietal operculum (PO), posterior insula, medial frontal cortex, lateral frontal cortex, posterior parietal cortex, precuneus, basal ganglia, and cerebellum bilaterally. This contrast also activated the right lateral prefrontal cortex, orbitofrontal cortex, and anterior insula (Table 2, Fig. 4AI). The contrast of the tactile sensorimotor control versus motor control condition (TSM − TM) activated the postcentral gyrus, PO, anterior and posterior insula, lateral prefrontal cortex, medial and lateral frontal cortex, posterior parietal cortex, precuneus, basal ganglia, and cerebellum bilaterally, and the right orbitofrontal cortex (Table 2, Fig. 4AII).
Comparisons between the tactile orientation and sensorimotor control conditions (TO − TSM and TSM − TO)
When the tactile orientation task was contrasted with its sensorimotor control condition (TO − TSM), the areas around the IPS were activated bilaterally (Table 2, Fig. 4AIII). The cluster in each hemisphere extended from the postcentral sulcus anteriorly to the posterior part of the IPS posteriorly. The right cluster was conspicuously larger than the left cluster (17,912 mm3 for the right hemisphere and 4712 mm3 for the left hemisphere). When the (TO − TSM) contrasts were performed for each hand, areas around the right IPS were activated regardless of the hand used. The overlapping area between the two hands was located in the right post-central sulcus (CS) and IPS (volume, 4720 mm3; coordinates for center of mass: x = 38, y = −47, and z = 56). When the sensorimotor control condition was contrasted with the orientation task, the contrast activated the bilateral lateral prefrontal cortices, the right anterior insula, orbitofrontal cortex, lingual gyrus, basal ganglia, amygdala, and the left lingual/fusiform gyrus, inferior/middle occipital gyrus, and cerebellum (Table 2, Fig. 4AIV). The left PO (coordinates: x = 54, y = −14, z = 14; Z value = 3.20) and right posterior insula (coordinates: x = 38, y = −12, z = 2; Z value = 3.06) were also activated, but the cluster size of these areas did not reach the statistical threshold.
Hemispheric effect
Right-lateralized activity was found in a cluster extending from the post-CS to the anterior part of the IPS (volume, 1736 mm3). The peak of the cluster was Z = 3.79 at the coordinates x = 48, y = −42, and z = 52 (Fig. 5).
Multisensory activation in the IPS for the classification of grating orientation
When the visual orientation task was contrasted with the sensorimotor-control condition (VO − VSM), the left middle occipital gyrus (MOG) was significantly activated (Table 3). This contrast also showed activation in a part of the right IPS within the cluster activated by the tactile orientation task (TO − TSM) (Fig. 6). The area of overlap extended from the anterior to the middle part of the IPS. The volume of this area was 1704 mm3 with the center of mass at coordinates x = 35, y = −51, and z = 58. Figure 7 shows two representative examples from the analyses of the individual data. No significant activation was observed when the sensorimotor control condition was contrasted with the orientation task (VSM − VO).
Discussion
The right post-CS and the IPS regions were activated when the tactile orientation task was contrasted with the sensorimotor control condition (roughness task), regardless of the hand used. In contrast, when contrasted with its control condition (color task), the visual orientation task activated the left MOG. As hypothesized, a part of the right IPS was activated by both the tactile and visual orientation tasks and hence might represent a multisensory processing area.
Task design and behavioral performance
In the tactile pilot experiment, perceived roughness magnitude varied with grating orientation. Gratings feel rougher when they move perpendicular to, as opposed to along, the gratings. The roughness percept is likely enhanced by the skin consistently catching on the leading edges as the gratings are moved directly across the finger. The difference between the two other orientations may also be explained by mechanical effects attributable to differential skin catching.
The main task-design difference between the sensory modalities was in the timing of responses. The subjects responded after the presentation of each grating in the visual task, whereas in the tactile task they responded to all three gratings only after the presentation of the third surface. This variation derives from the nature of the tactile sensorimotor control task: the classification of surface roughness. Although subjects can rely on some form of spatial reference system to classify grating orientation, they must rely on the perceived roughness of the other surfaces to classify the gratings in terms of perceived roughness. Subjects in a pilot study showed poor performance in roughness classification when there were no reference surfaces. In contrast, in the visual task, if we had used the same timing of responses as in the tactile task, there would have been substantial demands on memory: the subjects would have had to remember more responses because of the shorter duration of the presentation of visual stimuli.
The present experiment was designed to examine the existence of a multisensory area activated by both visual and tactile orientation tasks, as opposed to directly comparing the two modalities. Therefore, the orientation task for each sensory modality was contrasted with its own control condition, in which subjects were given the same sensory input, responded at the same time, and used the same fingers. In tactile tasks, furthermore, we excluded any activity during the response period from the task-related activity (test period) by using separate regressors. Collectively, the difference in the timing of responses between the sensory modalities should not explain the multisensory activation in the IPS.
It is known that the parietal-premotor cortical network is related to top-down attentional modulation (Corbetta, 1998; Hopfinger et al., 2000). In the present study, multiple brain areas were activated, when the tactile tasks were compared with the motor control task. Because the motor control task was easier than the other conditions, the activation might reflect difference in attentional demands as well as in the sensory and cognitive processing between the task conditions. In contrast, multisensory activation for the orientation judgment was only found in the IPS. In the tactile tasks, sequential finger responses were easy to perform, because the motor control task showed virtually perfect task accuracy. Because task accuracy was similar at <90% between the other tactile tasks, we can assume that attentional demands were comparable between them. Furthermore, the visual orientation task showed a significantly lower response time than its sensorimotor control task. This result indicates that more attentional demand, if any, might have been required in the control than in the orientation task. Collectively, it is unlikely that multisensory activation is caused by higher attentional demands in the orientation task than in its own sensorimotor control.
In the orientation task, subjects needed to rely on a spatial reference system to classify the orientation of the gratings. In contrast, it was necessary to use other cues in the control conditions (i.e., roughness magnitude for tactile and color for visual modalities). Hence, contrasting the orientation tasks with their sensorimotor control conditions should highlight the spatial processing required during orientation classification.
Sensory-specific activation
The right post-CS and anterior IPS were activated by the tactile orientation judgment, regardless of the hand used. The asymmetric activation in these areas corresponded with the results of Harada et al. (2004), which showed right-lateralized activation in the IPS during the tactile discrimination of two-dot spatial patterns. These areas might play an important role in the extraction of spatial information such as grating orientation from the anterior part of the postcentral gyrus.
In contrast, the visual orientation task specifically activated the left MOG. This result confirms the findings of Faillenot et al. (2001), which showed left MOG activation during the visual discrimination of grating orientation. The MOG was also activated by other visuospatial tasks, including the judgment of line orientation (Kesler et al., 2004) and mental rotation tasks (Podzebenko et al., 2002). The MOG might work in concert with other cortical areas, such as the IPS, for the visuospatial processing of orientation.
Multisensory activation
The main finding of the current study is that the multisensory orientation judgment of gratings activated a subregion of the IPS. It has been known that the IPS is important for visual orientation judgment (Eacott and Gaffan, 1991). For instance, neurons in the anterior IPS of nonhuman primates are visually selective to the orientation of objects (Murata et al., 2000), whereas neurons in the posterior IPS are tuned for orientation in the fronto-parallel plane of elongated objects (Sakata and Taira, 1994). In humans, part of the IPS was also activated during the visual discrimination of grating orientation (Vandenberghe et al., 1996; Faillenot et al., 2001).
In contrast, it has been unclear how this region is involved in tactile orientation judgment. Previous studies indicated that the extrastriate area might be crucial for tactile orientation judgment, because this area was activated by orientation judgment (Sathian et al., 1997; Zangaladze et al., 1999). However, recent studies also reported activation of the IPS during the tactile orientation judgment of gratings (Van Boven et al., 2005; Zhang et al., 2005). The findings from both visual and tactile studies indicate that the IPS might be crucial for the multisensory judgment of orientation. However, there is little evidence regarding the existence of a multisensory subregion within the human IPS, which is involved in grating orientation judgment.
The current study extends the findings of previous studies by showing that a subregion of the right IPS is crucial for multisensory orientation judgment. Neural populations in this region might constitute a multisensory orientation-related network. This idea is supported by the results of previous fMRI studies (Grefkes et al., 2002; Saito et al., 2003; Zhang et al., 2004). For instance, Bremmer et al. (2001) showed that a ventral part of the IPS was activated by polysensory motion stimuli around the subject’s head. The authors proposed that the IPS subregion encodes sensory information from different sensory modalities in a body-centered frame of reference.
It is possible that the IPS subregion might integrate the orientation representations from different sensory modalities into a supramodal representation within a single spatial reference system. The sensory-specific areas may extract spatial information of grating orientation from early sensory areas, whereas the IPS subregion may transform such spatial information in terms of a spatial reference system. In the present experiment, each sensory modality could encode grating orientation using several different systems of spatial reference, yet the subjects eventually responded with the same fingers. In other words, the subjects perceived the grating orientations from both sensory modalities by making an orientation judgment in the same type of spatial reference frame. It could be concluded from this that humans might be able to experience the orientation of gratings as both a single and an integrated representation. This hypothesis is supported by the notion that the posterior parietal cortex combines information from different sensory modalities to form a unified representation of space (Andersen et al., 1997). In spatial hemineglect, subjective orientation judgment is disrupted multimodally (Kerkhoff, 2001). Our results are in accordance with this finding, implying that the right IPS might be crucial for orientation judgment.
Visual mediation heuristic
In contrast, we observed no strong activation of the extrastriate cortex during orientation judgments. According to the hypothesis of Sathian et al. (1997), the differences in results compared with previous studies might derive from the degree to which visual mediation was applied. The subjects in our study might have been able to judge grating orientation without visualizing the gratings as vividly as the subjects in previous studies. Congenitally blind subjects, who have no visual experience, can perform several haptic tasks that require spatial processing, including mental rotation tasks (Marmor and Zaback, 1976; Carpenter and Eisenberg, 1978), estimating the spatial density of textures (Merabet et al., 2004), and recognizing objects such as a mask, shoes, and bottles (Pietrini et al., 2004). These results suggest that the judgment of grating orientations might not always require visual mediation heuristics. In conclusion, a subregion in the middle IPS might play an important role in the multisensory judgment of grating orientation.
Footnotes
-
This work was supported by Grant-in-Aid for Scientific Research S#17100003 from the Japan Society for the Promotion of Science to N.S., by Grant-in-Aid for Scientific Research 018#17021045 from the Japanese Ministry of Education, Culture, Sports, Science, and Technology to N.S., and by grants from the National Sciences and Engineering Research Council of Canada and the Canadian Institutes of Health Research to S.J.L. We thank M. Shimura for his kind cooperation in crafting the surfaces of linear gratings; Y. Nawa, K. Yamanaka, Dr. H. Mochiyama, and M. Murase for their technical support; and T. Takei and Dr. A. Sano for their cooperation in recruiting subjects.
- Correspondence should be addressed to Dr. Norihiro Sadato, Department of Cerebral Research, National Institute for Physiological Sciences, Okazaki, Aichi 444-8585, Japan. Email: sadato{at}nips.ac.jp