Numerous studies have investigated how information about the position of a target object perceived through the senses is converted into motor commands, so that an effector can act toward this target. One of the challenges the brain faces in solving this task is the conversion of spatial coordinates initially perceived with reference to the sensory organs (e.g., with respect to the retina in vision or with respect to the head in audition) into coordinates that guide the chosen effector, usually the hand, toward those coordinates. In the early 1990s, neurons were discovered in the macaque monkey brain that have a tactile receptive field (RF) centered on the hand or face and a visual RF around the location of the tactile RF (Colby et al., 1993; Graziano et al., 1994). Importantly, these neurons often respond to a visual stimulus only if it is no more than ∼30 cm away from the hand. These neurons therefore seem to represent only the space close around the body, the so-called “peripersonal” or “near” space. Furthermore, the visual RF of these neurons follows the hand when it is moved; in other words, visual stimuli near the hand are coded by these neurons with respect to the hand, not the eyes. The firing rate of these neurons is influenced both by vision of a fake monkey arm and by displacement of the unseen real arm, indicating that both visual and proprioceptive signals are used to determine arm position (and, thus, stimulus location with respect to the arm).
Evidence that such a representation of peripersonal space also exists in the human brain has mainly involved purely behavioral testing in healthy participants (Maravita et al., 2003) or neuropsychological studies of patients suffering from hemispatial neglect. These patients sometimes show either an amelioration or aggravation of their neglect syndrome when a stimulus is presented in their peripersonal compared with their extrapersonal space (Ladavas, 2002).
In a recent study published in The Journal of Neuroscience, Makin et al. (2007) provide compelling evidence for a representation of peripersonal space in humans using functional magnetic resonance imaging in healthy participants. In a 2 × 2 × 2 factorial design, the authors presented a small ball near and far from the left hand with respect to both visual and proprioceptive information [Makin et al. (2007), their Fig. 1 (http://www.jneurosci.org/cgi/content/full/27/4/731/F1)]. The stimulus was presented near the left thigh. In one condition, the hand was placed visibly on the left thigh, putting the hand near the stimulus both visually and proprioceptively. In a second condition, the hand was placed on the shoulder, such that the stimulus at the thigh was now far from the hand both visually and proprioceptively. In a third condition, the hand placed on the thigh was occluded by a cardboard shield (near was signaled only by proprioception, because visual information about the hand was absent). Finally, in a fourth condition, the participant's hand was placed on his or her shoulder, while a prosthetic (dummy) hand was placed on her thigh (vision near, proprioception far). All four of these conditions were also run with a far stimulus presented 70 cm toward the feet from the thigh stimulus, allowing for a comparison of near versus far stimuli in all conditions in which at least one sense signaled the thigh stimulus to be near. Because peripersonal neurons have both visual and tactile RFs, in a separate, purely tactile part of the experiment, Makin et al. (2007) touched the participant's hand with the stimulus while the participant closed his or her eyes.
Increased activity to the near over the far stimulus, when both vision and proprioception signaled stimulus proximity relative to the hand, was found in the ventral promoter cortex, the intraparietal sulcus (IPS), and the lateral occipital complex (LOC) [Makin et al. (2007), their Fig. 3 (http://www.jneurosci.org/cgi/content/full/27/4/731/F3)]. However, posterior areas (LOC and posterior IPS) were activated more by the near than the far stimulus when the hand was visually close to the stimulus (i.e., “hand on thigh” and “rubber hand on thigh” conditions) [Makin et al. (2007), their Fig. 5 (http://www.jneurosci.org/cgi/content/full/27/4/731/F5)]. These areas were not (although just barely nonsignificant; p = 0.07) modulated by proprioceptive information (i.e., their activation to a near stimulus was similar to that of a far one in the “hand on thigh but covered” and “hand on shoulder” conditions).
In contrast, the anterior IPS was significantly more activated by a near stimulus in the hand on thigh but covered condition, in which proximity to the stimulus was only signaled proprioceptively, albeit at a significance level of only 0.03 (compare with the nonsignificance of 0.07 of the same condition in the posterior IPS) [Makin et al. (2007), their Fig. 6 (http://www.jneurosci.org/cgi/content/full/27/4/731/F6)]. At the same time, the rubber arm condition (vision near only) did not show a significant far–near difference, so that the relative influence of vision and proprioception differed between posterior and anterior IPS. In addition, only the anterior IPS responded also to the purely tactile stimulation.
The authors interpret their results to indicate a caudorostral gradient with respect to the kind of information used to determine the spatial relation of the stimulus to the hand. In LOC and the posterior IPS, perihand space is defined primarily through vision, whereas in more anterior parts of the parietal lobe and in the frontal lobe, information about perihand space is formed from vision, proprioception, and somatosensation. However, the small statistical difference for a proprioceptive influence in posterior and anterior IPS (p = 0.07 vs 0.03) suggests that this interpretation should be affirmed by a replication of these results. Furthermore, at least neck-proprioceptive information reaches the part of IPS subsumed here as posterior in macaques (Snyder et al., 1998).
The parietal and frontal areas reported by Makin et al. (2007) to be involved in the representation of peripersonal space are well in line with neurophysiological data from macaques and with imaging studies concerned with action planning and hand–object manipulation in humans. Therefore, the question of which sense, vision or proprioception, dominates the determination of hand position not withstanding, this study provides compelling evidence for a human homolog of the monkey network representing peripersonal, or at least perihand, space. It remains to be determined whether the areas reported here are primarily responsible for the hand alone and subserve mainly hand action or whether they mediate the peripersonal space around the rest of the body as well. This question has not been exhaustively answered even in the monkey brain.
Maybe the most intriguing result is the involvement of visual area LOC, which is located in the ventral (“what”) rather than the dorsal (“where”) stream; the ventral stream has so far not been linked to the representation of peripersonal space, let alone shown to be modulated by hand position with respect to a stimulus. This study is therefore a good example of how human neuroimaging can base a hypothesis on neurophysiological work in monkeys, as well as provide data to create new hypotheses and research questions, which may be elucidated by neurophysiological recording or functional imaging.
Footnotes
-
Editor's Note: These short reviews of a recent paper in the Journal, written exclusively by graduate students or postdoctoral fellows, are intended to mimic the journal clubs that exist in your own departments or institutions. For more information on the format and purpose of the Journal Club, please see http://www.jneurosci.org/misc/ifa_features.shtml.
- Correspondence should be addressed to Tobias Schicke, Biological Psychology and Neuropsychology, University of Hamburg, Von-Melle-Park 11, 20146 Hamburg, Germany. tobias.schicke{at}uni-hamburg.de