Abstract
The neural control of hand movement involves coordination of the sensory, motor, and memory systems. Recent studies have documented the motor coordinates for hand shape, but less is known about the corresponding patterns of somatosensory activity. To initiate this line of investigation, the present study characterized the sense of hand shape by evaluating the influence of differences in the amount of grasping or twisting force, and differences in forearm orientation. Human subjects were asked to use the left hand to report the perceived shape of the right hand. In the first experiment, six commonly grasped items were arranged on the table in front of the subject: bottle, doorknob, egg, notebook, carton, and pan. With eyes closed, subjects used the right hand to lightly touch, forcefully support, or imagine holding each object, while 15 joint angles were measured in each hand with a pair of wired gloves. The forces introduced by supporting or twisting did not influence the perceptual report of hand shape, but for most objects, the report was distorted in a consistent manner by differences in forearm orientation. Subjects appeared to adjust the intrinsic joint angles of the left hand, as well as the left wrist posture, so as to maintain the imagined object in its proper spatial orientation. In a second experiment, this result was largely replicated with unfamiliar objects. Thus, somatosensory and motor information appear to be coordinated in an object-based, spatial-coordinate system, sensitive to orientation relative to gravitational forces, but invariant to grasp forcefulness.
Introduction
The control of hand use is one of the most sophisticated functions of the human nervous system. Behaviors like cooking and grooming involve making and breaking contact, squeezing and twisting objects, and using tools and utensils to move and scrape external surfaces. The purely motoric aspects of shaping and moving the hand are now reasonably well understood. We have learned that neuromuscular control of the hand is essentially distributed, but at the same time dominated by patterns of covariation in the excursion of individual joints and the activation of the individual motoneurons and motor cortical neurons (Weiss and Flanders, 2004; Thakur et al., 2008; Dombeck et al., 2009; Hendrix et al., 2009). Furthermore, our studies of sequences of human hand movements imply that the CNS continuously maintains a representation of the current hand shape, so that this state information can be treated as the initial condition upon which to generate new motor commands (Jerde et al., 2003; Flanders, 2011). Thus, the general nature of the motor output and the basic algorithm for producing ongoing movements are reasonably clear.
The neurophysiology of the somatosensory aspect is not as well understood. It is well known, from classic anatomical and physiological studies, that the human hand has an abundance of different types of somatosensory receptors. For example, there are stretch receptors in muscle, at the junctions between muscle and tendon, and in joint capsules. There are also many types of specialized sensory nerve endings in the skin, especially at the finger tips, and there are receptors in the connective tissue under the skin that can potentially signal more global patters of stretch. Interestingly, by recording action potentials from cutaneous mechanoreceptor afferents, Edin (1992) showed that receptor endings in the skin on the back of the human hand could signal patterns of joint configuration, i.e., hand shape. Thus, like the motor output, the peripheral sensory input seems to be widely distributed, gathering different types of information from a range of locations. However, despite numerous elegant studies of the hand area of monkey somatosensory cortex (for review, see Romo and Salinas, 2003; Hsiao, 2008), it is still not clear how this system synthesizes and coordinates its many inputs.
Most hand use can be classified as active sensing, meaning that the sensory input reflects a combination of external sources and aspects produced by self-movement. Our working hypothesis in approaching this problem is that somatosensory perception is best understood as a process that arises from a combined sensorimotor coordinate system; i.e., sensory inputs are normally interpreted in light of motor information about self-movement. Thus, we predict that subjects' interpretation of a pattern of intrinsic joint configurations (hand shape) should be invariant to self-actions like weight support, squeezing, and twisting, even though these actions profoundly alter the pattern of somatosensory input. Likewise, when musculoskeletal biomechanics are changed by altering arm posture, the sensorimotor system should still maintain an accurate representation of intrinsic hand shape.
Materials and Methods
Protocol.
Five adult human subjects (three male, two female) were recruited for experiment 1. Two of these subjects (one male, one female) returned for experiment 2, where we also recruited three new subjects (two male, one female). Each subject provided written, informed consent, and stated their freedom from any known neurological or muscular disease. Two of the subjects were left-handed.
In experiment 1, before data collection, each subject was presented with six common items, and was allowed to interact with them using only the right hand. Objects were selected to sample a range of qualitatively different hand shapes, and they varied dramatically in weight and size, as shown in Figure 1. The heaviest object was the orange juice (OJ) carton. It was emptied of OJ and then partially refilled with room-temperature water to weigh 1385 g. For the egg, a chicken egg shell was emptied of fluid and refilled with sand to match the natural weight. To avoid using a heavy metal pan in the magnetic field of our measurement system, we substituted an aluminum strainer pan with a wooden handle (233 g) and recreated the inertial balance typical of a metal frying pan by placing a 571 g heavy plastic object in the strainer basket. The bottle was full of water.
In experiment 1, subjects were instructed to hold each object in a natural grip and to use the same right hand grip across repeated trials for a given object. After achieving familiarity with the objects, subjects were instructed to imagine, lightly touch, or forcefully support a particular object with their right hand. For the support condition, the subject was required to suspend the object in the air, or twist and hold the object in a fixed position. The bottle and doorknob were fixed to stationary bases and the subject used a natural action and then maintained a static twisting force (the bottle top was screwed on tight to keep it from turning and the doorknob was spring-loaded). The other objects were simply lifted and held above the table top (Fig. 1).
Our primary instruction to subjects always emphasized hand shape. Subjects were instructed that, after adopting their right hand posture for a specified object, they were to reproduce their right hand's shape with their left hand. An additional instruction was that, before shaping the left hand, the left forearm was to be held opposite the right forearm in space (matched condition), or dropped down to the side of the body with the left elbow fully extended (extended condition). Subjects were required to complete the entire task without the use of vision. They became familiar with the locations of the objects on the table and then, beyond the initial practice period, they were never allowed to look at either hand. The experimental procedure consisted of 16 blocks of trials. Each block contained one trial for each of the 36 object/contact/posture combinations, in different pseudorandom orders.
In experiment 2, we placed unfamiliar objects, one by one, on a table. The objects and the hands were always screened from the subjects' view. The objects (described in Table 1) were approximately the size of an orange or a grapefruit (∼12 cm tall), and weighed ∼100–500 g. Approximately half of the objects were flat, 2.4 cm thick, wooden cutouts, with various shapes (Santello and Soechting, 1998). In some cases, the object was approached from its front side with a horizontal reach, touched with all right hand fingertips, and then lifted slightly up, just off the table (front-lift condition). In other cases, the objects were approached by the right hand from above, and were touched lightly with all of the fingertips, but remained on the table (above-touch condition). As in experiment 1, subjects were instructed that, after either lifting or touching the object, they were to recreate their right hand's shape with their left hand either dropped to their side (extended condition) or held opposite their right hand in space (matched condition).
Unfamiliar objects in experiment 2
The experiment 2 procedure consisted of 15 blocks in which the trial order was pseudorandomized (avoiding sequential presentation of the same object). Each block contained one trial for each of the 24 object/contact/posture combinations. The last block of trials was an eyes-open condition, in which subjects were able to see both the object and the right hand.
Data acquisition.
In experiment 1, static left and right hand postures were recorded using 18-sensor Cybergloves (Virtual Technologies). Both gloves were open at the finger tips, thus exposing the pads of the fingers and thumb. In experiment 2, only the left hand wore a glove (leaving the right hand bare). Joint angle data were collected at 83.33 Hz for 0.5 s. Before the experiment, the joint angles were measured over a series of standard postures to compute calibration gains and offsets for each glove sensor. Furthermore, four of the standard postures were collected before and after the main experiment to ensure the stability of sensor readings over time. We also performed a control experiment to test for a direct pull of gravity on the resistive bend sensors that are sewn into the glove. We compared joint angle readings in the matched versus extended conditions while one subject held unfamiliar object C, D, or G (Table 1), with five randomized repeats of each pair. We examined the data from each of the 15 joint angles and found no difference (n = 225; mean difference, 0.16°; paired t test; p = 0.19).
Forearm location and orientation data were recorded at 60 Hz using a 3Space Fastrak system (Polhemus). 3Space sensors were secured on the left and right wrists of the subjects, proximal to the wrist folds, to measure forearm position with respect to a fixed origin, the tabletop horizontal plane, and the vertical axis.
After each half-second trial, the recorded data were checked for stability and then averaged across time to give a single, 23-number vector for each hand specifying 15 intrinsic hand joint angles, two wrist angles, and six degrees of freedom of forearm position.
Data analysis.
In experiment 1, only 15 of the 16 collected repeats for each condition were analyzed. The first trial of each condition (Fig. 1, red traces) was discarded unless there was an obvious deviation from the mean in any of the other 15 trials, in which case the error trial was replaced by the first trial to preserve the 15-trial sample size. This happened only four times across all subjects (0.14% of trials). For experiment 2, there were three discarded error trials.
For both experiments, principal components analysis (PCA) and Mahalanobis distance (M-dist) were the primary analytical tools (MATLAB princomp, cov, and mean functions; MathWorks). Since each 15-dimensional hand shape corresponds to a location in principal component space (Fig. 2), the PCA was used to visualize and interpret changes in hand shape across different experimental conditions. M-dist was used to quantify reliable differences across experimental conditions. For comparison, we also calculated the standard (unscaled) Euclidian distance.
The M-dist value is analogous to a Student's t value; it weights the distance between the means (mean1 and mean2) of two multidimensional datasets (dataset 1 and dataset 2) according to the variance in each dimensions. The M-dist between two datasets is defined as follows:
where pooled covariance is as follows:
To test the influence of grasp forcefulness on the perception of hand shape, M-dist values were calculated between touch and support conditions. To examine influence of forearm posture, M-dist values were calculated between extended and matched conditions. These values were calculated separately for each subject, hand, and object. The resulting M-dist datasets were then further examined using a two-way ANOVA, with Scheffé post hoc testing. For selected sets of angle measurements, we also used paired t tests when appropriate, with data paired by subject and/or condition or object. Statistical testing was done using SYSTAT software.
Results
General observations
Although the experimental design allowed us to test for changes in the perceptual report across time, we observed no such changes. For both experiments, we examined the values of several parameters across sequential blocks of trials in the 2-h-long experiment. Occasionally, the first trial was noticeably different (Fig. 1, red traces). By design, the first trial was excluded from analysis in experiment 1, but because we had fewer repeats, it was not excluded in experiment 2. In most cases, subjects seemed to quickly settle upon their strategy, and then stick with it. During the first block of trials, three of the subjects asked whether there was any explicit instruction regarding the posture of the left wrist when the left elbow was extended. We responded that there was not, and we then reiterated that the only instruction was to use the left hand to report the shape of the right hand.
Hand shapes and hand shape data are shown for the six familiar objects used in experiment 1. Data from the first trial are shown in red; data from the 15 repeated trials are depicted by the overlaid black traces. The PC and M-dist analyses used the 15 intrinsic joint angles; they did not include the wrist angles (the last two points in each trace). These data are from subject 5's right hand in the support/matched condition. ROT, Rotation; ABD, abduction.
Experiment 1
In the first experiment, open-tipped gloves were worn on both hands. By design, joint angle data from the two hands were not compared directly. Due to mechanical differences in the hands and gloves, and potential inaccuracies in the calibrations, all key comparisons were between experimental conditions within the same hand. Summary results could then be contrasted for the left and right hands.
Hand shape data were initially examined by plotting each trial in the coordinate system of the first two principal components (Figs. 2, 3). However, statistical comparisons across conditions were based on all 15 dimensions. As listed in Figure 1, these 15 dimensions included three angles of the thumb [flexion/extension of the thumb metacarpal–phalangeal (MCP) and proximal interphalangeal (PIP) joints, and thumb rotation]. Also included were four abduction angles between adjacent digits (Fig. 1, arrows), as well as the MCP and PIP joints of each finger (eight flexion/extension angles) (Fig. 2A). Positive angle values represent thumb internal rotation and abduction (pulling apart) of neighboring digits. For example, as shown in Figure 1, the notebook hand shape had a relatively large thumb internal rotation (almost 100°), and the egg hand shape had relatively large abductions between neighboring fingers (∼20°). Positive values also correspond to MCP and PIP flexion. Zero degrees for MCP and PIP flexion corresponds to a straight finger, as in the little finger PIP for the bottle top. Note that it is possible for the thumb PIP to hyperextend (negative values), as in the grasp of the strainer pan handle.
General characteristics of a principal component coordinate system for human hand shape. A, PCs, as computed in the study by Weiss and Flanders (2004). Note that the two principal axes (PC1 and PC2) depend on the set of hand shapes used to compute them, but they are always orthogonal (uncorrelated). B, Hand shape data plotted in the PC space computed for subject 1. Repeated trials for the six familiar objects are located in this space using data from the left hand in the support/matched condition. Triangles, Notebook; circles, bottle top; ovals, egg; empty squares, OJ carton; filled squares, strainer pan; pentagons, doorknob.
An example showing the difference between conditions for extended (open symbols) versus matched (filled symbols). A, Repeated trials for subject 1/notebook are shown for the left hand (triangles) and right hand (circles). B, Lines show the two-dimensional distance between condition averages (note that M-dist was actually calculated as the reliable separation between clouds of data points, using all 15 dimensions).
Based on the results of preliminary experiments and prior studies (Santello et al., 1998; Weiss and Flanders, 2004), the various objects were chosen such that the hand shapes occupied a wide range of distinct portions of principal component space (Fig. 2B). Using data from a representative subject and the principal component space calculated for that subject's left hand data, Figure 2B illustrates the typical range of left hand variability for repeated trials. Variability was smaller for the right hand, especially in the support (Fig. 3A) and touch conditions, where the hand was in contact with the object.
Figures 3 and 4 illustrate our method for evaluating the reliable difference between two sets of hand shapes. In Figure 3A, we have plotted data for the repeated trials when one subject grasped and lifted the notebook with the right hand and then reported the hand shape with the left arm posture either extended or matched. For both the left and right hands, the 15-dimensional clouds of data points largely overlapped, as can be observed in the two dimensions of the primary principal components. As illustrated in Figure 3B, for this comparison between extended and matched conditions, the separation between the cloud averages was very small for the right hand, but larger for the left hand. In this example, the separation was especially prominent for the imagine condition. However, this was generally not the case. For the notebook, in all three conditions (imagine, touch, and support) the extended arm posture was associated with a larger positive score for the second principal component (PC2).
Summary of M-dist for the left hand (white) and right hand (gray). Each bar represents the average (and SE) across the experiment 1 subjects and secondary conditions. A, M-dist values compare touch and support conditions. B, M-dist values compare extended versus matched conditions. Hatched areas represent Euclidian distances.
In comparing two clouds of data points, larger M-dist values result when the averages are far apart and the trial-to-trial variability is low. Thus, Figure 4 serves to quantify the reliable difference between conditions, with M-dist values from the performing hand (right hand) being used as a control/contrast for the reporting hand (left hand).
Figure 4A shows hand shape differences between the experimental conditions where the right hand exerted different amounts of force against the objects (touch vs support conditions), thus altering both the somatosensory and the motor activities associated with each object. Right hand M-dist values were significantly greater than left hand values. The same right–left trend was present in the Euclidian distances.
The forcefulness M-dist comparison included both matched and extended trials (n = 120, F = 223, p < 0.001). There was also a significant difference across objects, and a significant interaction between hand and object (both p < 0.01). Thus, the shape of the right hand was slightly changed to support the object (compared with light touch), but the perceptual report of the left hand was relatively insensitive to this change. This was true for all objects but was particularly prominent for supporting the heavy orange juice carton, which consistently involved a wider shape for the right hand (Fig. 1).
In contrast to the relative invariance of left hand shape across changes in right hand forcefulness, the left hand shape changed more when the left forearm posture changed. Using an expanded scale on the y-axis, Figure 4B shows the M-dist for the six objects. In this case, the M-dist between clouds of data points represents the difference in the shape of the right or left hand for trials where the left forearm was extended or matched. Opposite from the result in Figure 4A for the touch versus support comparison, Figure 4B shows that the left hand changed shape more dramatically than did the right hand during left forearm extension (n = 180, F = 157, p < 0.001). This comparison included imagine, touch, and support conditions. Post hoc testing revealed no significant differences in this left/right tendency across the six objects.
We sought to interpret this change in the left hand's report of perceived hand shape by viewing the data in principal component (PC) space. In the coordinate system created by the first three PCs, we examined pairs of data points representing the extended and matched conditions. In Figure 5, we illustrate this using a principal component space calculated from the full set of left hand data, combined across all five subjects. Especially for the egg (top) and the strainer pan (bottom), we observed shifts that were consistent across subjects and across imagine, touch, and support situations (the three data pairs and three lines for each subject). In Table 2, we list the results of a paired t test for each of the six familiar objects. Consistent matched-to-extended shifts were seen in at least one of the first four principal components for all objects except the OJ carton.
Consistent changes in left hand shape for extended (open triangles) versus matched (filled circles) conditions for two familiar objects. Experiment 1 subjects are coded by colors: blue, subject 1; green, 2; red, 3; cyan, 4; magenta, 5. The data from each subject are represented by three pairs of data points: averages for imagine, touch, and support conditions.
Shifts in PC scores from the matched to the extended condition in experiment 1
To illustrate the direction of the shifts, in Figure 5, the lines connect filled circles, representing the matched condition, with open triangles, representing the extended condition. For the egg, the instruction to replicate right hand shape with the left forearm extended toward vertical resulted in left hand shapes with more negative scores for PC1, PC2, and PC3 (lines move down and to the left). When the right hand was shaped around the real or imagined handle of the strainer pan, extending the left forearm toward vertical resulted in left hand shapes with more positive scores for PC1 and PC3 (lines move to the right).
To understand these tendencies, we need to consider the patterns of covariation in intrinsic joint angles that represent the primary principal component axes. In Figures 6 and 7, extremes of the positive and negative axes were added to the average hand shape (zero on PC plots). The first principal component always represents the main pattern of covariation across the set of hand shapes used to compute it. As illustrated in Figure 2A, in previous studies (Santello et al., 1998; Weiss and Flanders, 2004), it mainly represented finger MCP flexion (positive scores indicate a closed fist) and finger MCP extension (negative scores indicate an open hand). The same was true for the present study (Fig. 6, top). Thus, a more negative PC1 represents more extended finger MCP joints (Fig. 7, top left).
Principal components calculated from data combined across all subjects, objects, and conditions in experiment 1. Extremes of the positive and negative axes were added to the average hand shape (Fig. 5, zero on PC plots). Note that PC1 is largely comprised of MCP flexion (positive) and extension (negative); PC2 is PIP flexion (negative); PC3 (positive) is thumb internal rotation/thumb PIP extension, as well as ring and little finger MCP extension toward zero (straight) and PIP flexion. ROT, Rotation; ABD, abduction.
Top, An approximated visualization of the tendencies for changes in hand shape that would result from an increased negative contribution of PC1 and PC2 (left), or an increased positive contribution of PC3 (right). Bottom, Overlay of positive (solid line) and negative (dashed line) extremes of the principal component axes. Arrows highlight the directions of the observed matched-to-extended shifts. Negative PC2 features a curl of the index and middle interphalangeal joints with the MCPs extended (extd) and the PIPs flexed (flex). Positive PC3 features the same sort of interphalangeal curl in the ring and little fingers.
The second principal component is always orthogonal to the first, and it accounts for the second largest amount of the variance in the dataset. In the present study, it was largely characterized by finger PIP flexion in its negative version (Fig. 6, middle, dashed line; Fig. 7, left). The third principal component is orthogonal to PC1 and PC2 and accounts for the third largest amount of variance. In this case, positive PC3 was thumb internal rotation/thumb PIP extension, as well as ring and little finger MCP extension toward zero (straight) and PIP flexion (Fig. 6, bottom; Fig. 7, right). The positive fourth principal component (data not shown) featured flexion of the index and middle fingers MCP and ring finger PIP.
Thus, when instructed to replicate the intrinsic shape of the right hand with the left forearm extended toward vertical, subjects distorted the left hand's shape in a consistent manner for most objects. Visualizing these shifts in hand shape, as in Figure 7 (egg, left; strainer pan handle, right), led us to hypothesize that subjects' perceptual reports may have been biased toward a memory of the sensation of holding the object in its proper spatial orientation, relative to gravity. If so, the prediction is that in addition to shifting the hand shape in an object-specific manner, subjects should also adjust the angles of the left wrist in an object-specific manner.
The left wrist data support this hypothesis. In Figure 8, we use colored lines (subject code) to connect pairs of wrist angles for corresponding extended and matched conditions. For each object, we plot wrist pitch (extension or flexion) and wrist yaw (little finger toward forearm or thumb toward forearm) as a function of forearm elevation angle (0°, horizontal; 90°, vertical). Depending on the object, data lines parallel to dashed +45° or −45° diagonal lines would represent perfect counter-rotation of the arm and wrist to maintain the hand in a fixed spatial orientation. The data lines show a moderate tendency toward this counter-rotation; in most cases complete counter-rotation is anatomically impossible.
In experiment 1, wrist angles were associated with the spatial orientation of the forearm in an object-appropriate manner. Perfect counter-rotation of wrist pitch or yaw angles would produce data lines parallel to the dashed, diagonal lines. Experiment 1 subjects are coded by colors: blue, subject 1; green, 2; red, 3; cyan, 4; magenta, 5. To simplify each plot, only one line is shown for each subject; it is the average across imagine, touch, and support conditions. Symbols in each panel show the results of paired t tests comparing wrist angles for extended (extd) versus matched (flex) conditions. n = 15 (5 subjects × 3 conditions); *p < 0.05; **p < 0.01; ***p < 0.001; NS (not significant), p > 0.05.
For the bottle top and the egg, only one subject (Fig. 8, red lines) came close to perfect counter-rotation, but all subject exhibited the object-appropriate tendency. For these objects, the hand was held palm-down (Fig. 1) so the appropriate counter-rotation would produce data lines parallel to the −45° diagonal, with wrist extension (positive) corresponding to the forearm being positioned downward toward vertical (negative). For the notebook, the palm was up and, again, all subjects exhibited the appropriate counter-rotation. Note that the notebook-associated shift in wrist posture was opposite to that for the egg. Interestingly, the shift in hand shape was also opposite, as evidenced by the positive, rather than negative, shift in PC2 (Table 2).
For the OJ carton and the strainer pan, where the back of the hand was more vertical, as was appropriate, the tendency was more significant in wrist yaw than in wrist pitch. Consistent with the results for various aspect of hand shape (Figs. 2, 4; Table 2), the wrist pitch pattern for the OJ carton was unique.
Experiment 2
In the second experiment, only the left hand wore a wired glove, leaving the right hand free to interact more fully with the unfamiliar objects (Table 1). At the end of each session, we asked the subjects how many objects they thought they had interacted with. They were generally unaware of the exact number, but as early as the second block of trials, most said that they had noticed that some objects were being repeated. We had hypothesized that there might be changes across time, as subjects became more familiar with the objects, but this was not generally observed.
In experiment 2, the main results of experiment 1 were repeated for some subjects and some objects. The perceptual report of hand shape was influenced by forearm orientation even when the objects were unfamiliar, as measured by the left hand M-dist values for extended versus matched conditions. The mean M-dist value in experiment 2 was 6.5 ± 0.4 SE, compared with the experiment 1 mean value of 6.7 ± 0.3 SE. Likewise, the left hand Euclidian distance averaged 19.0° for both experiments (1.1 SE for experiment 1; 0.9 SE for experiment 2).
However, in experiment 2, we encountered more variation across subjects and objects. This is apparent in the trends for wrist angles (Fig. 9). For the front-lift condition, subjects 2, 4, and 5 tended to counter-rotate in yaw (Fig. 9A–H). Subject 1 also showed this yaw counter-rotation, but perhaps because of the fit of his body geometry in our experimental set up, he used a limited range of forearm elevation angles. Subject 3 did not vary his wrist yaw across conditions.
In experiment 2, wrist angles were sometimes related to the spatial orientation of the forearm. A–H, For objects approached from the front and lifted, wrist yaw angle is plotted against forearm elevation angle. I–L, For objects that stayed on the table and were approached from above, wrist pitch angle is plotted against forearm elevation angle. Perfect counter-rotation of wrist yaw or pitch angles would correspond to data lines parallel to the dashed unity-slope lines. Experiment 2 subjects are coded by colors: green, 1; magenta, 2; teal, 3; yellow, 4; brown, 5. The results of a paired t test are indicated in each panel. n = 5; *p < 0.05; NS (not significant), p > 0.05.
However, all subjects did counter-rotate left wrist pitch for objects that were touched from above with a horizontal palm (above-touch condition), but remained on the horizontal table (Fig. 9I–L). Data lines parallel to the dashed lines represent perfect counter-rotation of the left wrist, maintaining the spatial orientation of an imagined hand-held object.
Discussion
Because of the close cooperation between the somatosensory and motor systems, we hypothesized that the report of the left hand would be invariant to self-produced changes in the grasp forcefulness of the right hand and the posture of the left arm. Our results supported the first prediction but not the second. Instead, we found that, in solving the task of replicating intrinsic hand shape, subjects often used a strategy that involved counter rotation of the forearm and hand, tending to maintain the spatial orientation of the hand. In conjunction with this strategy, however, subjects moved the shape of the left hand away from the correct replication of the right hand shape. This suggests that completing this sensorimotor matching task may have involved drawing on the imagined feeling of the inertial or spatial properties of the object, as it would feel supported against gravity or sitting firmly on a horizontal table.
Frames of reference
Although we did not anticipate this explanation for the hand-shape errors, it makes intuitive sense in retrospect. Before these experiments, we had no reason to expect the intrinsic joints of the hand to participate in a spatial stabilization of an imagined object. However, the partial counter-rotation of the wrist for this apparent purpose is consistent with the results of previous studies (Soechting and Flanders, 1993; Carrozzo and Lacquaniti, 1994; Flanders and Soechting, 1995; Kappers, 1999). In our prior studies, human subjects were asked to place a rod, held in the hand in a five-finger precision grip, in a specific, spatially defined orientation (e.g., 45° to vertical). By assigning this task to a wide range of locations in three-dimensional space, we could see that the errors in producing the instructed spatial orientation of the rod showed a bias toward a frame of reference fixed to the forearm. This error in following the task instruction suggested the existence of a forearm-centered frame of reference. Conversely, in the present study, the instruction was in an intrinsic coordinate system and the errors were biased toward a spatial coordinate system. In some cases, it may have been unnatural or even impossible for subjects to achieve full counter-rotation of the forearm and wrist, and therefore this strategic spatial goal was shared with the thumb and fingers.
Sensorimotor memory
There is a great deal of support for the idea that neural coding takes place in some sort of intermediate coordinate system—a merging of multisensory and/or motor coordinates (Jay and Sparks, 1987; Anderson and Zipser, 1988; Flanders et al., 1992; Kakei et al., 1999; Ostry et al., 2010). Another well established concept is the hypothesis that an internal model is formed in memory, and it is apparently updated by comparing an efference copy signal related to the expected somatosensory feedback with the actual somatosensory feedback (Nowak et al., 2004; Bensmail et al., 2010). The inverse internal model, by definition, stores mechanical parameters such as the inertia of hand-held objects (Flanders, 2011). Loh and colleagues (2010) recently demonstrated that this memory system can provide trial-to-trial improvements in smoothly moving an external object. In the present study, this representation was apparently drawn upon to aid in the task of producing a report of a target hand shape. If our interpretation is correct, it implies that, for hand-held objects, mechanical properties like inertial geometry and gravitational pull are components of this memory (Flanders et al., 2003; Shockley et al., 2004). This internal memory should be especially useful for familiar objects (Deshpande et al., 2010) and would be in the appropriate form for comparison with somatosensory feedback (Nowak et al., 2004).
Multisensory integration
We had hypothesized that the somatosensory perception of hand shape should be invariant to the changes in somatosensory input that are produced by active motor output. This was motivated by classical concepts as well as recent experimental results (Smith et al., 2009; Weiss and Flanders, 2011), which led us to propose a model in which an efference copy cancellation of an actively produced somatosensory input takes place at spinal, subcortical, and cortical levels (Weiss and Flanders, 2011). For the task in the present experiments, coordination of the somatosensory-motor information within a spatial frame of reference likely took place in multisensory areas of the parietal cortex, including somatosensory and vestibular areas (Angelaki et al., 2009; Azañón et al., 2010). Elegant studies have suggested that these areas must carry a representation of the influence of gravity on moving objects (McIntyre et al., 2001; Zago et al., 2009). The present study expands this notion to include a representation of the memory of the pull of gravity on a familiar object, as sensed by the combined somatosensory–motor system intrinsic to the human hand.
Conclusions
For interacting with familiar objects and objects on tabletops, our results suggest the following control process. First, an internal model of the object (and tabletop) is accessed; it is a memory formed through prior interactions. The motor system is commonly thought to use an internal model of the relation between forces and movements. We now suggest that the somatosensory system uses a memory model of the overall pattern of cutaneous and proprioceptive input that normally results from handling a particular object in a gravitational frame of reference. For this matching task, the next stage in the control would be to use the motor mapping of desired movement to motor unit activation, but then as a follow-up, the motor output should continue until the desired pattern of somatosensory input is achieved. This essentially implies that the motor commands would be driven by the desired somatosensory state. Surprisingly, in our study, the motor commands were driven to achieve an imagined somatosensory state.
Footnotes
-
This work was supported by National Institute of Neurological Disorders and Stroke Grant R01 NS027484. We thank Professor John F. Soechting.
-
The authors declare no competing financial interests.
- Correspondence should be addressed to Martha Flanders, Department of Neuroscience, University of Minnesota, 6-145 Jackson Hall, 321 Church Street SE, Minneapolis, MN 55455. fland001{at}umn.edu