Visuo-motor imagery of specific manual actions: A multi-variate pattern analysis fMRI study
Highlights
► We applied fMRI with MVPA to investigate coding of imagined and performed actions. ► Left anterior parietal cortex showed common imagined-performed coding for specific actions. ► Action-specific representations can be elicited by imagery alone. ► Apparent visuo-motor coding–by a human ‘mirror neuron system’–may partially be driven by imagery.
Introduction
Humans are social beings with a highly developed brain that enables them to interact with others and their environment in complex ways not seen in other animals. The ability to predict the consequences of our actions is crucial for such interactions, not only while performing actions—explained by ‘forward’ models (Kawato, 1999, Wolpert et al., 2003)—but also when imagining the outcome of our actions without actually executing them (Wolpert et al., 2003).
Although imagery of actions may be performed without any overt behavior, there is a long line of evidence showing that imagined actions and overt actions may share a cognitive mechanism. For example, Shepard and Metzler (1971) found that a mental object matching task showed a linear increase of reaction times as a function of rotation angle. Mental imagery of actions has also attracted interest in sport psychology, and mental practice is commonly reported in elite athletes (Hall and Rodgers, 1990). Several studies have shown that not only physical practice can improve performance but that imagery of practice, although generally less powerful, can improve sport performance as well (Hinshaw, 1991, Feltz and Landers, 1983).
Apart from improving existing skills, it has also been suggested that imagery is important in acquiring new action skills (Annett, 1995). For example, Mackay (1981) demonstrated that sentence production improves not only when producing these out loud but also silently. Beneficial effects of imagery have also been observed in typing (Wohldmann et al., 2007), music (Brodsky et al., 2008), dance (Golomer et al., 2008) and surgery (Hall, 2002). Analogous mechanisms may be involved when learning from observation, i.e. actions performed by others could be represented during observation, and then such representations could be re-activated, through mental imagery, during action execution (e.g., Sheffield, 1961, Cross et al., 2009, Calvo-Merino et al., 2005, Buccino et al., 2004, Cross et al., 2006).
Inspired by the discovery of ‘mirror neurons’ in macaques—neurons that fire both when the monkey executes an action or observes the action—in premotor and parietal cortex (di Pellegrino et al., 1992, Gallese et al., 1996), functional magnetic resonance imaging (fMRI) studies have investigated the neural correlates of such putative action representations shared across different modalities with a prime focus on imitation, observation, and execution. These studies have consistently found that several areas in the frontal and parietal cortex show an increased response for the imitation, observation and execution of actions, a result that often has been interpreted as evidence for human mirror neurons (e.g., Molenberghs et al., 2011, Rizzolatti and Fabbri-Destro, 2008, Gazzola and Keysers, 2008; but see Brass and Heyes, 2005, Welberg, 2008, Dinstein et al., 2008b, Hickok, 2009, Iacoboni and Dapretto, 2006). Fewer studies have investigated the role of imagery, although there is evidence that imagined actions engage similar areas as observed or executed actions (Filimon et al., 2007, Lotze et al., 1999, Ehrsson, 2003). Crucially, the large majority of studies did not investigate the representations of specific actions (Gazzola and Keysers, 2008) and—given the limited spatial resolution of fMRI—do not rule out that observed and executed actions are subserved by different but spatially overlapping neural populations (Dinstein et al., 2008b). The few studies that have investigated action-specific representations yielded mixed results and interpretations, with some arguing for different neural populations for observed and executed actions (Dinstein et al., 2008a, Lingnau et al., 2009) and others for evidence for a cross-modal visuo-motor population (Chong et al., 2008, Kilner et al., 2009, Oosterhof et al., 2010, Oosterhof et al., 2012).
To complicate this debate further, a possible interpretation of ostensible visuo-motor coding of observed and executed actions is mental imagery. For example, when participants observe actions they may also imagine executing the action without any overt behavior. Alternatively, participants may imagine the observation of their own actions while they execute unseen actions. Such imagery effects may result in engagement of unimodal action-specific representations that are similar both during performed and executed actions, and therefore would appear (incorrectly) to reflect cross-modal coding. Indeed, neuroimaging studies have provided evidence for shared representations of imagined and performed or observed movements of specific body parts in somatosensory (Stippich et al., 2002) and occipito-temporal cortices (Orlov et al., 2010). However, it is unclear whether mental imagery of specific actions can explain the apparent cross-modal coding of visuo-motor representations identified in previous studies (Chong et al., 2008, Kilner et al., 2009, Oosterhof et al., 2010, Oosterhof et al., 2012).
We sought to investigate this specific possibility empirically, and more generally to extend our understanding of the neural representations behind action imagery. We asked participants to perform two manual actions (while they could see their own right (dominant) hand) during certain trials, and to imagine these actions without overt behavior during other trials, while they were scanned with fMRI. We used multi-variate pattern analysis (MVPA; Edelman et al., 1998, Haxby et al., 2001, Norman et al., 2006, Haynes and Rees, 2006) to distinguish between the neural patterns associated with two manual actions. To anticipate our findings, in the crucial analysis comparing patterns of activity produced during performing vs. imagining actions, we found that the left anterior parietal cortex represents specific actions in a manner shared across both conditions. These findings are the first evidence that neural coding of specific imagined actions is similar to overtly performed actions and raise questions about the interpretation of studies investigating cross-modal visuo-motor coding of actions.
Section snippets
Participants
12 right-handed, healthy adult volunteers were recruited from the Bangor University community. All participants had normal or corrected-to‐normal vision. Participants satisfied all requirements in volunteer screening and gave informed consent. Procedures were approved by the Ethics Committee of the School of Psychology at Bangor University. Participation was compensated at £15.
Design and procedure
The same setup was used as in Oosterhof et al. (2010, Experiment 2) in which participants manipulated a cup-shaped
Univariate activation mapping
The conjunction group analysis (Table 1, Fig. 1) revealed several regions with an increased response during both the perform and imagery conditions. Seven local maxima were identified in the left hemisphere and four in the right hemisphere, including bilateral planum temporale, left anterior parietal cortex, left supplementary motor area, and several clusters in frontal cortex bilaterally. The most robust activation was found in the bilateral planum temporale, probably due to auditory stimulus
Discussion
Using fMRI MVPA we investigated how the human brain represents performed and imagined actions. In the unimodal perform analysis, when participants performed two distinct objected-related actions (lifts and slaps) while observing their hand and the object that was manipulated, we found that spatially distributed patterns dissociated the two actions across large portions of the cortex that included auditory, visual, somatosensory, and motor areas. The involvement of these areas is not surprising,
Acknowledgments
This research was supported by the ESRC (grant to SPT and PED), and the Wales Institute of Cognitive Neuroscience. NNO was supported by a fellowship awarded by the Boehringer Ingelheim Fonds. We would like to thank Marius Peelen for helpful discussions, and Emily Cross, Angelika Lingnau, Nick Peatfield, Marius Peelen, Richard Ramsey, and two anonymous reviewers for helpful comments on an earlier draft of this manuscript.
References (87)
Continuous carry-over designs for fMRI
Neuroimage
(2007)Motor imagery: perception or action?
Neuropsychologia
(1995)- et al.
The spatiotemporal organization of auditory, visual, and auditory–visual evoked potentials in rat cortex
Brain Res.
(1995) - et al.
Polysensory and cortico-cortical projections to frontal lobe of squirrel and rhesus monkeys
Electroencephalogr. Clin. Neurophysiol.
(1969) - et al.
Imitation: is cognitive neuroscience solving the correspondence problem?
Trends Cogn. Sci.
(2005) - et al.
Neural circuits underlying imitation learning of hand actions: an event-related fMRI study
Neuron
(2004) - et al.
fMRI adaptation reveals mirror neurons in human inferior parietal cortex
Curr. Biol.
(2008) AFNI: software for analysis and visualization of functional magnetic resonance neuroimages
Comput. Biomed. Res.
(1996)- et al.
Building a motor simulation de novo: observation of dance by dancers
Neuroimage
(2006) - et al.
The power of simulation: imagining one's own and other's behavior
Brain Res.
(2006)
A mirror up to nature
Curr. Biol.
Human cortical representations for reaching: mirror neurons for execution, observation, and imagery
Neuroimage
Brain areas underlying visual mental imagery and visual perception: an fMRI study
Cogn. Brain Res.
Contribution from neurophysiological and psychological methods to the study of motor imagery
Brain Res. Rev.
Imagery practice and the development of surgical skills
Am. J. Surg.
Thinking ahead: the case for motor imagery in prospective judgements of prehension
Cognition
Internal models for motor control and trajectory planning
Curr. Opin. Neurobiol.
Neural systems shared by visual imagery and visual perception: a positron emission tomography study
Neuroimage
Towards a true neural stance on consciousness
Trends Cogn. Sci.
Single-neuron responses in humans during execution and observation of actions
Curr. Biol.
Beyond mind-reading: multi-voxel pattern analysis of fMRI data
Trends Cogn. Sci.
A comparison of volume-based and surface-based multi-voxel pattern analysis
Neuroimage
Topographic representation of the human body in the occipitotemporal cortex
Neuron
Can cognitive processes be inferred from neuroimaging data?
Trends Cogn. Sci.
Return of the mental image: are there really pictures in the brain?
Trends Cogn. Sci.
Incongruent imagery interferes with action initiation
Brain Cogn.
Reading the mind's eye: decoding category information during mental imagery
Neuroimage
The mirror system and its role in social cognition
Curr. Opin. Neurobiol.
A new method for improving functional-to-structural MRI alignment using local Pearson correlation
Neuroimage
Threshold-free cluster enhancement: addressing problems of smoothing, threshold dependence and localisation in cluster inference
Neuroimage
Somatotopic mapping of the human primary sensorimotor cortex during motor imagery and motor execution by functional magnetic resonance imaging
Neurosci. Lett.
The mental representation of music notation: notational audiation
J. Exp. Psychol. Hum. Percept. Perform.
Action observation and acquired motor skills: an fMRI study with expert
Cereb. Cortex
LIBSVM : a library for support vector machines
ACM Trans. Intell. Syst. Technol.
Imagery and perception share cortical representations of content and location
Cereb. Cortex
Sensitivity of the action observation network to physical and observational learning
Cereb. Cortex
Brain structures participating in mental simulation of motor behavior: a neuropsychological interpretation
Acta Psychol.
Understanding motor events — a neurophysiological study
Exp. Brain Res.
Brain areas selective for both observed and executed movements
J. Neurophysiol.
Executed and observed movements have different distributed representations in human aIPS
J. Neurosci.
Neurons in primary motor cortex engaged during action observation
Eur. J. Neurosci.
Toward direct visualization of the internal shape representation space by fMRI
Psychobiology
Imagery of voluntary movement of fingers, toes, and tongue activates corresponding body-part-specific motor representations
J. Neurophysiol.
Cited by (47)
Cross-modal and non-monotonic representations of statistical regularity are encoded in local neural response patterns
2018, NeuroImageCitation Excerpt :In this procedure, classifiers were trained and tested on response patterns within the same sensory modality (A, V, AV). Second, to evaluate cross-modal sensitivity to entropy (i.e., information about entropy condition that generalizes across sensory modality), classifiers were trained on stimuli in the auditory modality and tested on stimuli in the visual modality (and vice versa) and the results averaged (as in, e.g., Man et al., 2012; Oosterhof et al., 2012). Note that the audiovisual condition was not examined in the cross-classification scheme.
Time-resolved decoding of planned delayed and immediate prehension movements
2018, CortexCitation Excerpt :A possible explanation for this seeming discrepancy is the fact that more trials can be used for training and testing the classifier for cross-decoding (i.e., train on all delayed planning trials, and test on all non-delayed trials, and vice versa), whereas within-condition decoding has to rely on half of the trials (e.g., train and test on delayed planning trials only). Moreover, it is worth mentioning that several previous studies reported similar observations, i.e., stronger cross-condition decoding than within-condition decoding (Ariani et al., 2015; Gallivan et al., 2013, 2011b; Oosterhof, Tipper, & Downing, 2012). Extending previous reports that used more conventional MVPA (Ariani et al., 2015; Gallivan et al., 2013, 2011a), we obtained significant decoding of hand movements during delayed planning in premotor (L-PMd) and parietal (L-aIPS) cortex, using time-resolved MVPA (Figs. 4–5).