Skip to main content

Main menu

  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Collections
    • Podcast
  • ALERTS
  • FOR AUTHORS
    • Information for Authors
    • Fees
    • Journal Clubs
    • eLetters
    • Submit
  • EDITORIAL BOARD
  • ABOUT
    • Overview
    • Advertise
    • For the Media
    • Rights and Permissions
    • Privacy Policy
    • Feedback
  • SUBSCRIBE

User menu

  • Log out
  • Log in
  • My Cart

Search

  • Advanced search
Journal of Neuroscience
  • Log out
  • Log in
  • My Cart
Journal of Neuroscience

Advanced Search

Submit a Manuscript
  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Collections
    • Podcast
  • ALERTS
  • FOR AUTHORS
    • Information for Authors
    • Fees
    • Journal Clubs
    • eLetters
    • Submit
  • EDITORIAL BOARD
  • ABOUT
    • Overview
    • Advertise
    • For the Media
    • Rights and Permissions
    • Privacy Policy
    • Feedback
  • SUBSCRIBE
PreviousNext
Research Articles, Behavioral/Cognitive

State-Dependent TMS Reveals Representation of Affective Body Movements in the Anterior Intraparietal Cortex

Noemi Mazzoni, Christianne Jacobs, Paola Venuti, Juha Silvanto and Luigi Cattaneo
Journal of Neuroscience 26 July 2017, 37 (30) 7231-7239; DOI: https://doi.org/10.1523/JNEUROSCI.0913-17.2017
Noemi Mazzoni
1Department of Psychology, Faculty of Science and Technology, University of Westminster, W1W 6UW London, United Kingdom,
2Department of Psychology and Cognitive Science, University of Trento, 38068 Rovereto (TN), Italy,
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Noemi Mazzoni
Christianne Jacobs
1Department of Psychology, Faculty of Science and Technology, University of Westminster, W1W 6UW London, United Kingdom,
3Faculty of Psychology and Educational Sciences, Université Catholique de Louvain, Louvain-la-Neuve, 1348 Belgium, and
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Paola Venuti
2Department of Psychology and Cognitive Science, University of Trento, 38068 Rovereto (TN), Italy,
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Paola Venuti
Juha Silvanto
1Department of Psychology, Faculty of Science and Technology, University of Westminster, W1W 6UW London, United Kingdom,
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Juha Silvanto
Luigi Cattaneo
4Department of Neuroscience, Biomedicine and Movement, Section of Physiology and Psychology, University of Verona, 37134 Verona, Italy
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Luigi Cattaneo
  • Article
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF
Loading

Abstract

In humans, recognition of others' actions involves a cortical network that comprises, among other cortical regions, the posterior superior temporal sulcus (pSTS), where biological motion is coded and the anterior intraparietal sulcus (aIPS), where movement information is elaborated in terms of meaningful goal-directed actions. This action observation system (AOS) is thought to encode neutral voluntary actions, and possibly some aspects of affective motor repertoire, but the role of the AOS' areas in processing affective kinematic information has never been examined. Here we investigated whether the AOS plays a role in representing dynamic emotional bodily expressions. In the first experiment, we assessed behavioral adaptation effects of observed affective movements. Participants watched series of happy or fearful whole-body point-light displays (PLDs) as adapters and were then asked to perform an explicit categorization of the emotion expressed in test PLDs. Participants were slower when categorizing any of the two emotions as long as it was congruent with the emotion in the adapter sequence. We interpreted this effect as adaptation to the emotional content of PLDs. In the second experiment, we combined this paradigm with TMS applied over either the right aIPS, pSTS, and the right half of the occipital pole (corresponding to Brodmann's area 17 and serving as control) to examine the neural locus of the adaptation effect. TMS over the aIPS (but not over the other sites) reversed the behavioral cost of adaptation, specifically for fearful contents. This demonstrates that aIPS contains an explicit representation of affective body movements.

SIGNIFICANCE STATEMENT In humans, a network of areas, the action observation system, encodes voluntary actions. However, the role of these brain regions in processing affective kinematic information has not been investigated. Here we demonstrate that the aIPS contains a representation of affective body movements. First, in a behavioral experiment, we found an adaptation after-effect for emotional PLDs, indicating the existence of a neural representation selective for affective information in biological motion. To examine the neural locus of this effect, we then combined the adaptation paradigm with TMS. Stimulation of the aIPS (but not over pSTS and control site) reversed the behavioral cost of adaptation, specifically for fearful contents, demonstrating that aIPS contains a representation of affective body movements.

  • adaptation
  • anterior intraparietal sulcus
  • biological motion
  • emotional bodily expressions
  • emotions
  • TMS

Introduction

Perception of movements of other living beings is crucial for survival in most species, to the extent that many vertebrate species have specialized neural systems for action observation. In humans, a widespread network of interconnected brain areas [known as the action observation system (AOS)] underlies the comprehension of conspecifics' body movements and actions. This network includes the posterior superior temporal sulcus (pSTS; Puce and Perrett, 2003), and two mirror system areas, the putative human anterior intraparietal area (aIPS) and the ventral premotor/caudal inferior frontal gyrus complex (PMv/cIFG; Cattaneo and Rizzolatti, 2009). Several TMS studies have demonstrated that stimulating the pSTS, aIPS, and PMv/cIFG regions produces selective impairment in visual recognition of neutral actions (Grossman et al., 2005; Pobric and Hamilton, 2006; Candidi et al., 2008; Cattaneo et al., 2010; van Kemenade et al., 2012). But is the AOS also important for the encoding of the emotional aspects of biological motion?

The perception of affective stimuli, regardless of stimulus type, generally enhances the neural response of core affective systems, situated within the limbic system (Adolphs, 2002; Phillips et al., 2003) but emotional body movements are complex and their perception activates also a more widespread network of subcortical and cortical regions, related to analysis of visual body features and more generally to action observation and preparation (de Gelder, 2006, de Gelder et al., 2010, 2015; Tamietto and de Gelder, 2011). It is thus crucial to understand whether the activation within the AOS is a mere side-effect of the type of stimuli (body actions), independent from their content, or whether AOS activity is causally linked to emotional recognition. This issue has been explored in the literature in only two TMS studies; these found that perturbation of pSTS (Candidi et al., 2011) and inferior parietal lobule (Engelen et al., 2015) selectively improved the recognition of fearful body images. However, a limitation of both of these studies was that participants observed static images; human bodies are dynamic in nature and the brain substrates used in processing static postures are likely to differ from those engaged in perception of body movements. Furthermore, although conventional TMS paradigms can reveal the causal role of cortical regions in cognitive functions, they do not inform us about the neural representations in those regions.

Here we examined whether specific regions of the action observation network contain representations of affective body movements. This was accomplished by the use of state-dependent TMS, which enables the selectivity of neural representations in a cortical region to be assessed (Silvanto et al., 2008; Romei et al., 2016). This approach has been previously used to examine the selectivity of neural representations in various cognitive functions such as color and motion perception (Silvanto et al., 2007a; Cattaneo and Silvanto, 2008), numerical cognition (Kadosh et al., 2010), and action observation (Cattaneo et al., 2011, 2010; Sato et al., 2011; Jacquet and Avenanti, 2015). To examine the role of AOS in encoding the emotional aspects of dynamic biological motion, we used point-light displays (PLDs), also referred to as biological motion (BM) stimuli (Johansson, 1973), which allow isolation of motion signals from others visual cues. Kinematic information contained in PLDs is sufficient for detection of emotional content of human movements (Dittrich et al., 1996; Atkinson et al., 2004, 2007, 2012; Clarke et al., 2005; Chouchourelou et al., 2006; Alaerts et al., 2011). In Experiment 1, we examined behavioral adaptation effects of observed affective PLDs. We found an adaptation-like bias with incongruent stimuli recognized faster than congruent ones. In Experiment 2, we used the TMS-adaptation paradigm to examine the cortical locus of adaptation effects observed in Experiment 1. TMS over the aIPS, but not over pSTS nor over a visual control area, reversed the behavioral adaptation for fearful stimuli, indicating that this region contains neural representations selective for the fearful characteristics of human movements.

Materials and Methods

Visual stimuli and validation of emotional valence.

A total of 20 PLDs were presented, depicting 10 different expressions of happiness and fear, respectively. These stimuli are part of a wider dataset created by Atkinson et al. (2004, 2012). The PLDs consisted of 2-s-long digitalized video clips (for details, see Atkinson et al., 2012), displaying a single actor represented as 13 white dot-lights moving on a black background. The dots were positioned over the head and the main joints (1 dot over each ankle, knee, hip, elbow, shoulder, and hand) of the actor. Examples of the stimuli can be viewed at http://community.dur.ac.uk/a.p.atkinson/Stimuli.html. We selected happy and fearful stimuli because they are approximately equally arousing emotions, with opposite emotional valences (positive or negative). Before the main experiments, we ran a pilot study to validate the PLDs in terms of quantity of movement contained in the PLDs and of type and intensity of portrayed emotion. Sixteen healthy adults took part in this pilot experiment (13 females, mean age = 29.63, SD = 7.65). All the participants provided informed consent before participating in the experiment. They were seated in front of a 24 in monitor at a distance of ∼60 cm. The stimuli were presented foveally. Each PLD was presented once, and for each video participants were asked to recognize the conveyed emotion among four options (fear, happiness, neutral, and other) by pressing the corresponding button on the keyboard. The response options (appearing on the screen after each stimulus) were indicated with a label placed over the keys “F–H, J” and were randomized across participants. After the emotion recognition task, participants were asked to rate the “intensity of the emotion” and the “quantity of movement” on a scale from 1 to 5, using the numeric keys on the top of the keyboard. Stimuli were presented and responses recorded with E-Prime 2.0 (Psychology Software Tools). For each individual PLD, we calculated the accuracy of emotion categorization, the rated intensity of the emotion, and the rated quantity of movement. Data distribution was tested for normality with Shapiro–Wilk's test. Accuracy, intensity, and movement were not normally distributed, so they were analyzed using a nonparametric test for paired data, the Wilcoxon signed rank test with continuity correction. Significance thresholds were Bonferroni-corrected for three multiple comparisons (for each variable, we compared results between the 3 emotional valences; hence, the critical α was set as p < 0.017). There were no significant differences between the happy and fearful movements for accuracy, movement, and intensity, whereas, predictably, the neutral movements were rated as less intense compared with the two emotions (Table 1). This implies that the stimuli used in Experiments 1 and 2 (i.e., fearful and happy PLDs) do not differ in terms of 1) recognizability between the emotional categories, 2) intensity of the expressed emotion, or 3) quantity of movement contained in the stimuli.

View this table:
  • View inline
  • View popup
Table 1.

Results of comparisons between the three emotional valences of PLDs for accuracy, intensity, and movement assessed in the pilot study

Experiment 1: behavioral assessment of adaptation to observed emotional body movements

Participants.

Twenty-six healthy adults (14 females, 12 males; mean age = 23.58 years, SD = 2.95 years) took part in the behavioral study (Experiment 1). All participants had normal or corrected-to-normal vision. Before the experiment, all participants provided written informed consent in accordance with the Declaration of Helsinki.

Design and procedure.

Participants were seated in a comfortable chair in front of a 24 in computer screen at a distance of ∼60 cm. E-Prime version 2.0 (Psychology Software Tools) software was used for stimulus presentation and response recordings. The study consisted of 12 adaptation blocks (6 with happy and 6 with fearful adapters), consisting of a 1 min adapting period followed by eight test trials. Each trial began with a white central fixation cross over a black background, lasting for 10 s. This was followed by an adaptation period in which the same PLD was repeated 30 times (for a total duration of 60 s). Participants were asked to simply watch the stimuli and focus on the emotion expressed by the actor. The order of adaptation blocks was randomized. At the end of adaptation, a screen appeared asking participant to “Get ready for the task”, after which eight test stimuli (4 fearful and 4 happy PLDs) were presented. Half of the test stimuli were emotionally congruent and half were emotionally incongruent with the adapter, and their order was randomized. The test stimuli and the adapter stimuli belong to the same dataset, i.e., the same stimulus could be used as an adapter in one block or as a test stimulus in another block. However, in single blocks, the adapter stimulus was always different from the test stimuli presented thereafter. In other words, every stimulus could appear randomly as adapter or as a test in different blocks, but not in the same block. The movie clip was presented centrally. Simultaneously with the stimulus presentation, the question “Which emotion?” appeared on the upper part of the screen, and the two response options (“Fear” and “Happiness”) were presented on the lower part of the monitor. For each test stimulus, participants were asked to categorize the expressed emotion as fast as possible by key-press. The response options were indicated with a label placed over the keys “G” and “H”, and the key-emotion correspondence was randomized across participants. Participants were asked to respond using the index and the middle finger of their right hand. The PLD was presented for a maximum of 2 s, whereas the question and the response period lasted until participants responded. Accuracy and response times (RTs) were recorded.

Data analyses.

The dependent variable was mean RTs. Only correct responses were included in the analyses (the overall error rate was 4.43%). Data distributions failed the normality (Shapiro–Wilk's test) and homoscedasticity of variance (Bartlett's test) tests. To normalize the distribution, the averaged RTs were log-transformed before analyses (logRT). A two-way repeated-measures ANOVA was conducted with emotional content of the test stimuli (“emoTest”: fear or happiness) and emotional congruence between test and adapter stimuli (congruent or incongruent) as within-subject factors. Post hoc comparisons were performed with two-tailed paired-samples t tests with correction of the significance threshold for multiple comparisons whenever appropriate. All analyses were performed using R v3.3.1 (R Development Core Team, 2016).

Experiment 2: effects of TMS on perceptual adaptation

Participants.

Seventeen healthy adults (11 females, 6 males; mean age = 25.63, SD = 5.17) participated in the TMS experiment (Experiment 2). Three participants were excluded from the analysis because of difficulties in determining their resting motor threshold. In these participants, the TMS stimulation over M1 did not produce any visible hand twitch, and no motor sensation was perceived. Hence, the final analyses were performed on a total of 14 participants. Participants in the TMS experiment were screened for MRI and TMS contraindication before the experiment and received a £15 voucher refund for their participation. All participants had normal or corrected-to-normal vision. Before the experiment, all participants provided written informed consent. The protocol was approved by the University of Westminster's ethical committee in accordance with the Declaration of Helsinki.

Neuronavigation and identification of stimulation sites on individual anatomy.

We used MRI-guided neuronavigation (BrainInnovation BV) for accurate positioning of the TMS coil. For each participant, a high resolution T1-weighted MPRAGE scan (176 partitions, 1 × 1 × 1 mm, flip angle = 7°, TI = 1000 ms, TE = 3.57 ms, TR = 8.4 ms) was acquired before the TMS experiment. Structural MRI images were obtained with a 1.5 T whole-body TIM Avanto System (Siemens Healthcare), at the Birkbeck/University College London Centre for NeuroImaging, with a 32-channel head coil. A 3D reconstruction of the gray matter surfaces and the scalp was created for each participant, which were coregistered to the participant's head to position the coil over the site of stimulation and to control coil position throughout the experiment. In each participant, three different sites in the right hemisphere were stimulated: the right pSTS, the right aIPS, and a posterior occipital control area located next to the midline. The three loci were identified on the basis of macro-anatomical landmarks. Specifically, the pSTS was targeted over the transition between its posterior segment and its horizontal segment (for an overview of STS anatomy, see Ochiai et al., 2004). We defined the aIPS as the most rostral part of the IPS at the intersection between the postcentral gyrus and the IPS (Caspers et al., 2006). Control TMS was applied to a site corresponding to a secondary visual area not primarily implied in coding for emotional aspect of visual stimuli, located between BA 17 and BA 18 (Fig. 1).

Figure 1.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 1.

Representation of stimulation sites and respective anatomical landmarks. Right, Individual renderings of the gray-white matter border in each of the 14 participants. Left, The same brains as in the right are shown with the main anatomical landmarks used for localization of TMS targets. Blue, Central sulcus; green, postcentral sulcus; yellow, intraparietal sulcus; purple, Silvian fissure; red, superior temporal sulcus. The three stimulation sites (aIPS, pSTS, and control) are represented with white spots.

TMS.

Biphasic TMS pulses were applied with a figure-of-eight coil (D70 mm coil) connected to a Magstim Rapid2 stimulator. At first we searched in each participant the visually assessed resting motor threshold (rMT), defined as the stimulator's output intensity necessary to obtain a visible twitch in the contralateral intrinsic hand muscles in exactly 50% of trials in a series of at least eight consecutive pulses (Rossini et al., 1994). The intensity of stimulation in the actual experiment was set to 120% of the individual's rMT with a maximum of 65% maximal stimulator output due to coil overheating and limiting discomfort to participants. The coil was attached to a Magstim coil stand and placed tangentially to the scalp. Coil orientation was medial-lateral with the handle pointing laterally and slightly posteriorly (70° from the midline) for the aIPS position, to induce a current in the underlying cortical tissue approximately perpendicular to the IPS. A similar orientation was used for the stimulation of pSTS, but with the coil handles pointing upward. Due to pSTS proximity to the ears, in some participants the coil orientation was changed to minimize discomfort. For the occipital (control) stimulation, the coil was positioned perpendicular to the midline with the handle pointing outward. TMS was delivered in triplets. In every trial participants received three 10 Hz pulses time-locked to the onset of the PLD, starting synchronously with the visual stimulus.

Procedure.

The TMS paradigm was identical to that used in Experiment 1 described above. Every block consisted of 1 min of adapting period followed by eight test trials. A total of 12 adapter stimuli (6 happy and 6 fearful PLDs) and 96 test stimuli were presented for each of the three sites of stimulation. The order of adaptation blocks was randomized. During the adaptation period the same PLD was repeated 30 times (for 60 s). Participants were asked to simply watch the adapter stimuli and focus on the emotion expressed by the actor. At the end of adaptation, eight test stimuli (4 fearful and 4 happy PLDs) were presented. Half of the test stimuli were emotionally congruent (i.e., same emotion) and half were emotionally incongruent (i.e., different emotion) with the adapter, and their order was randomized. Participants were asked to categorize the expressed emotion (fear or happiness) as fast as possible by key-press, using the index and the middle finger of their right hand (Fig. 2). Accuracy and RTs were recorded. The three stimulation sites (right pSTS, right aIPS, and the control site) were stimulated on the same day with 30 min of delay between sessions. The order of stimulation sites was counterbalanced between participants. Participants wore earplugs and were seated in a comfortable chair in a quiet room, in front of a 24 in computer screen at a distance of 60 cm, with their head on a chinrest.

Figure 2.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 2.

Timeline of TMS experiments. Every block consisted of 1 min of adapting period followed by eight test trials. During the adaptation period the same PLD was presented 30 times and participants were asked to simply watch the adapter stimuli. At the end of adaptation, 4 fearful and 4 happy test stimuli were presented. Participants were asked to categorize the expressed emotion by key press. Three 10 Hz TMS pulses were applied at the onset of every test stimulus.

Data analyses.

All analyses were performed using R v3.3.1 (R Development Core Team, 2016). The dependent variable was the mean of RTs. Only correct responses were included in the analyses. Data were tested for normality (Shapiro test) and homoscedasticity of variance (Bartlett test). To normalize the distribution, the averaged RTs were log-transformed before analyses (logRT). A three-way repeated-measures ANOVA (3 × 2 × 2) was performed. The site of TMS stimulation (“stimSite”), the emotional valence of the test stimuli (emoTest) and the emotional congruence between test and adapter stimuli (“congruence”) were entered as within-subject factors. Post hoc comparisons were performed with two-tailed paired-samples t tests. The significance threshold for the p values was corrected for multiple comparisons when appropriate. As a measure of the effects size, the generalized eta-squared (η2) is reported when appropriate. In addition, we calculated the Cohen's d for the significant comparisons using bootstrap resamples method (Gerlanc and Kirby, 2015). The number of bootstrap resamples (R) was set at 2000. Bootstrap Cohen's d effect size measures and their corresponding 95% confidence intervals (CIs) are also reported when appropriate.

Results

Experiment 1: behavioral evidence of perceptual adaptation to the emotional content of PLDs

In Experiment 1, the overall error rate was 4.43%. A summary of the results of Experiment 1 is presented in Table 2 and Figure 3. The two-way ANOVA showed a significant main effect of congruence (F(1,25) = 7.31, p = 0.012) with incongruent stimuli being recognized faster than congruent ones, while the interaction between emoTest and congruence was not significant (F(1,25) = 0.856, p = 0.364; η2 = 0.014; Cohen's d = −0.236; CI = −0.660, 0.166).

View this table:
  • View inline
  • View popup
Table 2.

Mean and SE of RTs in all the conditions in Experiment 1

Figure 3.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 3.

Visualization of results from Experiment 1. The performance of each participant is represented with a black bar. The gray columns represent the mean of RTs in congruent and incongruent conditions. Main analysis revealed an adaptation after-effect for affective PLDs with congruent stimuli being recognized significantly slower than incongruent ones. * indicates significant difference between congruent and incongruent condition.

Experiment 2: state-dependent effects of TMS over aIPS on explicit categorization of fearful PLDs

In Experiment 2, the overall error rate was 3.87%. The three-way ANOVA showed a significant main effect of congruence (F(1,13) = 14.994, p = 0.002), with congruent stimuli being recognized slower than incongruent ones (mean RTs: congruent = 1194 ms; incongruent = 1148 ms), suggesting the presence of an adaptation after-effect for affective PLDs and confirming the results of the behavioral experiment (Experiment 1). More importantly, we found a significant three-way interaction between stimSite, emoTest, and congruence (F(2,26) = 3.546, p = 0.043). To better understand this interaction, we performed three 2 × 2 repeated-measures ANOVAs in the three stimulation sites separately, with emoTest and congruence as within factors. We found a significant main effect of congruence in the control site (F(1,13) = 9.329; p = 0.009; η2 = 0.017) and in pSTS (F(1,13) = 9.393; p = 0.009; η2 = 0.029), showing that the adaptation after-effect persisted and hence suggesting that TMS stimulation did not have any effect on those two brain areas. On the contrary, ANOVA in aIPS showed a significant interaction between emoTest and congruence (F(1,13) = 8.474; p = 0.012; η2 = 0.022), but no significant main effects. In particular, the adaptation after-effect was still present for happy test stimuli (p = 0.009; Cohen's d = −0.311; CI = −1.114, 0.459) with incongruent stimuli recognized faster than congruent ones. Conversely, the adaptation after-effect was completely abolished for fearful test stimuli, to the point that we observed a trend toward an inversion of the adaptation effects, i.e., congruent test stimuli were recognized faster than congruent ones (p = 0.066; Cohen's d = 0.267; CI = −0.459, 1.075).

Discussion

Perceptual adaptation to emotional content of PLDs

In the first experiment, we investigated the perceptual aftereffects produced by repeated observation of emotional PLDs. When categorizing an affective PLD, participants' performance was markedly biased (slower RTs) by their previous exposure to congruent emotions. Adaptation aftereffects for PLDs have been reported previously for different features of biological motion, including gender characteristics (Troje et al., 2006), action category (van Boxtel and Lu, 2013; de la Rosa et al., 2014), and spatial orientation of observed bodily trajectories (Jackson and Blake, 2010; Theusner et al., 2011). Also judgments on-hand-object interactions in PLDs are susceptible to visual adaptation: viewing the grasping of a light object biases the judgment on subsequent grasped objects that appear heavier (Barraclough et al., 2009). In addition to, a number of studies reported adaptation aftereffects to affective facial (Russell and Fehr, 1987; Webster et al., 2004; Fox and Barton, 2007; Webster and MacLeod, 2011) and vocal expressions (Skuk and Schweinberger, 2013; Bestelmeyer et al., 2014). It remained unexplored whether emotional bodily movements can produce adaptation aftereffects. Our study fills this gap, providing the first evidence that perception of emotional whole-body movements can undergo selective perceptual adaptation.

Absence of state-dependent effects of TMS on the early visual cortex (control condition)

The aim of Experiment 2 was to examine the neural locus of this adaptation effect for affective dynamic bodily expressions. Following control stimulation we found adaptation after-effects similar to those observed in Experiment 1, i.e., a disadvantage in recognizing PLDs emotionally congruent with the adapter sequences (Fig. 4). Given the assumptions of TMS-adaptation paradigms, we did not expect any effect of TMS on this region, because the adapted features (bodily movements) are not supposed to be coded in the early visual cortex. Indeed, the earliest visual representation of bodies along the visual pathways is in the lateral occipital complex, way more rostral than the area that we chose as control (Downing et al., 2001). Studies in blindsight patients suggest that the processing of emotional information can efficaciously occur despite lesions of the early visual areas, either when conveyed by faces (de Gelder et al., 1999; Morris et al., 2001) or by body postures (de Gelder and Hadjikhani, 2006). Accordingly, in another study, TMS perturbation of V1 impaired the discrimination of neutral, but not emotional, body postures, supporting the hypothesis that the encoding of the emotional content does not depend on V1 (Filmer and Monsell, 2013).

Figure 4.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 4.

Visualization of results from Experiment 2. Mean RTs are shown, classified according to emotion in the test PLD (happiness or fear); congruence with the adapter sequence (congruent or incongruent, indicated in the figure as “congr.” or “inc.” respectively); and to the site of TMS (aIPS, pSTS, or occipital control). The vertical bars represent the SE.

Absence of state-dependent effects of TMS on the pSTS

In contrast to the early visual cortex, the pSTS is tuned to biological motion. However, to our surprise, no state-dependent effects of TMS were found. We interpreted this finding in the light of the functional specialization of the pSTS. The integrity of STS is fundamental to biological motion identification (Vaina et al., 1990; Grossman et al., 2005; Saygin, 2007), it encodes low-level pictorial aspects of BM (Cattaneo et al., 2010), and it represents bodily movements separately for different body parts (upper limb, face, whole-body, gaze; Hein and Knight, 2008), probably in a viewpoint-invariant manner (Grossman et al., 2010). In one TMS study, stimulation of pSTS improved the visual match of body forms specifically for fearful body postures (Candidi et al., 2011). However this type of task relies on pictorial analysis likely encoded in pSTS, whereas we asked to recognize the emotional meaning of dynamic PLDs, potentially related to higher level of action representation implemented in aIPS (Fogassi et al., 2005; Hamilton and Grafton, 2006; Shmuelof and Zohary, 2006; Cattaneo et al., 2010). Similarly, another study (Tseng et al., 2014) showed that the specific effects of static fearful facial displays as distracters in a visual search task could be disrupted by anodal transcranial direct current stimulation over the right pSTS.

State-dependent effects of TMS on the aIPS

TMS stimulation over aIPS reduced significantly the cost of adaptation, and even produced a reversal of the cost of this effect, turning it into behavioral advantage. This finding is diagnostic for the presence of neurons that were affected by adaptation changes in the stimulated area (Silvanto et al., 2008; Romei et al., 2016; Silvanto and Pascual-Leone, 2008). TMS-adaptation is based on the phenomenon that the impact of TMS depends on the ongoing activity in the targeted region. TMS, which in the absence of adaptation impairs behavior, can induce a facilitatory effect if neurons in the targeted area have undergone adaptation (Silvanto et al., 2007a, b). The outcome of this differential effect of TMS on adapted versus nonadapted neuronal representations is the removal/reversal of the behavioral adaptation effect (Silvanto et al., 2007b; Silvanto and Muggleton, 2008; Romei et al., 2016). In the present study, the removal of behavioral adaptation to fearful stimuli by aIPS TMS indicates that this region contains neuronal representations tuned to affective movements.

Interestingly, the effects of TMS over aIPS were limited to fearful PLDs, and were absent for happy PLDs. What do we know about action representation in the aIPS? Several lines of evidence in both human (Arfeller et al., 2013) and nonhuman primates (Borra et al., 2008; Nelissen et al., 2011; Rizzolatti et al., 2014) indicate that action representations are hierarchically organized between a low-level pictorial representation in pSTS, and a more abstract representation of action goals in the parietofrontal system including the aIPS (Tunik et al., 2007; Cattaneo et al., 2010). The aIPS generalizes actions across effectors (Cattaneo et al., 2010) and is capable of encoding action invariants such as action endpoints, outcomes, and environmental changes produced by actions (Hamilton and Grafton, 2006, 2008). In Experiment 2, we found evidence that the explicit recognition of the emotional component of body movements relies in part on the parietal node of the AOS. Visual observation of emotional body movements produces activity in several brain networks, such as visual regions, the limbic network, and the AOS (de Gelder et al., 2004, 2010; Tamietto et al., 2007; Pichon et al., 2008; van de Riet et al., 2009; Meeren et al., 2013). There are several different neural mechanisms by which the human brain can identify and categorize observed affective displays. The capacity to recognize nonverbal affective communications generally relies on a core system that is likely to be located within the limbic system (LeDoux, 1996; Öhman and Mineka, 2001; Adolphs et al., 2003). However, our findings indicate that (limitedly to explicit processes) some subtypes of emotional body movements may be encoded as purposeful, goal-directed actions in the aIPS. Conversely, the pSTS, being the site of simple movement representation, does not seem to contain a specific representation of affective movements.

Dissociation between fear and happiness in the aIPS

State-dependent effects of TMS in aIPS were specific to fearful PLDs (Fig. 4). Why do fearful stimuli seem to be predominantly represented in the aIPS compared with happy stimuli? A possible explanation is that the affective state of fear itself is represented in the aIPS. Alternatively, it is possible the motor patterns expressing fear have characteristics that are best encoded by the aIPS, which preferentially processes goal-directed, purposeful movements (Cattaneo et al., 2010). The fearful bodily movements represented in our stimuli were in most cases directed toward a position in space as they depicted self-protective or avoidance body movements directed away from specific threatening agents (see example videos at http://community.dur.ac.uk/a.p.atkinson/Stimuli.html). On the contrary, happy stimuli (e.g., exulting, clapping hands, joyful hopping) were not directed toward or away from specific sectors in space. Therefore, the fear–happiness dissociation could be explained by a higher goal-directedness or space orientation in fearful movements compared with happy ones. From an evolutionary point of view, the emotional movements are communicative in nature, and our brain's prompt reactions to them is essential for the survival (Darwin, 1872; Ekman, 1957; Grèzes et al., 2007). In this sense, each emotional subtype has an own identity, and its affective state is not dissociable from its stereotyped communicative motor behavior. The effective communication of fearful content is more likely relied on goal-directed and spatially oriented actions than happiness. We therefore favor the hypothesis that fearful movements have a more “praxic” and “goal-directed” quality compared with happiness. In line with that, several studies have reported that the motor system is specifically tuned to fearful body movements as shown by changes in corticospinal excitability in response to fearful body postures (Borgomaneri et al., 2012, 2015), fearful facial expressions (Schutter et al., 2008), and negative natural complex scenes (Borgomaneri et al., 2014). However, the role of activity in the corticospinal system in action comprehension remains unclear.

Conclusions

We conclude that, while performing explicit categorizations (i.e., high-level cognitive task), the human brain considers fearful emotional body movements as goal-directed actions. This conclusion is supported by the specific recruitment of the cortical network that is specialized in processing actions. The AOS therefore contains representations of affective movements, as long as these are interpreted as finalistic, goal-directed, meaningful actions. On the contrary, the pSTS is known to encode BM according to its characteristic kinematic, distinguishing it from nonhuman motion, and is apparently not encoding specifically neither fearful nor happy bodily actions.

Footnotes

  • This work was supported by the ERC (336152) to J.S. and the F.R.S.-F.N.R.S. (“Charge de recherches”) to C.J. We thank Antony P. Atkinson (Durham University), Paola Ricciardelli, and Rossana Actis-Grosso (University of Milano-Bicocca) for sharing with us the stimuli; Birkbeck-University College of London Centre for NeuroImaging (BUCNI); and Christina Moutsiana, Benjamin de Haas, and Iroise Dumontheil for technical assistance during MRI scan acquisition.

  • The authors declare no competing financial interests.

  • Correspondence should be addressed to Dr. Noemi Mazzoni, ODFLab, Department of Psychology and Cognitive Science, University of Trento, Via Matteo del Ben 5b, 38068, Rovereto (TN) Italy. noemi.mazzoni{at}unitn.it

References

  1. ↵
    1. Adolphs R
    (2002) Neural systems for recognizing emotion. Curr Opin Neurobiol 12:169–177. doi:10.1016/S0959-4388(02)00301-X pmid:12015233
    OpenUrlCrossRefPubMed
  2. ↵
    1. Adolphs R,
    2. Tranel D,
    3. Damasio AR
    (2003) Dissociable neural systems for recognizing emotions. Brain Cogn 52:61–69. doi:10.1016/S0278-2626(03)00009-5 pmid:12812805
    OpenUrlCrossRefPubMed
  3. ↵
    1. Alaerts K,
    2. Nackaerts E,
    3. Meyns P,
    4. Swinnen SP,
    5. Wenderoth N
    (2011) Action and emotion recognition from point light displays: an investigation of gender differences. PLoS One 6:e20989. doi:10.1371/journal.pone.0020989 pmid:21695266
    OpenUrlCrossRefPubMed
  4. ↵
    1. Arfeller C,
    2. Schwarzbach J,
    3. Ubaldi S,
    4. Ferrari P,
    5. Barchiesi G,
    6. Cattaneo L
    (2013) Whole-brain haemodynamic after-effects of 1-Hz magnetic stimulation of the posterior superior temporal cortex during action observation. Brain Topogr 26:278–291. doi:10.1007/s10548-012-0239-9 pmid:22772359
    OpenUrlCrossRefPubMed
  5. ↵
    1. Atkinson AP,
    2. Dittrich WH,
    3. Gemmell AJ,
    4. Young AW
    (2004) Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception 33:717–746. doi:10.1068/p5096 pmid:15330366
    OpenUrlCrossRefPubMed
  6. ↵
    1. Atkinson AP,
    2. Tunstall ML,
    3. Dittrich WH
    (2007) Evidence for distinct contributions of form and motion information to the recognition of emotions from body gestures. Cognition 104:59–72. doi:10.1016/j.cognition.2006.05.005 pmid:16831411
    OpenUrlCrossRefPubMed
  7. ↵
    1. Atkinson AP,
    2. Vuong QC,
    3. Smithson HE
    (2012) Modulation of the face- and body-selective visual regions by the motion and emotion of point-light face and body stimuli. Neuroimage 59:1700–1712. doi:10.1016/j.neuroimage.2011.08.073 pmid:21924368
    OpenUrlCrossRefPubMed
  8. ↵
    1. Barraclough NE,
    2. Keith RH,
    3. Xiao D,
    4. Oram MW,
    5. Perrett DI
    (2009) Visual adaptation to goal-directed hand actions. J Cogn Neurosci 21:1806–1820. doi:10.1162/jocn.2008.21145 pmid:18855549
    OpenUrlCrossRefPubMed
  9. ↵
    1. Bestelmeyer PE,
    2. Maurage P,
    3. Rouger J,
    4. Latinus M,
    5. Belin P
    (2014) Adaptation to vocal expressions reveals multistep perception of auditory emotion. J Neurosci 34:8098–8105. doi:10.1523/JNEUROSCI.4820-13.2014 pmid:24920615
    OpenUrlAbstract/FREE Full Text
  10. ↵
    1. Borgomaneri S,
    2. Gazzola V,
    3. Avenanti A
    (2012) Motor mapping of implied actions during perception of emotional body language. Brain Stimul 5:70–76. doi:10.1016/j.brs.2012.03.011 pmid:22503473
    OpenUrlCrossRefPubMed
  11. ↵
    1. Borgomaneri S,
    2. Gazzola V,
    3. Avenanti A
    (2014) Temporal dynamics of motor cortex excitability during perception of natural emotional scenes. Soc Cogn Affect Neurosci 9:1451–1457. doi:10.1093/scan/nst139 pmid:23945998
    OpenUrlCrossRefPubMed
  12. ↵
    1. Borgomaneri S,
    2. Vitale F,
    3. Gazzola V,
    4. Avenanti A
    (2015) Seeing fearful body language rapidly freezes the observer's motor cortex. Cortex 65:232–245. doi:10.1016/j.cortex.2015.01.014 pmid:25835523
    OpenUrlCrossRefPubMed
  13. ↵
    1. Borra E,
    2. Belmalih A,
    3. Calzavara R,
    4. Gerbella M,
    5. Murata A,
    6. Rozzi S,
    7. Luppino G
    (2008) Cortical connections of the macaque anterior intraparietal (AIP) area. Cereb Cortex 18:1094–1111. doi:10.1093/cercor/bhm146 pmid:17720686
    OpenUrlAbstract/FREE Full Text
  14. ↵
    1. Candidi M,
    2. Urgesi C,
    3. Ionta S,
    4. Aglioti SM
    (2008) Virtual lesion of ventral premotor cortex impairs visual perception of biomechanically possible but not impossible actions. Soc Neurosci 3:388–400. doi:10.1080/17470910701676269 pmid:18979387
    OpenUrlCrossRefPubMed
  15. ↵
    1. Candidi M,
    2. Stienen BM,
    3. Aglioti SM,
    4. de Gelder B
    (2011) Event-related repetitive transcranial magnetic stimulation of posterior superior temporal sulcus improves the detection of threatening postural changes in human bodies. J Neurosci 31:17547–17554. doi:10.1523/JNEUROSCI.0697-11.2011 pmid:22131416
    OpenUrlAbstract/FREE Full Text
  16. ↵
    1. Caspers S,
    2. Geyer S,
    3. Schleicher A,
    4. Mohlberg H,
    5. Amunts K,
    6. Zilles K
    (2006) The human inferior parietal cortex: cytoarchitectonic parcellation and interindividual variability. Neuroimage 33:430–448. doi:10.1016/j.neuroimage.2006.06.054 pmid:16949304
    OpenUrlCrossRefPubMed
  17. ↵
    1. Cattaneo L,
    2. Rizzolatti G
    (2009) The mirror neuron system. Arch Neurol 66:557–560. doi:10.1001/archneurol.2009.41 pmid:19433654
    OpenUrlCrossRefPubMed
  18. ↵
    1. Cattaneo L,
    2. Sandrini M,
    3. Schwarzbach J
    (2010) State-dependent TMS reveals a hierarchical representation of observed acts in the temporal, parietal, and premotor cortices. Cereb Cortex 20:2252–2258. doi:10.1093/cercor/bhp291 pmid:20051360
    OpenUrlAbstract/FREE Full Text
  19. ↵
    1. Cattaneo L,
    2. Barchiesi G,
    3. Tabarelli D,
    4. Arfeller C,
    5. Sato M,
    6. Glenberg AM
    (2011) One's motor performance predictably modulates the understanding of others' actions through adaptation of premotor visuo-motor neurons. Soc Cogn Affect Neurosci 6:301–310. doi:10.1093/scan/nsq099 pmid:21186167
    OpenUrlCrossRefPubMed
  20. ↵
    1. Cattaneo Z,
    2. Silvanto J
    (2008) Investigating visual motion perception using the transcranial magnetic stimulation-adaptation paradigm. Neuroreport 19:1423–1427. doi:10.1097/WNR.0b013e32830e0025 pmid:18766024
    OpenUrlCrossRefPubMed
  21. ↵
    1. Chouchourelou A,
    2. Matsuka T,
    3. Harber K,
    4. Shiffrar M
    (2006) The visual analysis of emotional actions. Soc Neurosci 1:63–74. doi:10.1080/17470910600630599 pmid:18633776
    OpenUrlCrossRefPubMed
  22. ↵
    1. Clarke TJ,
    2. Bradshaw MF,
    3. Field DT,
    4. Hampson SE,
    5. Rose D
    (2005) The perception of emotion from body movement in point-light displays of interpersonal dialogue. Perception 34:1171–1180. doi:10.1068/p5203 pmid:16309112
    OpenUrlCrossRefPubMed
  23. ↵
    1. Darwin CR
    (1872) The expression of the emotions in man and animals. London: John Murray.
  24. ↵
    1. de Gelder B
    (2006) Towards the neurobiology of emotional body language. Nat Rev Neurosci 7:242–249. doi:10.1038/nrn1872 pmid:16495945
    OpenUrlCrossRefPubMed
  25. ↵
    1. de Gelder B,
    2. Hadjikhani N
    (2006) Non-conscious recognition of emotional body language. Neuroreport 17:583–586. doi:10.1097/00001756-200604240-00006 pmid:16603916
    OpenUrlCrossRefPubMed
  26. ↵
    1. de Gelder B,
    2. Vroomen J,
    3. Pourtois G,
    4. Weiskrantz L
    (1999) Non-conscious recognition of affect in the absence of striate cortex. Neuroreport 10:3759–3763. doi:10.1097/00001756-199912160-00007 pmid:10716205
    OpenUrlCrossRefPubMed
  27. ↵
    1. de Gelder B,
    2. Snyder J,
    3. Greve D,
    4. Gerard G,
    5. Hadjikhani N
    (2004) Fear fosters flight: a mechanism for fear contagion when perceiving emotion expressed by a whole body. Proc Natl Acad Sci U S A 101:16701–16706. doi:10.1073/pnas.0407042101 pmid:15546983
    OpenUrlAbstract/FREE Full Text
  28. ↵
    1. de Gelder B,
    2. Van den Stock J,
    3. Meeren HK,
    4. Sinke CB,
    5. Kret ME,
    6. Tamietto M
    (2010) Standing up for the body: recent progress in uncovering the networks involved in the perception of bodies and bodily expressions. Neurosci Biobehav Rev 34:513–527. doi:10.1016/j.neubiorev.2009.10.008 pmid:19857515
    OpenUrlCrossRefPubMed
  29. ↵
    1. de Gelder B,
    2. de Borst AW,
    3. Watson R
    (2015) The perception of emotion in body expressions. Wiley Interdiscip Rev Cogn Sci 6:149–158. doi:10.1002/wcs.1335 pmid:26263069
    OpenUrlCrossRefPubMed
  30. ↵
    1. de la Rosa S,
    2. Streuber S,
    3. Giese M,
    4. Bülthoff HH,
    5. Curio C
    (2014) Putting actions in context: visual action adaptation aftereffects are modulated by social contexts. PLoS One 9:e86502. doi:10.1371/journal.pone.0086502 pmid:24466123
    OpenUrlCrossRefPubMed
  31. ↵
    1. Dittrich WH,
    2. Troscianko T,
    3. Lea SE,
    4. Morgan D
    (1996) Perception of emotion from dynamic point-light displays represented in dance. Perception 25:727–738. doi:10.1068/p250727 pmid:8888304
    OpenUrlCrossRefPubMed
  32. ↵
    1. Downing PE,
    2. Jiang Y,
    3. Shuman M,
    4. Kanwisher N
    (2001) A cortical area selective for visual processing of the human body. Science 293:2470–2473. doi:10.1126/science.1063414 pmid:11577239
    OpenUrlAbstract/FREE Full Text
  33. ↵
    1. Ekman P
    (1957) A methodological discussion of nonverbal behavior. J Psychol 43:141–149. doi:10.1080/00223980.1957.9713059
    OpenUrlCrossRef
  34. ↵
    1. Engelen T,
    2. de Graaf TA,
    3. Sack AT,
    4. de Gelder B
    (2015) A causal role for inferior parietal lobule in emotion body perception. Cortex 73:195–202. doi:10.1016/j.cortex.2015.08.013 pmid:26460868
    OpenUrlCrossRefPubMed
  35. ↵
    1. Filmer HL,
    2. Monsell S
    (2013) TMS to V1 spares discrimination of emotive relative to neutral body postures. Neuropsychologia 51:2485–2491. doi:10.1016/j.neuropsychologia.2013.09.029 pmid:24071594
    OpenUrlCrossRefPubMed
  36. ↵
    1. Fogassi L,
    2. Ferrari PF,
    3. Gesierich B,
    4. Rozzi S,
    5. Chersi F,
    6. Rizzolatti G
    (2005) Parietal lobe: from action organization to intention understanding. Science 308:662–667. doi:10.1126/science.1106138 pmid:15860620
    OpenUrlAbstract/FREE Full Text
  37. ↵
    1. Fox CJ,
    2. Barton JJ
    (2007) What is adapted in face adaptation? The neural representations of expression in the human visual system. Brain Res 1127:80–89. doi:10.1016/j.brainres.2006.09.104 pmid:17109830
    OpenUrlCrossRefPubMed
  38. ↵
    1. Gerlanc D,
    2. Kirby K
    (2015) bootES: bootstrap effect sizes. R package version 1.2. https://CRAN.R-project.org/package=bootES
  39. ↵
    1. Grèzes J,
    2. Pichon S,
    3. de Gelder B
    (2007) Perceiving fear in dynamic body expressions. Neuroimage 35:959–967. doi:10.1016/j.neuroimage.2006.11.030 pmid:17270466
    OpenUrlCrossRefPubMed
  40. ↵
    1. Grossman ED,
    2. Battelli L,
    3. Pascual-Leone A
    (2005) Repetitive TMS over posterior STS disrupts perception of biological motion. Vision Res 45:2847–2853. doi:10.1016/j.visres.2005.05.027 pmid:16039692
    OpenUrlCrossRefPubMed
  41. ↵
    1. Grossman ED,
    2. Jardine NL,
    3. Pyles JA
    (2010) fMR-adaptation reveals invariant coding of biological motion on the human STS. Front Hum Neurosci 4:15. doi:10.3389/neuro.09.015.2010 pmid:20431723
    OpenUrlCrossRefPubMed
  42. ↵
    1. Hamilton AF,
    2. Grafton ST
    (2006) Goal representation in human anterior intraparietal sulcus. J Neurosci 26:1133–1137. doi:10.1523/JNEUROSCI.4551-05.2006 pmid:16436599
    OpenUrlAbstract/FREE Full Text
  43. ↵
    1. Hamilton, AF,
    2. Grafton ST
    (2008) Action outcomes are represented in human inferior frontoparietal cortex. Cereb Cortex 18:1160–1168. doi:10.1093/cercor/bhm150 pmid:17728264
    OpenUrlAbstract/FREE Full Text
  44. ↵
    1. Hein G,
    2. Knight RT
    (2008) Superior temporal sulcus—it's my area: or is it? J Cogn Neurosci 20:2125–2136. doi:10.1162/jocn.2008.20148 pmid:18457502
    OpenUrlCrossRefPubMed
  45. ↵
    1. Jackson S,
    2. Blake R
    (2010) Neural integration of information specifying human structure from form, motion, and depth. J Neurosci 30:838–848. doi:10.1523/JNEUROSCI.3116-09.2010 pmid:20089892
    OpenUrlAbstract/FREE Full Text
  46. ↵
    1. Jacquet PO,
    2. Avenanti A
    (2015) Perturbing the action observation network during perception and categorization of actions' goals and grips: state-dependency and virtual lesion TMS effects. Cereb Cortex 25:598–608. doi:10.1093/cercor/bht242 pmid:24084126
    OpenUrlAbstract/FREE Full Text
  47. ↵
    1. Johansson G
    (1973) Visual perception of biological motion and a model for its analysis. Percept Psychophys 14:201–211. doi:10.3758/BF03212378
    OpenUrlCrossRef
  48. ↵
    1. Kadosh RC,
    2. Muggleton N,
    3. Silvanto J,
    4. Walsh V
    (2010) Double dissociation of format-dependent and number-specific neurons in human parietal cortex. Cereb Cortex 20:2166–2171. doi:10.1093/cercor/bhp273 pmid:20051361
    OpenUrlAbstract/FREE Full Text
  49. ↵
    1. LeDoux JE
    (1996) The emotional brain. New York: Simon and Shuster.
  50. ↵
    1. Meeren HK,
    2. de Gelder B,
    3. Ahlfors SP,
    4. Hämäläinen MS,
    5. Hadjikhani N
    (2013) Different cortical dynamics in face and body perception: an MEG study. PLoS One 8:e71408. doi:10.1371/journal.pone.0071408 pmid:24039712
    OpenUrlCrossRefPubMed
  51. ↵
    1. Morris JS,
    2. DeGelder B,
    3. Weiskrantz L,
    4. Dolan RJ
    (2001) Differential extrageniculostriate and amygdala responses to presentation of emotional faces in a cortically blind field. Brain 124:1241–1252. doi:10.1093/brain/124.6.1241 pmid:11353739
    OpenUrlAbstract/FREE Full Text
  52. ↵
    1. Nelissen K,
    2. Borra E,
    3. Gerbella M,
    4. Rozzi S,
    5. Luppino G,
    6. Vanduffel W,
    7. Rizzolatti G,
    8. Orban GA
    (2011) Action observation circuits in the macaque monkey cortex. J Neurosci 31:3743–3756. doi:10.1523/JNEUROSCI.4803-10.2011 pmid:21389229
    OpenUrlAbstract/FREE Full Text
  53. ↵
    1. Ochiai T,
    2. Grimault S,
    3. Scavarda D,
    4. Roch G,
    5. Hori T,
    6. Rivière D,
    7. Mangin JF,
    8. Régis J
    (2004) Sulcal pattern and morphology of the superior temporal sulcus. Neuroimage 22:706–719. doi:10.1016/j.neuroimage.2004.01.023 pmid:15193599
    OpenUrlCrossRefPubMed
  54. ↵
    1. Öhman A,
    2. Mineka S
    (2001) Fears, phobias, and preparedness: toward an evolved module of fear and fear learning. Psychol Rev 108:483–522. doi:10.1037/0033-295X.108.3.483 pmid:11488376
    OpenUrlCrossRefPubMed
  55. ↵
    1. Phillips ML,
    2. Drevets WC,
    3. Rauch SL,
    4. Lane R
    (2003) Neurobiology of emotion perception I: the neural basis of normal emotion perception. Biol Psychiatry 54:504–514. doi:10.1016/S0006-3223(03)00168-9 pmid:12946879
    OpenUrlCrossRefPubMed
  56. ↵
    1. Pichon S,
    2. de Gelder B,
    3. Grezes J
    (2008) Emotional modulation of visual and motor areas by dynamic body expressions of anger. Soc Neurosci 3:199–212. doi:10.1080/17470910701394368 pmid:18979376
    OpenUrlCrossRefPubMed
  57. ↵
    1. Pobric G,
    2. Hamilton AF
    (2006) Action understanding requires the left inferior frontal cortex. Curr Biol 16:524–529. doi:10.1016/j.cub.2006.01.033 pmid:16527749
    OpenUrlCrossRefPubMed
  58. ↵
    1. Puce A,
    2. Perrett D
    (2003) Electrophysiology and brain imaging of biological motion. Philos Trans R Soc Lond B Biol Sci 358:435–445. doi:10.1098/rstb.2002.1221 pmid:12689371
    OpenUrlAbstract/FREE Full Text
  59. ↵
    R Development Core Team (2016) R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna.
  60. ↵
    1. Rizzolatti G,
    2. Cattaneo L,
    3. Fabbri-Destro M,
    4. Rozzi S
    (2014) Cortical mechanisms underlying the organization of goal-directed actions and mirror neuron-based action understanding. Physiol Rev 94:655–706. doi:10.1152/physrev.00009.2013 pmid:24692357
    OpenUrlAbstract/FREE Full Text
  61. ↵
    1. Romei V,
    2. Thut G,
    3. Silvanto J
    (2016) Information-based approaches of noninvasive transcranial brain stimulation. Trends Neurosci 39:782–795. doi:10.1016/j.tins.2016.09.001 pmid:27697295
    OpenUrlCrossRefPubMed
  62. ↵
    1. Rossini PM,
    2. Barker AT,
    3. Berardelli A,
    4. Caramia MD,
    5. Caruso G,
    6. Cracco RQ,
    7. Dimitrijević MR,
    8. Hallett M,
    9. Katayama Y,
    10. Lücking CH,
    11. Maertens de Noordhout AL,
    12. Marsden CD,
    13. Murray NMF,
    14. Rothwell JC,
    15. Swash M,
    16. Tomberg C
    (1994) Non-invasive electrical and magnetic stimulation of the brain, spinal cord and roots: basic principles and procedures for routine clinical application: report of an IFCN committee. Electroencephalogr Clin Neurophysiol 91:79–92. doi:10.1016/0013-4694(94)90029-9 pmid:7519144
    OpenUrlCrossRefPubMed
  63. ↵
    1. Russell JA,
    2. Fehr B
    (1987) Relativity in the perception of emotion in facial expressions. J Exp Psychol Gen 116:223–237. doi:10.1037/0096-3445.116.3.223
    OpenUrlCrossRef
  64. ↵
    1. Sato M,
    2. Grabski K,
    3. Glenberg AM,
    4. Brisebois A,
    5. Basirat A,
    6. Ménard L,
    7. Cattaneo L
    (2011) Articulatory bias in speech categorization: evidence from use-induced motor plasticity. Cortex 47:1001–1003. doi:10.1016/j.cortex.2011.03.009 pmid:21501836
    OpenUrlCrossRefPubMed
  65. ↵
    1. Saygin AP
    (2007) Superior temporal and premotor brain areas necessary for biological motion perception. Brain 130:2452–2461. doi:10.1093/brain/awm162 pmid:17660183
    OpenUrlAbstract/FREE Full Text
  66. ↵
    1. Schutter DJ,
    2. Hofman D,
    3. Van Honk J
    (2008) Fearful faces selectively increase corticospinal motor tract excitability: a transcranial magnetic stimulation study. Psychophysiology 45:345–348. doi:10.1111/j.1469-8986.2007.00635.x pmid:18221448
    OpenUrlCrossRefPubMed
  67. ↵
    1. Shmuelof L,
    2. Zohary E
    (2006) A mirror representation of others' actions in the human anterior parietal cortex. J Neurosci 26:9736–9742. doi:10.1523/JNEUROSCI.1836-06.2006 pmid:16988044
    OpenUrlAbstract/FREE Full Text
  68. ↵
    1. Silvanto J,
    2. Pascual-Leone A
    (2008) State-dependency of transcranial magnetic stimulation. Brain Topogr 21:1–10. doi:10.1007/s10548-008-0067-0 pmid:18791818
    OpenUrlCrossRefPubMed
  69. ↵
    1. Silvanto J,
    2. Muggleton NG
    (2008) New light through old windows: moving beyond the “virtual lesion” approach to transcranial magnetic stimulation. Neuroimage 39:549–552. doi:10.1016/j.neuroimage.2007.09.008 pmid:17945512
    OpenUrlCrossRefPubMed
  70. ↵
    1. Silvanto J,
    2. Muggleton NG,
    3. Cowey A,
    4. Walsh V
    (2007a) Neural adaptation reveals state-dependent effects of transcranial magnetic stimulation. Eur J Neurosci 25:1874–1881. doi:10.1111/j.1460-9568.2007.05440.x pmid:17408427
    OpenUrlCrossRefPubMed
  71. ↵
    1. Silvanto J,
    2. Muggleton NG,
    3. Cowey A,
    4. Walsh V
    (2007b) Neural activation state determines behavioral susceptibility to modified theta burst transcranial magnetic stimulation. Eur J Neurosci 26:523–528. doi:10.1111/j.1460-9568.2007.05682.x pmid:17650122
    OpenUrlCrossRefPubMed
  72. ↵
    1. Silvanto J,
    2. Muggleton N,
    3. Walsh V
    (2008) State-dependency in brain stimulation studies of perception and cognition. Trends Cogn Sci 12:447–454. doi:10.1016/j.tics.2008.09.004 pmid:18951833
    OpenUrlCrossRefPubMed
  73. ↵
    1. Skuk VG,
    2. Schweinberger SR
    (2013) Adaptation aftereffects in vocal emotion perception elicited by expressive faces and voices. PLoS One 8:e81691. doi:10.1371/journal.pone.0081691 pmid:24236215
    OpenUrlCrossRefPubMed
  74. ↵
    1. Tamietto M,
    2. de Gelder B
    (2011) Sentinels in the visual system. Front Behav Neurosci 5:6. doi:10.3389/fnbeh.2011.00006 pmid:21373367
    OpenUrlCrossRefPubMed
  75. ↵
    1. Tamietto M,
    2. Adenzato M,
    3. Geminiani G,
    4. de Gelder B
    (2007) Fast recognition of social emotions takes the whole brain: interhemispheric cooperation in the absence of cerebral asymmetry. Neuropsychologia 45:836–843. doi:10.1016/j.neuropsychologia.2006.08.012 pmid:16996092
    OpenUrlCrossRefPubMed
  76. ↵
    1. Theusner S,
    2. de Lussanet MH,
    3. Lappe M
    (2011) Adaptation to biological motion leads to a motion and a form aftereffect. Atten Percept Psychophys 73:1843–1855. doi:10.3758/s13414-011-0133-7 pmid:21598067
    OpenUrlCrossRefPubMed
  77. ↵
    1. Troje NF,
    2. Sadr J,
    3. Geyer H,
    4. Nakayama K
    (2006) Adaptation aftereffects in the perception of gender from biological motion. J Vis 6(8):7 850–857. doi:10.1167/6.8.7 pmid:16895463
    OpenUrlAbstract/FREE Full Text
  78. ↵
    1. Tseng LY,
    2. Tseng P,
    3. Liang WK,
    4. Hung DL,
    5. Tzeng OJ,
    6. Muggleton NG,
    7. Juan CH
    (2014) The role of superior temporal sulcus in the control of irrelevant emotional face processing: a transcranial direct current stimulation study. Neuropsychologia 64:124–133. doi:10.1016/j.neuropsychologia.2014.09.015 pmid:25261612
    OpenUrlCrossRefPubMed
  79. ↵
    1. Tunik E,
    2. Rice NJ,
    3. Hamilton A,
    4. Grafton ST
    (2007) Beyond grasping: representation of action in human anterior intraparietal sulcus. Neuroimage 36:T77–86. doi:10.1016/j.neuroimage.2007.03.026 pmid:17499173
    OpenUrlCrossRefPubMed
  80. ↵
    1. Vaina LM,
    2. Lemay M,
    3. Bienfang DC,
    4. Choi AY,
    5. Nakayama K
    (1990) Intact “biological motion” and “structure from motion” perception in a patient with impaired motion mechanisms: a case study. Vis Neurosci 5:353–369. doi:10.1017/S0952523800000444 pmid:2265150
    OpenUrlCrossRefPubMed
  81. ↵
    1. van Boxtel JJ,
    2. Lu H
    (2013) Impaired global, and compensatory local, biological motion processing in people with high levels of autistic traits. Front Psychol 4:209. doi:10.3389/fpsyg.2013.00209 pmid:23630514
    OpenUrlCrossRefPubMed
  82. ↵
    1. van de Riet WA,
    2. Grezes J,
    3. de Gelder B
    (2009) Specific and common brain regions involved in the perception of faces and bodies and the representation of their emotional expressions. Soc Neurosci 4:101–120. doi:10.1080/17470910701865367 pmid:19255912
    OpenUrlCrossRefPubMed
  83. ↵
    1. van Kemenade BM,
    2. Muggleton N,
    3. Walsh V,
    4. Saygin AP
    (2012) Effects of TMS over premotor and superior temporal cortices on biological motion perception. J Cogn Neurosci 24:896–904. doi:10.1162/jocn_a_00194 pmid:22264195
    OpenUrlCrossRefPubMed
  84. ↵
    1. Webster MA,
    2. MacLeod DI
    (2011) Visual adaptation and face perception. Philos Trans R Soc Lond B Biol Sci 366:1702–1725. doi:10.1098/rstb.2010.0360 pmid:21536555
    OpenUrlAbstract/FREE Full Text
  85. ↵
    1. Webster MA,
    2. Kaping D,
    3. Mizokami Y,
    4. Duhamel P
    (2004) Adaptation to natural facial categories. Nature 428:557–561. doi:10.1038/nature02420 pmid:15058304
    OpenUrlCrossRefPubMed
Back to top

In this issue

The Journal of Neuroscience: 37 (30)
Journal of Neuroscience
Vol. 37, Issue 30
26 Jul 2017
  • Table of Contents
  • Table of Contents (PDF)
  • About the Cover
  • Index by author
  • Advertising (PDF)
  • Ed Board (PDF)
Email

Thank you for sharing this Journal of Neuroscience article.

NOTE: We request your email address only to inform the recipient that it was you who recommended this article, and that it is not junk mail. We do not retain these email addresses.

Enter multiple addresses on separate lines or separate them with commas.
State-Dependent TMS Reveals Representation of Affective Body Movements in the Anterior Intraparietal Cortex
(Your Name) has forwarded a page to you from Journal of Neuroscience
(Your Name) thought you would be interested in this article in Journal of Neuroscience.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Print
View Full Page PDF
Citation Tools
State-Dependent TMS Reveals Representation of Affective Body Movements in the Anterior Intraparietal Cortex
Noemi Mazzoni, Christianne Jacobs, Paola Venuti, Juha Silvanto, Luigi Cattaneo
Journal of Neuroscience 26 July 2017, 37 (30) 7231-7239; DOI: 10.1523/JNEUROSCI.0913-17.2017

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Respond to this article
Request Permissions
Share
State-Dependent TMS Reveals Representation of Affective Body Movements in the Anterior Intraparietal Cortex
Noemi Mazzoni, Christianne Jacobs, Paola Venuti, Juha Silvanto, Luigi Cattaneo
Journal of Neuroscience 26 July 2017, 37 (30) 7231-7239; DOI: 10.1523/JNEUROSCI.0913-17.2017
del.icio.us logo Digg logo Reddit logo Twitter logo Facebook logo Google logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • Introduction
    • Materials and Methods
    • Results
    • Discussion
    • Footnotes
    • References
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF

Keywords

  • adaptation
  • anterior intraparietal sulcus
  • biological motion
  • emotional bodily expressions
  • emotions
  • TMS

Responses to this article

Respond to this article

Jump to comment:

No eLetters have been published for this article.

Related Articles

Cited By...

More in this TOC Section

Research Articles

  • Evidence that ultrafast non-quantal transmission underlies synchronized vestibular action potential generation
  • Nfia is Critical for AII Amacrine Cell Production: Selective Bipolar Cell Dependencies and Diminished ERG
  • Multimodal Imaging for Validation and Optimization of Ion Channel-Based Chemogenetics in Nonhuman Primates
Show more Research Articles

Behavioral/Cognitive

  • Signatures of Electrical Stimulation Driven Network Interactions in the Human Limbic System
  • Dissociable Neural Mechanisms Underlie the Effects of Attention on Visual Appearance and Response Bias
  • Rhythmic Entrainment Echoes in Auditory Perception
Show more Behavioral/Cognitive
  • Home
  • Alerts
  • Visit Society for Neuroscience on Facebook
  • Follow Society for Neuroscience on Twitter
  • Follow Society for Neuroscience on LinkedIn
  • Visit Society for Neuroscience on Youtube
  • Follow our RSS feeds

Content

  • Early Release
  • Current Issue
  • Issue Archive
  • Collections

Information

  • For Authors
  • For Advertisers
  • For the Media
  • For Subscribers

About

  • About the Journal
  • Editorial Board
  • Privacy Policy
  • Contact
(JNeurosci logo)
(SfN logo)

Copyright © 2023 by the Society for Neuroscience.
JNeurosci Online ISSN: 1529-2401

The ideas and opinions expressed in JNeurosci do not necessarily reflect those of SfN or the JNeurosci Editorial Board. Publication of an advertisement or other product mention in JNeurosci should not be construed as an endorsement of the manufacturer’s claims. SfN does not assume any responsibility for any injury and/or damage to persons or property arising from or related to any use of any material contained in JNeurosci.