Abstract
Current models of face perception propose that initial visual processing is followed by activation of nonvisual somatosensory areas that contributes to emotion recognition. To test whether there is a pure and independent involvement of somatosensory cortex (SCx) during face processing over and above visual responses, we directly measured participants' somatosensory-evoked activity by tactually probing (105 ms postvisual facial stimuli) the state of SCx during an emotion discrimination task while controlling for visual effects. Discrimination of emotional versus neutral expressions enhanced early somatosensory-evoked activity between 40 and 80 ms after stimulus onset, suggesting visual emotion processing in SCx. This effect was source localized within primary, secondary, and associative somatosensory cortex. Emotional face processing influenced somatosensory responses to both face (congruent body part) and finger (control site) tactile stimulation, suggesting a general process that includes nonfacial cortical representations. Gender discrimination of the same facial expressions did not modulate somatosensory-evoked activity. We provide novel evidence that SCx activation is not a byproduct of visual processing but is independently shaped by face emotion processing.
Introduction
Sensorimotor and somatosensory activity during processing of facial expressions provides an index of internal embodiment and supports simulationist models of face-based emotion recognition (Haxby et al., 2000; Goldman and Sripada, 2005; Niedenthal, 2007; Atkinson and Adolphs, 2011). Whereas activity in motor and premotor regions has been well documented (van der Gaag et al., 2007; Banissy et al., 2011), the properties of activity in somatosensory cortex (SCx) are less well understood (Hussey and Safford, 2009). Hierarchical models of face processing propose that emotional and other types of face processing require a series of interactions between visual areas (occipital face area, fusiform face area, superior temporal sulcus), that feedforward to central and frontal regions (Haxby et al., 2000; Calder and Young, 2005; Fairhall and Ishai, 2007). It remains unclear whether activity in the SCx evoked by emotional face processing is a byproduct of earlier emotion processing in visual regions (Pitcher et al., 2008) or whether SCx might have an independent role over and above visual processing of emotional stimuli.
Neuroimaging studies have associated activity increases in right SCx (rSCx) to face expression discrimination (Winston et al., 2003; Vuilleumier and Pourtois, 2007). Moreover, patients with rSCx lesions perform poorly in expression recognition tasks (Adolphs, 2002; Atkinson and Adolphs, 2011), although we note that lesions often extend beyond somatosensory cortex (Calder et al., 2001; Pourtois et al., 2004). Interestingly, direct evidence for the involvement of rSCx in face expression processing comes from two transcranial magnetic stimulation (TMS) studies showing that TMS over rSCx disrupts performance in emotion discrimination tasks (Pourtois et al., 2004; Pitcher et al., 2008). Importantly, TMS disruption of right occipital face area and rSCx suggests sequential involvement of visual (60–100 ms) and somatosensory (100–170 ms) areas in expression recognition (Pitcher et al., 2008). Whether rSCx uniquely contributes to emotional facial recognition independent from visual processing is not clear.
To assess the selective involvement of rSCx in emotional face processing, we directly measured the state of SCx during a visual facial emotion discrimination task by applying tactile probes that elicited somatosensory electrocortical activity (Auksztulewicz et al., 2012). To control for attentional effects of the facial expression, participants additionally performed a gender discrimination task on the identical face stimuli. To isolate the response of SCx over and above the effects induced by lower processing regions, we subtracted purely visually evoked from tactually probed somatosensory activity during facial processing. Finally, to understand whether SCx responses reflect a specific or general process, we tactually probed somatosensory activity on a body part congruent with the emotional stimuli (face) and a noncongruent control site (finger).
Materials and Methods
Participants.
Seventeen right-handed participants with normal or corrected-to-normal vision took part in the experiment. One participant was excluded from the analysis because of excess of artifacts in the EEG signal, resulting in a total of 16 (5 males; aged 23–39 years, mean = 28.19; laterality quotient 83.2%) (Oldfield, 1971). Participants gave informed consent, with approval by the Ethics Committee, School of Social Sciences, City University London.
Stimuli and procedure.
A set of 90 pictures depicting happy, fearful, and neutral emotions was initially taken from the Karolinska Directed Emotional Faces set (Lundqvist et al., 1998). Faces were grayscaled and enclosed in a rectangular frame (140 × 157 inches), excluding most of the hair and nonfacial contours. Eight volunteers, none of whom participated in the subsequent study, judged the strength of emotion expressed in the faces on a visual analog scale (100 = “extremely emotional”; 0 = “not emotional at all”). Based on these judgments, we selected 40 emotional faces (20 fearful, 20 happy) (mean ± SD, 76.53 ± 6.95) and 20 neutral faces-faces rated closest to the “not emotional at all” judgment (mean ± SD, 10.76 ± 4.74) (half male).
Tactile stimulation was applied using two 12 V solenoids driving a metal rod with a blunt conical tip that contacted with participants' skin when a current passed through the solenoids. One solenoid was placed on the tip of the left index finger; one was placed on the left cheek (face). To mask sounds made by the tactile stimulators, white noise (65 dB, measured from the participants' head) was presented through two loudspeakers placed 90 cm away from the participants' head and 25 cm to either side of the participants' midline.
During the visual–tactile conditions, trials started with the presentation of a fixation cross (500 ms), followed by a neutral, fearful, or happy face (600 ms). Tactile stimuli were delivered to the left index finger (visual–tactile finger condition [VTFIC]) or to the left cheek (visual–tactile face condition [VTFAC]) 105 ms after face onset (Pitcher et al., 2008). To control for induced visual effects in the somatosensory response, we included a visual-only condition (VOC), where the same facial stimuli were presented without tactile stimulation (Fig. 1). We used 180 practice trials that did not contain any experimental material (30 trials per condition, including 10 fearful, 10 neutral, and 10 happy trials). The overall experiment consisted of 1800 randomized trials, presented in two blocks (900 trials per block/task, including 300 neutral, 300 fearful, and 300 happy faces).
Timeline of VTFAC, VTFIC, and VOC in the emotion and gender tasks. In VOC, faces were presented alone. In VTFAC and VTFIC, tactile probes were delivered 105 ms after the face onset. In both tasks, on 20% of trials, participants were asked to indicate either the emotion or gender after presentation of the face stimulus.
In 20% of the trials of each block, participants were asked whether the face stimulus was happy (10%) or fearful (10%) (emotion discrimination), and in a separate block, whether it was male (10%) or female (10%) (gender discrimination). Participants were explicitly told to ignore the tactile stimuli, to closely observe the faces presented on the screen, and to respond vocally (yes/no) as soon as possible if a question was presented (maximum response time 3000 ms). This was done to ensure that participants directed attention to the task. Participants were given a break in between blocks. Block order was randomized across participants. Participants were seated in a dimly lit sound-attenuated and electrically shielded chamber in front of a monitor at a distance of 80 cm. Visual stimuli were presented centrally on a black background using the E-prime software (Psychology Software Tools).
EEG recording and data analysis.
EEG was recorded with active electrodes from 60 scalp electrodes mounted equidistantly on an elastic electrode cap (M10 montage; EasyCap). All electrodes were referenced to the right mastoid and rereferenced to the average reference off-line. Vertical and bipolar horizontal electrooculogram was recorded for artifact correction purposes. Continuous EEG was recorded using a BrainAmp amplifier (BrainProducts; 500 Hz sampling rate). Off-line EEG analysis was performed using Vision Analyzer software (BrainProducts). The data were digitally low-pass-filtered at 40 Hz, and ocular correction was performed (Gratton et al., 1983). The EEG signal was epoched into 600 ms segments, starting 100 ms before tactile stimuli onset on VTFAC and VTFIC trials and starting 5 ms after visual onset on VOC trials. Segments were then baseline corrected to the first 100 ms, and artifact rejection was computed eliminating epochs with amplitudes exceeding ±100 μV.
Single-subject ERPs for each condition (VOC, VTFAC, and VTFIC), emotions (happy, fearful, neutral), and task (emotion, gender) were calculated and used to compute ERP grand-averages across subjects. Specifically, single-subject average ERPs were computed for trials in VOC containing only visual-evoked potentials (VEPs) and for trials in VTFAC and VTFIC, which contained VEPs and somatosensory-evoked potentials (SEPs). To eliminate any contamination of SEPs by VEPs, single-subject averages of trials in VOC were subtracted from single-subject averages of both VTFAC and VTFIC trials (for subtraction methods, see Dell'Acqua et al., 2003). The resulting difference somatosensory-evoked activity was averaged across participants and contrasted for happy, fearful, and neutral.
To analyze the emotion effect on early and mid-latency somatosensory activity, mean voltages of the difference somatosensory-evoked activity were computed in consecutive time windows of 20 ms length starting from 0 to 160 ms after tactile stimuli onset. Analyses were restricted to 18 electrodes located close to and over somatosensory cortex, where the early and mid-early SEPs are maximal (corresponding to Fc1/2, Fc3/4, FC5/6, C1/2, C3/4, C5/6, Cp1/2, Cp3/4, CP5/6, of the 10/20 system) (Fig. 2B). Factors of the analysis were as follows: emotion (happy, fearful, neutral), task (emotion, gender), tactile stimulation locus (face, finger), lateralization (left, right hemisphere), region (anterior, central, posterior), site (dorsal, dorsolateral, lateral), and time window (0–20, 20–40, 40–60, 60–80, 80–100, 100–120, 120–140, 140–160), allowing us to define the timing of emotion processing in SCx. When appropriate, Greenhouse–Geisser adjustments to the degrees of freedom were applied, and p values were corrected for multiple comparisons using Bonferroni correction.
Additionally, to ensure that the emotion manipulation was effective, mean voltages of the VEPs time-locked to face onset in the VOC were analyzed at occipital sites (corresponding to O1/2, O9/10, PO9/10 electrodes of the 10/20 system), where early emotional face processing is typically observed (Williams et al., 2004, 2006; Conty et al., 2012). Repeated-measures ANOVA with factor emotion (happy, fearful, neutral) was conducted on mean amplitudes for the time window of the P120 (120–150 ms) and N170 (170–190 ms) (Williams et al., 2006; Conty et al., 2012).
Electrophysiological source analysis.
Standardized Low Resolution Brain Electromagnetic Tomography (s-LORETA) was used to estimate the brain generators associated with emotional modulations of difference somatosensory-evoked activity. s-LORETA provides an approximate 3D discrete solution to the inverse EEG problem (Pascual-Marqui, 2002). Source estimations were performed on single-subject data to determine the likely regions differentially activated when observing emotional faces (happy/fearful) relative to neutral faces, and happy relative to fearful faces. Only time windows where facial emotion significantly modulated mean amplitudes in difference somatosensory-evoked activity were subjected to source localize neural activity.
Results
Emotional modulation of difference somatosensory-evoked activity
We performed a repeated-measures ANOVA of mean difference somatosensory-evoked activity with factors facial emotion (happy, fearful, neutral), task (emotion, gender), tactile stimulation locus (face, finger), lateralization (left, right hemisphere), region (anterior, central, posterior), site (dorsal, dorsolateral, lateral), and time window (0–20, 20–40, 40–60, 60–80, 80–100, 100–120, 120–140, 140–160). Results showed a main effect of tactile (F(1,15) = 19.54, p < 0.01), a tactile × time window × lateralization × region × site (all p < 0.05) and a emotion × task × time window × region × site (F(56,840) = 1.99, p = 0.04) interaction. Because of the interaction involving emotion and task but not stimulus location, we computed separate ANOVAs for the emotion and gender tasks, collapsing over tactile face and tactile hand trials. In the emotion task, we found the following significant interactions: emotion × lateralization × region (F(4,60) = 4.22, p < 0.01) and emotion × time window × region × site (F(56,840) = 2.56, p = 0.01). Given the interaction involving emotion × time window, we performed ANOVAs for each of the sequential 20 ms time windows. The first significant emotion effects were present in 40–60 ms analysis window with enhanced amplitude when observing emotional relative to neutral faces (emotion × lateralization, F(2,30) = 4.42, p = 0.02; emotion × lateralization × region, F(4,60) = 3.37, p = 0.02). Follow-up t test contrasting the emotion effect (by subtraction of amplitudes on neutral from happy and from fearful emotion trials) showed that emotion effects differed in the right hemisphere (t(15) = 3.23, p = 0.01) and central sites (t(15) = 2.48, p = 0.04). The 60–80 ms time window showed similar happy and fearful emotion effects (emotion × lateralization, F(2,30) = 5.72, p < 0.01), showing amplitude differences of happy minus neutral versus fearful minus neutral trials over the right hemisphere (t(15) = 2.58, p = 0.04). Furthermore, we observed significant enhancement of difference somatosensory-evoked activity to emotional faces at the following time windows: 80–100 ms (emotion × region × site, F(8,120) = 2.88, p = 0.02), 120–140 ms (emotion × lateralization × region, F(4,60) = 3.76, p = 0.01; emotion × region × site, F(8,120) = 2.56, p = 0.04), and 140–160 ms (emotion × region × site, F(8,120) = 3.55, p = 0.01), but follow-up of the emotion interactions did not reach significance. There were no significant main effects of emotion or in interaction in the 0–20, 20–40, or 100–120 ms time windows. In the gender task, there were no main effects or interactions, including the factor emotion (Fig. 2A,C). Overall, these results show an early and independent somatosensory sensitivity to emotional processing.
A, Grand average difference somatosensory-evoked activity when observing fearful (red), happy (blue), and neutral (black) faces, for electrodes where differences were strongest in the visual–tactile (face and finger) condition for both emotion (top) and gender (bottom) tasks. B, Electrodes over somatosensory (red) and visual (blue) areas included in the ANOVA. C, Topographical maps showing enhanced somatosensory activity for the fearful, neutral, and happy conditions, respectively. D, Pseudo-3D representation of sLORETA statistical maps showing regions where maximal fearful versus neutral, happy versus neutral, and happy versus fearful differential activity were source localized at latencies of 40–60 and 60–80 ms (happy vs neutral, t = 0.905, p = 0.01). E, Grand average VEPs for trials at occipital electrode positions shown in B for which maximum amplitude differences at P120 and N170 time-windows were observed. *p < 0.05.
Source localization analysis
Source estimation was performed on the periods where facial emotion significantly modulated mean amplitude in difference somatosensory-evoked activity and identified a set of regions whose peak of activity was maximal for fearful versus neutral, happy versus neutral, and happy versus fearful conditions (Fig. 2D). In the 40–60 ms period, maximum differential activity between fearful and neutral, happy and neutral, and fearful and happy conditions was source localized in primary Brodmann areas (BA) 1/2/3, secondary BA40, and associative BA7 SCx, bilaterally. For the 60–80 ms latency, a cluster of sources was found in primary BA 1/2 (happy vs neutral, happy vs fearful), secondary BA40 (happy vs neutral, fearful vs neutral), and associative SCx BA 5/7 (fearful vs neutral, fearful vs happy) across both hemispheres.
Emotional modulation of VEP amplitudes
To ensure emotion manipulation was effective, we analyzed effects of emotion on VEPs time-locked to face onset in the VOC at occipital sites, where early emotional effects on face processing are typically observed (Williams et al., 2004, 2006). Emotional modulations of the VEPs were shown for emotional relative to neutral conditions in both emotion and gender tasks (Fig. 2E). An ANOVA with emotion type as a factor conducted at occipital electrode sites (corresponding to O1/2, O9/10, PO9/10 electrodes of the 10/20 system) revealed a main effect of emotion as shown by significant P120 (F(2,30) = 5.20, p = 0.01) and N170 (F(2,30) = 4.75, p = 0.02) enhancement for emotional versus neutral faces. Follow-up t test analysis performed on the emotion effect (happy − neutral and fearful − neutral trials across electrodes) showed significant differences in the P120 window (t(15) = 2.28, p = 0.03) but not in the N170 window (t(15) = 1.55, p = not significant). This result accords with previous observations of early VEP modulation in response to direct attention to faces (Williams et al., 2004, 2006; Conty et al., 2012) and confirms the effectiveness of the visual manipulation.
Discussion
This study investigated the selective involvement of SCx during emotional face processing by means of somatosensory-evoked activity. We directly probed the state of the SCx with tactile stimulation on the cheek (face) and finger during a visual facial expression and a gender discrimination task. Moreover, we isolated somatosensory responses from visually induced effects by use of ERP subtraction method. If SCx activity during emotional visual processing would simply reflect a carryover activation from visual regions, then tactually evoked responses should not be differentially affected by emotional valence. We further ensured this to be the case by controlling for visually evoked responses and carryover somatosensory effects for emotional and neutral faces in the absence of tactile input. Our results show that discriminating emotional expressions significantly enhances pure early somatosensory-evoked activity. Moreover, we found that emotional expressions modulated both tactually probed face and finger somatosensory activity. Importantly, we also found that differential activity between emotional and neutral expression processing is initially source localized within the primary, secondary, and associative SCx. Our results contribute to the idea that emotional simulation is not purely conceptual but involves the representation of the actual body (Keysers et al., 2010).
The main finding was that facial emotion expressions enhance early somatosensory activity as opposed to neutral expressions, suggesting an active and independent role of the SCx during facial emotion discrimination. No such emotional effects were found in the control gender task. TMS, fMRI, and lesion studies have suggested that SCx participates during emotional face processing (Adolphs et al., 2000; Pourtois et al., 2004; Pitcher et al., 2008; Atkinson and Adolphs, 2011). Pitcher et al. (2008) showed that visual and somatosensory regions contribute to emotion recognition at different processing times. However, similar behavioral impairments resulting from visual and somatosensory TMS disruption suggest a similar role possibly driven by visual areas when considering difference in timing. To elucidate whether SCx has a specific role over and above visual areas in facial emotion processing, we compared somatosensory-evoked activity when presented with happy, fearful, and neutral faces. Importantly, visual-related activity was subtracted from somatosensory activity resulting in visually independent somatosensory activity, enhanced when processing emotional as opposed to neutral faces. This provides novel evidence that SCx activity is shaped by emotional processing, independently, and over and above visual responses to facial emotion perception.
To further investigate the specificity of the SCx response in emotion processing, we tactually probed the face and the finger somatosensory regions on different trials during a facial emotion discrimination task. Based on recent interventional TMS work in an emotion discrimination task, we aimed to particularly target the somatosensory response after the initial visual processing between 100 and 170 ms (as described by Pitcher et al., 2008). Therefore, facial and finger tactile probes were presented 105 ms after the visual presentation onset. We found a modulation of early somatosensory-evoked activity (between 40 and 80 ms) when perceiving emotional versus neutral faces. This differential response in the SCx was evoked similarly in the VTFAC and the VTFIC. This result supports the notion that general emotional processing relies on widespread activation of the SCx related to the observed emotion, including changes in nonfacial cortical representations (Tamietto et al., 2009; Atkinson and Adolphs, 2011). Furthermore, it supports simulation theories suggesting a close link between the observed emotion and the observer's body and, specifically, a general body representation response during the perception of emotional (happy, fearful) versus neutral faces.
Interestingly, a specific response of the SCx during emotional processing has been previously suggested in a TMS study (Pitcher et al., 2008). This study showed different performance for TMS over the finger area in the rSCx as opposed to TMS over a region previously shown to respond to facial emotional processing (Winston et al., 2003), but which did not entirely fit with accurate coordinates of face SCx reported in fMRI studies (Blatow et al., 2007). Therefore, in the latter condition, additional regions concerned with facial emotion processing may have been targeted. By contrast, the current study directly taps into the facial representation of SCx by using tactile probes.
With regards to the type of emotional stimuli used in the present work, previous fMRI studies have largely investigated the processing of fearful expressions (Vuilleumier and Pourtois, 2007). Pourtois et al. (2004) demonstrated a selective interference in judging fearful expressions (as opposed to happy expressions) when applying TMS over rSCx. However, evidence for somatosensory activity associated to other facial expressions, including happy expressions, have also been reported in lesion (Adolphs et al., 2000), fMRI (Winston et al., 2003), and TMS studies (Pitcher et al., 2008). Comparison across these studies is difficult because of the different methodologies, task, and emotional stimuli. The current study compares somatosensory-enhanced activity during emotion discrimination of fearful, happy, and neutral faces. The results suggest a general response for emotionally salient facial expressions with a certain degree of valence-specific internal representation in somatosensory areas, as shown by different early (40–80 ms) somatosensory response patterns to happy and fearful emotions. Other emotion interactions in later latencies (80–100, 120–160 ms) suggest a further SCx involvement in emotion processing, but follow-up tests comparing different emotion effects do not reach significance. Time properties in the SCx could be explored in more detail using interventional approaches, such as TMS, which can test for the temporally resolved relevance of the valence sensitivity in SCx during emotional processing.
An important result of our study concerns the neural source of the differences between emotional and neutral face processing. We found that the neural sources of the maximum peak difference between evoked somatosensory activity when perceiving emotional versus neutral faces are localized in the primary, secondary, and associative SCx. This result accords well with studies on affective blind sight that have reported that partially cortically blind patients exhibit a response in primary and associative somatosensory areas to emotional stimuli presented in the blind field (Anders et al., 2004; Van den Stock et al., 2011), as well as other studies mentioned above (Adolphs et al., 2000; Winston et al., 2003; Pourtois et al., 2004; Vuilleumier and Pourtois, 2007; Pitcher et al., 2008).
In conclusion, this study provides novel evidence for a distinctive role of somatosensory cortex in emotional face processing, which is independent of activation in early visual areas. Our methodology offers a direct test to the hypothesis that the activation state of SCx is influenced by emotional processing, by directly probing the responsiveness of SCx to tactile input during emotional processing of faces. Specifically, we show that observing emotional (happy, fearful) versus neutral facial expressions enhances pure somatosensory-evoked activity in primary, secondary, and associative SCx, demonstrating a direct involvement of rSCx during emotional face processing. No such effects occurred in the control gender discrimination task. Finally, this response was modulated in a similar manner in the congruent and noncongruent tactile conditions. Overall, our results provide support for simulationist models of emotion and demonstrate the specific role of SCx to emotional face processing.
Footnotes
This work was supported by the Spanish Ministerio de Economía y Competitividad (Grant AP2008-00664 to A.S. and Grants RYC-2008-03090 and PSI2012-34558/PSIC to B.C.-M.), and the City University London Pump Priming Scheme (B.C.-M., B.F.).
The authors declare no competing financial interests.
- Correspondence should be addressed to Dr. Beatriz Calvo-Merino, Department of Psychology, City University London, Northampton Square, EC1V 0HB, London, United Kingdom. b.calvo{at}city.ac.uk
This article is freely available online through the J Neurosci Author Open Choice option.