Skip to main content

Main menu

  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Collections
    • Podcast
  • ALERTS
  • FOR AUTHORS
    • Information for Authors
    • Fees
    • Journal Clubs
    • eLetters
    • Submit
  • EDITORIAL BOARD
  • ABOUT
    • Overview
    • Advertise
    • For the Media
    • Rights and Permissions
    • Privacy Policy
    • Feedback
  • SUBSCRIBE

User menu

  • Log in
  • My Cart

Search

  • Advanced search
Journal of Neuroscience
  • Log in
  • My Cart
Journal of Neuroscience

Advanced Search

Submit a Manuscript
  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Collections
    • Podcast
  • ALERTS
  • FOR AUTHORS
    • Information for Authors
    • Fees
    • Journal Clubs
    • eLetters
    • Submit
  • EDITORIAL BOARD
  • ABOUT
    • Overview
    • Advertise
    • For the Media
    • Rights and Permissions
    • Privacy Policy
    • Feedback
  • SUBSCRIBE
PreviousNext
Research Articles, Behavioral/Cognitive

Role of Inferior Frontal Junction (IFJ) in the Control of Feature versus Spatial Attention

Sreenivasan Meyyappan, Abhijit Rajan, George R. Mangun and Mingzhou Ding
Journal of Neuroscience 22 September 2021, 41 (38) 8065-8074; DOI: https://doi.org/10.1523/JNEUROSCI.2883-20.2021
Sreenivasan Meyyappan
1J. Crayton Pruitt Family Department of Biomedical Engineering, University of Florida, Gainesville, Florida 32611
2Center for Mind and Brain, University of California, Davis, California 95618
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Abhijit Rajan
1J. Crayton Pruitt Family Department of Biomedical Engineering, University of Florida, Gainesville, Florida 32611
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
George R. Mangun
2Center for Mind and Brain, University of California, Davis, California 95618
3Departments of Psychology and Neurology, University of California, Davis, California 95616
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Mingzhou Ding
1J. Crayton Pruitt Family Department of Biomedical Engineering, University of Florida, Gainesville, Florida 32611
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF
Loading

Abstract

Feature-based visual attention refers to preferential selection and processing of visual stimuli based on their nonspatial attributes, such as color or shape. Recent studies have highlighted the inferior frontal junction (IFJ) as a control region for feature but not spatial attention. However, the extent to which IFJ contributes to spatial versus feature attention control remains a topic of debate. We investigated in humans of both sexes the role of IFJ in the control of feature versus spatial attention in a cued visual spatial (attend-left or attend-right) and feature (attend-red or attend-green) attention task using fMRI. Analyzing cue-related fMRI using both univariate activation and multivoxel pattern analysis, we found the following results in IFJ. First, in line with some prior studies, the univariate activations were not different between feature and spatial attentional control. Second, in contrast, the multivoxel pattern analysis decoding accuracy was above chance level for feature attention (attend-red vs attend-green) but not for spatial attention (attend-left vs attend-right). Third, while the decoding accuracy for feature attention was above chance level during attentional control in the cue-to-target interval, it was not during target processing. Fourth, the right IFJ and visual cortex (V4) were observed to be functionally connected during feature but not during spatial attention control, and this functional connectivity was positively associated with subsequent attentional selection of targets in V4, as well as with behavioral performance. These results support a model in which IFJ plays a crucial role in top-down control of visual feature but not visual spatial attention.

SIGNIFICANCE STATEMENT Past work has shown that the inferior frontal junction (IFJ), a prefrontal structure, is activated by both attention-to-feature (e.g., color) and attention-to-location, but the precise role of IFJ in the control of feature- versus spatial-attention is debated. We investigated this issue in a cued visual spatial (attend-left or attend-right) and feature (attend-red or attend-green) attention task using fMRI, multivoxel pattern analysis, and functional connectivity methods. The results show that (1) attend-red versus attend-green can be decoded in single-trial cue-evoked BOLD activity in IFJ but not attend-left versus attend-right and (2) only right IFJ modulates V4 to enhance task performance. This study sheds light on the function and hemispheric specialization of IFJ in the control of visual attention.

  • feature attention
  • fMRI
  • inferior frontal junction
  • MVPA

Introduction

Voluntary attention to sensory events can be deployed based on stimulus attributes, such as their spatial location(s), their features (e.g., color, form, or texture), or conjunctions of these attributes as objects (Treisman, 1969; Driver, 2001). Attention cuing paradigms (Posner, 1980; Kingstone, 1992) have enabled the measurement of brain activity related to the control of attention separately from brain activity related to subsequent selective stimulus processing during different forms of preparatory attention, including spatial (Harter et al., 1989; Corbetta et al., 2000; Hopf and Mangun, 2000; Hopfinger et al., 2000), feature (Giesbrecht et al., 2003; Slagter et al., 2007; Snyder and Foxe, 2010), and object attention (Noah et al., 2020).

Functional imaging studies in humans have shown that in the period following an attention-directing cue but in advance of the target stimulus (cue-to-target interval), the dorsal attention network (DAN), principally comprising the intraparietal sulcus (IPS) and frontal eye fields (FEFs), is activated irrespective of whether attention is directed to a spatial location or a nonspatial feature (Giesbrecht et al., 2003; Slagter et al., 2007; Egner et al., 2008; Rajan et al., 2021). This has led to the notion that the DAN is an important network supporting the top-down control of attention regardless of the to-be-attended stimulus attribute. During the cue-to-target interval, the DAN is thought to maintain the attentional set and issue top-down control signals to sensory-specific cortex to bias sensory processing so that the attended information is facilitated and irrelevant stimulus information is suppressed (T. Liu et al., 2003; Corbetta et al., 2008; C. Wang et al., 2016).

Regions outside the classically defined DAN have, however, also been implicated in attentional control, notably the inferior frontal junction (IFJ). IFJ is a prefrontal area situated at the confluence of the precentral and inferior frontal sulci. Early work shows that IFJs play a role in cognitive control using paradigms such as task switching (Brass and von Cramon, 2004; Brass et al., 2005; Derrfuss et al., 2005) and Stroop (Neumann et al., 2005; Derrfuss et al., 2005). During the control of spatial attention, IFJ is activated along with the DAN (see also Giesbrecht et al., 2003; Asplund et al., 2010), leading to the notion that IFJ is involved in the control of spatial attention and that it may even be a part of the DAN (Corbetta et al., 2008). But other work has challenged this notion by showing that, during spatial attention, the IFJ activity following a cue is short-lived, suggesting that IFJ is more involved in interpreting the cues and facilitating attentional orienting than in controlling sustained attention to spatial locations per se (Tamber-Rosenau et al., 2018). In contrast, several lines of more recent research have focused on the role of the IFJ in feature- and object-based attention, suggesting that nonspatial attentional control characterizes the role of IFJ (Zanto et al., 2010; Baldauf and Desimone, 2014; Gong and Liu, 2020). The specificity of IFJ's role in the control of spatial versus feature attention remains a significant open question in our understanding of attentional control.

Hemispheric differences in the IFJ control of attention are similarly debated. Zanto et al. (2010) showed that attending color increased right IFJ (rIFJ)-V4 connectivity. Baldauf and Desimone (2014) reported stronger connectivity between rIFJ-FFA during attention to faces and rIFJ-PPA during attention to houses (see also Lee and Geng, 2017). In contrast, other studies have found either no hemispheric differences in IFJ contributions to attentional control (Zhang et al., 2018; Gong and Liu, 2020) or left lateralized activity in IFJ for object attention (Orlandi and Proverbio, 2020).

We investigated the role of IFJ in preparatory spatial versus feature attention control using multivariate analysis of fMRI data and explored the laterality of IFJ control of attention by computing functional connectivity between IFJ and V4 and relating it to selective target processing and behavioral performance.

Materials and Methods

Overview

The experimental protocol was approved by the Institutional Review Board at the University of Florida. Twenty (5 female/15 male) right-handed graduate and undergraduate students took part in the study. The participants reported no prior history of neuropsychological disorders and provided written informed consent. The data from this experiment have been used to investigate different questions in our previously published work (Rajan et al., 2019, 2021). Because the analyses conducted here followed this prior work, and effect sizes were taken to be similar to those significant prior findings, no new power analysis was performed before the experiment.

Experimental design

The timeline of a typical trial is illustrated in Figure 1. Participants were required to fixate a central cross throughout each trial; this was verified during training and testing using an SR Research EyeLink 1000 MR-compatible eye tracker. Two sets of dots continuously marked the two possible peripheral target locations, which were 3.6 degrees lateral to the upper left and upper right of the fixation cross (closest edge). The start of the trial was signaled by an auditory cue, which directed the participants to covertly direct their attention to either a spatial location (left or right) or a color (red or green). Following a delay period, varied randomly from 3000 to 6600 ms, two colored rectangles (red or green) were presented for a duration of 200 ms, with one in each of the two peripheral locations. The subject's task was to report the orientation of the rectangle (target) appearing at the cued location or having the cued color, and to ignore the other rectangle (distractor). For feature (color) trials, the two rectangles displayed were always of the opposite color; for spatial trials, the two rectangles were either of the same color or of the opposite color. On 8% of the trials (invalid trials), only one rectangle was displayed, appearing in the uncued location or color; in this case, the participants were required to report the orientation of that single rectangle. These invalidly cued trials were included to provide a measure of the behavioral benefits of attentional cuing (Posner, 1980). An intertrial interval, which was varied randomly from 8000 to 12,800 ms following the target, elapsed before the start of the next trial. Trials were organized into blocks, with each block consisting of 25 trials and lasting ∼7 min. Each participant completed 10-14 blocks over 2 days.

Figure 1.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 1.

Experimental paradigm. Each trial starts with an auditory cue (500 ms) instructing the subject to covertly attend to a spatial location (left or right) or to a color (red or green). Following a variable cue-to-target delay (3000-6600 ms), two colored rectangles were displayed (200 ms), one in each of the two peripheral locations. Participants were asked to report the orientation of the rectangle (horizontal or vertical) appearing in the cued location or having the cued color. On some of the trials (8%), the cues were not valid (i.e., only one target appeared), the rectangle was either not at cued location or not having the cued color, and participants were required to report the orientation of the rectangle. When the none cue was presented (neutral trials), the subject did not prepare to attend to spatial or feature information, and instead prepared to discriminate the orientation of the rectangle that was presented over a gray patch. An intertrial interval (ITI), varied randomly from 8000 to 12,800 ms following the target onset, elapsed before the start of the next trial. On a portion of trials (20%), the cue was not followed by targets (cue-only trials).

In addition to spatial and color cues, there was a third type of “neutral” cue (the word “none”), which comprised 20% of the total number of trials and provided no information as to what to attend, but instead warned the subject to prepare to respond to a rectangle's orientation based on it being placed on a gray patch; these were included to provide behavioral measures comparing focused (spatial or feature) versus neutral attention but were not analyzed here for BOLD activity. Valid spatial and features cues were followed (with a delay of 3000-6600 ms) 80% of the time by two-colored rectangles (red or green), while on the remaining 20% of trials the cue was not followed by the targets (cue-only trials).

All the participants went through a training session before scanning. Since the study required participants to maintain central fixation for long durations and pay covert attention to the cued sensory attribute while fixating, participants were screened based on their ability to maintain eye fixation at the center of the monitor during the training session. At the end of the training session, behavioral accuracy >70% was considered satisfactory for enrollment in the fMRI study.

fMRI acquisition and preprocessing

A 3T Philips Achieva scanner with a 32-channel head coil (Philips Medical Systems) was used to collect fMRI data. The EPI sequence parameters were as follows: TR = 1.98 s; TE = 30 ms; flip angle = 80°; FOV = 224 mm; slice number = 36; voxel size = 3.5 × 3.5 × 3.5 mm; matrix size = 64 × 64. The slices were oriented parallel to the plane connecting the anterior and posterior commissures.

The fMRI BOLD signals were preprocessed using the statistical parametric mapping toolbox (SPM) and custom scripts written in MATLAB. Preprocessing steps included slice timing correction, realignment, spatial normalization, and smoothing. Slice timing correction was conducted using sinc interpolation to correct for differences in slice acquisition time within an EPI volume. The images were then spatially realigned to the first image of each session by a 6-parameter rigid body spatial transformation to account for head movement during data acquisition. Each participant's images were then normalized and registered to the MNI space. All images were further resampled to a voxel size of 3 × 3 × 3 mm, and spatially smoothed using a Gaussian kernel with 7 mm FWHM. Slow temporal drifts in baseline were removed by applying a high-pass filter with cutoff frequency set at 1/128 Hz.

Statistical analysis

GLM analysis of cue-evoked response

The GLM method, as implemented in the SPM toolbox, was used to analyze the BOLD responses to cues. Eight task-related events were included in the GLM analysis as regressors. Five of them were used to model the cue-related BOLD activity; only trials with correct responses were included. We used two additional regressors to account for BOLD responses evoked by target stimuli: one for validly and one for invalidly cued targets. Recall that the trials could be either validly cued or invalidly cued, and the subject was expected to respond to both. Finally, one regressor was used to model the trials with incorrect responses. The HRF used in the GLM analysis was the default HRF in the SPM toolbox where the delay was 6 s. At the group level, cue-evoked fMRI activations were obtained by applying a parametric one-sample t test and thresholding the results at p < 0.05 after correcting for multiple comparisons using the false discovery rate (FDR) method.

ROI definition

Bilateral IFJ (bIFJ) ROIs were defined by applying the above GLM analysis to cue-evoked BOLD activity. As described above, activation evoked by spatial and feature cues was subjected to a statistical threshold of p < 0.05 (FDR correction). Voxels meeting this threshold requirement and lying in the proximity of previously published coordinates of IFJ (Brass et al., 2005) were taken to be the IFJ, and were used as the ROI in this study. Individually, the peak activation for each subject was found to be within the IFJ ROI so defined for 19 of 20 subjects; and for the remaining subject, the peak activation was found to be within two voxels of the IFJ ROI. The DAN ROI was defined in a similar fashion (Rajan et al., 2021). The V4 ROI was defined by the overlap between feature cue-evoked activations (p < 0.05 FDR) and the retinotopically defined hv4 in the probabilistic maps of L. Wang et al. (2015). There were 233 voxels in bIFJ (82 voxels in rIFJ with peak activation at MNI: [45, 14, 28], 151 voxels in left IFJ (lIFJ) with peak activation at MNI [−42, 14, 3.1]). The bilateral V4 ROI comprised 34 voxels, with left and right V4 peak activations at MNI (−21, −79, −14) and (24, −73, −8), respectively. The DAN ROI comprised 1390 voxels with peak activation coordinates: left FEF (−27, −1, 52), right FEF (36, 2, 49), left IPS (−15, −70, 52), and right IPS (24, −64, 52).

Estimation of single-trial BOLD response

In addition to the conventional GLM analysis, we also applied multivoxel pattern analysis (MVPA) to neural activity following cues and targets, with cue-evoked activity yielding information on attentional control and target-evoked activity on attentional selection. Since MVPA is performed at the single-trial level, a β series regression method (Rissman et al., 2004) was used to estimate BOLD response on each trial in every voxel. In this method, cues and targets in trials with correct responses were assigned individual regressors and one regressor was assigned for all the cues and targets with incorrect responses. The regressors were modeled in the conventional GLM framework using custom MATLAB scripts developed within SPM toolbox. Single-trial BOLD responses estimated in this way were used for both MVPA and for ROI-based univariate analysis. For the latter, single-trial BOLD activations were averaged over the voxels in a given ROI and across trials of a given trial type, which were then compared between different trial types. FDR corrections were applied where appropriate.

MVPA

MVPA was performed by using linear support vector machine implemented in the Statistics and Machine Learning toolbox of MATLAB. Trials formed the instances, and the single-trial β estimates of voxels within a given ROI as features. For feature attention, decoding was between attend-red versus attend-green; for spatial attention, decoding was between attend-left versus attend-right. A 10-fold cross-validation technique was applied to determine the classification or decoding accuracy. The cross-validation analysis was repeated 25 times over different fold partitions to avoid any intrinsic bias which may have resulted when dividing the trials into 10 specific groups (10-fold). Twenty-five, 10-fold cross-validation accuracies (a total of 250 decoding accuracies) were averaged to obtain the decoding accuracy for a subject. Group-level accuracy was determined by averaging the classification accuracy across all subjects.

To determine whether the decoding accuracy was above chance level, we adopted two approaches. First, using the Wilcoxon signed rank test, the decoding accuracies from a given MVPA were compared against a priori chance level accuracy of 50%, resulting in a threshold corresponding to p < 0.05 adjusted for multiple comparisons by the FDR technique (Benjamini and Yekutieli, 2001). Second, the first approach was complemented by a nonparametric permutation approach (Stelzer et al., 2013), in which trial labels were randomized, and the null-hypothesis distribution of chance level decoding accuracy was constructed, from which the threshold decoding accuracy corresponding to p < 0.001 was derived. A decoding accuracy was considered above chance level if it passed the two thresholds.

Beta-series functional connectivity analysis

Functional connectivity analysis was performed using single-trial level β values (Rissman et al., 2004). The purpose of this analysis was to test whether cue-related functional connectivity between IFJ and V4 was modulated by different types of anticipatory attention (feature vs space). Specifically, functional connectivity analysis was performed at the subject level by averaging the β values over the voxels within a given ROI and subjecting them to a Pearson cross-correlation analysis across trials. For feature attention, attend-red and attend-green trials were combined; similarly, for spatial attention, attend-left and attend-right trials were combined. The group-level functional connectivity measure was obtained by averaging the correlation coefficients from individual subjects after subjecting them to a Fisher's (r to z) transformation, which made the correlation coefficient approximately Gaussian distributed, and compared between cue types.

Results

Behavioral analysis

We assessed the effects of attention (cueing) using two-way ANOVAs: cue validity (valid vs invalid) by attention type (spatial vs feature). For reaction time (RT), we found a statistically significant main effect of cue validity (valid vs invalid) [Fvalidity (1,19) = 27.87, p = 1 × 10−6, η2 (effect size) = 0.26]. There were no statistically significant main effects of attention types on RTs (Fattention type (1,19) = 0.01, p = 0.93, η2 = 6 × 10−5), nor were there significant interactions between cue validity and attention type (Fvalidity × attention type (1,19) = 0.01, p = 0.92, η2 = 8 × 10−5). Separate t tests for RT confirmed that subjects responded significantly faster for validly cued trials than invalidly cued trials for both spatial attention (t = 4.35, p = 3 × 10−4, d = 0.97; Fig. 2A) and feature attention (t = 6.67, p = 2 × 10−6, d = 1.49; Fig. 2B). For accuracy, however, although validly cued targets had slightly higher mean accuracy, ANOVA revealed no main effects of attention for either cue validity or attention types: cue validity, Fvalidity (1,19) = 0.78, p = 0.38, η2 = 0.01; attention type, Fattention type (1,19) = 0.02, p = 0.88, η2 = 2 × 10−4; nor was there a significant interaction between attention type and cue validity for accuracy (Fvalidity × attention type (1,19) = 0.01, p = 0.90, η2 = 1 × 10−4). Separate t tests for accuracy confirmed these statistically nonsignificant results (Fig. 2C,D). Figure 2E, F shows that spatial attention and feature attention led to better performance than not attending to specific information (neutral trials where the word “none” was the cue).

Figure 2.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 2.

Behavioral analysis. RTs to the targets (collapsed over visual field and color) were significantly faster when validly cued (attended) were compared with invalidly cued (unattended) targets for both spatial attention (A) and feature attention (B). Accuracy (% correct) between validly cued and invalidly cued target discrimination (orientation of the target rectangle) was not different for either spatial (C) or feature (D) attention. E, RTs to targets in neutral trials (when the word “none” was the cue) were significantly slower compared with RTs to either attended space or feature targets, which were not different from each other. F, Similarly, accuracy for target discrimination in neutral trials was significantly lower than for either attended space or feature targets, which were not different from each other. **p < 2 × 10–3. ***p < 1 ×10–5.

For each trial type, as shown in Table 1, the patterns for behavioral measures were consistent with the above analyses, with the exception of accuracy for the attend-red trial type. Therefore, to investigate any possible differences across the attention trial types in how subjects allocated attention, we performed separate t tests on each attention trial type (attend-left, attend-right, attend-red, and attend-green) for each measure (RT and accuracy). These separate analyses showed that, in line with the ANOVA results, there were highly significant cue validity effects for RT, but no significant cue validity effects for accuracy.

View this table:
  • View inline
  • View popup
Table 1.

RT and accuracy for different attention conditions and cue validitiesa

In order to assess whether there were any differences in task difficulty or arousal among the four different attention trial types (attend-left, attend-right, attend-red, and attend-green), we conducted a one-way ANOVA (four levels) for the responses to the attended (cued) targets. Neither RT nor accuracy was found to be significantly different among the four attention trial types: RT = [FRT (3,76) = 0.88, p = 0.45, η2 = 0.03]; accuracy = [Faccuracy (3,76) = 0.72, p = 0.54, η2 = 0.02]. These patterns for RT and accuracy suggest that the attention conditions did not differ in overall task difficulty or arousal.

Together, these RT and accuracy results provide behavioral evidence that subjects deployed covert attention to the cued sensory attributes according to task instructions.

Univariate analysis of cue-evoked BOLD activity

Cue-evoked whole-brain responses were first analyzed using the conventional GLM method. As shown in Figure 3, color cues, spatial cues, and space + color cues all activated the DAN (Fig. 3A–C), as well as bIFJ (Fig. 3D–F), consistent with previous reports (Giesbrecht et al., 2003; Slagter et al., 2007; Egner et al., 2008; Greenberg et al., 2010; T. Liu and Hou, 2013; Y. Liu et al., 2016; Rajan et al., 2021). Other regions activated included bilateral temporal cortex, bilateral dorsal ACC, bilateral anterior insula, bilateral precuneus, bilateral inferior frontal gyrus, bilateral putamen, bilateral lingual gyrus, bilateral thalamus, and left SMA (see Table 2). Furthermore, comparing cue-related BOLD activations in the IFJ ROI (Fig. 3G) between spatial cues and color cues revealed no statistically significant difference between the two conditions (Fig. 3H). Here, the ROI-based univariate analysis in Figure 3H was done using the same single-trial BOLD activation data used in MVPA and functional connectivity analyses, to achieve consistency across analyses (see Materials and Methods).

View this table:
  • View inline
  • View popup
Table 2.

MNI coordinates and corresponding Z scores for brain areas activated by both spatial and feature cues

Figure 3.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 3.

Univariate analysis of cue-evoked activity. A–C, Both spatial cues and color cues activate DAN (FEF and IPS/SPL). D–F, Both spatial cues and color cues activate bIFJ. G, Coronal slice showing the IFJ ROIs. H, Univariate BOLD activations in IFJ were not significantly different between attend space and attend feature. *p < 0.05.

Multivariate analysis of cue-evoked BOLD activity in IFJ and DAN

Single-trial cue-evoked BOLD activity from bIFJ, lIFJ, and rIFJ was subjected to MVPA (see Fig. 4). The decoding accuracies for attend-red versus attend-green (feature attention) were as follows: 57.26 ± 1.79% (bIFJ), 57.41 ± 1.59% (lIFJ), 56.92 ± 1.45% (rIFJ), all significantly above chance (see Materials and Methods; Fig. 4A). The decoding accuracy for attend-left versus attend-right (spatial attention), however, was not significantly above chance (bIFJ: 51.08 ± 1.63%; lIFJ: 52.00 ± 1.86%; rIFJ: 52.96 ± 1.70%; Fig. 4A). Moreover, the decoding accuracy for attend-red versus attend-green was significantly higher than that for attend-left versus attend-right in bIFJ (p = 0.02; d = 0.60) and in lIFJ (p = 0.05; d = 0.46), and marginally higher in rIFJ (p = 0.09; d = 0.40). These results support the idea that IFJ is more involved in the control of feature attention than spatial attention.

Figure 4.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 4.

Multivariate analysis of cue-evoked BOLD activation in IFJ and in DAN. A, In IFJ, decoding accuracy for attend-red versus attend-green was significantly above chance level, whereas decoding accuracy for attend-left versus attend-right was not significantly above chance level. In addition, attend-feature decoding accuracy was significantly higher in bIFJ and lIFJ, and marginally higher in rIFJ than attend-space decoding accuracy. B, In DAN, decoding accuracies for both attend-red versus attend-green (feature) and attend-left versus attend-right (spatial) were above chance level, but the two decoding accuracies were not significantly different from each other. C–E, Cue evoked decoding accuracies for attend-red versus attend-green in bIFJ, lIFJ, and rIFJ were significantly correlated with that in DAN. #p ≤ 0.09. *p ≤ 0.05. **p < 0.02.

Applying MVPA to the DAN, we found significantly above chance level decoding for both feature (attend-red vs attend-green) and spatial (attend-left vs attend-right) attention (Fig. 4B). The mean accuracy for feature attention was slightly higher than that for spatial attention, but the difference was not statistically significant (p = 0.53; d = 0.14) (see also Rajan et al., 2021). Further, the decoding accuracy between attend-red and attend-green in DAN was significantly correlated with that in bIFJ (r = 0.79, d = 2.61; Fig. 4C), lIFJ (r = 0.88, d = 3.62; Fig. 4D), and rIFJ (r = 0.60, d = 1.49; Fig. 4E), indicating that these regions may work in tandem to effect top-down control of feature-based attention.

Analysis of target-evoked activity in IFJ

In addition to the role of the IFJ during preparatory attentional control, we also investigated the involvement of IFJ in attentional selection during target processing, by analyzing the target-related BOLD activity using both univariate and MVPA decoding analyses. As shown in Figure 5A, univariate analysis revealed that both left and rIFJs were similarly significantly activated in the post-target period during spatial (bIFJ: p = 9 × 10–8, d = 2.06; lIFJ: p = 1 × 10–7, d = 1.91; rIFJ: p = 9 × 10–7, d = 1.65) and feature attention (bIFJ: p = 1 × 10–7, d = 1.90; lIFJ: p = 1 × 10–7, d = 1.96; rIFJ: p = 6 × 10–6, d = 1.44). For the MVPA analyses (Fig. 5B), however, decoding accuracies in the IFJ during target processing were not significantly above chance for either spatial attention (attend-left vs attend-right) (bIFJ: 51.08 ± 1.63%; lIFJ: 52.20 ± 1.72%; rIFJ: 51.60 ± 1.98%) or feature attention (attend-red vs attend-green) (bIFJ: 54.40 ± 1.80%; lIFJ: 53.40 ± 1.89%; rIFJ: 54.60 ± 2.04%). These findings suggest that IFJ is not a prime structure for attentional selection of task-relevant stimuli during target processing.

Figure 5.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 5.

Target-evoked activity in the IFJ. A, Univariate analyses showed that the IFJ was similarly activated during spatial and feature attention in the post-target period. B, MVPA decoding accuracies for attend-left versus attend-right (spatial) and for attend-red versus attend-green (feature) were not significantly above chance level. *p < 0.05.

Relation between IFJ and V4: behavioral consequences of cue-related functional connectivity

During covert attention, biasing signals issued by attention control regions, such as IFJ, propagate to the sensory regions to influence the processing of the upcoming stimulus (Desimone and Duncan, 1995; Giesbrecht et al., 2003, 2006; Slagter et al., 2007). V4 is a visual sensory area that has been extensively investigated for its role in processing color information (Lueck et al., 1989; Zeki et al., 1991; Murphey et al., 2008; Zanto et al., 2010). V4's role in visual spatial attention has also been well demonstrated (Moran and Desimone, 1985; Spitzer et al., 1988; Motter, 1993; Heinze et al., 1994; Luck et al., 1997; Mangun et al., 1997; Tootell et al., 1998; Kastner et al., 1999; Hopfinger et al., 2000; Klein et al., 2014). Indeed, in our data, cue-evoked decoding accuracy for attend-left versus attend-right was significantly above chance level at 58.8% in V4. How does IFJ interact with V4 during the cue-to-target interval during anticipatory feature and spatial attention? What are the behavioral consequences of such interaction? We investigated these questions by computing the functional connectivity between V4 and rIFJ and separately between V4 and lIFJ (Fig. 6A). lIFJ did not show significant functional connectivity for either color cues (p = 0.17, d = −0.32) or spatial cues (p = 0.81, d = −0.06) (Fig. 6B). In contrast, rIFJ functional connectivity with V4 following color cues was found to be significantly >0 (p = 5 × 10–4, d = 0.93), whereas the functional connectivity between rIFJ and V4 for spatial cues was not significantly different from zero (p = 0.23, d = 0.28). In addition, the functional connectivity between rIFJ and V4 following color cues was greater than that following spatial cues (p = 0.05, d = 0.46) (Fig. 6C).

Figure 6.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 6.

Relation between IFJ and V4. A, Schematic representation of cue-evoked functional connectivity between IFJ and V4. Solid line indicates significant functional connectivity between rIFJ and V4. Dashed line indicates nonsignificant functional connectivity between rIFJ and V4. The thickness of the solid line represents the connectivity strength. B, C, Cue-related functional connectivity between IFJ and V4 for attend feature and attend space trials. rIFJ-V4 functional connectivity during feature attention was significantly higher than that during spatial attention, whereas lIFJ showed no significant functional connectivity for either types of trials. D, Subjects with stronger rIFJ-V4 functional connectivity during feature attention had higher behavioral accuracy. E, Cue-evoked decoding accuracy in rIFJ was positively associated with target-evoked decoding accuracy in V4. *p ≤ 0.05. **p < 0.0005.

The behavioral implications of the cue-evoked functional connectivity between rIFJ and V4 were examined by comparing the behavioral accuracy with the strength of the rIFJ-V4 functional connectivity during feature attention control. Specifically, subjects were divided into two equal groups (median split) according to the magnitude of the cue-evoked rIFJ-V4 functional connectivity: subjects exhibiting high rIFJ-V4 functional connectivity (high connectivity group) and subjects exhibiting low rIFJ-V4 functional connectivity (low connectivity group). As shown in Figure 6D, the high connectivity group exhibited better behavioral performance than the low functional connectivity group (p = 0.04, d = 0.55).

Relation between IFJ and V4: IFJ attention control and V4 target selection

Preparatory attention signals in IFJ are thought to influence target processing of visual sensory areas, such as V4. To test this, we correlated cue-evoked decoding accuracy between attend-red and attend-green in IFJ with target-evoked decoding accuracy between attend-red and attend-green in V4. Although the decoding accuracy within V4 was not above chance level (48%), it varied greatly from subject to subject. Leveraging this variability, target-evoked decoding accuracy in V4 was found to be significantly correlated with the cue-evoked decoding accuracy in rIFJ (r = 0.50, p = 0.02, d = 1.15; Fig. 6E), but not that in lIFJ (r = 0.31, p = 0.18, d = 0.65; not shown), suggesting that the strength of the attention control signals in rIFJ, rather than lIFJ, played role in enhancing attentional selection of the stimulus in V4. In contrast, cue-evoked decoding accuracy in V4, which is also not significantly above chance at 51%, and target-evoked decoding accuracy in V4 were not significantly correlated (r = 0.11, p = 0.63, d = 0.22). Further, cue-evoked decoding accuracy in IFJ and that in V4 was not significantly correlated (r = 0.03, p = 0.90, d = 0.06).

Patterns of eye movements

To examine whether there were systematic eye movement patterns that distinguished different attention conditions, we analyzed the eye-tracking data using support vector machines. The decoding accuracy was at chance level in the interval 0-3000 ms after cue onset for all pairs of attention conditions: attend-left versus attend-right, attend-red versus attend-green, and attend-space versus attend-feature. These eye-tracking analyses have been reported in a recent paper on the same data and are included here for completeness (Rajan et al., 2021).

Discussion

The role of IFJ in top-down preparatory attentional control for spatial versus feature attention was examined in this study using fMRI, machine learning decoding, and functional connectivity. In particular, we investigated whether IFJ control was specific to feature attention, different between the two cerebral hemispheres, and may also be engaged during target selection in visual cortex.

Univariate fMRI analysis showed that both spatial cues and color cues similarly evoked preparatory attention-related BOLD activity in bIFJ, as has been observed previously (e.g., Giesbrecht et al., 2003). MVPA, however, revealed that, while decoding accuracy for attentional control in left and right IFJ was significantly above chance for feature attention, it was not for spatial attention; in the DAN, in contrast, decoding was significant for both feature and spatial attention. Decoding of subsequent target-evoked activity in the IFJ was not significant for either spatial or feature attention, demonstrating that IFJ is involved primarily in preparatory attentional control and not selective processing of targets. Nonetheless, the decoding accuracy for feature attentional control in rIFJ did predict the decoding accuracy of attentional selection of the subsequent targets in V4. In line with this, cue-related functional connectivity was higher between rIFJ and visual area V4, and this enhanced connectivity was positively associated with behavior; the higher the rIFJ-V4 functional connectivity, the better the behavioral performance across subjects.

IFJ as a region for attention control

Activation of the IFJ during attentional control has been consistently reported. Some researchers have hypothesized that IFJ may act as a relay station, sending top-down signals from the DAN to visual cortex during endogenous attention and bottom-up reorienting signals triggered by salient stimuli from the ventral attention network to the DAN during exogenous attention (He et al., 2007; Corbetta et al., 2008). Others have reported that IFJ may have a role in both spatial and feature attention, together with the DAN (Giesbrecht et al., 2003; Slagter et al., 2007). Still others have concluded that IFJ activity evoked by a cue is transient during the preparatory attention period, and is primarily involved in interpreting the cue and orienting attention toward the attended hemifield rather than in controlling and sustaining covert attention per se, with the latter being conducted mainly by the DAN (Asplund et al., 2010; Tamber-Rosenau et al., 2018). In our data, spatial cues elicited significant activity within the IFJ, but attend-left versus attend-right cues are not distinguished in either the univariate (results not shown) or the multivariate analysis. A similar analysis in DAN revealed significantly above-chance decoding between attend-left versus attend-right cues, suggesting that spatial attention control is effected by the DAN, rather than the IFJ (Rajan et al., 2021). These results, enabled by our experimental design and MVPA, provide critical evidence that IFJ is engaged transiently following the spatial cues but is not involved in maintaining covert spatial attention in the cue-to-target interval (e.g., Tamber-Rosenau et al., 2018); the latter was being conducted instead by the DAN.

Role of IFJ in controlling top-down attention to feature

Zanto et al. (2010) suggested that IFJ is involved in the control of feature attention (see also Giesbrecht et al., 2003). In a visual working memory paradigm, they asked the participants to view and remember four different sets of dots during encoding (two each for different colors and motion), and to report during retrieval whether the target was present in one of the two stimuli encoded in the attended feature domains (color or motion). Using functional connectivity with visual cortex ROIs as seed regions (V4 for color and MT for motion), they showed that, during encoding, IFJ voxels exhibit stronger IFJ-V4 connectivity when attending color, and stronger IFJ-MT connectivity when attending motion. Baldauf and Desimone (2014) presented face-house compound stimuli to subjects in a one-back working memory paradigm and asked the subjects to either attend faces or attend houses. Using γ frequency synchrony measured with MEG, they found increased functional interaction between IFJ and FFA when attending faces, and increased functional interaction between IFJ and PPA when attending houses, thus extending the work of Zanto et al. (2010) to the domain of object attention.

Recent work by Gong and Liu (2020) asked participants to pay attention to one of two subcategories of stimuli within a feature domain: for example, clockwise versus counterclockwise motion in the motion domain, or red versus green in the color domain. The participants' task was to detect brief changes in luminance in colored moving dots in the attended feature domain. Multivoxel patterns evoked by attention to different stimuli within a given feature domain were found to be decodable in IFJ. While our current results showing decoding of attended color information in the cue-to-target interval in IFJ are consistent with Gong and Liu (2020), there are two crucial differences between our study and theirs. First, in their study, a stimulus (colored moving dots) was present throughout the trial, and the subjects were to detect a change in the stimulus. As a result, there was no distinction made between preparatory attentional control and activity during target processing. By having a cue-to-target interval in our paradigm, it was possible to isolate attentional control from stimulus selection (Hopfinger et al., 2000; Woldorff et al., 2004; Grent-'t-Jong and Woldorff, 2007), and therefore conclude that the IFJ is involved in controlling attention to stimulus features (in the absence of sensory input) but not in attentional selection of target stimuli. Second, a major design feature of our paradigm is to include both attention to space and attention to feature in the same experiment, which enables us to conclude that IFJ controls sustained preparatory attention to feature but not to space.

Hemispherical differences in IFJ control of attention

Using repetitive transcranial magnetic stimulation, it was shown that stimulation of the rIFJ impaired attention to color (but not motion) (Zanto et al., 2011). Other studies have found no differences in functionality between the left versus rIFJ during either color or motion attention (Gong and Liu, 2020), although effects of contralaterality of control have been reported (Zhang et al., 2018). In our data, decoding success in left versus rIFJ did not differ, but hemispheric differences were revealed in functional connectivity analyses, where we found that rIFJ-V4 connectivity was significant for feature attention and stronger than for spatial attention; this enhanced rIFJ-V4 functional connectivity was associated with better behavioral performance. lIFJ-V4 connectivity, on the other hand, was not significant for either feature or space trials. Our results thus provide evidence to support the notion that attention to feature is more right lateralized in IFJ when interactions with sensory cortex are considered. Further supporting this notion, our cue-related decoding accuracy in rIFJ, but not in lIFJ, was associated with the efficacy of target selection in V4. Together, these findings suggest that the rIFJ plays a more prominent role in influencing sensory brain regions during preparatory control of feature attention.

The role of IFJ in attentional selection

The findings reported here and the current literature support the notion that IFJ is involved in the preparatory control of feature attention. Whether IFJ plays a role in attentional selection of the target stimuli remains less clear. Baldauf and Desimone (2014) observed an increase in BOLD activity in rIFJ when participants attended to the color (red or green) of moving dots compared with passive viewing of the dots. Bichot et al. (2015) observed an increase in firing rates in neurons in ventral prearcuate regions (a putative nonhuman primate analog of IFJ) when the attended stimulus was the primary target in a visual search task, and that inactivation of ventral prearcuate region led to reductions or elimination of feature but not spatial attention effects in V4 neurons during target processing (Bichot et al., 2019). On the other hand, IFJ, sometimes considered to be at the confluence of dorsal and ventral attention networks, has been suggested to play a role in dynamically switching its association between DAN and ventral attention network during preparatory attention rather than in target stimulus selection, as shown by He et al. (2007) and others (Asplund et al., 2010; Tamber-Rosenau et al., 2018). Thus, although IFJ activation during target stimulus selection might be expected, it has not been clear whether IFJ plays a critical role during target processing in addition to setting up that selection during the prior control (cue) period. We provide evidence that IFJ activity is not critical for successful attention-related target selection during target processing because it was neither differentially activated (univariate) nor decoded (MVPA) during spatial or feature attention.

Together, the findings presented here support an explicit role of IFJ during preparatory attentional control during feature-based but not location-based attention. Further, sensory biasing in IFJ was right lateralized in our color attention task, and the strength of the connectivity of the rIFJ to V4 predicted both the success of target selection in V4 and the subjects' task performance.

Footnotes

  • The authors declare no competing financial interests.

  • This work was supported by National Institute of Mental Health Grant MH117991 to G.R.M. and M.D. All data will be publicly available on the National Institute of Mental Health Data Archive.

  • Correspondence should be addressed to Mingzhou Ding at mding{at}bme.ufl.edu or George R. Mangun at mangun{at}ucdavis.edu

SfN exclusive license.

References

  1. ↵
    1. Asplund CL,
    2. Todd JJ,
    3. Snyder AP,
    4. Marois R
    (2010) A central role for the lateral prefrontal cortex in goal-directed and stimulus-driven attention. Nat Neurosci 13:507–512. doi:10.1038/nn.2509 pmid:20208526
    OpenUrlCrossRefPubMed
  2. ↵
    1. Baldauf D,
    2. Desimone R
    (2014) Neural mechanisms of object-based attention. Science 344:424–427. doi:10.1126/science.1247003 pmid:24763592
    OpenUrlAbstract/FREE Full Text
  3. ↵
    1. Benjamini Y,
    2. Yekutieli D
    (2001) The control of the false discovery rate in multiple testing under dependency. Ann Statist 29:1165–1188. doi:10.1214/aos/1013699998
    OpenUrlCrossRef
  4. ↵
    1. Bichot NP,
    2. Heard MT,
    3. DeGennaro EM,
    4. Desimone R
    (2015) A source for feature-based attention in the prefrontal cortex. Neuron 88:832–844. doi:10.1016/j.neuron.2015.10.001 pmid:26526392
    OpenUrlCrossRefPubMed
  5. ↵
    1. Bichot NP,
    2. Xu R,
    3. Ghadooshahy A,
    4. Williams ML,
    5. Desimone R
    (2019) The role of prefrontal cortex in the control of feature attention in area V4. Nat Commun 10:5727. doi:10.1038/s41467-019-13761-7 pmid:31844117
    OpenUrlCrossRefPubMed
  6. ↵
    1. Brass M,
    2. von Cramon DY
    (2004) Decomposing components of task preparation with functional magnetic resonance imaging. J Cogn Neurosci 16:609–620. doi:10.1162/089892904323057335 pmid:15165351
    OpenUrlCrossRefPubMed
  7. ↵
    1. Brass M,
    2. Derrfuss J,
    3. Forstmann B,
    4. von Cramon DY
    (2005) The role of the inferior frontal junction area in cognitive control. Trends Cogn Sci 9:314–316. doi:10.1016/j.tics.2005.05.001 pmid:15927520
    OpenUrlCrossRefPubMed
  8. ↵
    1. Corbetta M,
    2. Kincade JM,
    3. Ollinger JM,
    4. McAvoy MP,
    5. Shulman GL
    (2000) Voluntary orienting is dissociated from target detection in human posterior parietal cortex. Nat Neurosci 3:292–297. doi:10.1038/73009 pmid:10700263
    OpenUrlCrossRefPubMed
  9. ↵
    1. Corbetta M,
    2. Patel G,
    3. Shulman GL
    (2008) The reorienting system of the human brain: from environment to theory of mind. Neuron 58:306–324. doi:10.1016/j.neuron.2008.04.017 pmid:18466742
    OpenUrlCrossRefPubMed
  10. ↵
    1. Derrfuss J,
    2. Brass M,
    3. Neumann J,
    4. von Cramon D
    (2005) Involvement of the inferior frontal junction in cognitive control: meta-analyses of switching and Stroop studies. Hum Brain Mapp 25:22–34. doi:10.1002/hbm.20127 pmid:15846824
    OpenUrlCrossRefPubMed
  11. ↵
    1. Desimone R,
    2. Duncan J
    (1995) Neural mechanisms of selective visual-attention. Annu Rev Neurosci 18:193–222. doi:10.1146/annurev.ne.18.030195.001205 pmid:7605061
    OpenUrlCrossRefPubMed
  12. ↵
    1. Driver J
    (2001) A selective review of selective attention research from the past century. Br J Psychol 92:53–78. pmid:11802865
    OpenUrlCrossRefPubMed
  13. ↵
    1. Egner T,
    2. Monti JM,
    3. Trittschuh EH,
    4. Wieneke CA,
    5. Hirsch J,
    6. Mesulam MM
    (2008) Neural integration of top-down spatial and feature-based information in visual search. J Neurosci 28:6141–6151. doi:10.1523/JNEUROSCI.1262-08.2008 pmid:18550756
    OpenUrlAbstract/FREE Full Text
  14. ↵
    1. Giesbrecht B,
    2. Woldorff MG,
    3. Song AW,
    4. Mangun GR
    (2003) Neural mechanisms of top-down control during spatial and feature attention. Neuroimage 19:496–512. doi:10.1016/S1053-8119(03)00162-9
    OpenUrlCrossRefPubMed
  15. ↵
    1. Giesbrecht B,
    2. Weissman DH,
    3. Woldorff MG,
    4. Mangun GR
    (2006) Pre-target activity in visual cortex predicts behavioral performance on spatial and feature attention tasks. Brain Res 1080:63–72. doi:10.1016/j.brainres.2005.09.068 pmid:16412994
    OpenUrlCrossRefPubMed
  16. ↵
    1. Gong M,
    2. Liu T
    (2020) Biased neural representation of feature-based attention in the human frontoparietal network. J Neurosci 40:8386–8395. doi:10.1523/JNEUROSCI.0690-20.2020 pmid:33004380
    OpenUrlAbstract/FREE Full Text
  17. ↵
    1. Greenberg AS,
    2. Esterman M,
    3. Wilson D,
    4. Serences JT,
    5. Yantis S
    (2010) Control of spatial and feature-based attention in frontoparietal cortex. J Neurosci 30:14330–14339. doi:10.1523/JNEUROSCI.4248-09.2010 pmid:20980588
    OpenUrlAbstract/FREE Full Text
  18. ↵
    1. Grent-'t-Jong T,
    2. Woldorff MG
    (2007) Timing and sequence of brain activity in top-down control of visual-spatial attention. PLoS Biol 5:e12. doi:10.1371/journal.pbio.0050012 pmid:17199410
    OpenUrlCrossRefPubMed
  19. ↵
    1. Harter MR,
    2. Miller SL,
    3. Price NJ,
    4. Lalonde ME,
    5. Keyes AL
    (1989) Neural processes involved in directing attention. J Cogn Neurosci 1:223–237. doi:10.1162/jocn.1989.1.3.223 pmid:23968506
    OpenUrlCrossRefPubMed
  20. ↵
    1. He BJ,
    2. Snyder AZ,
    3. Vincent JL,
    4. Epstein A,
    5. Shulman GL,
    6. Corbetta M
    (2007) Breakdown of functional connectivity in frontoparietal networks underlies behavioral deficits in spatial neglect. Neuron 53:905–918. doi:10.1016/j.neuron.2007.02.013 pmid:17359924
    OpenUrlCrossRefPubMed
  21. ↵
    1. Heinze HJ,
    2. Mangun GR,
    3. Burchert W,
    4. Hinrichs H,
    5. Scholz M,
    6. Munte TF,
    7. Gos A,
    8. Scherg M,
    9. Johannes S,
    10. Hundeshagen H
    (1994) Combined spatial and temporal imaging of brain activity during visual selective attention in humans. Nature 372:543–546. doi:10.1038/372543a0 pmid:7990926
    OpenUrlCrossRefPubMed
  22. ↵
    1. Hopf JM,
    2. Mangun GR
    (2000) Shifting visual attention in space: an electrophysiological analysis using high spatial resolution mapping. Clin Neurophysiol 111:1241–1257. doi:10.1016/s1388-2457(00)00313-8 pmid:10880800
    OpenUrlCrossRefPubMed
  23. ↵
    1. Hopfinger JB,
    2. Buonocore MH,
    3. Mangun GR
    (2000) The neural mechanisms of top-down attentional control. Nat Neurosci 3:284–291. doi:10.1038/72999 pmid:10700262
    OpenUrlCrossRefPubMed
  24. ↵
    1. Kastner S,
    2. Pinsk MA,
    3. De Weerd P,
    4. Desimone R,
    5. Ungerleider LG
    (1999) Increased activity in human visual cortex during directed attention in the absence of visual stimulation. Neuron 22:751–761. doi:10.1016/s0896-6273(00)80734-5 pmid:10230795
    OpenUrlCrossRefPubMed
  25. ↵
    1. Kingstone A
    (1992) Combining expectancies. Q J Exp Psychol 44:69–104. doi:10.1080/14640749208401284
    OpenUrlCrossRef
  26. ↵
    1. Klein BP,
    2. Harvey BM,
    3. Dumoulin SO
    (2014) Attraction of position preference by spatial attention throughout human visual cortex. Neuron 84:227–237. doi:10.1016/j.neuron.2014.08.047 pmid:25242220
    OpenUrlCrossRefPubMed
  27. ↵
    1. Lee J,
    2. Geng JJ
    (2017) Idiosyncratic patterns of representational similarity in prefrontal cortex predict attentional performance. J Neurosci 37:1257–1268. doi:10.1523/JNEUROSCI.1407-16.2016 pmid:28028199
    OpenUrlAbstract/FREE Full Text
  28. ↵
    1. Liu T,
    2. Hou Y
    (2013) A hierarchy of attentional priority signals in human frontoparietal cortex. J Neurosci 33:16606–16616. doi:10.1523/JNEUROSCI.1780-13.2013 pmid:24133264
    OpenUrlAbstract/FREE Full Text
  29. ↵
    1. Liu T,
    2. Slotnick SD,
    3. Serences JT,
    4. Yantis S
    (2003) Cortical mechanisms of feature-based attentional control. Cereb Cortex 13:1334–1343. doi:10.1093/cercor/bhg080 pmid:14615298
    OpenUrlCrossRefPubMed
  30. ↵
    1. Liu Y,
    2. Bengson J,
    3. Huang H,
    4. Mangun GR,
    5. Ding M
    (2016) Top-down modulation of neural activity in anticipatory visual attention: control mechanisms revealed by simultaneous EEG-fMRI. Cereb Cortex 26:517–529. doi:10.1093/cercor/bhu204 pmid:25205663
    OpenUrlCrossRefPubMed
  31. ↵
    1. Luck SJ,
    2. Chelazzi L,
    3. Hillyard SA,
    4. Desimone R
    (1997) Neural mechanisms of spatial selective attention in areas V1, V2, and V4 of macaque visual cortex. J Neurophysiol 77:24–42. doi:10.1152/jn.1997.77.1.24 pmid:9120566
    OpenUrlCrossRefPubMed
  32. ↵
    1. Lueck CJ,
    2. Zeki S,
    3. Friston KJ,
    4. Deiber MP,
    5. Cope P,
    6. Cunningham VJ,
    7. Lammertsma AA,
    8. Kennard C,
    9. Frackowiak RS
    (1989) The colour centre in the cerebral cortex of man. Nature 340:386–389. doi:10.1038/340386a0 pmid:2787893
    OpenUrlCrossRefPubMed
  33. ↵
    1. Mangun GR,
    2. Hopfinger JB,
    3. Kussmaul CL,
    4. Fletcher EM,
    5. Heinze HJ
    (1997) Covariations in ERP and PET measures of spatial selective attention in human extrastriate visual cortex. Hum Brain Mapp 5:273–279. doi:10.1002/(SICI)1097-0193(1997)5:4<273::AID-HBM12>3.0.CO;2-F
    OpenUrlCrossRefPubMed
  34. ↵
    1. Moran J,
    2. Desimone R
    (1985) Selective attention gates visual processing in the extrastriate cortex. Science 229:782–784. doi:10.1126/science.4023713 pmid:4023713
    OpenUrlAbstract/FREE Full Text
  35. ↵
    1. Motter BC
    (1993) Focal attention produces spatially selective processing in visual cortical areas V1, V2, and V4 in the presence of competing stimuli. J Neurophysiol 70:909–919. doi:10.1152/jn.1993.70.3.909 pmid:8229178
    OpenUrlCrossRefPubMed
  36. ↵
    1. Murphey DK,
    2. Yoshor D,
    3. Beauchamp MS
    (2008) Perception matches selectivity in the human anterior color center. Curr Biol 18:216–220. doi:10.1016/j.cub.2008.01.013 pmid:18258428
    OpenUrlCrossRefPubMed
  37. ↵
    1. Neumann J,
    2. Lohmann G,
    3. Derrfuss J,
    4. von Cramon DY
    (2005) Meta-analysis of functional imaging data using replicator dynamics. Hum Brain Mapp 25:165–173. doi:10.1002/hbm.20133 pmid:15846812
    OpenUrlCrossRefPubMed
  38. ↵
    1. Noah S,
    2. Powell T,
    3. Khodayari N,
    4. Olivan D,
    5. Ding M,
    6. Mangun GR
    (2020) Neural mechanisms of attentional control for objects: decoding EEG alpha when anticipating faces, scenes, and tools. J Neurosci 40:4913–4924. doi:10.1523/JNEUROSCI.2685-19.2020
    OpenUrlAbstract/FREE Full Text
  39. ↵
    1. Orlandi A,
    2. Proverbio AM
    (2020) ERP indices of an orientation-dependent recognition of the human body schema. Neuropsychologia 146:107535. doi:10.1016/j.neuropsychologia.2020.107535 pmid:32561310
    OpenUrlCrossRefPubMed
  40. ↵
    1. Posner MI
    (1980) Orienting of attention. Q J Exp Psychol 32:3–25. doi:10.1080/00335558008248231 pmid:7367577
    OpenUrlCrossRefPubMed
  41. ↵
    1. Rajan A,
    2. Meyyappan S,
    3. Walker H,
    4. Samuel IB,
    5. Hu Z,
    6. Ding M
    (2019) Neural mechanisms of internal distraction suppression in visual attention. Cortex 117:77–88. doi:10.1016/j.cortex.2019.02.026 pmid:30933692
    OpenUrlCrossRefPubMed
  42. ↵
    1. Rajan A,
    2. Meyyappan S,
    3. Liu Y,
    4. Samuel IB,
    5. Nandi B,
    6. Mangun GR,
    7. Ding M
    (2021) The microstructure of attentional control in the dorsal attention network. J Cogn Neurosci 33:965–919. doi:10.1162/jocn_a_01710
    OpenUrlCrossRef
  43. ↵
    1. Rissman J,
    2. Gazzaley A,
    3. D'Esposito M
    (2004) Measuring functional connectivity during distinct stages of a cognitive task. Neuroimage 23:752–763. doi:10.1016/j.neuroimage.2004.06.035 pmid:15488425
    OpenUrlCrossRefPubMed
  44. ↵
    1. Slagter HA,
    2. Giesbrecht B,
    3. Kok A,
    4. Weissman DH,
    5. Kenemans JL,
    6. Woldorff MG,
    7. Mangun GR
    (2007) fMRI evidence for both generalized and specialized components of attentional control. Brain Res 1177:90–102. doi:10.1016/j.brainres.2007.07.097 pmid:17916338
    OpenUrlCrossRefPubMed
  45. ↵
    1. Snyder AC,
    2. Foxe JJ
    (2010) Anticipatory attentional suppression of visual features indexed by oscillatory alpha-band power increases: a high-density electrical mapping study. J Neurosci 30:4024–4032. doi:10.1523/JNEUROSCI.5684-09.2010 pmid:20237273
    OpenUrlAbstract/FREE Full Text
  46. ↵
    1. Spitzer H,
    2. Desimone R,
    3. Moran J
    (1988) Increased attention enhances both behavioral and neuronal performance. Science 240:338–340. doi:10.1126/science.3353728 pmid:3353728
    OpenUrlAbstract/FREE Full Text
  47. ↵
    1. Stelzer J,
    2. Chen Y,
    3. Turner R
    (2013) Statistical inference and multiple testing correction in classification-based multi-voxel pattern analysis (MVPA): random permutations and cluster size control. Neuroimage 65:69–82. doi:10.1016/j.neuroimage.2012.09.063 pmid:23041526
    OpenUrlCrossRefPubMed
  48. ↵
    1. Tamber-Rosenau BJ,
    2. Asplund CL,
    3. Marois R
    (2018) Functional dissociation of the inferior frontal junction from the dorsal attention network in top-down attentional control. J Neurophysiol 120:2498–2512. doi:10.1152/jn.00506.2018 pmid:30156458
    OpenUrlCrossRefPubMed
  49. ↵
    1. Tootell RB,
    2. Hadjikhani N,
    3. Hall EK,
    4. Marrett S,
    5. Vanduffel W,
    6. Vaughan JT,
    7. Dale AM
    (1998) The retinotopy of visual spatial attention. Neuron 21:1409–1422. doi:10.1016/S0896-6273(00)80659-5 pmid:9883733
    OpenUrlCrossRefPubMed
  50. ↵
    1. Treisman AM
    (1969) Strategies and models of selective attention. Psychol Rev 76:282–299. doi:10.1037/h0027242 pmid:4893203
    OpenUrlCrossRefPubMed
  51. ↵
    1. Wang C,
    2. Rajagovindan R,
    3. Han SM,
    4. Ding M
    (2016) Top-down control of visual alpha oscillations: sources of control signals and their mechanisms of action. Front Hum Neurosci 10:15. doi:10.3389/fnhum.2016.00015 pmid:26834601
    OpenUrlCrossRefPubMed
  52. ↵
    1. Wang L,
    2. Mruczek RE,
    3. Arcaro MJ,
    4. Kastner S
    (2015) Probabilistic maps of visual topography in human cortex. Cereb Cortex 25:3911–3931. doi:10.1093/cercor/bhu277 pmid:25452571
    OpenUrlCrossRefPubMed
  53. ↵
    1. Woldorff MG,
    2. Hazlett CJ,
    3. Fichtenholtz HM,
    4. Weissman DH,
    5. Dale AM,
    6. Song AW
    (2004) Functional parcellation of attentional control regions of the brain. J Cogn Neurosci 16:149–165. doi:10.1162/089892904322755638 pmid:15006044
    OpenUrlCrossRefPubMed
  54. ↵
    1. Zanto TP,
    2. Rubens MT,
    3. Bollinger J,
    4. Gazzaley A
    (2010) Top-down modulation of visual feature processing: the role of the inferior frontal junction. Neuroimage 53:736–745. doi:10.1016/j.neuroimage.2010.06.012
    OpenUrlCrossRefPubMed
  55. ↵
    1. Zanto TP,
    2. Rubens MT,
    3. Thangavel A,
    4. Gazzaley A
    (2011) Causal role of the prefrontal cortex in top-down modulation of visual processing and working memory. Nat Neurosci 14:656–661. doi:10.1038/nn.2773 pmid:21441920
    OpenUrlCrossRefPubMed
  56. ↵
    1. Zeki S,
    2. Watson JD,
    3. Lueck CJ,
    4. Friston KJ,
    5. Kennard C,
    6. Frackowiak RS
    (1991) A direct demonstration of functional specialization in human visual cortex. J Neurosci 11:641–649. pmid:2002358
    OpenUrlAbstract/FREE Full Text
  57. ↵
    1. Zhang X,
    2. Mlynaryk N,
    3. Ahmed S,
    4. Japee S,
    5. Ungerleider LG
    (2018) The role of inferior frontal junction in controlling the spatially global effect of feature-based attention in human visual areas. PLoS Biol 16:e2005399. doi:10.1371/journal.pbio.2005399 pmid:29939981
    OpenUrlCrossRefPubMed
Back to top

In this issue

The Journal of Neuroscience: 41 (38)
Journal of Neuroscience
Vol. 41, Issue 38
22 Sep 2021
  • Table of Contents
  • Table of Contents (PDF)
  • About the Cover
  • Index by author
  • Ed Board (PDF)
Email

Thank you for sharing this Journal of Neuroscience article.

NOTE: We request your email address only to inform the recipient that it was you who recommended this article, and that it is not junk mail. We do not retain these email addresses.

Enter multiple addresses on separate lines or separate them with commas.
Role of Inferior Frontal Junction (IFJ) in the Control of Feature versus Spatial Attention
(Your Name) has forwarded a page to you from Journal of Neuroscience
(Your Name) thought you would be interested in this article in Journal of Neuroscience.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Print
View Full Page PDF
Citation Tools
Role of Inferior Frontal Junction (IFJ) in the Control of Feature versus Spatial Attention
Sreenivasan Meyyappan, Abhijit Rajan, George R. Mangun, Mingzhou Ding
Journal of Neuroscience 22 September 2021, 41 (38) 8065-8074; DOI: 10.1523/JNEUROSCI.2883-20.2021

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Respond to this article
Request Permissions
Share
Role of Inferior Frontal Junction (IFJ) in the Control of Feature versus Spatial Attention
Sreenivasan Meyyappan, Abhijit Rajan, George R. Mangun, Mingzhou Ding
Journal of Neuroscience 22 September 2021, 41 (38) 8065-8074; DOI: 10.1523/JNEUROSCI.2883-20.2021
Reddit logo Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • Introduction
    • Materials and Methods
    • Results
    • Discussion
    • Footnotes
    • References
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF

Keywords

  • feature attention
  • fMRI
  • inferior frontal junction
  • MVPA

Responses to this article

Respond to this article

Jump to comment:

No eLetters have been published for this article.

Related Articles

Cited By...

More in this TOC Section

Research Articles

  • CIB2 and CIB3 regulate stereocilia maintenance and mechanoelectrical transduction in mouse vestibular hair cells
  • Face-selective patches in marmosets are involved in dynamic and static facial expression processing
  • Sex differences in the impact of electronic nicotine vapor on corticotropin-releasing factor receptor 1 neurons in the mouse ventral tegmental area
Show more Research Articles

Behavioral/Cognitive

  • A learned map for places and concepts in the human MTL
  • Genetic Disruption of System xc-Mediated Glutamate Release from Astrocytes Increases Negative-Outcome Behaviors While Preserving Basic Brain Function in Rat
  • Neural Substrates of Body Ownership and Agency during Voluntary Movement
Show more Behavioral/Cognitive
  • Home
  • Alerts
  • Visit Society for Neuroscience on Facebook
  • Follow Society for Neuroscience on Twitter
  • Follow Society for Neuroscience on LinkedIn
  • Visit Society for Neuroscience on Youtube
  • Follow our RSS feeds

Content

  • Early Release
  • Current Issue
  • Issue Archive
  • Collections

Information

  • For Authors
  • For Advertisers
  • For the Media
  • For Subscribers

About

  • About the Journal
  • Editorial Board
  • Privacy Policy
  • Contact
(JNeurosci logo)
(SfN logo)

Copyright © 2023 by the Society for Neuroscience.
JNeurosci Online ISSN: 1529-2401

The ideas and opinions expressed in JNeurosci do not necessarily reflect those of SfN or the JNeurosci Editorial Board. Publication of an advertisement or other product mention in JNeurosci should not be construed as an endorsement of the manufacturer’s claims. SfN does not assume any responsibility for any injury and/or damage to persons or property arising from or related to any use of any material contained in JNeurosci.