Skip to main content

Main menu

  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Collections
    • Podcast
  • ALERTS
  • FOR AUTHORS
    • Information for Authors
    • Fees
    • Journal Clubs
    • eLetters
    • Submit
  • EDITORIAL BOARD
  • ABOUT
    • Overview
    • Advertise
    • For the Media
    • Rights and Permissions
    • Privacy Policy
    • Feedback
  • SUBSCRIBE

User menu

  • Log out
  • Log in
  • My Cart

Search

  • Advanced search
Journal of Neuroscience
  • Log out
  • Log in
  • My Cart
Journal of Neuroscience

Advanced Search

Submit a Manuscript
  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Collections
    • Podcast
  • ALERTS
  • FOR AUTHORS
    • Information for Authors
    • Fees
    • Journal Clubs
    • eLetters
    • Submit
  • EDITORIAL BOARD
  • ABOUT
    • Overview
    • Advertise
    • For the Media
    • Rights and Permissions
    • Privacy Policy
    • Feedback
  • SUBSCRIBE
PreviousNext
Research Articles, Behavioral/Cognitive

Differential Brain Mechanisms of Selection and Maintenance of Information during Working Memory

Romain Quentin, Jean-Rémi King, Etienne Sallard, Nathan Fishman, Ryan Thompson, Ethan R. Buch and Leonardo G. Cohen
Journal of Neuroscience 8 May 2019, 39 (19) 3728-3740; DOI: https://doi.org/10.1523/JNEUROSCI.2764-18.2019
Romain Quentin
1Human Cortical Physiology and Neurorehabilitation Section, NINDS, NIH, Bethesda, Maryland 20892,
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Romain Quentin
Jean-Rémi King
2New York University, New York, New York 10003, and
3Frankfurt Institute for Advanced Studies, 60438 Frankfurt am Main, Germany
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Etienne Sallard
1Human Cortical Physiology and Neurorehabilitation Section, NINDS, NIH, Bethesda, Maryland 20892,
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Nathan Fishman
1Human Cortical Physiology and Neurorehabilitation Section, NINDS, NIH, Bethesda, Maryland 20892,
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Ryan Thompson
1Human Cortical Physiology and Neurorehabilitation Section, NINDS, NIH, Bethesda, Maryland 20892,
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Ethan R. Buch
1Human Cortical Physiology and Neurorehabilitation Section, NINDS, NIH, Bethesda, Maryland 20892,
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Leonardo G. Cohen
1Human Cortical Physiology and Neurorehabilitation Section, NINDS, NIH, Bethesda, Maryland 20892,
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Leonardo G. Cohen
  • Article
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF
Loading

Article Figures & Data

Figures

  • Figure 1.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 1.

    Behavioral task and performance. A, Visual working memory task. The stimulus appears for 100 ms and is composed of the following four different visual attributes: left and right spatial frequency (each chosen from among five possible: 1, 1.5, 2.25, 3.375, or 5.06 cycles/°) and left and right orientation (each chosen from among five possible: −72°, −36°, 0°, 36°, and 72°; 0° = vertical). After a delay of 800 ± 50 ms, the cue appears for 100 ms and indicates which visual attribute of the stimulus the participant has to compare with the upcoming probe. A left or right solid line cue indicates respectively the left or right orientation, and a left or right dotted line indicates respectively the left or right spatial frequency of the stimulus. After a 1500 ± 50 ms delay, the probe appears and the participant is required to answer whether the cued stimulus attribute is the same or different from the corresponding probe attribute. In the trial depicted in the figure, the solid line cue pointing to the left instructs the participant to compare the orientation on the left side of the stimulus with the orientation in the probe (the correct answer in this case is “different”). We refer to the time between the stimulus and the cue as the stimulus epoch, the time between the cue and the probe as the cue epoch and the time after the probe as the probe epoch. B, Behavioral performance. The values are the mean percentage and SD of correct responses across participants. The mean performance across all trials was 83 ± 3.6%. Participants were better when they had to remember an orientation compared with a spatial frequency (85% vs 81%, p < 0.001, paired t test). Performances were similar for trials with a left and right cue.

  • Figure 2.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 2.

    Neural dynamics of visual perception, selection rule, and memory content in evoked and time–frequency domains. A, Time course of MEG decoding performance. The x-axis corresponds to the time relative to each event (stimulus, cue, and probe; see top), and the y-axis corresponds to the decoding performance for the stimulus attributes, the selection rule, the memory content, and the probe attributes. Vertical gray bars indicate the visual presentation of each image (stimulus, cue, and probe). Color-filled areas depict significant temporal clusters of decoding performance (cluster-level, p < 0.05 corrected). Variance (thickness of the line) is shown as the SEM across participants. Note that the successful decoding of the four visual attributes of the stimulus, the spatial and feature rules, the memory content (cued − uncued) for both spatial frequency and orientation and for the two attributes of the probe. The asterisks indicate the significance of the mean decoding performance over the entire corresponding epoch (***p < 0.001, **p < 0.01, and *p < 0.05). B, Decoding performance in the time–frequency domain. The x-axis corresponds to the time relative to each event (stimulus, cue, and probe; see top), and the y-axis depicts the frequency of MEG activity (between 2 and 60 Hz). Significant clusters of decoding performance are contoured with a dotted line. Note the successful decoding in the time–frequency domain of the four visual attributes of the stimulus, both the spatial and the feature rules and the two attributes of the probe, but not the memory content.

  • Figure 3.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 3.

    Spatial source representation of stimuli, selection rule, and memory content. A, Encoding of visual attributes during the stimulus epoch. The calcarine cortex, the cuneus, and lateral occipital regions encoded the visual attributes of the stimulus during the stimulus epoch. B, Selection rule during the cue epoch. A large cortical network, including the ventrolateral prefrontal regions and the insula, encoded the selection rule. C, Memory content during the cue epoch. The neural representation of memory content involves an occipitotemporal brain network. D, Decoding performances from the source signal. Time course of decoding performance during the stimulus epoch for the visual encoding of the spatial frequency (average of left and right spatial frequency) and the line orientation (average of left and right orientation), and during the cue epoch for the rules (the cue side and the cue type) and the memory content (the cued orientation and the cued spatial frequency) in the source space. Note that these decoding performances in source space are similar to the decoding performance in sensor space shown in Figures 4 and 5. ***p < 0.001.

  • Figure 4.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 4.

    The selection rule is encoded in a persistent and stable pattern of low-frequency brain activity. On the left, the time course (x-axis) of decoding performance (y-axis) during the cue epoch for the cue side (top) and the cue type (bottom) during the working memory task when the cue is associated with the selection rule (blue) and the control one-back task when it is not (gray). Note that decoding performance was significantly higher in the working memory task than in the control one-back task. The time generalization matrices (middle panels), in which each estimator trained on time t was tested on its ability to predict the variable at time t′, identified stable neural representations for both spatial and feature rules. The right panel shows the decoding in the time–frequency domain. Note that both rules are maintained within the low-frequency alpha (∼10 Hz) and theta (∼3 Hz) band activity. ***p < 0.001.

  • Figure 5.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 5.

    The sustained and frequency-specific neural representation of the rule is not present in the control one-back task. Estimators trained on cue side and cue type during the working memory task are tested during the same task (blue) or are tested during the control task (gray). On the left, the time course (x-axis) of decoding performance (y-axis) during the cue epoch for the cue side (top) and the cue type (bottom). The right panel shows the decoding in the time–frequency domain. These analyses served as a control for the analyses in Figure 4.

  • Figure 6.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 6.

    Spatial source representation of the selection rule in the theta (3 Hz) and alpha (10 Hz) bands. A, Selection rule during the cue epoch decoding from theta power (3 Hz). A large cortical network, including the ventrolateral prefrontal regions and the insula, encoded the selection rule in the theta band (left) with the corresponding decoding performance in time–frequency source space (right). B, Selection rule during the cue epoch decoding from alpha power (10 Hz). A posterior network encodes the selection rule in the alpha band (left) with the corresponding decoding performance in time–frequency source space (right). ***p < 0.001, **p < 0.01.

  • Figure 7.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 7.

    The memory content is transiently reactivated 500 ms after the cue. A, Time course of decoding performance (y-axis) during the cue epoch for the cued orientation (five possible orientations) and the cued spatial frequency (five possible spatial frequencies) during the working memory task and their corresponding time generalization analysis. B, Same analysis for the uncued orientation and spatial frequency. Note that the decoding performance was significantly above chance for the cued but not the uncued orientation and spatial frequency. Additionally, decoding was significantly higher for the cued than the uncued item (Fig. 2, for this difference). ***p < 0.001, **p < 0.01.

  • Figure 8.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 8.

    Different neural representations of memory and perceptual content. A, Left, Average decoding performance for each participant when estimators are trained on the stimulus attributes (average of orientation and spatial frequency) during the stimulus epoch and tested either during the same epoch or on the corresponding memory content during the cue epoch. Right, Average decoding performance for each participant when estimators are trained on the memory content (average of orientation and spatial frequency) during the cue epoch and tested either during the same epoch or on the corresponding stimulus attribute during the stimulus epoch. B, Time generalization matrix trained on the stimulus attribute (left, line orientation; right, spatial frequency) during the stimulus epoch (y-axes) and tested on the memory content during stimulus (orange matrix) and cue (red matrix) epochs (x-axis). C, Time generalization matrix trained on the memory content during the cue epoch (y-axes) and tested on the stimulus attribute during stimulus (left orange matrix) and cue (right red matrix) epochs (x-axis). Note that an estimator trained to decode a visual feature during perception cannot decode the corresponding memory content during the cue epoch, and that an estimator trained to decode a memory content during the cue epoch cannot decode the corresponding stimulus feature during perception. ***p < 0.001.

  • Figure 9.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 9.

    Small saccades are informative about the side of the cue, but not about the memory content. Time course of decoding performance during the cue epoch for the cue side, the cue type, the cued orientation, and the cued spatial frequency from eye position data only. ***p < 0.001.

Back to top

In this issue

The Journal of Neuroscience: 39 (19)
Journal of Neuroscience
Vol. 39, Issue 19
8 May 2019
  • Table of Contents
  • Table of Contents (PDF)
  • About the Cover
  • Index by author
  • Advertising (PDF)
  • Ed Board (PDF)
Email

Thank you for sharing this Journal of Neuroscience article.

NOTE: We request your email address only to inform the recipient that it was you who recommended this article, and that it is not junk mail. We do not retain these email addresses.

Enter multiple addresses on separate lines or separate them with commas.
Differential Brain Mechanisms of Selection and Maintenance of Information during Working Memory
(Your Name) has forwarded a page to you from Journal of Neuroscience
(Your Name) thought you would be interested in this article in Journal of Neuroscience.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Print
View Full Page PDF
Citation Tools
Differential Brain Mechanisms of Selection and Maintenance of Information during Working Memory
Romain Quentin, Jean-Rémi King, Etienne Sallard, Nathan Fishman, Ryan Thompson, Ethan R. Buch, Leonardo G. Cohen
Journal of Neuroscience 8 May 2019, 39 (19) 3728-3740; DOI: 10.1523/JNEUROSCI.2764-18.2019

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Respond to this article
Request Permissions
Share
Differential Brain Mechanisms of Selection and Maintenance of Information during Working Memory
Romain Quentin, Jean-Rémi King, Etienne Sallard, Nathan Fishman, Ryan Thompson, Ethan R. Buch, Leonardo G. Cohen
Journal of Neuroscience 8 May 2019, 39 (19) 3728-3740; DOI: 10.1523/JNEUROSCI.2764-18.2019
Reddit logo Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • Introduction
    • Materials and Methods
    • Results
    • Discussion
    • Footnotes
    • References
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF

Keywords

  • decoding
  • magnetoencephalography
  • MVPA
  • selection rule
  • temporal dynamics
  • working memory

Responses to this article

Respond to this article

Jump to comment:

No eLetters have been published for this article.

Related Articles

Cited By...

More in this TOC Section

Research Articles

  • Axonal ER Ca2+ Release Selectively Enhances Activity-Independent Glutamate Release in a Huntington Disease Model
  • Sound improves neuronal encoding of visual stimuli in mouse primary visual cortex
  • Decoding of working memory contents in auditory cortex is not distractor-resistant
Show more Research Articles

Behavioral/Cognitive

  • Decoding of working memory contents in auditory cortex is not distractor-resistant
  • The representational similarity between visual perception and recent perceptual history
  • Neural index of reinforcement learning predicts improved stimulus-response retention under high working memory load
Show more Behavioral/Cognitive
  • Home
  • Alerts
  • Visit Society for Neuroscience on Facebook
  • Follow Society for Neuroscience on Twitter
  • Follow Society for Neuroscience on LinkedIn
  • Visit Society for Neuroscience on Youtube
  • Follow our RSS feeds

Content

  • Early Release
  • Current Issue
  • Issue Archive
  • Collections

Information

  • For Authors
  • For Advertisers
  • For the Media
  • For Subscribers

About

  • About the Journal
  • Editorial Board
  • Privacy Policy
  • Contact
(JNeurosci logo)
(SfN logo)

Copyright © 2023 by the Society for Neuroscience.
JNeurosci Online ISSN: 1529-2401

The ideas and opinions expressed in JNeurosci do not necessarily reflect those of SfN or the JNeurosci Editorial Board. Publication of an advertisement or other product mention in JNeurosci should not be construed as an endorsement of the manufacturer’s claims. SfN does not assume any responsibility for any injury and/or damage to persons or property arising from or related to any use of any material contained in JNeurosci.