Skip to main content

Main menu

  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Collections
    • Podcast
  • ALERTS
  • FOR AUTHORS
    • Information for Authors
    • Fees
    • Journal Clubs
    • eLetters
    • Submit
  • EDITORIAL BOARD
  • ABOUT
    • Overview
    • Advertise
    • For the Media
    • Rights and Permissions
    • Privacy Policy
    • Feedback
  • SUBSCRIBE

User menu

  • Log in
  • My Cart

Search

  • Advanced search
Journal of Neuroscience
  • Log in
  • My Cart
Journal of Neuroscience

Advanced Search

Submit a Manuscript
  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Collections
    • Podcast
  • ALERTS
  • FOR AUTHORS
    • Information for Authors
    • Fees
    • Journal Clubs
    • eLetters
    • Submit
  • EDITORIAL BOARD
  • ABOUT
    • Overview
    • Advertise
    • For the Media
    • Rights and Permissions
    • Privacy Policy
    • Feedback
  • SUBSCRIBE
PreviousNext
Research Articles, Behavioral/Cognitive

Cortical Processing of Arithmetic and Simple Sentences in an Auditory Attention Task

Joshua P. Kulasingham, Neha H. Joshi, Mohsen Rezaeizadeh and Jonathan Z. Simon
Journal of Neuroscience 22 September 2021, 41 (38) 8023-8039; DOI: https://doi.org/10.1523/JNEUROSCI.0269-21.2021
Joshua P. Kulasingham
1Department of Electrical and Computer Engineering
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Neha H. Joshi
1Department of Electrical and Computer Engineering
2Institute for Systems Research
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Mohsen Rezaeizadeh
1Department of Electrical and Computer Engineering
2Institute for Systems Research
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Jonathan Z. Simon
1Department of Electrical and Computer Engineering
2Institute for Systems Research
3Department of Biology, University of Maryland, College Park, Maryland, 20742
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Jonathan Z. Simon
  • Article
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF
Loading

Article Figures & Data

Figures

  • Tables
  • Figure 1.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 1.

    Stimulus structure. A, The foreground, background, and mix waveforms for the initial section of the stimulus for a two-speaker attend-language trial. The sentence, equation, word, and symbol structures are shown. The word and symbol rhythms are clearly visible in the waveforms. The mix was presented diotically and is the linear sum of both streams. B, The frequency spectrum of the Hilbert envelope of the entire concatenated stimulus for the attend-sentences condition (432-s duration). The sentence (0.67 Hz), equation (0.55 Hz), word (2.67 Hz), and symbol (2.78 Hz) rates are indicated by colored arrows under the x-axis. Clear word and symbol rate peaks are seen in the foreground and background, respectively, while the mix spectrum has both peaks. Note that there are no sentence rate or equation rate peaks in the stimulus spectrum. The appearance of harmonics of the equation rate are consistent with the limited set of math symbols used.

  • Figure 2.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 2.

    Neural response spectrum. The MEG response spectrum as a function of frequency for the four conditions. A: Single speaker sentences, B: Single speaker equations, C: Cocktail party attend sentences, D: Cocktail party attend equations. The amplitude spectrum, averaged over sensors and subjects, is shown with light shaded regions denoting the first-third quartile range across subjects. Clear peaks are seen at the sentence, equation, word, and symbol rates (indicated by the arrows under the x-axis). These responses were compared against neighboring bins (of width ∼0.01 Hz, not visible here) for statistical tests. Insets show the average responses at the four frequencies of interest for each subject, after subtracting the neighboring bins. The scale for the insets is standardized within each condition, but with 0 indicating the baseline average activity of the neighboring bins. For the single speaker conditions, peaks appear only at the rates corresponding to the presented stimulus. For the cocktail party conditions, peaks appear at the symbol and word rates regardless of attention, while sentence and equation peaks only appear during the attended condition. There are no analogous sentence or equation peaks during the opposite attention condition.

  • Figure 3.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 3.

    Source localized responses at each frequency of interest. A: Single speaker sentences, B: Single speaker equations, C: Cocktail party attend sentences, D: Cocktail party attend equations. The source localized responses at critical frequencies, averaged over subjects and masked by significant increase over the noise model, are shown. Color scales are normalized within each condition to more clearly show the spatial patterns. The word and symbol rate responses are maximal in bilateral auditory cortical areas, while the sentence rate response is maximal in the left temporal lobe. The equation rate responses localize to bilateral parietal, temporal, and occipital areas, albeit with increased left hemispheric activity. Although the background sentence and equation rates also show significant activity, the amplitude of these responses are much smaller than the responses at the corresponding attended rates.

  • Figure 4.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 4.

    Neural response correlations with behavior. A: Single speaker condition, B: Cocktail party condition. The source localized responses at the frequencies of interest were correlated with the corresponding deviant detection performance, across subjects. The areas of significant correlation are plotted here (same color scale for all plots). Sentence and equation rate responses are significantly correlated with behavior only if attended, while both attended and unattended word rate responses are significantly correlated with behavior. The sentence rate response is significantly correlated over regions in left temporal, parietal, and frontal areas, while significant correlation for the equation rate response is seen in left parietal and occipital regions.

  • Figure 5.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 5.

    TRFs in the single speaker conditions. A: Sentence TRF, B: Equation TRF. Overlay plots of the amplitude of the TRF vectors for each voxel, averaged over subjects. For each TRF subfigure, the top axis shows vector amplitudes of voxels in the left hemisphere and the bottom axis correspondingly in the right hemisphere. Each trace is from the TRF of a single voxel; non-significant time points are shown in gray, while significant time points are shown in red (sentence TRF) or blue (equation TRF). The duration plotted corresponds to that of a sentence or equation, plus 350 ms; because of the fixed presentation rate, the first 350 ms (shown in gray) are identical to the last 350 ms. The large peak at the end (and beginning) of each TRF may either be ascribed to processing of the completion of the sentence/equation, or to the onset of the new sentence sentence/equation, or both. Word and symbol onset times are shown in red and blue dashed lines, respectively; it can be seen that response contributions associated with them have been successfully regressed out. Volume source space distributions for several peaks in the TRF amplitudes are shown in the inlay plots, with black arrows denoting current directions (peaks automatically selected as local maxima of the TRFs). Although most of the TRF activity is dominated by neural currents in the left temporal lobe, the equation TRFs show more bilateral activation.

  • Figure 6.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 6.

    TRFs in the cocktail party conditions. A: Sentence TRF while attending to sentences, B: Equation TRF while attending to equations. Overlay plots of the TRF for each voxel averaged over subjects are shown as those in Figure 5. Word and symbol onset times are shown in red and blue dashed lines, respectively, and are marked in both sentence and equation TRFs since both stimuli were present in the cocktail party conditions; again, it can be seen that responses contributions associated with them have been successfully regressed out. Differences between sentence and equation TRFs arise at later time points, with sentence TRFs being predominantly near left temporal areas, while equation TRFs are in bilateral temporal, motor, and parietal regions.

  • Figure 7.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 7.

    Decoding arithmetic and language processing. A, Performance of decoding attention condition (math vs language) at each time point using MEG sensors for single speaker (purple) and cocktail party (brown). Prediction success is measured by AUC, which is plotted (mean and SEM across subjects); time points where predictions are significantly above chance are marked by the horizontal bars at the bottom (every time point is significantly above chance for the single speaker case). The word and symbol onsets are also shown, and the decoding performance increases toward the end of the time window. B, Decoding arithmetic operators from sensor topographies. The time window of the operator and the subsequent operand was used for the three types of decoders. Time intervals where predictions are significantly above chance are marked by the colored horizontal bars at the bottom: all three operator comparisons could be significantly decoded. C, Decoding math versus language based on the last word. During the single speaker conditions, most of the brain is significant. However, for the cocktail party conditions, more focal significant decoding is seen in IPS and superior parietal areas. Decoding based on the first word resulted in similar results (data not shown). D, Decoding attention in the cocktail party conditions (AUC masked by significance across subjects). The sentence responses in foreground and background were decoded in left middle temporal and bilateral superior parietal areas. The equation responses in foreground and background were decoded in bilateral parietal areas.

  • Figure 8.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 8.

    Schematic of cortical processing of sentences and equations. A schematic representation of sentence and equation processing is shown. Exemplars of both foreground and background of stimuli are shown at the bottom. The areas that were most consistent across all analysis methods (frequency domain, TRFs and decoders) are shown.

Tables

  • Figures
    • View popup
    Table 1.

    Experiment block structure

    ForegroundBackgroundSpeaker foreground (background)Number of trials per block
    Equations—Male10
    Sentences—Male10
    Equations—Female10
    Sentences—Female10
    SentencesEquationsFemale (male)6
    EquationsSentencesFemale (male)6
    SentencesEquationsMale (female)6
    EquationsSentencesMale (female)6
    SentencesEquationsFemale (male)6
    EquationsSentencesFemale (male)6
    SentencesEquationsMale (female)6
    EquationsSentencesMale (female)6
    • The experiment consisted of four single speaker blocks followed by eight cocktail party blocks. Each trial was 18 s in duration and consisted of 10 equations (1.8 s × 10 = 18 s) or 12 sentences (1.5 s × 12 = 18 s). The speaker gender was counterbalanced across subjects (i.e., the order of column 3 was changed).

Back to top

In this issue

The Journal of Neuroscience: 41 (38)
Journal of Neuroscience
Vol. 41, Issue 38
22 Sep 2021
  • Table of Contents
  • Table of Contents (PDF)
  • About the Cover
  • Index by author
  • Ed Board (PDF)
Email

Thank you for sharing this Journal of Neuroscience article.

NOTE: We request your email address only to inform the recipient that it was you who recommended this article, and that it is not junk mail. We do not retain these email addresses.

Enter multiple addresses on separate lines or separate them with commas.
Cortical Processing of Arithmetic and Simple Sentences in an Auditory Attention Task
(Your Name) has forwarded a page to you from Journal of Neuroscience
(Your Name) thought you would be interested in this article in Journal of Neuroscience.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Print
View Full Page PDF
Citation Tools
Cortical Processing of Arithmetic and Simple Sentences in an Auditory Attention Task
Joshua P. Kulasingham, Neha H. Joshi, Mohsen Rezaeizadeh, Jonathan Z. Simon
Journal of Neuroscience 22 September 2021, 41 (38) 8023-8039; DOI: 10.1523/JNEUROSCI.0269-21.2021

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Respond to this article
Request Permissions
Share
Cortical Processing of Arithmetic and Simple Sentences in an Auditory Attention Task
Joshua P. Kulasingham, Neha H. Joshi, Mohsen Rezaeizadeh, Jonathan Z. Simon
Journal of Neuroscience 22 September 2021, 41 (38) 8023-8039; DOI: 10.1523/JNEUROSCI.0269-21.2021
del.icio.us logo Digg logo Reddit logo Twitter logo Facebook logo Google logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • Introduction
    • Materials and Methods
    • Results
    • Discussion
    • Footnotes
    • References
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF

Keywords

  • cocktail party
  • isochronous
  • MEG
  • selective attention
  • TRF

Responses to this article

Respond to this article

Jump to comment:

No eLetters have been published for this article.

Related Articles

Cited By...

More in this TOC Section

Research Articles

  • The Anterior Cingulate Cortex Promotes Long-Term Auditory Cortical Responses through an Indirect Pathway via the Rhinal Cortex in Mice
  • Prior Movement of One Arm Facilitates Motor Adaptation in the Other
  • Intracranial Electroencephalography and Deep Neural Networks Reveal Shared Substrates for Representations of Face Identity and Expressions
Show more Research Articles

Behavioral/Cognitive

  • Posterior Parietal Cortex Plays a Causal Role in Abstract Memory-Based Visual Categorical Decisions
  • Domain-General and Domain-Specific Electrophysiological Markers of Cognitive Distance Coding for What, Where, and When Memory Retrieval
  • The Differential Weights of Motivational and Task Performance Measures on Medial and Lateral Frontal Neural Activity
Show more Behavioral/Cognitive
  • Home
  • Alerts
  • Visit Society for Neuroscience on Facebook
  • Follow Society for Neuroscience on Twitter
  • Follow Society for Neuroscience on LinkedIn
  • Visit Society for Neuroscience on Youtube
  • Follow our RSS feeds

Content

  • Early Release
  • Current Issue
  • Issue Archive
  • Collections

Information

  • For Authors
  • For Advertisers
  • For the Media
  • For Subscribers

About

  • About the Journal
  • Editorial Board
  • Privacy Policy
  • Contact
(JNeurosci logo)
(SfN logo)

Copyright © 2023 by the Society for Neuroscience.
JNeurosci Online ISSN: 1529-2401

The ideas and opinions expressed in JNeurosci do not necessarily reflect those of SfN or the JNeurosci Editorial Board. Publication of an advertisement or other product mention in JNeurosci should not be construed as an endorsement of the manufacturer’s claims. SfN does not assume any responsibility for any injury and/or damage to persons or property arising from or related to any use of any material contained in JNeurosci.