Skip to main content

Main menu

  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Collections
    • Podcast
  • ALERTS
  • FOR AUTHORS
    • Information for Authors
    • Fees
    • Journal Clubs
    • eLetters
    • Submit
  • EDITORIAL BOARD
  • ABOUT
    • Overview
    • Advertise
    • For the Media
    • Rights and Permissions
    • Privacy Policy
    • Feedback
  • SUBSCRIBE

User menu

  • Log in
  • My Cart

Search

  • Advanced search
Journal of Neuroscience
  • Log in
  • My Cart
Journal of Neuroscience

Advanced Search

Submit a Manuscript
  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Collections
    • Podcast
  • ALERTS
  • FOR AUTHORS
    • Information for Authors
    • Fees
    • Journal Clubs
    • eLetters
    • Submit
  • EDITORIAL BOARD
  • ABOUT
    • Overview
    • Advertise
    • For the Media
    • Rights and Permissions
    • Privacy Policy
    • Feedback
  • SUBSCRIBE
PreviousNext
Research Articles, Systems/Circuits

Integrating Visual Information into the Auditory Cortex Promotes Sound Discrimination through Choice-Related Multisensory Integration

Song Chang, Jinghong Xu, Mengyao Zheng, Les Keniston, Xiaoming Zhou, Jiping Zhang and Liping Yu
Journal of Neuroscience 9 November 2022, 42 (45) 8556-8568; DOI: https://doi.org/10.1523/JNEUROSCI.0793-22.2022
Song Chang
1Key Laboratory of Brain Functional Genomics (Ministry of Education and Shanghai), East China Normal University, Shanghai 200062, China
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Jinghong Xu
1Key Laboratory of Brain Functional Genomics (Ministry of Education and Shanghai), East China Normal University, Shanghai 200062, China
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Mengyao Zheng
1Key Laboratory of Brain Functional Genomics (Ministry of Education and Shanghai), East China Normal University, Shanghai 200062, China
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Les Keniston
2Department of Biomedical Sciences, Kentucky College of Osteopathic Medicine, University of Pikeville, Pikeville, Kentucky 41501
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Xiaoming Zhou
1Key Laboratory of Brain Functional Genomics (Ministry of Education and Shanghai), East China Normal University, Shanghai 200062, China
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Jiping Zhang
1Key Laboratory of Brain Functional Genomics (Ministry of Education and Shanghai), East China Normal University, Shanghai 200062, China
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Liping Yu
1Key Laboratory of Brain Functional Genomics (Ministry of Education and Shanghai), East China Normal University, Shanghai 200062, China
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF
Loading

Abstract

An increasing number of studies have shown that cross-modal interaction can occur in early sensory cortices. Yet, how neurons in sensory cortices integrate multisensory cues in perceptual tasks and to what extent this influences behavior is largely unclear. To investigate, we examined visual modulation of auditory responses in the primary auditory cortex (A1) in a two-alternative forced-choice task. During the task, male rats were required to make a behavioral choice based on the pure tone frequency (low vs high) of the self-triggered stimulus to get a water reward. The result showed that the presence of a noninformative visual cue did not uniformly influence auditory response, with frequently enhancing just one of them. Closely correlated with behavioral choice, the visual cue mainly enhanced responsiveness to the auditory cue indicating a movement direction contralateral to A1 being recorded. Operating in this fashion provided A1 neurons a superior capability to discriminate sound during multisensory trials. Concomitantly, behavioral data and decoding analysis revealed that visual cue presence could speed the process of sound discrimination. We also observed this differential multisensory integration effect in well-trained rats when tested with passive stimulation and under anesthesia, albeit to a much lesser extent. We did not see this differentially integrative effect while recording in A1 in another similar group of rats performing a free-choice task. These data suggest that auditory cortex can engage in meaningful audiovisual processing, and perceptual learning can modify its multisensory integration mechanism to meet task requirements.

SIGNIFICANCE STATEMENT In the natural environment, visual stimuli are frequently accompanied by auditory cues. Although multisensory integration has traditionally been seen as a feature of associational cortices, recent studies have shown that cross-modal inputs can also influence neuronal activity in primary sensory cortices. However, exactly how neurons in sensory cortices integrate multisensory cues to guide behavioral choice is still unclear. Here, we describe a novel model of multisensory integration used by A1 neurons to shape auditory representations when rats performed a cue-guided task. We found that a task-irrelevant visual cue could specifically enhance the response of neurons in sound guiding to the contralateral choice. This differentially integrative model facilitated sound discrimination and behavioral choice. This result indicates that task engagement can modulate multisensory integration.

  • auditory cortex
  • behavioral task
  • cross-modal interaction
  • multisensory integration
  • task engagement
  • visual

SfN exclusive license.

View Full Text

Member Log In

Log in using your username and password

Enter your Journal of Neuroscience username.
Enter the password that accompanies your username.
Forgot your user name or password?

Purchase access

You may purchase access to this article. This will require you to create an account if you don't already have one.
Back to top

In this issue

The Journal of Neuroscience: 42 (45)
Journal of Neuroscience
Vol. 42, Issue 45
9 Nov 2022
  • Table of Contents
  • Table of Contents (PDF)
  • About the Cover
  • Index by author
  • Masthead (PDF)
Email

Thank you for sharing this Journal of Neuroscience article.

NOTE: We request your email address only to inform the recipient that it was you who recommended this article, and that it is not junk mail. We do not retain these email addresses.

Enter multiple addresses on separate lines or separate them with commas.
Integrating Visual Information into the Auditory Cortex Promotes Sound Discrimination through Choice-Related Multisensory Integration
(Your Name) has forwarded a page to you from Journal of Neuroscience
(Your Name) thought you would be interested in this article in Journal of Neuroscience.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Print
View Full Page PDF
Citation Tools
Integrating Visual Information into the Auditory Cortex Promotes Sound Discrimination through Choice-Related Multisensory Integration
Song Chang, Jinghong Xu, Mengyao Zheng, Les Keniston, Xiaoming Zhou, Jiping Zhang, Liping Yu
Journal of Neuroscience 9 November 2022, 42 (45) 8556-8568; DOI: 10.1523/JNEUROSCI.0793-22.2022

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Respond to this article
Request Permissions
Share
Integrating Visual Information into the Auditory Cortex Promotes Sound Discrimination through Choice-Related Multisensory Integration
Song Chang, Jinghong Xu, Mengyao Zheng, Les Keniston, Xiaoming Zhou, Jiping Zhang, Liping Yu
Journal of Neuroscience 9 November 2022, 42 (45) 8556-8568; DOI: 10.1523/JNEUROSCI.0793-22.2022
Reddit logo Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • Introduction
    • Material and Methods
    • Results
    • Discussion
    • Footnotes
    • References
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF

Keywords

  • auditory cortex
  • behavioral task
  • cross-modal interaction
  • multisensory integration
  • task engagement
  • visual

Responses to this article

Respond to this article

Jump to comment:

No eLetters have been published for this article.

Related Articles

Cited By...

More in this TOC Section

Research Articles

  • GluA1 Degradation by Autophagy Contributes to Circadian Rhythm Effects on Cerebral Ischemia Injury
  • Severely Attenuated Visual Feedback Processing in Children on the Autism Spectrum
  • Neural Substrates of Body Ownership and Agency during Voluntary Movement
Show more Research Articles

Systems/Circuits

  • In Vivo Photoadduction of Anesthetic Ligands in Mouse Brain Markedly Extends Sedation and Hypnosis
  • Vestibular Contributions to Primate Neck Postural Muscle Activity during Natural Motion
  • Subgenual and hippocampal pathways in amygdala are set to balance affect and context processing
Show more Systems/Circuits
  • Home
  • Alerts
  • Visit Society for Neuroscience on Facebook
  • Follow Society for Neuroscience on Twitter
  • Follow Society for Neuroscience on LinkedIn
  • Visit Society for Neuroscience on Youtube
  • Follow our RSS feeds

Content

  • Early Release
  • Current Issue
  • Issue Archive
  • Collections

Information

  • For Authors
  • For Advertisers
  • For the Media
  • For Subscribers

About

  • About the Journal
  • Editorial Board
  • Privacy Policy
  • Contact
(JNeurosci logo)
(SfN logo)

Copyright © 2023 by the Society for Neuroscience.
JNeurosci Online ISSN: 1529-2401

The ideas and opinions expressed in JNeurosci do not necessarily reflect those of SfN or the JNeurosci Editorial Board. Publication of an advertisement or other product mention in JNeurosci should not be construed as an endorsement of the manufacturer’s claims. SfN does not assume any responsibility for any injury and/or damage to persons or property arising from or related to any use of any material contained in JNeurosci.