Skip to main content

Main menu

  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Collections
    • Podcast
  • ALERTS
  • FOR AUTHORS
    • Information for Authors
    • Fees
    • Journal Clubs
    • eLetters
    • Submit
  • EDITORIAL BOARD
  • ABOUT
    • Overview
    • Advertise
    • For the Media
    • Rights and Permissions
    • Privacy Policy
    • Feedback
  • SUBSCRIBE

User menu

  • Log in
  • My Cart

Search

  • Advanced search
Journal of Neuroscience
  • Log in
  • My Cart
Journal of Neuroscience

Advanced Search

Submit a Manuscript
  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Collections
    • Podcast
  • ALERTS
  • FOR AUTHORS
    • Information for Authors
    • Fees
    • Journal Clubs
    • eLetters
    • Submit
  • EDITORIAL BOARD
  • ABOUT
    • Overview
    • Advertise
    • For the Media
    • Rights and Permissions
    • Privacy Policy
    • Feedback
  • SUBSCRIBE
PreviousNext
Research Articles, Behavioral/Cognitive

Rapid Ocular Responses Are Modulated by Bottom-up-Driven Auditory Salience

Sijia Zhao, Nga Wai Yum, Lucas Benjamin, Elia Benhamou, Makoto Yoneya, Shigeto Furukawa, Fred Dick, Malcolm Slaney and Maria Chait
Journal of Neuroscience 25 September 2019, 39 (39) 7703-7714; DOI: https://doi.org/10.1523/JNEUROSCI.0776-19.2019
Sijia Zhao
1Ear Institute, University College London, London WC1X 8EE, United Kingdom,
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Sijia Zhao
Nga Wai Yum
1Ear Institute, University College London, London WC1X 8EE, United Kingdom,
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Lucas Benjamin
1Ear Institute, University College London, London WC1X 8EE, United Kingdom,
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Elia Benhamou
2Dementia Research Centre, Department of Neurodegenerative Disease, University College London, London WC1N 3AR, United Kingdom,
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Makoto Yoneya
3NTT Communication Science Laboratories, NTT Corporation, Atsugi 243-0198 Japan,
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Shigeto Furukawa
3NTT Communication Science Laboratories, NTT Corporation, Atsugi 243-0198 Japan,
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Fred Dick
4Department of Psychological Sciences, Birkbeck College, London WC1 7HX, United Kingdom,
5 Department of Experimental Psychology, University College London, London WC1H 0DS, United Kingdom, and
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Fred Dick
Malcolm Slaney
6Machine Hearing Research, Google, Mountain View, California 94043
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Malcolm Slaney
Maria Chait
1Ear Institute, University College London, London WC1X 8EE, United Kingdom,
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Maria Chait
  • Article
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF
Loading

Abstract

Despite the prevalent use of alerting sounds in alarms and human–machine interface systems and the long-hypothesized role of the auditory system as the brain's “early warning system,” we have only a rudimentary understanding of what determines auditory salience—the automatic attraction of attention by sound—and which brain mechanisms underlie this process. A major roadblock has been the lack of a robust, objective means of quantifying sound-driven attentional capture. Here we demonstrate that: (1) a reliable salience scale can be obtained from crowd-sourcing (N = 911), (2) acoustic roughness appears to be a driving feature behind this scaling, consistent with previous reports implicating roughness in the perceptual distinctiveness of sounds, and (3) crowd-sourced auditory salience correlates with objective autonomic measures. Specifically, we show that a salience ranking obtained from online raters correlated robustly with the superior colliculus-mediated ocular freezing response, microsaccadic inhibition (MSI), measured in naive, passively listening human participants (of either sex). More salient sounds evoked earlier and larger MSI, consistent with a faster orienting response. These results are consistent with the hypothesis that MSI reflects a general reorienting response that is evoked by potentially behaviorally important events regardless of their modality.

SIGNIFICANCE STATEMENT Microsaccades are small, rapid, fixational eye movements that are measurable with sensitive eye-tracking equipment. We reveal a novel, robust link between microsaccade dynamics and the subjective salience of brief sounds (salience rankings obtained from a large number of participants in an online experiment): Within 300 ms of sound onset, the eyes of naive, passively listening participants demonstrate different microsaccade patterns as a function of the sound's crowd-sourced salience. These results position the superior colliculus (hypothesized to underlie microsaccade generation) as an important brain area to investigate in the context of a putative multimodal salience hub. They also demonstrate an objective means for quantifying auditory salience.

  • attention
  • auditory scene analysis
  • microsaccades
  • pupil dilation
  • superior colliculus

This is an open-access article distributed under the terms of the Creative Commons Attribution License Creative Commons Attribution 4.0 International, which permits unrestricted use, distribution and reproduction in any medium provided that the original work is properly attributed.

View Full Text
Back to top

In this issue

The Journal of Neuroscience: 39 (39)
Journal of Neuroscience
Vol. 39, Issue 39
25 Sep 2019
  • Table of Contents
  • Table of Contents (PDF)
  • About the Cover
  • Index by author
  • Advertising (PDF)
  • Ed Board (PDF)
Email

Thank you for sharing this Journal of Neuroscience article.

NOTE: We request your email address only to inform the recipient that it was you who recommended this article, and that it is not junk mail. We do not retain these email addresses.

Enter multiple addresses on separate lines or separate them with commas.
Rapid Ocular Responses Are Modulated by Bottom-up-Driven Auditory Salience
(Your Name) has forwarded a page to you from Journal of Neuroscience
(Your Name) thought you would be interested in this article in Journal of Neuroscience.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Print
View Full Page PDF
Citation Tools
Rapid Ocular Responses Are Modulated by Bottom-up-Driven Auditory Salience
Sijia Zhao, Nga Wai Yum, Lucas Benjamin, Elia Benhamou, Makoto Yoneya, Shigeto Furukawa, Fred Dick, Malcolm Slaney, Maria Chait
Journal of Neuroscience 25 September 2019, 39 (39) 7703-7714; DOI: 10.1523/JNEUROSCI.0776-19.2019

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Respond to this article
Request Permissions
Share
Rapid Ocular Responses Are Modulated by Bottom-up-Driven Auditory Salience
Sijia Zhao, Nga Wai Yum, Lucas Benjamin, Elia Benhamou, Makoto Yoneya, Shigeto Furukawa, Fred Dick, Malcolm Slaney, Maria Chait
Journal of Neuroscience 25 September 2019, 39 (39) 7703-7714; DOI: 10.1523/JNEUROSCI.0776-19.2019
del.icio.us logo Digg logo Reddit logo Twitter logo Facebook logo Google logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • Introduction
    • Materials and Methods
    • Results
    • Discussion
    • Footnotes
    • References
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF

Keywords

  • attention
  • auditory scene analysis
  • microsaccades
  • pupil dilation
  • superior colliculus

Responses to this article

Respond to this article

Jump to comment:

No eLetters have been published for this article.

Related Articles

Cited By...

More in this TOC Section

Research Articles

  • Basolateral amygdala astrocytes are engaged by the acquisition and expression of a contextual fear memory
  • Restoration of sleep and circadian behavior by autophagy modulation in Huntington’s disease
  • The stria vascularis in mice and humans is an early site of age-related cochlear degeneration, macrophage dysfunction, and inflammation
Show more Research Articles

Behavioral/Cognitive

  • Learning a Model of Shape Selectivity in V4 Cells Reveals Shape Encoding Mechanisms in the Brain
  • A Fluid Self-Concept: How the Brain Maintains Coherence and Positivity across an Interconnected Self-Concept While Incorporating Social Feedback
  • A Texture Statistics Encoding Model Reveals Hierarchical Feature Selectivity across Human Visual Cortex
Show more Behavioral/Cognitive
  • Home
  • Alerts
  • Visit Society for Neuroscience on Facebook
  • Follow Society for Neuroscience on Twitter
  • Follow Society for Neuroscience on LinkedIn
  • Visit Society for Neuroscience on Youtube
  • Follow our RSS feeds

Content

  • Early Release
  • Current Issue
  • Issue Archive
  • Collections

Information

  • For Authors
  • For Advertisers
  • For the Media
  • For Subscribers

About

  • About the Journal
  • Editorial Board
  • Privacy Policy
  • Contact
(JNeurosci logo)
(SfN logo)

Copyright © 2023 by the Society for Neuroscience.
JNeurosci Online ISSN: 1529-2401

The ideas and opinions expressed in JNeurosci do not necessarily reflect those of SfN or the JNeurosci Editorial Board. Publication of an advertisement or other product mention in JNeurosci should not be construed as an endorsement of the manufacturer’s claims. SfN does not assume any responsibility for any injury and/or damage to persons or property arising from or related to any use of any material contained in JNeurosci.