Skip to main content

Main menu

  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Collections
    • Podcast
  • ALERTS
  • FOR AUTHORS
    • Information for Authors
    • Fees
    • Journal Clubs
    • eLetters
    • Submit
  • EDITORIAL BOARD
  • ABOUT
    • Overview
    • Advertise
    • For the Media
    • Rights and Permissions
    • Privacy Policy
    • Feedback
  • SUBSCRIBE

User menu

  • Log in
  • My Cart

Search

  • Advanced search
Journal of Neuroscience
  • Log in
  • My Cart
Journal of Neuroscience

Advanced Search

Submit a Manuscript
  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Collections
    • Podcast
  • ALERTS
  • FOR AUTHORS
    • Information for Authors
    • Fees
    • Journal Clubs
    • eLetters
    • Submit
  • EDITORIAL BOARD
  • ABOUT
    • Overview
    • Advertise
    • For the Media
    • Rights and Permissions
    • Privacy Policy
    • Feedback
  • SUBSCRIBE
PreviousNext
Journal Club

Does Auditory Cortex Code Temporal Information from Acoustic and Cochlear Implant Stimulation in a Similar Way?

Charlotte Amalie Navntoft and Victor Adenis
Journal of Neuroscience 10 January 2018, 38 (2) 260-262; DOI: https://doi.org/10.1523/JNEUROSCI.2774-17.2017
Charlotte Amalie Navntoft
1Department of Biomedicine, Basel University, 4056 Basel, Switzerland, and
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Charlotte Amalie Navntoft
Victor Adenis
2Paris-Saclay Institute of Neuroscience, Université Paris-Sud, 91405 Orsay Cedex, France
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Info & Metrics
  • eLetters
  • PDF
Loading

A fundamental task of the auditory system is to encode time-varying sounds in the environment around us. For instance, when clicks are repeated in a train, we can discern each click as long as the presentation rate is below a certain threshold. Above that threshold, we perceive a continuous sound like a buzz, hum, or pitch. Along the auditory pathway, neurons progressively lose their temporal fidelity: the auditory nerve can phase lock or synchronize to stimulus presentation rates >1 kHz, but the synchronization rate of neurons in auditory cortex falls to ∼30–50 Hz. Auditory cortex therefore uses a combination of stimulus-synchronized and nonsynchronized (NS) population responses to encode temporal information with nonsynchronized responses that increase the spike rate (a rate code) with temporal features that vary too fast to be represented by synchronized responses (a temporal code; Lu et al., 2001; Joris et al., 2004). Why we perceive slow, repeating acoustic events (<40 Hz), such as an idling engine, as distinct events and fast ones, such as a revving motorbike, as a continuous sound is thought to be a direct result of how the signals are represented by synchronized and nonsynchronized cortical neurons, respectively (Bendor and Wang, 2007).

Cochlear implants (CIs) can provide temporal information about sound through modulated electric pulse trains delivered to the auditory nerve. However, only synchronized cortical responses to slowly repeating CI stimulation have been reported so far (Schreiner and Urbas, 1988; Middlebrooks, 2008). In a recent study in The Journal of Neuroscience, Johnson et al. (2017) endeavored to determine how rapid modulations in CI stimulation are represented in primary auditory cortex (A1) and how this coding scheme relates to acoustic sound. The authors implanted a CI in one ear of marmosets and recorded the spike activity of single neurons in both hemispheres in response to either CI or acoustic stimulation delivered in an alternating manner to the CI-implanted ear (right ear) or the intact ear (left ear). In this way, they could compare the response of each neuron to time-varying acoustic and CI stimuli in A1 of the same animal. The acoustic stimuli were short Gaussian click trains at various sound levels and presentation rates, and the CI stimuli were trains of electric pulses at varying current levels and repetition rates. Extensive effort was made to identify stimulus parameters (e.g., frequency/electrode position, sound/current level, stimulus rate) that could drive single-unit firing. The reported averaged response is therefore the sum of individual neurons responding to different preferred stimuli.

Johnson et al. (2017) report two major results. First, they found A1 populations that showed nonsynchronized firing in response to rapid CI stimulus trains. This has not been reported in previous CI physiology studies. The fact that they use an awake rather than a anesthetized animal as in earlier studies (Schreiner and Urbas, 1988; Middlebrooks, 2008), is the likely explanation for this finding. It is not surprising per se, but the absence of nonsynchronized populations in previous studies has puzzled the field because CI users can perceive pitch with an increasing repetition rate (described below). The second finding was that A1 neurons responded to time-varying acoustic and CI stimulation using the same coding scheme (Johnson et al., 2017). Across a wide range of temporal modulations, both stimulus modalities were represented by a combination of synchronized and nonsynchronized population. Interestingly, the distribution and the response boundaries (the presentation rate at which synchronized responses transition to nonsynchronized responses) for acoustic and CI stimuli were similar for each population type. Furthermore, there were no differences in best frequency or laminar distribution. Based on this, the authors propose that A1 neurons process temporal auditory information independently of the nature of the stimulus (acoustic or electric).

There are at least two issues with the authors' second interpretation. First, it is difficult to draw conclusions about the population level, because the reported averaged activity is the sum of individual neurons responding to different preferred stimuli. This is analogous to the sound of individual instruments playing their favorite piece of music rather than the output of a symphony orchestra playing the same notes. The problem with such “custom-made” stimulation schemes was recently highlighted in an optogenetic study, which showed that the conclusions we draw about the role of a single neuron can be sensitive to how it was manipulated by various experimental parameters (Phillips and Hasenstaub, 2016). To support the findings by Johnson et al. (2017); it would be useful to record responses to each stimulus from many neurons at the same time and determine whether the general population and the single-unit responses are complementary. The second issue is that Johnson et al. (2017) report that time-varying acoustic and CI input engage A1 neurons in a similar way in terms of, for example, population distribution, firing rate, and cortical depth. In contrast, using the same four CI-implanted animals, the authors previously found that the two modalities yielded distinct responses: CI stimulation did not activate A1 neurons as efficiently as acoustic stimuli; CI-responsive neurons had different frequency response areas than CI nonresponsive neurons; and CI nonresponsive neurons were actively suppressed, rather than simply not being activated by the CI (Johnson et al., 2016). Surprisingly, the authors do not link the two articles. One explanation could be that the article by Johnson et al. (2017) studies neurons that respond to the CI stimulation and not those that do not. Alternatively, that CI stimulation might engage A1 neurons differently when it comes to spectral (Johnson et al., 2016) versus temporal (Johnson et al., 2017) stimulation paradigms.

What is the neural mechanism underlying the generation of the synchronized and nonsynchronized responses and the transition from one to the other (at the temporal-to-rate response boundary)? And why would it be similar for acoustic and electric hearing? It is well established that a finely tuned balance between inhibition and excitation is needed to shape and refine cortical dynamics over time (Wehr and Zador, 2005). In computational models, Bendor (2015) and Gao et al. (2016) showed that strong excitation and delayed inhibition produced the synchronized responses, whereas weak net excitation due to concurrent excitation and inhibition could generate nonsynchronized responses when acoustic sound was played (Bendor, 2015; Gao et al., 2016). As acoustic and CI stimulation analogously engaged A1 neurons in the study by Johnson et al. (2017), it is likely that the brain uses a comparable excitatory–inhibitory interplay to interpret time-varying electric stimulation. Nonetheless, the role of inhibition in this process has so far not been demonstrated directly in vivo. Parvalbumin-positive interneurons are known for promoting temporal precision (Wehr and Zador, 2005). They might therefore be key controllers of, for instance, the temporal-to-rate code transition.

Across auditory, visual, and somatosensory systems, the neural coding and perceptual boundary for repetition rates generating a sensation of discrete and continuous events is ∼40–60 Hz. At least in the auditory system, this it thought to be a result of the temporal integration window of A1 neurons of 25 ms (Bendor and Wang, 2007). With slower repetition rates, only one event occurs within this window and can thus be represented by a phase-locking spike to each event. Faster repetition rates produce multiple events in the integration window and are consequently represented by the firing rate. Matching strategies for neural–perceptual coding across sensory systems is likely to be important for cross-modal integration, discrimination, and plasticity. For instance, subjects trained to discriminate tactile intervals also got better at discriminating sound intervals (Nagarajan et al. 1998). In light of this consistency across sensory systems, it makes sense that Johnson et al. (2017) would find similar coding schemes for acoustic and electric hearing.

People with normal hearing can perceive an increase in pitch with either an increase in presentation rate of short acoustic clicks (rate pitch) or by an increase in frequency (place pitch). The relative importance of rate and place pitch are nonetheless still debated because the rate of mechanical stimulation of the basilar membrane is strongly correlated with position. For instance, low presentation rates produce vibrations in the apical basilar membrane where low frequencies are encoded and high rates produce vibrations in the basal end where high frequencies are encoded. In this way, the upper limit of temporal pitch in people with normal hearing is per se the upper hearing range of 20 kHz. The two stimulus variables, rate and place, can, however, be controlled independently in a CI: different rates can be applied to different electrodes position along the basilar membrane. CI users can only perceive an increase in pitch with increasing stimulus rate up to 300 Hz, also known as the “300 Hz limit” (McKay et al., 1994). The neural basis for this ceiling effect has been unknown for decades. Interestingly, Johnson et al. (2017) observed that NS neuron firing increased with increasing CI stimulus rate and plateaued at ∼257 Hz. Combined with the fact that marmosets have similar pitch perception properties and organization of auditory cortex as humans (de la Mothe et al., 2006; Song et al., 2016), this led the authors to propose that NS population activity is a strong neural candidate to encode rate–pitch perception in CI users and to determine the long unexplained 300 Hz limit (Johnson et al., 2017). Nevertheless, recent literature suggests that there is no 300 Hz limit in CI users; instead the upper limit of pulse rate discrimination is dependent on the CI stimulation parameters used (Venter and Hanekom, 2014). For instance, stimulating auditory nerve fibers near the tip of the cochlea improved phase locking in cats (Middlebrooks and Snyder, 2010), and Macherey et al. (2011) showed that the upper limit of rate pitch in human CI users could be extended somewhat with asymmetric pulses compared with standard symmetric pulse shapes (Macherey et al., 2011). Also, stimulating multiple electrodes at the same time yields a better rate discrimination than stimulating a single electrode (Venter and Hanekom, 2014). Thus, if NS neurons are the neural basis of rate coding in the auditory cortex, any such stimulation means that elevation to the 300 Hz limit should be mirrored in elevated plateaus in NS neuron firing. Consequently, NS neurons could be an attractive target of future stimulus design based on a repetition rate that might lead to better pitch perception in human CI users.

Until the two above-mentioned issues have been investigated, concluding that auditory cortex encodes temporal information from acoustic and cochlear implant stimulation in a similar way seems premature. Nonetheless, Johnson et al. (2017) provide valuable insight into how the brain processes temporal information from a CI and acoustic sound at the single-neuron level. Indeed, a dialogue between animal research and human psychophysics is needed to optimize the future design of CI processing strategies and eventually to improve the perception of speech and music in human CI users.

Footnotes

  • Editor's Note: These short reviews of recent JNeurosci articles, written exclusively by students or postdoctoral fellows, summarize the important findings of the paper and provide additional insight and commentary. If the authors of the highlighted article have written a response to the Journal Club, the response can be found by viewing the Journal Club at www.jneurosci.org. For more information on the format, review process, and purpose of Journal Club articles, please see http://jneurosci.org/content/preparing-manuscript#journalclub.

  • We thank Eva Meier Carlsen and Jean-Marc Edeline for critically reading the manuscript and providing helpful comments.

  • Correspondence should be addressed to Charlotte Amalie Navntoft, Department of Biomedicine, Basel University, Klingelbergstrasse 50–70, Room 7014, 4056 Basel, Switzerland. charlotte.navntoft{at}unibas.ch

References

  1. ↵
    1. Bendor D
    (2015) The role of inhibition in a computational model of an auditory cortical neuron during the encoding of temporal information. PLoS Comput Biol 11:e1004197. doi:10.1371/journal.pcbi.1004197 pmid:25879843
    OpenUrlCrossRefPubMed
  2. ↵
    1. Bendor D,
    2. Wang X
    (2007) Differential neural coding of acoustiv flutter within primate auditory cortex. Nat Neurosci 10:763–771. doi:10.1038/nn1888 pmid:17468752
    OpenUrlCrossRefPubMed
  3. ↵
    1. de la Mothe LA,
    2. Blumell S,
    3. Kajikawa Y,
    4. Hackett TA
    (2006) Corticalconnections of the auditory cortex in marmoset monkeys: core and medial belt regions. J Comp Neurol 496:27–71. doi:10.1002/cne.20923 pmid:16528722
    OpenUrlCrossRefPubMed
  4. ↵
    1. Gao L,
    2. Kostlan K,
    3. Wang Y,
    4. Wang X
    (2016) Distinct subthreshold mechanisms underlying rate-coding principles in primate auditory cortex. Neuron 91:905–919. doi:10.1016/j.neuron.2016.07.004 pmid:27478016
    OpenUrlCrossRefPubMed
  5. ↵
    1. Johnson LA,
    2. Della Santina CC,
    3. Wang X
    (2016) Selective neuronal activation by cochlear implant stimulation in auditory cortex of awake primate. J Neurosci 36:12468–12484. doi:10.1523/JNEUROSCI.1699-16.2016 pmid:27927962
    OpenUrlAbstract/FREE Full Text
  6. ↵
    1. Johnson LA,
    2. Della Santina CC,
    3. Wang X
    (2017) Representations of time-varying cochlear implant stimulation in auditory cortex of awake marmosets (Callithrix jacchus). J Neurosci 37:7008–7022. doi:10.1523/JNEUROSCI.0093-17.2017 pmid:28634306
    OpenUrlAbstract/FREE Full Text
  7. ↵
    1. Joris PX,
    2. Schreiner CE,
    3. Rees A
    (2004) Neural processing of amplitude-modulated sounds. Physiol Rev 84:541–577. doi:10.1152/physrev.00029.2003 pmid:15044682
    OpenUrlCrossRefPubMed
  8. ↵
    1. Lu T,
    2. Liang L,
    3. Wang X
    (2001) Temporal and rate representations of time-varying signals in ratecoding principles in primate Neurosci 4:1131–1138. doi:10.1038/nn737 pmid:11593234
    OpenUrlCrossRefPubMed
  9. ↵
    1. Macherey O,
    2. Deeks JM,
    3. Carlyon RP
    (2011) Extending the limits of place and temporal pitch perception in cochlear implant users. J Assoc Res Otolaryngol 12:233–251. doi:10.1007/s10162-010-0248-x pmid:21116672
    OpenUrlCrossRefPubMed
  10. ↵
    1. McKay CM,
    2. McDermott HJ,
    3. Clark GM
    (1994) Pitch percepts associated with amplitude-modulated current pulse trains in cochlear implantees. J Acoust Soc Am 96:2664–2673. doi:10.1121/1.411377 pmid:7983272
    OpenUrlCrossRefPubMed
  11. ↵
    1. Middlebrooks JC
    (2008) Auditory cortex phase locking to amplitude-modulated cochlear implant pulse trains. J Neurophysiol 100:76–91. doi:10.1152/jn.01109.2007 pmid:18367697
    OpenUrlCrossRefPubMed
  12. ↵
    1. Middlebrooks JC,
    2. Snyder RL
    (2010) Selective electrical stimulation of the auditory nerve activates a pathway specialized for high temporal acuity. J Neurosci 30:1937–1946. doi:10.1523/JNEUROSCI.4949-09.2010 pmid:20130202
    OpenUrlAbstract/FREE Full Text
  13. ↵
    1. Nagarajan SS,
    2. Blake DT,
    3. Wright BA,
    4. Byl N,
    5. Merzenich MM
    (1998) Practice-related improvements in somatosensory interval discrimination are temporally specific but generalize across skin location, hemisphere, and modality. J Neurosci 18:1559–1570. pmid:9454861
    OpenUrlAbstract/FREE Full Text
  14. ↵
    1. Phillips EA,
    2. Hasenstaub AR
    (2016) Asymmetric effects of activating and inactivating cortical interneurons. Elife 5:e18383. doi:10.7554/eLife.18383 pmid:27719761
    OpenUrlCrossRefPubMed
  15. ↵
    1. Schreiner CE,
    2. Urbas JV
    (1988) Representation of amplitude modulation in the auditory cortex of the cat. II. Comparison between cortical fields. Hear Res 32:49–63. pmid:3350774
    OpenUrlCrossRefPubMed
  16. ↵
    1. Song X,
    2. Osmanski MS,
    3. Guo Y,
    4. Wang X
    (2016) Complex pitch perception mechanisms are shared by humans and a New World monkey. Proc Natl Acad Sci U S A 113:781–786. doi:10.1073/pnas.1516120113 pmid:26712015
    OpenUrlAbstract/FREE Full Text
  17. ↵
    1. Venter PJ,
    2. Hanekom JJ
    (2014) Is there a fundamental 300 Hz limit to pulse rate discrimination in cochlear implants? J Assoc Res Otolaryngol 15:849–866. doi:10.1007/s10162-014-0468-6 pmid:24942704
    OpenUrlCrossRefPubMed
  18. ↵
    1. Wehr M,
    2. Zador AM
    (2005) Synaptic mechanisms of forward suppression in rat auditory cortex. Neuron 47:437–445. doi:10.1016/j.neuron.2005.06.009 pmid:16055066
    OpenUrlCrossRefPubMed
Back to top

In this issue

The Journal of Neuroscience: 38 (2)
Journal of Neuroscience
Vol. 38, Issue 2
10 Jan 2018
  • Table of Contents
  • Table of Contents (PDF)
  • About the Cover
  • Index by author
  • Advertising (PDF)
  • Ed Board (PDF)
Email

Thank you for sharing this Journal of Neuroscience article.

NOTE: We request your email address only to inform the recipient that it was you who recommended this article, and that it is not junk mail. We do not retain these email addresses.

Enter multiple addresses on separate lines or separate them with commas.
Does Auditory Cortex Code Temporal Information from Acoustic and Cochlear Implant Stimulation in a Similar Way?
(Your Name) has forwarded a page to you from Journal of Neuroscience
(Your Name) thought you would be interested in this article in Journal of Neuroscience.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Print
View Full Page PDF
Citation Tools
Does Auditory Cortex Code Temporal Information from Acoustic and Cochlear Implant Stimulation in a Similar Way?
Charlotte Amalie Navntoft, Victor Adenis
Journal of Neuroscience 10 January 2018, 38 (2) 260-262; DOI: 10.1523/JNEUROSCI.2774-17.2017

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Respond to this article
Request Permissions
Share
Does Auditory Cortex Code Temporal Information from Acoustic and Cochlear Implant Stimulation in a Similar Way?
Charlotte Amalie Navntoft, Victor Adenis
Journal of Neuroscience 10 January 2018, 38 (2) 260-262; DOI: 10.1523/JNEUROSCI.2774-17.2017
Reddit logo Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Footnotes
    • References
  • Info & Metrics
  • eLetters
  • PDF

Responses to this article

Respond to this article

Jump to comment:

No eLetters have been published for this article.

Related Articles

Cited By...

More in this TOC Section

  • The Oscillatory Nature of Movement Initiation
  • Adult-Born Granule Cells Contribute to Dentate Gyrus Circuit Reorganization after Traumatic Brain Injury
  • Lateral Preoptic Hypothalamus: A Window to Understanding Insomnia
Show more Journal Club
  • Home
  • Alerts
  • Visit Society for Neuroscience on Facebook
  • Follow Society for Neuroscience on Twitter
  • Follow Society for Neuroscience on LinkedIn
  • Visit Society for Neuroscience on Youtube
  • Follow our RSS feeds

Content

  • Early Release
  • Current Issue
  • Issue Archive
  • Collections

Information

  • For Authors
  • For Advertisers
  • For the Media
  • For Subscribers

About

  • About the Journal
  • Editorial Board
  • Privacy Policy
  • Contact
(JNeurosci logo)
(SfN logo)

Copyright © 2023 by the Society for Neuroscience.
JNeurosci Online ISSN: 1529-2401

The ideas and opinions expressed in JNeurosci do not necessarily reflect those of SfN or the JNeurosci Editorial Board. Publication of an advertisement or other product mention in JNeurosci should not be construed as an endorsement of the manufacturer’s claims. SfN does not assume any responsibility for any injury and/or damage to persons or property arising from or related to any use of any material contained in JNeurosci.