Skip to main content

Main menu

  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Collections
    • Podcast
  • ALERTS
  • FOR AUTHORS
    • Information for Authors
    • Fees
    • Journal Clubs
    • eLetters
    • Submit
  • EDITORIAL BOARD
  • ABOUT
    • Overview
    • Advertise
    • For the Media
    • Rights and Permissions
    • Privacy Policy
    • Feedback
  • SUBSCRIBE

User menu

  • Log in
  • My Cart

Search

  • Advanced search
Journal of Neuroscience
  • Log in
  • My Cart
Journal of Neuroscience

Advanced Search

Submit a Manuscript
  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Collections
    • Podcast
  • ALERTS
  • FOR AUTHORS
    • Information for Authors
    • Fees
    • Journal Clubs
    • eLetters
    • Submit
  • EDITORIAL BOARD
  • ABOUT
    • Overview
    • Advertise
    • For the Media
    • Rights and Permissions
    • Privacy Policy
    • Feedback
  • SUBSCRIBE
PreviousNext
Journal Club

Delving Deep into Crossmodal Integration

N. Alex Cayco-Gajic and Yann Sweeney
Journal of Neuroscience 18 July 2018, 38 (29) 6442-6444; DOI: https://doi.org/10.1523/JNEUROSCI.0988-18.2018
N. Alex Cayco-Gajic
1 Department of Neuroscience, Physiology and Pharmacology, University College London, London WC1E 6BT, United Kingdom, and
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for N. Alex Cayco-Gajic
Yann Sweeney
2 Department of Bioengineering, Imperial College London, London SW7 2AZ, United Kingdom
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Yann Sweeney
  • Article
  • Info & Metrics
  • eLetters
  • PDF
Loading

“Don't you wonder sometimes

'Bout sound and vision?”

—David Bowie

To create a coherent representation of the world, the brain must consolidate information across different sensory modalities. This process, called multisensory integration, is key for the meaningful perception of objects and experiences (Maddox et al., 2015). Consider, for instance, how disconcerting it is to watch a film in which the audio and video are slightly out of sync. Traditionally, it was believed that information from multiple senses was integrated in higher cortical areas, after being processed independently in primary sensory areas. This view has recently been challenged by anatomical and functional evidence of crossmodal integration at even the earliest stages of cortical processing (Ghazanfar and Schroeder, 2006).

What is the computational advantage of multisensory enhancement in primary sensory cortex? Recent imaging studies in mouse visual cortex have shown that concurrent auditory stimuli can enhance visual coding by sharpening tuning and modulating firing rates (Ibrahim et al., 2016; Meijer et al., 2017). Moreover, activating auditory and somatosensory cortices elicit similar responses in visual cortex, indicating that the mechanism behind crossmodal integration may be broadly similar across non-primary modalities (Iurilli et al., 2012). There is also considerable evidence of visual and somaesthetic modulation of activity in auditory cortex (for review, see Ghazanfar and Schroeder, 2006). In this case, however, several basic questions remain unanswered, including: what nonauditory features are represented in auditory cortical neurons, how is that information integrated into local cortical circuits, and what effect does this have on auditory processing? To address these questions, and to understand the functional role of crossmodal integration more generally, further interrogation of the circuit mechanism is needed. In a recent article in The Journal of Neuroscience, Morrill and Hasenstaub (2018) took a step toward answering these questions by probing the laminar dependence of visual responsiveness in auditory cortex.

Morrill and Hasenstaub (2018) recorded extracellularly from the auditory cortex of awake mice while presenting either auditory (tone) or visual (flash) stimuli. They observed visually-evoked increases of firing rate in 58% of recordings, in both primary and secondary cortical areas, as judged by frequency tuning and auditory response latencies. The use of laminar probes allowed the authors to isolate the effect in different layers, revealing that the significant majority of visual responses occurred in infragranular layers, with minimal responses in L1–L4.

These findings are timely, as they allow direct comparison with several recent experiments that, by contrast, investigate auditory responses in visual cortex. This comparison reveals a functional asymmetry in audiovisual integration in visual and auditory areas. In mouse primary visual cortex, tones and bursts of noise elicit responses in supragranular as well as infragranular layers (but not in L4; Iurilli et al., 2012). In particular, up to 10% of L2/3 neurons respond to tone presentation alone (Meijer et al., 2017). Conversely, Morrill and Hasenstaub (2018) report that <1% of multiunits in L2/3 of auditory cortex were visually responsive.

The strong functional asymmetry between visual and auditory cortex likely stems from a difference in crossmodal input. To understand the source of this difference, it is necessary to identify the main pathways of visual information in auditory cortex. There are three possible pathways: top-down connections from higher-order multisensory areas, connections from thalamus (either visual or multisensory regions), and lateral connections from visual cortex. In rodents, anatomical connections have been observed from all three of these candidates (Ghazanfar and Schroeder, 2006; Banks et al., 2011; Tyll et al., 2011). In the case of lateral connections, there is a striking imbalance between auditory and visual corticocortical projections. For example, a recent tracing study in mice found that, despite significant projections from primary auditory cortex to primary visual cortex, projections in the reverse direction were absent (Ibrahim et al., 2016). Instead, auditory cortex receives input from secondary visual cortex (Banks et al., 2011). Even from these areas, however, an overall asymmetry is apparent. A quick calculation from the Allen Mouse Brain Connectivity Atlas reveals that auditory cortical regions send a greater fraction (by an order of magnitude) of their outgoing projections to visual regions than the converse (Oh et al., 2014). Moreover, the timing of visual responses in auditory cortex is not fast enough to implicate direct projections from early visual cortex as the predominant channel for visual information. For example, Iurilli et al. (2012) reported that activation of auditory cortex elicited responses in visual cortex with a latency of 6 ms. In contrast, Morrill and Hasenstaub (2018) measured visually evoked response latencies of multiunits in both auditory cortex (90 ms) and visual cortex (40 ms). This delay is considerably longer than expected for a monosynaptic connection, suggesting that visual information may be coming primarily from multisensory corticothalamic or higher cortical inputs, at least in mice.

To determine what kind of visual information is integrated in auditory cortex, Morrill and Hasenstaub (2018) repeated their recordings in auditory cortex while presenting drifting gratings of varying orientation. Visually responsive single units were significantly less orientation selective than units in visual cortex, suggesting that these units primarily signaled the timing and presence of a visual stimulus, as opposed to specific visual features. This finding supports the idea that timing is particularly important for crossmodal integration. Indeed, recent studies have demonstrated that temporally congruent auditory and visual stimuli (i.e., having the same temporal frequency) preferentially modulate activity in both ferret auditory cortex (Atilgan et al., 2018) and mouse visual cortex (Meijer et al., 2017) compared with incongruent stimuli. Furthermore, it has recently been demonstrated that projections from auditory cortex to primary visual cortex are dominated by neurons that encode the abrupt onset of sounds (Deneux et al., 2018). However, these recent studies contrast with classic electrophysiological studies, which found evidence of precise frequency and spatial information about auditory stimuli in the visual cortex of cats (Spinelli et al., 1968; Fishman and Michael, 1973). One explanation for this disparity may be the fact that cats have more advanced visual processing compared with rodents. Another possibility is that visual responses in mouse auditory cortex contain information about more complex visual stimuli than the gratings tested by Morrill and Hasenstaub (2018). This may be expected considering that mouse auditory cortex receives direct projections from secondary visual cortex (Banks et al., 2011). However, what visual features these regions represent in mice is unknown (Glickfeld and Olsen, 2017).

Finally, a key result of Morrill and Hasenstaub (2018) is that visual information in auditory cortex was almost exclusively found in infragranular layers, especially in L6. This finding shines a light onto the mysterious role of deep layer neurons. In comparison with their more superficial counterparts, less is known about how L6 neurons contribute to sensory processing. This is due in part to the technical difficulty of accessing deep layers, as well as to the heterogeneous morphologies and unusual response properties of these neurons. Previous work in primary auditory cortex of rats (Sakata and Harris, 2009; Zhou et al., 2010) and cats (Atencio et al., 2009) found that L6 pyramidal cells are less feature selective than cells in superficial layers with complex receptive fields and little stimulus information. These properties have made it difficult to understand the role of L6 neurons for representing auditory stimuli. Although there are likely to be cross-species differences, the findings of Morrill and Hasenstaub (2018) may explain these results by pointing to a more complex role for L6 beyond unimodal auditory processing.

The discovery of a subpopulation of visually-responsive cells in L6 suggests that this layer may serve as a gateway for contextual information from other modalities. Two recent studies in V1 further support this hypothesis. Vélez-Fort et al. (2018) found that L6 pyramidal cells could convey head velocity signals inherited via a direct connection from retrosplenial cortex. Similarly, Leinweber et al. (2017) found that L6 received predictive signals about expected visual flow from motor cortex. Morrill and Hasenstaub (2018) complement these studies by showing that audiovisual integration also takes place in L6 of auditory cortex. Crossmodal integration in L6 could therefore be used to control auditory processing based on nonauditory contextual signals, as L6 of visual cortex has previously been shown to perform gain control on superficial populations without changing their preferred orientation (Olsen et al., 2012). More recently, it has been shown that optogenetic activation of L6 of auditory cortex modulates auditory tuning, and that this could control a tradeoff between sound detection and discrimination performance (Guo et al., 2017). Importantly, this behavioral enhancement was highly dependent on the timing between sensory stimulation and L6 spiking. Combined with the results of Morrill and Hasenstaub (2018), this suggests that visual timing information in L6 may enhance auditory processing. Intriguingly, these studies all target the same Ntsr1-Cre transgenic mouse line, in which Cre-expression is limited to L6 corticothalamic neurons (Sundberg et al., 2018). These findings together suggest the possibility that a population of L6 pyramidal cells perform a crucial role by modulating early sensory processing to generate coherent sensory representations.

The recent burst of work on multisensory enhancement in sensory cortex and on L6 pyramidal cells make this an exciting time for unraveling the circuit mechanism underlying crossmodal integration. Morrill and Hasenstaub (2018) have made a key contribution by revealing the laminar specificity of visual information in auditory cortex. Future studies could help to tease apart the circuit mechanisms of crossmodal integration even further; for example, by dissecting the role of local deep-layer inhibitory circuits. New techniques for large-scale characterization of long-range projections will also clarify how crossmodal information is transmitted between regions (Han et al. 2018). Another open question is whether crossmodal signals can be enhanced by multimodal behavioral tasks. In particular, it would be valuable to investigate whether animals trained to detect specific audiovisual combinations develop tuned visual responses in auditory cortex. Evidence from sensory deprivation experiments hints that such a substrate exists for expressing crossmodal plasticity (Bavelier and Neville, 2002). Ultimately, determining the circuit mechanisms behind crossmodal integration will lead neuroscience further toward understanding naturalistic behavior in dynamic, multisensory environments.

Footnotes

  • Editor's Note: These short reviews of recent JNeurosci articles, written exclusively by students or postdoctoral fellows, summarize the important findings of the paper and provide additional insight and commentary. If the authors of the highlighted article have written a response to the Journal Club, the response can be found by viewing the Journal Club at www.jneurosci.org. For more information on the format, review process, and purpose of Journal Club articles, please see http://jneurosci.org/content/preparing-manuscript#journalclub.

  • The authors declare no competing financial interests.

  • Correspondence should be addressed to Dr. N. Alex Cayco-Gajic, Department of Neuroscience, Physiology and Pharmacology, University College London, Gower St, London WC1E 6BT, United Kingdom. natasha.gajic{at}ucl.ac.uk

This is an open-access article distributed under the terms of the Creative Commons Attribution License Creative Commons Attribution 4.0 International, which permits unrestricted use, distribution and reproduction in any medium provided that the original work is properly attributed.

References

  1. ↵
    1. Atencio CA,
    2. Sharpee TO,
    3. Schreiner CE
    (2009) Hierarchical computation in the canonical auditory cortical circuit. Proc Natl Acad Sci U S A 106:21894–21899. doi:10.1073/pnas.0908383106 pmid:19918079
    OpenUrlAbstract/FREE Full Text
  2. ↵
    1. Atilgan H,
    2. Town SM,
    3. Wood KC,
    4. Jones GP,
    5. Maddox RK,
    6. Lee AKC,
    7. Bizley JK
    (2018) Integration of visual information in auditory cortex promotes auditory scene analysis through multisensory binding. Neuron 97:640–655.e4. doi:10.1016/j.neuron.2017.12.034 pmid:29395914
    OpenUrlCrossRefPubMed
  3. ↵
    1. Banks MI,
    2. Uhlrich DJ,
    3. Smith PH,
    4. Krause BM,
    5. Manning KA
    (2011) Descending projections from extrastriate visual cortex modulate responses of cells in primary auditory cortex. Cereb Cortex 21:2620–2638. doi:10.1093/cercor/bhr048 pmid:21471557
    OpenUrlCrossRefPubMed
  4. ↵
    1. Bavelier D,
    2. Neville HJ
    (2002) Cross-modal plasticity: where and how? Nat Rev Neurosci 3:443–452. doi:10.1038/nrn848 pmid:12042879
    OpenUrlCrossRefPubMed
  5. ↵
    1. Deneux T,
    2. Kempf A,
    3. Bathellier B
    (2018) Context-dependent signaling of coincident auditory and visual events in primary visual cortex. BioRxiv. Advance online publication. Retrieved April 4, 2018. doi: 10.1101/258970.
    OpenUrlAbstract/FREE Full Text
  6. ↵
    1. Fishman MC,
    2. Michael P
    (1973) Integration of auditory information in the cat visual cortex. Vision Res 13:1415–1419. doi:10.1016/0042-6989(73)90002-3 pmid:4719075
    OpenUrlCrossRefPubMed
  7. ↵
    1. Ghazanfar AA,
    2. Schroeder CE
    (2006) Is neocortex essentially multisensory? Trends Cogn Sci 10:278–285. doi:10.1016/j.tics.2006.04.008 pmid:16713325
    OpenUrlCrossRefPubMed
  8. ↵
    1. Glickfeld LL,
    2. Olsen SR
    (2017) Higher-order areas of the mouse visual cortex. Annu Rev Vis Sci 3:251–273. doi:10.1146/annurev-vision-102016-061331 pmid:28746815
    OpenUrlCrossRefPubMed
  9. ↵
    1. Guo W,
    2. Clause AR,
    3. Barth-Maron A,
    4. Polley DB
    (2017) A corticothalamic circuit for dynamic switching between feature detection and discrimination. Neuron 95:180–194e5. doi:10.1016/j.neuron.2017.05.019 pmid:28625486
    OpenUrlCrossRefPubMed
  10. ↵
    1. Han Y,
    2. Kebschull JM,
    3. Campbell RA,
    4. Cowan D,
    5. Imhof F,
    6. Zador AM,
    7. Mrsic-Flogel TD
    (2018) The logic of single-cell projections form visual cortex. Nature 556:51–56. doi:10.1038/nature26159 pmid:29590093
    OpenUrlCrossRefPubMed
  11. ↵
    1. Ibrahim L,
    2. Mesik L,
    3. Ji XY,
    4. Fang Q,
    5. Li HF,
    6. Li YT,
    7. Zingg B,
    8. Zhang LI,
    9. Tao HW
    (2016) Cross-modality sharpening of visual cortical processing through layer-1-mediated inhibition and disinhibition. Neuron 89:1031–1045. doi:10.1016/j.neuron.2016.01.027 pmid:26898778
    OpenUrlCrossRefPubMed
  12. ↵
    1. Iurilli G,
    2. Ghezzi D,
    3. Olcese U,
    4. Lassi G,
    5. Nazzaro C,
    6. Tonini R,
    7. Tucci V,
    8. Benfenati F,
    9. Medini P
    (2012) Sound-driven synaptic inhibition in primary visual cortex. Neuron 73:814–828. doi:10.1016/j.neuron.2011.12.026 pmid:22365553
    OpenUrlCrossRefPubMed
  13. ↵
    1. Leinweber M,
    2. Ward DR,
    3. Sobczak JM,
    4. Attinger A,
    5. Keller GB
    (2017) A sensorimotor circuit in mouse cortex for visual flow predictions. Neuron 96:1204. doi:10.1016/j.neuron.2017.11.009 pmid:29216453
    OpenUrlCrossRefPubMed
  14. ↵
    1. Maddox RK,
    2. Atligan H,
    3. Bizley JK,
    4. Lee AK
    (2015) Auditory selective attention is enhanced by a task-irrelevant temporally coherent visual stimulus in human listeners. eLife 4:e04995. doi:10.7554/eLife.04995 pmid:25654748
    OpenUrlCrossRefPubMed
  15. ↵
    1. Meijer GT,
    2. Montijn JS,
    3. Pennartz CM,
    4. Lansink CS
    (2017) Audiovisual modulation in mouse primary visual cortex depends on cross-modal stimulus configuration and congruency. J Neurosci 37:8783–8796. doi:10.1523/JNEUROSCI.0468-17.2017 pmid:28821672
    OpenUrlAbstract/FREE Full Text
  16. ↵
    1. Morrill RJ,
    2. Hasenstaub AR
    (2018) Visual information present in infragranular layers of mouse auditory cortex. J Neurosci 38:2854–2862. doi:10.1523/JNEUROSCI.3102-17.2018 pmid:29440554
    OpenUrlAbstract/FREE Full Text
  17. ↵
    1. Oh SW,
    2. Harris JA,
    3. Ng L,
    4. Winslow B,
    5. Cain N,
    6. Mihalas S,
    7. Wang Q,
    8. Lau C,
    9. Kuan L,
    10. Henry AM,
    11. Mortrud MT,
    12. Ouellette B,
    13. Nguyen TN,
    14. Sorensen SA,
    15. Slaughterbeck CR,
    16. Wakeman W,
    17. Li Y,
    18. Feng D,
    19. Ho A,
    20. Nicholas E, et al
    . (2014) A mesoscale connectome of the mouse brain. Nature 508:207–214. doi:10.1038/nature13186 pmid:24695228
    OpenUrlCrossRefPubMed
  18. ↵
    1. Olsen SR,
    2. Bortone DS,
    3. Adesnik H,
    4. Scanziani M
    (2012) Gain control by layer six in cortical circuits of vision. Nature 483:47–52. doi:10.1038/nature10835 pmid:22367547
    OpenUrlCrossRefPubMed
  19. ↵
    1. Sakata S,
    2. Harris KD
    (2009) Laminar structure of spontaneous and sensory-evoked population activity in auditory cortex. Neuron 64:404–418. doi:10.1016/j.neuron.2009.09.020 pmid:19914188
    OpenUrlCrossRefPubMed
  20. ↵
    1. Spinelli DN,
    2. Starr A,
    3. Barrett TW
    (1968) Auditory specificity in unit recordings from cat's visual cortex. Exp Neurol 22:75–84. doi:10.1016/0014-4886(68)90020-4 pmid:5687686
    OpenUrlCrossRefPubMed
  21. ↵
    1. Sundberg SC,
    2. Lindström SH,
    3. Sanchez GM,
    4. Granseth B
    (2018) Cre-expressing neurons in visual cortex of Ntsr1-cre GN220 mice are corticothalamic and are depolarized by acetylcholine. J Comp Neurol 526:120–132. doi:10.1002/cne.24323 pmid:28884467
    OpenUrlCrossRefPubMed
  22. ↵
    1. Tyll S,
    2. Budinger E,
    3. Noesselt T
    (2011) Thalamic influences on multisensory integration. Commun Integr Biol 4:378–381. doi:10.4161/cib.15222 pmid:21966551
    OpenUrlCrossRefPubMed
  23. ↵
    1. Vélez-Fort M,
    2. Bracey EF,
    3. Keshavarzi S,
    4. Rousseau CV,
    5. Cossell L,
    6. Lenzi SC,
    7. Strom M,
    8. Margrie TW
    (2018) A circuit for integration of head- and visual-motion signals in layer 6 of mouse primary visual cortex. Neuron 98:179–191.e6. doi:10.1016/j.neuron.2018.02.023 pmid:29551490
    OpenUrlCrossRefPubMed
  24. ↵
    1. Zhou Y,
    2. Liu BH,
    3. Wu GK,
    4. Kim YJ,
    5. Xiao Z,
    6. Tao HW,
    7. Zhang LI
    (2010) Preceding inhibition silences layer 6 neurons in auditory cortex. Neuron 65:706–717. doi:10.1016/j.neuron.2010.02.021 pmid:20223205
    OpenUrlCrossRefPubMed
Back to top

In this issue

The Journal of Neuroscience: 38 (29)
Journal of Neuroscience
Vol. 38, Issue 29
18 Jul 2018
  • Table of Contents
  • Table of Contents (PDF)
  • About the Cover
  • Index by author
  • Advertising (PDF)
  • Ed Board (PDF)
Email

Thank you for sharing this Journal of Neuroscience article.

NOTE: We request your email address only to inform the recipient that it was you who recommended this article, and that it is not junk mail. We do not retain these email addresses.

Enter multiple addresses on separate lines or separate them with commas.
Delving Deep into Crossmodal Integration
(Your Name) has forwarded a page to you from Journal of Neuroscience
(Your Name) thought you would be interested in this article in Journal of Neuroscience.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Print
View Full Page PDF
Citation Tools
Delving Deep into Crossmodal Integration
N. Alex Cayco-Gajic, Yann Sweeney
Journal of Neuroscience 18 July 2018, 38 (29) 6442-6444; DOI: 10.1523/JNEUROSCI.0988-18.2018

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Respond to this article
Request Permissions
Share
Delving Deep into Crossmodal Integration
N. Alex Cayco-Gajic, Yann Sweeney
Journal of Neuroscience 18 July 2018, 38 (29) 6442-6444; DOI: 10.1523/JNEUROSCI.0988-18.2018
del.icio.us logo Digg logo Reddit logo Twitter logo Facebook logo Google logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Footnotes
    • References
  • Info & Metrics
  • eLetters
  • PDF

Responses to this article

Respond to this article

Jump to comment:

No eLetters have been published for this article.

Related Articles

Cited By...

More in this TOC Section

  • Mixed Excitatory and Inhibitory Projections from the Basolateral Amygdala to the Mediodorsal Thalamic Nucleus
  • Selective and Systems-Level Face Processing Impairments in ASD
  • Transplanted Astrocytes Show Functional Flexibility in the Recipient Brain
Show more Journal Club
  • Home
  • Alerts
  • Visit Society for Neuroscience on Facebook
  • Follow Society for Neuroscience on Twitter
  • Follow Society for Neuroscience on LinkedIn
  • Visit Society for Neuroscience on Youtube
  • Follow our RSS feeds

Content

  • Early Release
  • Current Issue
  • Issue Archive
  • Collections

Information

  • For Authors
  • For Advertisers
  • For the Media
  • For Subscribers

About

  • About the Journal
  • Editorial Board
  • Privacy Policy
  • Contact
(JNeurosci logo)
(SfN logo)

Copyright © 2023 by the Society for Neuroscience.
JNeurosci Online ISSN: 1529-2401

The ideas and opinions expressed in JNeurosci do not necessarily reflect those of SfN or the JNeurosci Editorial Board. Publication of an advertisement or other product mention in JNeurosci should not be construed as an endorsement of the manufacturer’s claims. SfN does not assume any responsibility for any injury and/or damage to persons or property arising from or related to any use of any material contained in JNeurosci.