Skip to main content

Main menu

  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Collections
    • Podcast
  • ALERTS
  • FOR AUTHORS
    • Information for Authors
    • Fees
    • Journal Clubs
    • eLetters
    • Submit
  • EDITORIAL BOARD
  • ABOUT
    • Overview
    • Advertise
    • For the Media
    • Rights and Permissions
    • Privacy Policy
    • Feedback
  • SUBSCRIBE

User menu

  • Log out
  • Log in
  • My Cart

Search

  • Advanced search
Journal of Neuroscience
  • Log out
  • Log in
  • My Cart
Journal of Neuroscience

Advanced Search

Submit a Manuscript
  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Collections
    • Podcast
  • ALERTS
  • FOR AUTHORS
    • Information for Authors
    • Fees
    • Journal Clubs
    • eLetters
    • Submit
  • EDITORIAL BOARD
  • ABOUT
    • Overview
    • Advertise
    • For the Media
    • Rights and Permissions
    • Privacy Policy
    • Feedback
  • SUBSCRIBE
PreviousNext
Research Articles, Systems/Circuits

Learning-Induced Odor Modulation of Neuronal Activity in Auditory Cortex

Omri David Gilday and Adi Mizrahi
Journal of Neuroscience 22 February 2023, 43 (8) 1375-1386; DOI: https://doi.org/10.1523/JNEUROSCI.1398-22.2022
Omri David Gilday
1The Edmond and Lily Safra Center for Brain Sciences, The Hebrew University of Jerusalem, Jerusalem 91904, Israel,
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Adi Mizrahi
1The Edmond and Lily Safra Center for Brain Sciences, The Hebrew University of Jerusalem, Jerusalem 91904, Israel,
2Department of Neurobiology, The Hebrew University of Jerusalem, Jerusalem, 91904, Israel
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF
Loading

Abstract

Sensory cortices, even of primary regions, are not purely unisensory. Rather, cortical neurons in sensory cortex show various forms of multisensory interactions. While some multisensory interactions naturally co-occur, the combination of others will co-occur through experience. In real life, learning and experience will result in conjunction with seemingly disparate sensory information that ultimately becomes behaviorally relevant, impacting perception, cognition, and action. Here we describe a novel auditory discrimination task in mice, designed to manipulate the expectation of upcoming trials using olfactory cues. We show that, after learning, female mice display a transient period of several days during which they exploit odor-mediated expectations for making correct decisions. Using two-photon calcium imaging of single neurons in auditory cortex (ACx) during behavior, we found that the behavioral effects of odor-mediated expectations are accompanied by an odor-induced modulation of neuronal activity. Further, we find that these effects are manifested differentially, based on the response preference of individual cells. A significant portion of effects, but not all, are consistent with a predictive coding framework. Our data show that learning novel odor–sound associations evoke changes in ACx. We suggest that behaviorally relevant multisensory environments mediate contextual effects as early as ACx.

SIGNIFICANCE STATEMENT Natural environments are composed of multisensory objects. It remains unclear whether and how animals learn the regularities of congruent multisensory associations and how these may impact behavior and neural activity. We tested how learned odor–sound associations affected single-neuron responses in auditory cortex. We introduce a novel auditory discrimination task for mice in which odors set different contexts of expectation to upcoming trials. We show that, although the task can be solved purely by sounds, odor-mediated expectation impacts performance. We further show that odors cause a modulation of neuronal activity in auditory cortex, which is correlated with behavior. These results suggest that learning prompts an interaction of odor and sound information as early as sensory cortex.

  • auditory cortex
  • behavior
  • expectation
  • multisensory
  • two-photon

Introduction

Natural environments are inherently multisensory. By continuous exposure to natural scenes, our senses are constantly bombarded by stimuli from several modalities. By repeated exposure or learning, multisensory information becomes behaviorally meaningful, whereas one stimulus provides contextual information to another. It is, therefore, not surprising that nervous systems of creatures as simple as insects and as complex as humans are endowed with mechanisms that integrate information from several senses, and are well adapted to perceive multisensory scenes (Stein and Meredith, 1993; Leonard and Masek, 2014; Currier and Nagel, 2020). How neural circuits compute and integrate information from the various senses in support of behavior remains a fundamental, yet underexplored, topic in neuroscience.

Multisensory (or cross-modal) interactions are evident in our everyday perception. Taste, for example, emerges from the synthesis of gustatory, olfactory, tactile, and visual cues (Spence, 2013), and speech perception is highly influenced by concomitantly observing lip movements (McGurk and MacDonald, 1976). Moreover, the benefits of cross-modal interactions extend beyond proper perception of multisensory objects. In some cases, associative information from one modality might generate context and, thereby, modulate perception by the way an animal attends, expects, and interprets information from other modalities. Odors, for example, are a prime case for such modulation because olfactory signals can provide strong contextual cues of the current state of the environment. For example, scent markings are used as social communication signals to mark a territory or provide a signature of social dominance (Arakawa et al., 2008). An animal passing by an environment sprayed with the smell of a predator will become differentially attentive to environmental sounds to avoid predation. Similarly, conspecific urine smell conveys information on reproductive state and has been shown to impact reproductive success (Thonhauser et al., 2013; Coombes et al., 2018). Odors are potent contextual cues likely because they remain in the environment for relatively long periods of time (Doty, 1986), and since the activation of olfactory circuits is directly linked to the limbic system (Sokolowski and Corbin, 2012). In humans, odors have also been shown to induce context-dependent effects on long-term memory (Willander and Larsson, 2007).

Odors have been shown to have contextual effects that impact the auditory modality. Parental behavior toward newborns is one such example. In rodents, pup odors play a role in a multisensory behavior called “pup retrieval.” Pups emit ultrasonic vocalizations (USVs) when they are isolated from the nest. As a result, the mother retrieves the pups back to the nest (Ehret, 2005; Elyada and Mizrahi, 2015). This behavior is multisensory in nature. When pup vocalizations are synthetically played from a speaker, mothers approach the speaker primarily when it is in the presence of pup odor, suggesting that both odors and sounds are important for this behavior (Okabe et al., 2013). Moreover, responses of single neurons to USVs and other sounds in the mouse auditory cortex (ACx) have been shown to be modulated by the presence of pup odors (Cohen et al., 2011). Predator odors, too, have been shown to modulate neuronal responses to auditory stimuli in mice (Halene et al., 2009).

The above-mentioned examples of odors acting as context for other senses still remain underexplored. Moreover, they are often restricted to behaviors and/or odors with innate valence or that naturally cooccur with other sensory stimuli as multisensory objects. It remains unclear whether and how animals learn the statistical dependencies of odors and sounds in the environment and how such associations may impact behavior. Even less is known about how odors change the underlying neural activity of the relevant circuits. To that end, we designed a sound discrimination task using odors as contextual expectation cues for trial identity. We hypothesized that different behavioral contexts associated with odors will result in the modulation of behavior and neuronal activity in the first cortical station of sound processing, the ACx. Our results show that indeed learned odor–sound associations modulate behavioral choices, which are correlated with changes in single-neuron responses in ACx. The nature of modulation varied among neurons and was partially consistent with the theory of predictive coding.

Materials and Methods

Surgical procedures and imaging.

All experimental procedures were conducted in accordance with the Hebrew University Animal Care and Use Committee. We used a virus expressing a calcium indicator in the cytoplasm and a red fluorescent protein in the nucleus (AAV9-hsyn-GCaMP6s-P2A-nls-dTomato) that was produced at The Edmond and Lily Safra Center for Brain Sciences virus core facility (https://elsc.huji.ac.il/research-and-facilities/expertise-centers/elsc-vector-core-facility/) and was assessed at a titer of 1012. The virus (200 nl) was injected using NanoJect 2 to the ACx of the left hemisphere of female juvenile BALB/C mice [age range, postanatal day 21 (P21) to P24]. The injection site was sealed with bonewax. During the same procedure, a head bar was fixed to the top of the skull using dental cement. A 3 mm chronic glass window was implanted above the injection site according to published protocols (Goldey et al., 2014) at 21–28 d after virus injection. Both procedures were performed under 2% isoflurane anesthesia. The hair was initially removed from the surgery area using a commercial hair removal cream and rinsed with rubbing alcohol. Lidocaine was injected under the skin as local analgesic. Mice were injected subcutaneously with carprofen (4 mg/kg) after each procedure.

Sound presentation.

Pure-tone sound stimuli were 100 ms in duration and presented through a free field speaker (model ES1, TDT) positioned 5 cm from the right ear of the animals. The speaker was driven at a 500 kHz sampling rate via a driver (model ED1, TDT). Sound intensity was calibrated to 75 ± 2 dB SPL for all presented sound frequencies.

Odor presentation.

We used the odors ethyl-butyrate, α-pinine, and isoamyl-acetate (Sigma-Aldrich) and diluted them with mineral oil to an equal vapor pressure of 10 ppm. These odors have been shown to have neutral valence (Root et al., 2014). Odors were delivered to the snout of the mouse through a custom-built olfactometer, at a flow rate of 0.1 L/m using a mass flow controller (Vinograd et al., 2017). Odors were continuously removed around the headspace by air suction. Odor delivery was calibrated with a photoionization detector (miniPID, Aurora Scientific) to verify that no odor trace remained in subsequent trials. Odors were still present and overlapped completely during the presentation of sounds.

Intrinsic signal imaging.

To identify the primary auditory cortex, we imaged the brain at low resolution using a PhotonFocus CMOS camera, while directly illuminating its surface with an LED light (wavelength, 617 nm). We played 10 repetitions of tone clouds with 2 s duration and a center frequency of 4, 7, 13, or 24 kHz, consisting of 30 tones with 50 ms duration logarithmically spaced between ±10% of the center frequency.

Two-photon calcium imaging.

We imaged GCaMP6s-labeled neurons in layer 2/3 for 28 training sessions in four mice (of the five mice trained behaviorally) using a custom-built (Flickinger et al., 2010) galvo-mirror scanning two-photon microscope with a frame rate of 7.2 Hz. Two-photon excitation (950 nm) was delivered through a DeepSee femtosecond laser (Mai Tai, SpectraPhysics). Imaging was performed through a water-immersion objective (0.8 numerical aperture; model CF175, Nikon) and detected through GaAsP Photomultiplier Tubes (Hamamatsu). The imaging field size was set to 260 × 260 μm over a 512 × 210 pixel window. We used Scanimage (Pologruto et al., 2003) software for acquisition and online drift correction (using the red channel).

Sound selection.

Imaging fields were first selected based on the optical quality of imaging. Once chosen, for every mouse, we played a series of pure tones of frequencies ranging from 4 to 24 kHz and recorded neuronal responses from that field. Then, we chose two frequencies 0.5–1 octave apart such that a majority of neurons in the field were responsive to one or both frequencies.

Behavioral timeline.

Mice were initially trained on a Go-NoGo task without olfactory cues. Hit trials were rewarded with 5–10 μl of sweetened water (5% sucrose). Reward was delayed for at least 1 s after stimulus onset to allow for sufficient neuronal recording time not driven by possible responses to the reward. False alarm (FA) trials were punished by 2 s of white noise, without delay. After mice reached high performance in the task (d′ > 2), either the Go or NoGo frequencies were shifted closer to the other frequency to increase task difficulty. We shifted the frequencies daily while continuing to probe the mouse performance. When performance stabilized at a moderate level (1 < d′ < 2; see Behavioral analysis), we defined that we reached the level of task difficulty that may benefit from adding additional information to the task. Only after reaching this stage, odors were added. Table 1 shows the final frequencies used as Go and NoGo for each mouse.

View this table:
  • View inline
  • View popup
Table 1

The final frequencies used as Go and NoGo for each mouse

Mice initially tended to perform worse after the addition of odors and gradually improved. The first session when their performance exceeded a d′ value of 1 was considered the pre-odor bias session. From the next session onward, odor–sound probabilities were changed, as indicated in Table 2.

View this table:
  • View inline
  • View popup
Table 2

The odor–sound probabilities used in the experiments

We used a mild coupling between odor and sound to prevent mice from ignoring sounds and using odors instead. The identities of the three odors were different for each mouse such that the odor for which the bias was highest on the pre-odor bias session was chosen as odor 3 (predicting NoGo with higher probability). Each mouse went through six odor bias behavioral sessions, not necessarily on consecutive calendar days.

Behavioral analysis.

We excluded trials in which mice began licking during odor presentation, before sound onset. The criterion to stop a session was a drop in licking to <10% of trials in which the mouse licked in the last 50 trials.

For each behavioral session, we calculated a hit rate (the response probability for Go trials) and a FA rate (the response probability for NoGo trials). To compensate for individual biases, we quantified accuracy levels using a measure of discriminability from signal detection theory, d′ (Nevin, 1969). d′ is defined as the difference between the normal inverse cumulative distribution of the hit and FA rates, d′=z(hit)−z(FA) . When calculating d′ values in different conditions, different trials were considered as Go and NoGo (and therefore hit and FA rates differed). When calculating d′ based on sounds (Fig. 1e), all Go trials were considered as Go and all NoGo trials were considered as NoGo. When calculating d′ based on odors (Fig. 1e), all trials preceded by odor 1 were considered as Go and all trials preceded by odor 3 were considered as NoGo, and the rest of the trials were not used for this analysis. When calculating d′ for expected trials (Fig. 2d), all Go trials preceded by odor 1 were considered as Go and all NoGo trials preceded by odor 3 were considered as NoGo, and the rest of the trials were not used for this analysis. When calculating d′ for neutral trials (Fig. 2d), all Go trials preceded by odor 2 were considered as Go and all NoGo trials preceded by odor 2 were considered as NoGo, and the rest of the trials were not used for this analysis. When calculating d′ for unexpected trials (Fig. 2d), all Go trials preceded by odor 3 were considered as Go and all NoGo trials preceded by odor 1 were considered as NoGo, and the rest of the trials were not used for this analysis. Since d′ is an unbounded measure, calculation of the changes in d′ were calculated as the change index (d′ CI) between the expected and unexpected conditions (Fig. 3e). d′ CI was calculated as follows: d′CI=d′E−d′UEd′E+d′UE.

Figure 1.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 1.

Using odors for manipulation of expectation in an auditory discrimination task. a, The full experimental timeline started at P21–P24 and lasted for 10–15 weeks. b, d′ values of all five mice before learning when training on the set, after learning the easy task and after we increased task difficulty to maximum, just before the pre-odor bias (see Materials and Methods). Each shape is data from a different mouse. The red cross is the mean. c, Behavioral setup. d, Trial structure and outcomes for the Go-NoGo task with odors as preceding cues. Odor duration, 500 ms; sound duration, 100 ms; response window, 2 s. e, Odor–sound probabilities for all of the six combinations. The pre-odor bias and odor bias stages are shown on the top and bottom, respectively. Odors 1, 2, and 3 are labeled blue, gray, and red, respectively. Expectation condition for each odor–sound pair is labeled as expected (E; turquoise, p(sound/odor) = 2/3), neutral (N; gray, p(sound/odor) = 1/2), or unexpected (UE; purple, p(sound/odor) = 1/3). f, A representative example of 30 consecutive trials from one mouse engaged in an odor bias session. Black dots are licks, green dots are rewards, and red dots are punishments. Trial outcome and the expected probability of the stimulus per trial are indicated on the right. g, d′ values (mean ± SEM) for all behavioral sessions across all mice. The data are sorted based on sounds alone (solid line; Go, lick; NoGo, no lick) or based on odors alone (odor 1, lick; odor 3, no lick). d′ for sound alone was significantly higher than for odor alone across all sessions (one-tailed signed-rank test: *p < 0.05, df = 4, es = 6.55, 3.56, 1.4, 1.26, 2.06, 5.08, 3.88).

Figure 2.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 2.

Odor cues affect sound discrimination performance. a, Hit rates for the different odors of one example mouse for the pre-odor bias session (left) and one of the odor bias session (session 3, right). b, Difference in hit rate between odors 1 and 2 (blue), or odors 3 and 2 (red), for all mice across all sessions (mean ± SEM; one-tailed signed-rank test: n.s., no significant difference; df = 4; es = 0.01, −0.75, 0.48, 0.29, 0.53, −1.09, −0.44). c, FA rates for the same mouse and sessions as in a. d, Same as b but for the difference of FA rates. Starting from odor bias session 2, the FA rate was significantly higher for odor 1 as compared to odor 3 (one-tailed signed-rank test: *p < 0.05; df = 4; es = 1.08, 0.19, 1.91, 1.88, 3.06, 1.23, 2.5). e, Lick bias for the same mouse and sessions as in a. f, Same as b but for lick bias. On Odor bias sessions 2–4, lick bias was significantly higher for odor 1 as compared to odor 3 (one-tailed signed-rank test: *p < 0.05; df = 4; es = 0.72, 0.8, 1.55, 2.15, 3.9, 0.04, 0.98). Shaded gray area marks “effective biasing sessions.” g, d′ for the same mouse and sessions as in a. In the pre-odor bias stage, all odors were neutral, though color codes match their future expectation condition. At the odor bias stage odor 1 → Go and odor 3 → NoGo became expected (E; turquoise), odor2 → NoGo and odor2 → Go remained Neutral (N; gray), and odor1 → NoGo and odor3 → Go became unexpected (UE; magenta). h, Difference in d′ between either the expected or the unexpected condition and the neutral condition for all mice across all sessions (mean ± SEM). On odor bias sessions 2–4, d′ was significantly higher for the expected condition (one-tailed signed-rank test: *p < 0.05; df = 4; es = 0.72, 0.8, 1.55, 2.15, 3.9, 0.04, 0.98). i, All “effective biasing sessions” and mice for (from left to right) hit rate, FA rate, bias, and d′ for the different expectation conditions (mean ± SEM; one-tailed paired t test with Bonferroni’s correction: *p < 0.05, **p < 0.005, ***p < 0.0005; df = 14; es(Hit rate 1,2) = 0.42, es(Hit rate 2,3) = 0.23, es(Hit rate 1,3) = 0.65, es(FA 1,2) = 1.13, es(FA 2,3) = 1.16, es(FA 1,3) = 2.28, es(bias 1,2) = 1.58, es(bias 2,3) = 0.71, es(bias 1,3) = 2.27, es(d′ 1,2) = 1.58, es(d′ 2,3) = 0.71, es(d′ 1,3) = 2.27).

Figure 3.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 3.

Two-photon calcium imaging of the ACx of mice engaged in the task. a, Experimental setup. ISI, Intrinsic signal imaging; 2P; two-photon calcium imaging. b, Top, Image of a blood vessel map from the chronic window. Bottom, Mean intrinsic signal response to a 4 kHz tone cloud. The dotted line marks the rough boundary of ACx in this mouse. c, Example 2P micrograph of a representative neuronal field of view. d, Mean calcium response traces (shaded area is the SEM) of 30 example neurons (rows) from the neuronal field shown in b to Go (left column) and NoGo (right column) following odor 1 (blue), odor 2 (gray), and odor 3 (red). Calibration: 0.1 ΔF/F; 1 s. Gray vertical lines indicate the time of sound presentation. The table on the top shows the odor–sound combination of each column. e, Mean calcium response traces (shaded area, SEM) of an example neuron with a stable response profile to Go (left column) and NoGo (right column) following odor 1 (blue), odor 2 (gray), and odor 3 (red) throughout all behavioral sessions. Y-scale, 0.1 ΔF/F. Green rectangle indicates odor presentation (duration, 500 ms). Black/white rectangle indicates Go/NoGo presentation respectively (duration, 100 ms). Asterisks indicate a statistically significant response. f, same as e for a neuron with an unstable response profile.

We used an additional measure from signal detection theory which is the lick bias (Fig. 2e). Bias is the general tendency for a response, independent of d′, and is defined as bias=z(hit)+z(FA) . Bias was calculated separately for Go and NoGo trials of each odor, ignoring trials of other odors.

Calcium imaging analysis.

Image stacks were corrected online for motion using the red channel as a reference. Regions of interest (ROIs) were selected manually for each cell in each session. Raw fluorescence time series F(t) were obtained for each cell by averaging across pixels within each ROI. Baseline fluorescence F0 was computed by taking the mean F(t) before each stimulus. The change in fluorescence relative to baseline, ΔF/F0, was computed by taking the difference between F and F0 and dividing it by F0. A neuronal response to a single trial was calculated as the average of the ΔF/F0 of that neuron, 1 s following sound onset. To avoid analyzing responses that follow a potential change in fluorescence resulting from premotor activity or any other non-task-related input, trials in which the relative SD of F0 was >10% were excluded. Neuron–session pairs that responded to five trials or less of some odor–sound pair were excluded from the analysis. A minority (22% of responsive neuron–session pairs) of neuron–session pairs responded with a decrease in fluorescence to some odor–sound pairs. These responses were excluded from the analysis of this work.

Responsive neurons were classified as neurons with responses significantly >0 (signed-rank test, p < 0.05) to at least one odor–sound pair. Analysis was restricted to responsive neuron-session pairs and correct trials (Hit, CR) since Miss trials were scarce and neuronal responses to FA might be contaminated by the response to the noise punishment. Neuron–session pairs were classified as Go-preferring neurons if their strongest response was to an odor–Go pair, and as NoGo-preferring neurons if their strongest response was to an odor–NoGo pair. Of note, single-neuron responses to sounds were variable throughout sessions, such that neurons could even be classified as responsive in one session and as unresponsive in another (Fig. 3e, for examples). Therefore, we avoided a between-session comparison of the same neurons and treated different sessions separately. In the analysis shown in Figures 4d and 5, in which we pooled all effective bias sessions together, the responses of neurons that were responsive on more than one session were taken only from the first session in which they responded in each population. This was done to avoid multiple sampling of the same neuron.

Figure 4.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 4.

Discrimination indices of ACx neurons correlate with behavioral performance. a, Mean calcium response traces (shaded area, SEM) of five example neurons to Go (left column) and NoGo (right column) following odor 1 (blue) and odor 3 (red). Y-scale, 0.1 ΔF/F. Green rectangle indicates odor presentation (500 ms). Black/white rectangle indicates Go/NoGo presentation, respectively (100 ms). DI for expected and unexpected trials is indicated in turquoise and purple text, respectively, for each neuron. b, Top, Mean (±SEM) expected (turquoise) and unexpected (purple) DI of all responsive neurons throughout all behavioral sessions (signed-rank test: **p < 0.005, ***p < 0.0005; df = 218, 233, 209, 234, 175, 180, 127; es = 0.06, 0.1, 0.31, 0.26, 0.23, 0.02, 0.04). Bottom, Mean difference between expected and unexpected DIs (ΔDI) from the top. c, Mean ΔDI over all responsive neurons within a neuronal field of view per session as a function of d′ CI of that session, for all behavioral sessions (Pearson correlation: R = −0.43, p < 0.05). Each dot is a mouse–session pair; red line, linear fit. d, Left, DI of Go-preferring neurons for the three expectation conditions (n = 213) for all “effective odor bias sessions” together (shaded area in ‘b’). Right, Same for NoGo-preferring neurons (n = 153). For each box plot, the central mark indicates the median, and the bottom and top edges of the box indicate the 25th and 75th percentiles, respectively. The whiskers extend to the most extreme data points not considered outliers, and the outliers are plotted individually using the “+” symbol (two-tailed signed-rank test with Bonferroni’s correction: ***p < 0.0005; df(Go) = 212; es(Go E,N) = 0.1, es(Go N,UE) = 0.25, es(Go E, UE) = 0.34; df(NoGo) = 212; es(NoGo E,N) = 0.1, es(NoGo N,UE) = 0.05, es(NoGo E, UE) = 0.14).

Figure 5.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 5.

Cortical responses are modulated by odor–cue expectations. a, Top, Mean calcium response traces (shaded area, SEM) of Go-preferring neurons (n = 213) to the 6 odor–sound stimuli on Odor bias sessions 2–4. Y-scale, 0.05 ΔF/F. Green rectangle indicates odor presentation (500 ms). Black\white rectangle indicates Go/NoGo presentation, respectively (100 ms). Bottom, Individual mean responses of all neurons. Black line indicates the mean (two-tailed t test with Bonferroni’s correction: *p < 0.05, **p < 0.005, ***p < 0.0005; df = 212; es(Go 1,2) = 0.02, es(Go 2,3) = 0.14, es(Go E, 1,3) = 0.13, es(NoGo 1,2) = 0.11 es(NoGo 2,3) = 0.18, es(NoGo 1,3) = 0.27). b, Same as a for NoGo-preferring neurons (n = 153; df = 152; es(Go 1,2) = 0.06, es(Go 2,3) = 0.13, es(Go E, 1,3) = 0.22, es(NoGo 1,2) = 0.23 es(NoGo 2,3) = 0.01, es(NoGo 1,3) = 0.23). c, Same as a for the pre-odor bias session (n = 167; df = 166; es(Go 1,2) = 0.23, es(Go 2,3) = 0.18, es(Go E, 1,3) = 0.07, es(NoGo 1,2) = 0.05 es(NoGo 2,3) = 0.13, es(NoGo 1,3) = 0.09). d, Same as b for the pre-odor bias session (n = 52; df = 51; es(Go 1,2) = 0.02, es(Go 2,3) = 0.22, es(Go E, 1,3) = 0.2, es(NoGo 1,2) = 0.02 es(NoGo 2,3) = 0.03, es(NoGo 1,3) = 0.092).

To measure how well neuronal populations could discriminate between Go and NoGo, we calculated a receiver operating characteristic curve between the distributions of the responses of each neuron to these sounds and calculated its area under the curve (AUC). AUC values close to 0.5 indicate low discrimination, whereas values away from 0.5 indicate high discrimination. To compensate for neuronal preferences of either sound, we calculated the discrimination index (DI) as follows: DI=0.5+|AUC−0.5|.

Discrimination indices were calculated separately for Go–NoGo pairs in the expected, neutral, and unexpected conditions.

Statistical analysis.

All analysis was performed in custom-written MATLAB code. We used either the statistical signed-rank test or Student’s t test as indicated in the text. When the comparison was hypothesis driven, we used a one-tailed test. Otherwise, we used a two-tailed test. A multiple-comparison Bonferroni correction was applied when necessary. Since all tests are paired, degrees of freedom were always calculated as n – 1. Effect size (es) was calculated as Cohen’s d [mean(Sample1–Sample2)/SD(Sample1–Sample2) ].

Data availability.

All the data and codes used in this study are available from the corresponding author on request or can be downloaded from https://github.com/MizrahiTeam.

Results

A behavioral paradigm to induce odor-mediated expectation

To find out whether learned odor-mediated expectation can modulate sound perception, we started by designing an auditory discrimination task with preceding odors as cues. We trained head-fixed mice on an auditory Go–NoGo discrimination task with one of three possible odors preceding the sounds. Mice were rewarded for licking the spout in response to one sound frequency (Go) by a sweet drop of water and punished by a white noise sound for licking in response to a second sound frequency (NoGo). No-lick trials were neither punished nor rewarded [Fig. 1c,d; see also Materials and Methods (also see Fig. 1a,b, for the full training protocol)]. To attain mice that are engaged in a difficult task, we used sounds (pure tones) separated by a frequency difference just above the discrimination threshold of the mouse (Maor et al., 2020; see also Materials and Methods). As odors, we used monomolecular odorants with neutral valence (Root et al., 2014). Sound stimuli, whether Go or NoGo, were initially preceded by one of the odors with equal probabilities (Fig. 1e; p(sound|odor) = 1/2 for all six possible combinations). At this stage, which we called “pre-odor bias”, all odors were merely a cue that signals the initiation of a new trial. Thus, at this stage, odors did not convey any information about the identity of the upcoming trial in the task (Fig. 1e, pre-odor bias). In the pre-odor bias stage mice performed the sound discrimination based strictly on the sounds themselves, not the odors (Fig. 1g, pre-odor bias; d′sound_alone = 1.5 ± 0.1; d′odor_alone = 0 ± 0.1; mean ± SEM; n = 5 mice).

To manipulate expectations of trial types, we rearranged the probabilities of specific odor → sound pairs, a stage we called “odor bias”. Odor → sound pairs were changed as follows: (1) odor 1 (blue in all figures) was more likely to be followed by a Go sound (p(Go|Odor1) = 2/3, p(NoGo|Odor1) = 1/3); (2) odor 2 (gray in all figures) was followed by either Go or NoGo sounds with equal probabilities (p(Go|Odor2) = p(NoGo|Odor2) =1/2); and (3) odor 3 (red in all figures) was more likely to be followed by a NoGo sound (p(Go|Odor3) = 1/3, p(NoGo|Odor3) = 2/3; Fig. 1e, odor bias). We measured mouse performance for sound discrimination along six consecutive sessions when odors were now potentially informative about the identity of an upcoming trial. In the odor bias stage, two of the odor → sound pairs were expected, two pairs were unexpected, and two remained neutral (Fig. 1e). On average, mice performed 404 ± 18 trials per session (mean ± SEM), allowing us to measure behavioral responses to all six odor → sound combinations, which were presented in random order (Fig. 1f, a representative snapshot of 30 consecutive trials). Despite odors now being informative, mice still used the sound information significantly more than the odors for solving the task (Fig. 1g, odor bias sessions 1–6).

We next asked to what extent the manipulated probabilities of odor → sound pairs changed the perceived expectation of trial type? To answer this question, we analyzed expectation by assessing mouse performance in the different stages (i.e., pre-odor bias vs odor bias) and by comparing behavioral responses in expected versus unexpected trials. Notably, by subtracting responses of neutral odors, we balanced off any nonspecific behavioral effects.

First, we analyzed lick rates for Go and NoGo sounds (i.e., Hit rate and FA rate, respectively). Expectation did not affect Hit rates, which remained consistently high both in the pre-odor bias stage and throughout the odor bias sessions (Fig. 2a,b). In contrast, we found clear differences in FA rates contingent on the expectation of a Go trial. Specifically, FA rates increased for odor 1, while they concomitantly decreased for odor 3 (Fig. 2c,d). This change in FA rates was not evident in the first session after switching the odor → sound probabilities, but only after the second session and onward (Fig. 2d). This finding suggests that mice became sensitive to the change in odor → sound probabilities only after session 2 during the odor bias stage.

Second, we measured lick bias and sound discriminability (d′). Lick bias is a general measurement of the tendency to lick regardless of the sound (Nevin, 1969). Lick bias was higher for odor 1 and lower for odor 3 but only during odor bias sessions 2–4 (Fig. 2e,f). To test the role of expectation on behavioral performance, we compared d′ for expected, neutral, and unexpected Go–NoGo pairs (see Fig. 1e for precise probabilities). d′ increased when sounds were expected (Fig. 2g, green bars) and decreased when sounds were unexpected (Fig. 2g, purple bars). Here, too, the significant differences were evident during odor bias sessions 2–4 (Fig. 2h). These results show that our manipulation had a strong yet transient effect on behavior, appearing at session 2, maintained for at least three sessions (sessions 2–4), and waning down at sessions 5–6.

In summary, by manipulating the statistics of odor cues that precede specific sounds, we developed a behavioral paradigm that allows us to test the effects of learned olfactory cues in different contexts during an auditory discrimination task. The behavioral effects were clear, yet transient. The late decline of the effect may be because of the learning (by session 5) that odors are in fact unnecessary for solving this task. Since we cannot rule out that mice learned to ignore the odor probabilities by session 5, we focused our analysis only on sessions 2–4 [Fig. 2f,h, shaded gray (collectively referred to herein as “effective biasing sessions”)]. During the effective biasing sessions, the behavioral effects are monotonic with odor → sound probabilities for FA rate, lick bias, and d′ (Fig. 2i).

Cortical neurons show increased discrimination when sounds are unexpected

We used two-photon calcium imaging in the ACx of mice engaged in the above-mentioned task to measure how the regularities of odor → sound probabilities modulate neuronal activity in ACx. We injected mice, unilaterally, with AAV9-hSyn-GCaMP6s-P2A-nls-dTomato into the left ACx and prepared them for imaging (see Materials and Methods). We located the rough borders of the ACx using intrinsic signal imaging of the cortical sheet (Fig. 3a,b; see Materials and Methods) and then zoomed-in to image single-neuron responses (Fig. 3c). We imaged ACx at depths corresponding to L2/3 (range, 150–350 μm from the pial surface), and successfully imaged four of the five mice that were described above. We imaged calcium responses of single neurons during behavior both in the pre-odor bias stage as well as throughout all sessions of the odor bias stage (Movie 1, representative example of raw data in a behaving mouse).

Movie 1.

Two-photon calcium imaging during behavior. Left, Raw imaging data from the mouse shown on the right. Red, dTomato; green, GCaMP6s. Right, Movie of a head-fixed mouse engaged in the behavioral task and imaged; top right, trial type; bottom left, information about odors, sounds rewards, and punishments.

Each behavioral session lasted 45–75 min, during which we imaged neuronal responses to all six combinations of odor → sound stimuli. As expected from ACx (Rothschild et al., 2010; Feigin et al., 2021), neuronal responses to sounds were highly heterogeneous (Figs. 3d–f, 4a). Some neurons responded only to the Go sound (Fig. 4a, neuron 4), NoGo sound (Fig. 4a, neuron 3), or both sounds to different degrees (Fig. 4a, neurons 1, 2, and 5). Moreover, while some neurons responded stably to the same stimulus in every session, others were unstable and responsive to each stimulus only during some sessions. Notably, heterogenous response patterns across days are consistent with what had been referred to as representational drift (Rule et al., 2019) and has also been shown to be evident in ACx (Aschauer et al., 2022; Suri and Rothschild, 2022). We therefore analyzed only responsive neurons and only in those sessions they were responsive (i.e. neuron-session pairs). In total, we analyzed a dataset composed of 301 neurons in 1383 neuron–session pairs. Since miss trials were rare, and FA responses could have included a response to the white noise punishment, our primary analysis was focused on neuronal responses only between similar correct trials (i.e., Hit vs Hit and CR vs CR). This ensured that we analyzed how odor-mediated changes affect cortical responses when actions are similar. Central to our hypothesis about the role of odors as contextual cues, we found neurons that responded to a given sound distinctly based on the preceding odor stimulus (Figs. 3d–f, 4a). This shows that learned odor → sound associations modulated the representation of the same sound when motor responses and behavioral outcomes were identical.

To evaluate how the different odor → sound probabilities affected the way neurons discriminated between the sounds during different choices, we calculated a DI between Go and NoGo, in expected/unexpected conditions (see Materials and Methods). Some neurons were more discriminative in expected trials (Fig. 4a: odor 1 → Go and odor 3 → NoGo, in neurons 4 and 5, respectively), while others in unexpected trials (Fig. 4a: odor 3 → Go and odor 1 → NoGo, neurons 1 and 3, respectively). The mean DI of all single neurons was significantly higher in the unexpected condition, only during the effective biasing sessions. Strikingly, the sessions in which odor-mediated expectations had pronounced physiological effects correspond exactly to the sessions of the behavioral effects (Fig. 4b, shaded area). Notably, the changes in neuronal DI during the effective odor bias sessions were opposite in direction to the behavioral ones (compare Figs. 4b, 2h), which is partially explained by the fact that only correct trials were considered in the physiological analysis (see more on this issue in the Discussion). To test for a correlation between the behavioral effects and the neuronal effects, in each mouse we plotted the mean DI difference (ΔDI) per session versus the difference in d′ (d′ CI; see Materials and Methods) between expected and unexpected trials. Plotting this relationship across all behavioral sessions individually revealed a negative correlation (Fig. 4c). This result suggests that ACx is involved in discriminating sounds during behavior, and particularly so when they are unexpected.

We then tested whether the difference in DI is a feature of neurons with a specific response profile. We thus analyzed DI in single neurons based on their sound–frequency preference—either as Go-preferring or NoGo-preferring neurons. The mean DI of all neurons in the effective biasing sessions (such that in each population, each neuron was sampled only once; see Materials and Methods) revealed a statistically significant effect in DI as a feature of Go-preferring neurons, but not NoGo-preferring neurons (Fig. 4d). Furthermore, by adding the neutral expectation condition to the analysis, we found that only the unexpected condition in the Go-preferring neurons contributes to the effect (Fig. 4d, left graph). Note that in the behavior we found a difference across all conditions (Fig. 2i), which argues that the neuronal responses we measured from ACx cannot explain the full behavioral manifestation induced by the odor → sound associations.

Since neuronal responses were affected by the preceding odors in various ways (Figs. 3d–f, 4a), we next sought to account for the differences in DI during the effective biasing sessions, between the expected and unexpected conditions. We analyzed responses of Go-preferring and NoGo-preferring neurons separately. Interestingly, the average population response amplitudes differed following the different odors, differentially between neuronal populations. Specifically, Go-preferring neurons responded weaker to NoGo when it was unexpected [Fig. 5a (also see Fig. 4a, neuron 2)]. This result readily explains the increased DI in Go neurons. NoGo-preferring neurons, however, responded stronger to both stimuli when they were unexpected (Fig. 5b, bottom right). No such changes were evident during the pre-odor bias session (Fig. 5c,d). These results suggest that odor 1, which is associated with a Go trial (either sound and/or its reward), has, on average, increased the selectivity of both neuronal populations to their preferred stimulus.

Discussion

It is well established that the so called “unisensory” cortices are strongly modulated by stimuli from other sensory modalities (Schroeder and Foxe, 2005; Ghazanfar and Schroeder, 2006; Stein and Stanford, 2008). This is true for all sensory cortices and includes their primary subregions (Morgan et al., 2008; Maier et al., 2015; Murray et al., 2016; Clemens et al., 2018). In ACx, several physiological studies have revealed that neurons integrate auditory–visual or auditory–somatosensory cues, which are conveyed through direct anatomic and functional connections among the regions (Murray et al., 2005; Bizley et al., 2007, 2016; Kayser et al., 2007, 2009; Lakatos et al., 2007). Those forms of multisensory integration have been suggested to complement auditory processing and modulate the way an animal perceives its natural acoustic environment (Ghazanfar and Schroeder, 2006; Stein and Stanford, 2008). For example, in humans, who are highly visually guided, audiovisual integration has been linked to improved speech perception, localization accuracy, and reaction times to auditory cues (Schröger and Widmann, 1998; Sekiyama et al., 2003; Besle et al., 2008; Schroeder et al., 2008). ACx in humans and other animals has also been shown to be modulated by other senses like touch (Kayser et al., 2005; Schürmann et al., 2006). In mouse ACx, multisensory interactions between sounds and touch have also been suggested to play a role in biologically meaningful ways, like during social interactions (Rao et al., 2014). Here, we asked to what extent is ACx affected by contextual cues from other senses that seem intuitively distant, like olfaction?

Olfactory cues form cross-modal associations with all other senses (Deroy et al., 2013). However, most examples of such associations are related to natural contingencies, like between odors and flavors. Multisensory interactions between odors and flavors are intuitively explained by the mere statistical regularities of the environment (e.g., caramel odor is always congruent with its sweet taste). But some cross modal associations with odors are surprising, like those between odors and touch or odors and colors in humans (Demattè et al., 2006a, b). Such surprising congruencies come from human psychology and are often explained as anecdotal associations arising from metaphorical/synesthetic transfers among the senses. Yet, real life experience and learning offer rich substrates for forming associations between any stimuli that co-occur in the environment in a meaningful manner.

One example of a natural form of odor–sound contingency is one that develops during parenthood. By measuring single-neuron responses to sounds in ACx of mouse dams, we previously showed that exposure to the body odor of pups has a modulatory effect on sound-evoked responses (Cohen et al., 2011; Cohen and Mizrahi, 2015). Although the neural circuit underlying those effects remains unknown, they likely involve innate circuits (e.g., the medial preoptic area and amygdala) that receive strong inputs from pup odors and directly shape maternal behavior (Dulac et al., 2014). Notably, maternal plasticity can be argued as a different case than the one we studied here, as it involves innate circuits and behaviors, while our task is based purely on learned association. By artificially creating a learned association between specific odors and specific sounds with no innate preference, we ensured that contextual odor information is first learned during the task (Fig. 1). Given this experimental design and choice of odors and sounds, we hypothesized that any contingencies formed will likely arise from cortical association areas and engage in cognitive processes like attention and expectation (Talsma et al., 2010; Rohe and Noppeney, 2016).

Our behavioral design required fine tuning such that an increased task difficulty will be easier to bias by the predictive cues. Since mice rely heavily on olfaction (Howard et al., 1968; Rokni et al., 2014), we trained mice to their perceptual limit on a strictly auditory task and only then started introducing odors (Fig. 1a). In addition, we made sure that odors will be sufficiently less informative than sounds (i.e., while the Go sound predicts a reward with 100% certainty, odor 1 predicts it with 66% certainty). Fine-tuning these measures ensured that mice relied more on sounds compared to odors (Fig. 1g). The fact that odors were not necessary for solving the task could explain, at least in part, the transient nature of the behavioral effect that we observed (Fig. 2f,h). Nonetheless, odor expectation cues had a clear behavioral effect in a time window of several days, which was also correlated with neuronal changes (Fig. 4b,c). We interpret this correlation as evidence that odor–cue contingency changed the learned auditory behavior and that neural changes in ACx might be informative for the task. But how?

A somewhat counterintuitive finding is that neurons showed increased discrimination during unexpected trials, but mice showed decreased performance in those trials (Figs. 2i, 4). One possibility is that the choice of the mouse is affected by the activity of other brain regions in addition to ACx that weigh the odor cues more heavily and that choices were incorrect in those trials despite ACx being more discriminative. However, since we analyzed only correct trials, another possible explanation is that information in ACx is used for responding correctly, although the trial was unexpected. Specifically, increased discrimination was observed for Go-preferring neurons, which decreased their response to the NoGo sound when it was unexpected. Thus, this attenuated neural response on these (correct) trials, might have contributed to the accurate behavioral choice.

One of our main physiological findings is an increased response of NoGo-preferring neurons to unexpected trials (Fig. 5b). This type of a response is reminiscent of the response profile of ACx neurons to the oddball paradigm (Ulanovsky et al., 2003). In the classical auditory oddball paradigm, two sounds are repeated in sequence such that one of them (the standard) appears with higher probability than the other (the rare). ACx neurons tend to respond more strongly to the same sound when it is rare (and therefore less expected) than when it is standard. This is similar to NoGo neurons in this work responding more strongly to the same sound when it is unexpected. Notably, however, we think that these two phenomena likely do not have a common mechanism, since the characteristic response of ACx neurons to the oddball paradigm is thought to be a result of a local feedforward computation (Mill et al., 2011, 2012; Taaseh et al., 2011). The responses described here, however, require that information about odor-mediated expectations will arise from regions outside the ACx.

Interestingly, heightened responses to rare sounds in the oddball paradigm are termed “prediction error signals” (Rubin et al., 2016)—a term commonly associated with the theory of predictive processing (Rao and Ballard, 1999; Friston, 2005; Keller and Mrsic-Flogel, 2018). Indeed, according to this theory, predictions of upcoming stimuli arrive at sensory cortices from high-order (top-down) cortical regions and compared with bottom-up input. When these do not match, neurons respond with a prediction error signal. Our finding of NoGo-preferring neurons responding more strongly to unexpected stimuli fits well with the theory of predictive processing, but the fact that the Go-preferring neurons did not show such a response pattern argues that computations in ACx are more diverse than simply prediction errors.

We speculate that the information of odor–cue expectation arises from brain regions that integrate information from both the auditory and olfactory modalities. One such candidate is the orbitofrontal cortex, which has been shown to respond to odors and sounds, as well as to innervate ACx (Rolls, 2004; Winkowski et al., 2018). In addition, the orbitofrontal cortex is generally thought to be involved in the assignment of values to sensory stimuli during associative learning (Padoa-Schioppa and Assad, 2006; Schoenbaum et al., 2009). Whether orbitofrontal cortex, or any brain region, is indeed involved warrants future investigation.

We cannot rule out that the effects we measured here are not purely sensory. Indeed, odor-induced expectations are not necessarily limited to the Go/NoGo sounds but likely contain information more generally—pertinent to the meaning of Go/NoGo as a whole. First, odor–cue expectations might carry information about the reward component as well (Schultz et al., 1998). Second, it is well established that cortical activity in sensory systems, including primary sensory cortices, includes motor components (Musall et al., 2019; Steinmetz et al., 2019). However, since odors modulated the responses to the NoGo sound, for which the mouse did not proactively respond by a motor action, the effect on sound processing remains probable, though not exclusive. Teasing out the individual components of the effects warrants additional experiments that will better isolate each component individually.

Footnotes

  • This work was supported by ERC (European Research Council) Consolidator Grant 616063 (to A.M.), Israeli Science Foundation Grant 2453/18 (to A.M.), and the Gatsby Charitable Foundation. Some elements in Figures 1 and 3 were created with graphical features from BioRender.com. This work is dedicated to the memory of Mrs. Lily Safra, a great supporter of brain research. We thank Eran Lottem, Leon Deouell, Ido Maor, and members of the Mizrahi laboratory for comments on the manuscript. We also thank Yishai Elyada for technical help in setting up a first version of the microscope and for providing other technical help. In addition, we thank Maya Sherman and Yishai Elyada for virus preparation and calibration.

  • The authors declare no competing financial interests.

  • Correspondence should be addressed to Adi Mizrahi at Mizrahi.adi{at}mail.huji.ac.il

SfN exclusive license.

References

  1. ↵
    1. Arakawa H,
    2. Blanchard DC,
    3. Arakawa K,
    4. Dunlap C,
    5. Blanchard RJ
    (2008) Scent marking behavior as an odorant communication in mice. Neurosci Biobehav Rev 32:1236–1248. doi:10.1016/j.neubiorev.2008.05.012 pmid:18565582
    OpenUrlCrossRefPubMed
  2. ↵
    1. Aschauer DF,
    2. Eppler J-B,
    3. Ewig L,
    4. Chambers AR,
    5. Pokorny C,
    6. Kaschube M,
    7. Rumpel S
    (2022) Learning-induced biases in the ongoing dynamics of sensory representations predict stimulus generalization. Cell Rep 38:110340. doi:10.1016/j.celrep.2022.110340 pmid:35139386
    OpenUrlCrossRefPubMed
  3. ↵
    1. Besle J,
    2. Fischer C,
    3. Bidet-Caulet A,
    4. Lecaignard F,
    5. Bertrand O,
    6. Giard M-H
    (2008) Visual activation and audiovisual interactions in the auditory cortex during speech perception: intracranial recordings in humans. J Neurosci 28:14301–14310. doi:10.1523/JNEUROSCI.2875-08.2008 pmid:19109511
    OpenUrlAbstract/FREE Full Text
  4. ↵
    1. Bizley JK,
    2. Nodal FR,
    3. Bajo VM,
    4. Nelken I,
    5. King AJ
    (2007) Physiological and anatomical evidence for multisensory interactions in auditory cortex. Cereb Cortex 17:2172–2189. doi:10.1093/cercor/bhl128 pmid:17135481
    OpenUrlCrossRefPubMed
  5. ↵
    1. Bizley JK,
    2. Maddox RK,
    3. Lee AK
    (2016) Defining auditory-visual objects: behavioral tests and physiological mechanisms. Trends Neurosci 39:74–85. doi:10.1016/j.tins.2015.12.007 pmid:26775728
    OpenUrlCrossRefPubMed
  6. ↵
    1. Clemens AM,
    2. Fernandez Delgado Y,
    3. Mehlman ML,
    4. Mishra P,
    5. Brecht M
    (2018) Multisensory and motor representations in rat oral somatosensory cortex. Sci Rep 8:13556. doi:10.1038/s41598-018-31710-0 pmid:30201995
    OpenUrlCrossRefPubMed
  7. ↵
    1. Cohen L,
    2. Mizrahi A
    (2015) Plasticity during motherhood: changes in excitatory and inhibitory layer 2/3 neurons in auditory cortex. J Neurosci 35:1806–1815. doi:10.1523/JNEUROSCI.1786-14.2015 pmid:25632153
    OpenUrlAbstract/FREE Full Text
  8. ↵
    1. Cohen L,
    2. Rothschild G,
    3. Mizrahi A
    (2011) Multisensory integration of natural odors and sounds in the auditory cortex. Neuron 72:357–369. doi:10.1016/j.neuron.2011.08.019 pmid:22017993
    OpenUrlCrossRefPubMed
  9. ↵
    1. Coombes HA,
    2. Stockley P,
    3. Hurst JL
    (2018) Female chemical signalling underlying reproduction in mammals. J Chem Ecol 44:851–873. doi:10.1007/s10886-018-0981-x pmid:29992368
    OpenUrlCrossRefPubMed
  10. ↵
    1. Currier TA,
    2. Nagel KI
    (2020) Multisensory control of navigation in the fruit fly. Curr Opin Neurobiol 64:10–16. doi:10.1016/j.conb.2019.11.017 pmid:31841944
    OpenUrlCrossRefPubMed
  11. ↵
    1. Demattè ML,
    2. Sanabria D,
    3. Sugarman R,
    4. Spence C
    (2006a) Cross-modal interactions between olfaction and touch. Chem Senses 31:291–300. doi:10.1093/chemse/bjj031 pmid:16452454
    OpenUrlCrossRefPubMed
  12. ↵
    1. Demattè ML,
    2. Sanabria D,
    3. Spence C
    (2006b) Cross-modal associations between odors and colors. Chem Senses 31:531–538.
    OpenUrlCrossRefPubMed
  13. ↵
    1. Deroy O,
    2. Crisinel A-S,
    3. Spence C
    (2013) Crossmodal correspondences between odors and contingent features: odors, musical notes, and geometrical shapes. Psychon Bull Rev 20:878–896. doi:10.3758/s13423-013-0397-0 pmid:23463615
    OpenUrlCrossRefPubMed
  14. ↵
    1. Doty RL
    (1986) Odor-guided behavior in mammals. Experientia 42:257–271. doi:10.1007/BF01942506 pmid:3514263
    OpenUrlCrossRefPubMed
  15. ↵
    1. Dulac C,
    2. O’Connell LA,
    3. Wu Z
    (2014) Neural control of maternal and paternal behaviors. Science 345:765–770. doi:10.1126/science.1253291 pmid:25124430
    OpenUrlAbstract/FREE Full Text
  16. ↵
    1. Ehret G
    (2005) Infant rodent ultrasounds—a gate to the understanding of sound communication. Behav Genet 35:19–29. doi:10.1007/s10519-004-0853-8 pmid:15674530
    OpenUrlCrossRefPubMed
  17. ↵
    1. Elyada YM,
    2. Mizrahi A
    (2015) Becoming a mother—circuit plasticity underlying maternal behavior. Curr Opin Neurobiol 35:49–56. doi:10.1016/j.conb.2015.06.007 pmid:26143475
    OpenUrlCrossRefPubMed
  18. ↵
    1. Feigin L,
    2. Tasaka G,
    3. Maor I,
    4. Mizrahi A
    (2021) Sparse coding in temporal association cortex improves complex sound discriminability. J Neurosci 41:7048–7064. doi:10.1523/JNEUROSCI.3167-20.2021 pmid:34244361
    OpenUrlAbstract/FREE Full Text
  19. ↵
    1. Flickinger D,
    2. Iyer V,
    3. Huber D,
    4. O’Connor D,
    5. Peron S,
    6. Clack N,
    7. Chandrashekar J,
    8. Svoboda K
    (2010) MIMMS: a modular, open design microscopy platform for in vivo imaging of neural tissues. Soc Neurosci Abstr 36:816.12/NNN33.
    OpenUrl
  20. ↵
    1. Friston K
    (2005) A theory of cortical responses. Philos Trans R Soc B Biol Sci 360:815–836. doi:10.1098/rstb.2005.1622 pmid:15937014
    OpenUrlCrossRefPubMed
  21. ↵
    1. Ghazanfar AA,
    2. Schroeder CE
    (2006) Is neocortex essentially multisensory? Trends Cogn Sci 10:278–285. doi:10.1016/j.tics.2006.04.008 pmid:16713325
    OpenUrlCrossRefPubMed
  22. ↵
    1. Goldey GJ,
    2. Roumis DK,
    3. Glickfeld LL,
    4. Kerlin AM,
    5. Reid RC,
    6. Bonin V,
    7. Schafer DP,
    8. Andermann ML
    (2014) Removable cranial windows for long-term imaging in awake mice. Nat Protoc 9:2515–2538. doi:10.1038/nprot.2014.165 pmid:25275789
    OpenUrlCrossRefPubMed
  23. ↵
    1. Halene TB,
    2. Talmud J,
    3. Jonak GJ,
    4. Schneider F,
    5. Siegel SJ
    (2009) Predator odor modulates auditory event-related potentials in mice. Neuroreport 20:1260–1264. doi:10.1097/WNR.0b013e3283300cde pmid:19625986
    OpenUrlCrossRefPubMed
  24. ↵
    1. Howard WE,
    2. Marsh RE,
    3. Cole RE
    (1968) Food detection by deer mice using olfactory rather than visual cues. Anim Behav 16:13–17. doi:10.1016/0003-3472(68)90100-0 pmid:5639893
    OpenUrlCrossRefPubMed
  25. ↵
    1. Kayser C,
    2. Petkov CI,
    3. Augath M,
    4. Logothetis NK
    (2005) Integration of touch and sound in auditory cortex. Neuron 48:373–384. doi:10.1016/j.neuron.2005.09.018 pmid:16242415
    OpenUrlCrossRefPubMed
  26. ↵
    1. Kayser C,
    2. Petkov CI,
    3. Augath M,
    4. Logothetis NK
    (2007) Functional imaging reveals visual modulation of specific fields in auditory cortex. J Neurosci 27:1824–1835. doi:10.1523/JNEUROSCI.4737-06.2007 pmid:17314280
    OpenUrlAbstract/FREE Full Text
  27. ↵
    1. Kayser C,
    2. Petkov CI,
    3. Logothetis NK
    (2009) Multisensory interactions in primate auditory cortex: fMRI and electrophysiology. Hear Res 258:80–88. doi:10.1016/j.heares.2009.02.011 pmid:19269312
    OpenUrlCrossRefPubMed
  28. ↵
    1. Keller GB,
    2. Mrsic-Flogel TD
    (2018) Predictive processing: a canonical cortical computation. Neuron 100:424–435. doi:10.1016/j.neuron.2018.10.003 pmid:30359606
    OpenUrlCrossRefPubMed
  29. ↵
    1. Lakatos P,
    2. Chen C-M,
    3. O′Connell MN,
    4. Mills A,
    5. Schroeder CE
    (2007) Neuronal oscillations and multisensory interaction in primary auditory cortex. Neuron 53:279–292. doi:10.1016/j.neuron.2006.12.011 pmid:17224408
    OpenUrlCrossRefPubMed
  30. ↵
    1. Leonard AS,
    2. Masek P
    (2014) Multisensory integration of colors and scents: insights from bees and flowers. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 200:463–474. doi:10.1007/s00359-014-0904-4 pmid:24710696
    OpenUrlCrossRefPubMed
  31. ↵
    1. Maier JX,
    2. Blankenship ML,
    3. Li JX,
    4. Katz DB
    (2015) A multisensory network for olfactory processing. Curr Biol 25:2642–2650. doi:10.1016/j.cub.2015.08.060 pmid:26441351
    OpenUrlCrossRefPubMed
  32. ↵
    1. Maor I,
    2. Shwartz-Ziv R,
    3. Feigin L,
    4. Elyada Y,
    5. Sompolinsky H,
    6. Mizrahi A
    (2020) Neural correlates of learning pure tones or natural sounds in the auditory cortex. Front Neural Circuits 13:82. doi:10.3389/fncir.2019.00082 pmid:32047424
    OpenUrlCrossRefPubMed
  33. ↵
    1. McGurk H,
    2. MacDonald J
    (1976) Hearing lips and seeing voices. Nature 264:746–748. doi:10.1038/264746a0 pmid:1012311
    OpenUrlCrossRefPubMed
  34. ↵
    1. Mill R,
    2. Coath M,
    3. Wennekers T,
    4. Denham SL
    (2011) A neurocomputational model of stimulus-specific adaptation to oddball and Markov sequences. PLoS Comput Biol 7:e1002117. doi:10.1371/journal.pcbi.1002117 pmid:21876661
    OpenUrlCrossRefPubMed
  35. ↵
    1. Mill R,
    2. Coath M,
    3. Wennekers T,
    4. Denham SL
    (2012) Characterising stimulus-specific adaptation using a multi-layer field model. Brain Res 1434:178–188. doi:10.1016/j.brainres.2011.08.063 pmid:21955728
    OpenUrlCrossRefPubMed
  36. ↵
    1. Morgan ML,
    2. DeAngelis GC,
    3. Angelaki DE
    (2008) Multisensory integration in macaque visual cortex depends on cue reliability. Neuron 59:662–673. doi:10.1016/j.neuron.2008.06.024 pmid:18760701
    OpenUrlCrossRefPubMed
  37. ↵
    1. Murray MM,
    2. Molholm S,
    3. Michel CM,
    4. Heslenfeld DJ,
    5. Ritter W,
    6. Javitt DC,
    7. Schroeder CE,
    8. Foxe JJ
    (2005) Grabbing your ear: rapid auditory–somatosensory multisensory interactions in low-level sensory cortices are not constrained by stimulus alignment. Cereb Cortex 15:963–974. doi:10.1093/cercor/bhh197 pmid:15537674
    OpenUrlCrossRefPubMed
  38. ↵
    1. Murray MM,
    2. Thelen A,
    3. Thut G,
    4. Romei V,
    5. Martuzzi R,
    6. Matusz PJ
    (2016) The multisensory function of the human primary visual cortex. Neuropsychologia 83:161–169. doi:10.1016/j.neuropsychologia.2015.08.011 pmid:26275965
    OpenUrlCrossRefPubMed
  39. ↵
    1. Musall S,
    2. Kaufman MT,
    3. Juavinett AL,
    4. Gluf S,
    5. Churchland AK
    (2019) Single-trial neural dynamics are dominated by richly varied movements. Nat Neurosci 22:1677–1686. doi:10.1038/s41593-019-0502-4 pmid:31551604
    OpenUrlCrossRefPubMed
  40. ↵
    1. Nevin JA
    (1969) Signal detection theory and operant behavior: a review of David M. Green and John A. Swets′ signal detection theory and psychophysics1. J Exp Anal Behav 12:475–480. doi:10.1901/jeab.1969.12-475
    OpenUrlCrossRef
  41. ↵
    1. Okabe S,
    2. Nagasawa M,
    3. Kihara T,
    4. Kato M,
    5. Harada T,
    6. Koshida N,
    7. Mogi K,
    8. Kikusui T
    (2013) Pup odor and ultrasonic vocalizations synergistically stimulate maternal attention in mice. Behav Neurosci 127:432–438. doi:10.1037/a0032395 pmid:23544596
    OpenUrlCrossRefPubMed
  42. ↵
    1. Padoa-Schioppa C,
    2. Assad JA
    (2006) Neurons in the orbitofrontal cortex encode economic value. Nature 441:223–226. doi:10.1038/nature04676 pmid:16633341
    OpenUrlCrossRefPubMed
  43. ↵
    1. Pologruto TA,
    2. Sabatini BL,
    3. Svoboda K
    (2003) ScanImage: flexible software for operating laser scanning microscopes. Biomed Eng Online 2:13. doi:10.1186/1475-925X-2-13 pmid:12801419
    OpenUrlCrossRefPubMed
  44. ↵
    1. Rao RP,
    2. Ballard DH
    (1999) Predictive coding in the visual cortex: a functional interpretation of some extra-classical receptive-field effects. Nat Neurosci 2:79–87. doi:10.1038/4580 pmid:10195184
    OpenUrlCrossRefPubMed
  45. ↵
    1. Rao RP,
    2. Mielke F,
    3. Bobrov E,
    4. Brecht M
    (2014) Vocalization–whisking coordination and multisensory integration of social signals in rat auditory cortex. Elife 3:e03185. doi:10.7554/eLife.03185
    OpenUrlCrossRefPubMed
  46. ↵
    1. Rohe T,
    2. Noppeney U
    (2016) Distinct computational principles govern multisensory integration in primary sensory and association cortices. Curr Biol 26:509–514. doi:10.1016/j.cub.2015.12.056 pmid:26853368
    OpenUrlCrossRefPubMed
  47. ↵
    1. Rokni D,
    2. Hemmelder V,
    3. Kapoor V,
    4. Murthy VN
    (2014) An olfactory cocktail party: figure-ground segregation of odorants in rodents. Nat Neurosci 17:1225–1232. doi:10.1038/nn.3775 pmid:25086608
    OpenUrlCrossRefPubMed
  48. ↵
    1. Rolls ET
    (2004) The functions of the orbitofrontal cortex. Brain Cogn 55:11–29. doi:10.1016/S0278-2626(03)00277-X pmid:15134840
    OpenUrlCrossRefPubMed
  49. ↵
    1. Root CM,
    2. Denny CA,
    3. Hen R,
    4. Axel R
    (2014) The participation of cortical amygdala in innate, odour-driven behaviour. Nature 515:269–273. doi:10.1038/nature13897 pmid:25383519
    OpenUrlCrossRefPubMed
  50. ↵
    1. Rothschild G,
    2. Nelken I,
    3. Mizrahi A
    (2010) Functional organization and population dynamics in the mouse primary auditory cortex. Nat Neurosci 13:353–360. doi:10.1038/nn.2484 pmid:20118927
    OpenUrlCrossRefPubMed
  51. ↵
    1. Rubin J,
    2. Ulanovsky N,
    3. Nelken I,
    4. Tishby N
    (2016) The representation of prediction error in auditory cortex. PLoS Comput Biol 12:e1005058. doi:10.1371/journal.pcbi.1005058 pmid:27490251
    OpenUrlCrossRefPubMed
  52. ↵
    1. Rule ME,
    2. O’Leary T,
    3. Harvey CD
    (2019) Causes and consequences of representational drift. Curr Opin Neurobiol 58:141–147. doi:10.1016/j.conb.2019.08.005 pmid:31569062
    OpenUrlCrossRefPubMed
  53. ↵
    1. Schoenbaum G,
    2. Roesch MR,
    3. Stalnaker TA,
    4. Takahashi YK
    (2009) A new perspective on the role of the orbitofrontal cortex in adaptive behaviour. Nat Rev Neurosci 10:885–892. doi:10.1038/nrn2753 pmid:19904278
    OpenUrlCrossRefPubMed
  54. ↵
    1. Schroeder CE,
    2. Foxe J
    (2005) Multisensory contributions to low-level, “unisensory” processing. Curr Opin Neurobiol 15:454–458. doi:10.1016/j.conb.2005.06.008 pmid:16019202
    OpenUrlCrossRefPubMed
  55. ↵
    1. Schroeder CE,
    2. Lakatos P,
    3. Kajikawa Y,
    4. Partan S,
    5. Puce A
    (2008) Neuronal oscillations and visual amplification of speech. Trends Cogn Sci 12:106–113. doi:10.1016/j.tics.2008.01.002 pmid:18280772
    OpenUrlCrossRefPubMed
  56. ↵
    1. Schröger E,
    2. Widmann A
    (1998) Speeded responses to audiovisual signal changes result from bimodal integration. Psychophysiology 35:755–759. pmid:9844437
    OpenUrlCrossRefPubMed
  57. ↵
    1. Schultz W,
    2. Tremblay L,
    3. Hollerman JR
    (1998) Reward prediction in primate basal ganglia and frontal cortex. Neuropharmacology 37:421–429. doi:10.1016/s0028-3908(98)00071-9 pmid:9704983
    OpenUrlCrossRefPubMed
  58. ↵
    1. Schürmann M,
    2. Caetano G,
    3. Hlushchuk Y,
    4. Jousmäki V,
    5. Hari R
    (2006) Touch activates human auditory cortex. Neuroimage 30:1325–1331. doi:10.1016/j.neuroimage.2005.11.020 pmid:16488157
    OpenUrlCrossRefPubMed
  59. ↵
    1. Sekiyama K,
    2. Kanno I,
    3. Miura S,
    4. Sugita Y
    (2003) Auditory-visual speech perception examined by fMRI and PET. Neurosci Res 47:277–287. doi:10.1016/s0168-0102(03)00214-1 pmid:14568109
    OpenUrlCrossRefPubMed
  60. ↵
    1. Sokolowski K,
    2. Corbin JG
    (2012) Wired for behaviors: from development to function of innate limbic system circuitry. Front Mol Neurosci 5:55. doi:10.3389/fnmol.2012.00055 pmid:22557946
    OpenUrlCrossRefPubMed
  61. ↵
    1. Spence C
    (2013) Multisensory flavour perception. Curr Biol 23:R365–R369. doi:10.1016/j.cub.2013.01.028 pmid:23660358
    OpenUrlCrossRefPubMed
  62. ↵
    1. Stein BE,
    2. Meredith MA
    (1993) The merging of the senses. Cambridge, MA: MIT.
  63. ↵
    1. Stein BE,
    2. Stanford TR
    (2008) Multisensory integration: current issues from the perspective of the single neuron. Nat Rev Neurosci 9:255–266. doi:10.1038/nrn2331 pmid:18354398
    OpenUrlCrossRefPubMed
  64. ↵
    1. Steinmetz NA,
    2. Zatka-Haas P,
    3. Carandini M,
    4. Harris KD
    (2019) Distributed coding of choice, action and engagement across the mouse brain. Nature 576:266–273. doi:10.1038/s41586-019-1787-x pmid:31776518
    OpenUrlCrossRefPubMed
  65. ↵
    1. Suri H,
    2. Rothschild G
    (2022) Enhanced stability of complex sound representations relative to simple sounds in the auditory cortex. Eneuro 9:ENEURO.0031-22.2022. doi:10.1523/ENEURO.0031-22.2022
    OpenUrlAbstract/FREE Full Text
  66. ↵
    1. Taaseh N,
    2. Yaron A,
    3. Nelken I
    (2011) Stimulus-specific adaptation and deviance detection in the rat auditory cortex. PLoS One 6:e23369. doi:10.1371/journal.pone.0023369 pmid:21853120
    OpenUrlCrossRefPubMed
  67. ↵
    1. Talsma D,
    2. Senkowski D,
    3. Soto-Faraco S,
    4. Woldorff MG
    (2010) The multifaceted interplay between attention and multisensory integration. Trends Cogn Sci 14:400–410. doi:10.1016/j.tics.2010.06.008 pmid:20675182
    OpenUrlCrossRefPubMed
  68. ↵
    1. Thonhauser KE,
    2. Raveh S,
    3. Hettyey A,
    4. Beissmann H,
    5. Penn DJ
    (2013) Scent marking increases male reproductive success in wild house mice. Anim Behav 86:1013–1021. doi:10.1016/j.anbehav.2013.09.004 pmid:25554707
    OpenUrlCrossRefPubMed
  69. ↵
    1. Ulanovsky N,
    2. Las L,
    3. Nelken I
    (2003) Processing of low-probability sounds by cortical neurons. Nat Neurosci 6:391–398. doi:10.1038/nn1032 pmid:12652303
    OpenUrlCrossRefPubMed
  70. ↵
    1. Vinograd A,
    2. Livneh Y,
    3. Mizrahi A
    (2017) History-dependent odor processing in the mouse olfactory bulb. J Neurosci 37:12018–12030. doi:10.1523/JNEUROSCI.0755-17.2017 pmid:29109236
    OpenUrlAbstract/FREE Full Text
  71. ↵
    1. Willander J,
    2. Larsson M
    (2007) Olfaction and emotion: the case of autobiographical memory. Mem Cognit 35:1659–1663. doi:10.3758/bf03193499 pmid:18062543
    OpenUrlCrossRefPubMed
  72. ↵
    1. Winkowski DE,
    2. Nagode DA,
    3. Donaldson KJ,
    4. Yin P,
    5. Shamma SA,
    6. Fritz JB,
    7. Kanold PO
    (2018) Orbitofrontal cortex neurons respond to sound and activate primary auditory cortex neurons. Cereb Cortex 28:868–879. doi:10.1093/cercor/bhw409 pmid:28069762
    OpenUrlCrossRefPubMed
Back to top

In this issue

The Journal of Neuroscience: 43 (8)
Journal of Neuroscience
Vol. 43, Issue 8
22 Feb 2023
  • Table of Contents
  • Table of Contents (PDF)
  • About the Cover
  • Index by author
  • Masthead (PDF)
Email

Thank you for sharing this Journal of Neuroscience article.

NOTE: We request your email address only to inform the recipient that it was you who recommended this article, and that it is not junk mail. We do not retain these email addresses.

Enter multiple addresses on separate lines or separate them with commas.
Learning-Induced Odor Modulation of Neuronal Activity in Auditory Cortex
(Your Name) has forwarded a page to you from Journal of Neuroscience
(Your Name) thought you would be interested in this article in Journal of Neuroscience.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Print
View Full Page PDF
Citation Tools
Learning-Induced Odor Modulation of Neuronal Activity in Auditory Cortex
Omri David Gilday, Adi Mizrahi
Journal of Neuroscience 22 February 2023, 43 (8) 1375-1386; DOI: 10.1523/JNEUROSCI.1398-22.2022

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Respond to this article
Request Permissions
Share
Learning-Induced Odor Modulation of Neuronal Activity in Auditory Cortex
Omri David Gilday, Adi Mizrahi
Journal of Neuroscience 22 February 2023, 43 (8) 1375-1386; DOI: 10.1523/JNEUROSCI.1398-22.2022
del.icio.us logo Digg logo Reddit logo Twitter logo Facebook logo Google logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • Introduction
    • Materials and Methods
    • Results
    • Discussion
    • Footnotes
    • References
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF

Keywords

  • auditory cortex
  • behavior
  • expectation
  • multisensory
  • two-photon

Responses to this article

Respond to this article

Jump to comment:

No eLetters have been published for this article.

Related Articles

Cited By...

More in this TOC Section

Research Articles

  • ALS-associated KIF5A mutation causes locomotor deficits associated with cytoplasmic inclusions, alterations of neuromuscular junctions and motor neuron loss
  • Anatomical diversity of the adult corticospinal tract revealed by single cell transcriptional profiling
  • Expectation cues and false percepts generate stimulus-specific activity in distinct layers of the early visual cortex Laminar profile of visual false percepts
Show more Research Articles

Systems/Circuits

  • Expectation cues and false percepts generate stimulus-specific activity in distinct layers of the early visual cortex Laminar profile of visual false percepts
  • Haploinsufficiency of Shank3 in mice selectively impairs target odor recognition in novel background odors
  • Widespread and Multifaceted Binocular Integration in the Mouse Primary Visual Cortex
Show more Systems/Circuits
  • Home
  • Alerts
  • Visit Society for Neuroscience on Facebook
  • Follow Society for Neuroscience on Twitter
  • Follow Society for Neuroscience on LinkedIn
  • Visit Society for Neuroscience on Youtube
  • Follow our RSS feeds

Content

  • Early Release
  • Current Issue
  • Issue Archive
  • Collections

Information

  • For Authors
  • For Advertisers
  • For the Media
  • For Subscribers

About

  • About the Journal
  • Editorial Board
  • Privacy Policy
  • Contact
(JNeurosci logo)
(SfN logo)

Copyright © 2023 by the Society for Neuroscience.
JNeurosci Online ISSN: 1529-2401

The ideas and opinions expressed in JNeurosci do not necessarily reflect those of SfN or the JNeurosci Editorial Board. Publication of an advertisement or other product mention in JNeurosci should not be construed as an endorsement of the manufacturer’s claims. SfN does not assume any responsibility for any injury and/or damage to persons or property arising from or related to any use of any material contained in JNeurosci.