Memory of music: Roles of right hippocampus and left inferior frontal gyrus
Introduction
Music offers a unique opportunity to better understand the organization of the human brain (Peretz and Zatorre, 2003). Studies of music memory have found that dorsolateral and inferior frontal regions are recruited when working memory load is high (Zatorre et al., 1994, Holcomb et al., 1998, Griffiths et al., 1999, Gaab et al., 2003). These findings fit within a broader context in which working memory for pitch may be seen as a specialized subsystem within the framework of general working memory (Marin and Perry, 1999). However, neural correlates of episodic memory for music are not well understood.
Lines of evidence indicate that the fronto-temporal regions are crucial for episodic memory for music (Zatorre, 2005). First, neuroimaging studies found that the left middle and inferior frontal regions are associated with episodic retrieval of music (Platel et al., 2003). Second, lesion studies showed that the right medial temporal lobe is associated with melody retrieval (Milner, 1962) and discrimination of melodies (Samson and Zatorre, 1988, Liegeois-Chauvel et al., 1998). In addition, the bilateral lateral temporal regions are also associated with melodic processing (Peretz et al., 1994).
Although these previous studies revealed that the fronto-temporal regions are recruited in music retrieval processes, the specific component of the retrieval process to which the front-temporal regions contribute remains unknown. By contrast, in word memory studies, the cognitive operations supporting the retrieval of episodic memories have been divided into two broad classes, associated with retrieval effort and retrieval success (Kapur et al., 1995, Nyberg et al., 1995, Schacter et al., 1996, Rugg et al., 1996, Buckner et al., 1998). Retrieval success, which is defined by the comparison between Hit (correctly recognized the target as having been learned in the encoding phase) and Correct Rejection (CR, correctly identified the target as not having been learned), refers to processes engaged when a retrieval attempt is successful and yields information about a past event (Rugg et al., 1998). The cognitive processes of music and word have common components (Besson and Schon, 2001), and we hypothesized that the fronto-temporal regions are involved in the retrieval success for music.
Based on the assumption, we set two goals in this study. The first goal was to test our hypothesis within the conventional framework of memory retrieval studies using event-related functional magnetic resonance imaging (fMRI) with sparse temporal sampling technique. We identified the neural correlates of music retrieval success using whole brain analyses and then directly compared the responses of the retrieval success between the left and right fronto-temporal brain hemispheres using anatomically defined region-of-interests (ROIs) analyses (Brett et al., 2002, Tzourio-Mazoyer et al., 2002). The second goal was to further classify the roles of the fronto-temporal retrieval success regions using performance-based analyses. The analyses categorized these fronto-temporal regions according to the correlation between the hemodynamic responses and corrected recognition rates, which provided us with further information about the function of their regions.
In the experiments of music using fMRI, MRI scan noise can be one of the critical confounding factors. In order to minimize confounding activations generated by scan noise, we used a sparse temporal sampling technique (Fig. 1), which was proved to successfully delimit the “core” activated regions revealed by conventional continuous imaging (Hall et al., 1999, Nebel et al., 2005). Another confounding factor is experience of pre-existing music. If we used a piece of pre-existing music as a stimulus, subjects’ memory processing would be influenced by whether they knew or did not know it before the experiment. In order to detect regions without the confounding factor, we employed an auto-composing software and developed new music stimuli that were in accordance with Western musical system. Therefore, our methods would make it possible to detect regions under the minimum influence of MRI scan noise without retrieval of pre-existing music. Hence, the present study of music memory with well-controlled methods would contribute to a better understanding the functional architecture of the human brain.
Section snippets
Subjects
Eighteen subjects (22.4 ± 0.4, mean ± SEM; 6 females, 12 males) participated in this study. All were right-handed (Edinburgh score, 95 ± 2.1) (Oldfield, 1971) and had normal hearing. They were all graduate or university students, and none of them had received professional musical education, nor did any have any special musical preference, which means that they were in Wertheim and Botez’s class I (Wertheim and Botez, 1959, Platel et al., 2003). None had absolute pitch (absolute pitch test score, 1.4 ±
Behavioral results
The mean induction ratios of Hit and CR were 0.62 ± 0.03 and 0.86 ± 0.02 (mean ± SEM), respectively. In the original experiment, the number of Hit and CR responses per subjects were 37.1 ± 1.6 and 25.9 ± 0.6, respectively. The mean corrected recognition rate which was used to evaluate the accuracy of memory retrieval (Wagner et al., 1998, Henson et al., 1999, Shannon and Buckner, 2004) was 0.48 ± 0.04, suggesting good discrimination between new (previously not presented in encoding phase) and old
Discussion
In this study, we investigated the neural correlates of music retrieval success using fMRI with sparse temporal sampling. The right hippocampus, left IFG, bilateral lateral temporal regions and left precuneus were identified as the regions of retrieval success for music memory with whole brain analyses (Table 1). Using anatomically defined ROI analyses (Brett et al., 2002, Tzourio-Mazoyer et al., 2002), we demonstrated that the right hippocampus responded to the retrieval success more strongly
Acknowledgments
This research was supported by a Grant-in-Aid for Scientific Research (C) (16500195) to H.K. We thank Takanori Hashimoto (Pittsburgh University), Christopher Holmes, and Joseph Green for their helpful comments on an earlier draft of this paper. The authors declare that they have no competing financial interests.
References (57)
- et al.
Retrieval of relational information: a role for the left inferior prefrontal cortex
NeuroImage
(2002) - et al.
Functional–anatomic study of episodic retrieval: I. Retrieval effort versus retrieval success
NeuroImage
(1998) - et al.
Functional anatomy of pitch memory. An fMRI study with sparse temporal sampling
NeuroImage
(2003) - et al.
Temporal lobe activations of “feeling-of-knowing” induced by face-name associations
NeuroImage
(2004) - et al.
Neural correlates for feeling-of-knowing: an fMRI parametric analysis
Neuron
(2002) - et al.
Neural correlates of episodic retrieval success
NeuroImage
(2000) - et al.
Neurological aspects of music perception and performance
See Deutsch
(1999) - et al.
Characterizing the hemodynamic response: effects of presentation rate, sampling procedure, and the possibility of ordering brain activity based on relative timing
NeuroImage
(2000) The assessment and analysis of handedness: the Edinburgh inventory
Neuropsychologia
(1971)- et al.
Semantic and episodic memory of music are subserved by distinct neural networks
NeuroImage
(2003)
Neural correlates of memory retrieval during recognition memory and cued recall
NeuroImage
Melodic and harmonic discrimination following unilateral cerebral excision
Brain Cogn.
Functional segregation of the temporal lobes into highly differentiated subsystems for auditory perception: an auditory rapid event-related fMRI-task
NeuroImage
Automated anatomical labeling of activations in SPM using a macroscopic anatomical parcellation of the MNI MRI single-subject brain
NeuroImage
Recovering meaning: left prefrontal cortex guides controlled semantic retrieval
Neuron
Patterns of music agnosia associated with middle cerebral artery infarcts
Brain
MRI studies of brain activation: dynamic characteristics
Comparison between language and music
Ann. N.Y. Acad. Sci.
Dynamic mapping of the human visual cortex by high-speed magnetic resonance imaging
Proc. Natl. Acad. Sci. U. S. A.
Intensely pleasurable responses to music correlate with activity in brain regions implicated in reward and emotion
Proc. Natl. Acad. Sci. U. S. A.
Emotional responses to pleasant and unpleasant music correlate with activity in paralimbic brain regions
Nat. Neurosci.
Region of interest analysis using an SPM toolbox
NeuroImage
Estimation and detection of event-related fMRI signals with temporally correlated noise: a statistically efficient and unbiased approach
Hum. Brain Mapp.
Brainweb: online interface to a 3D MRI simulated brain database
NeuroImage
Preserved learning and retention of pattern-analyzing skill in amnesia: dissociation of knowing how and knowing that
Science
Hippocampal system and declarative (relational) memory: summarizing the data from functional neuroimaging studies
Hippocampus
A common neural substrate for the analysis of pitch and duration pattern in segmented sound?
NeuroReport
“Sparse” temporal sampling in auditory fMRI
Hum. Brain Mapp.
Cited by (64)
The neural bases of familiar music listening in healthy individuals: An activation likelihood estimation meta-analysis
2023, Neuroscience and Biobehavioral ReviewsImplicit auditory memory in older listeners: From encoding to 6-month retention
2023, Current Research in NeurobiologyThe hearing hippocampus
2022, Progress in NeurobiologyInteractions between the hippocampus and the auditory pathway
2022, Neurobiology of Learning and MemoryCitation Excerpt :It mainly receives sensory information via the entorhinal cortex (Schultz and Engelhardt, 2014) and also sends a direct projection from CA1 to the auditory cortex (Cenquizca and Swanson, 2007). The hippocampus also contributes to language and music processing in human subjects (Davis and Johnsrude, 2003; Watanabe et al., 2008). Sensory gating describes the neural processes of filtering out redundant information flow reaching the brain.
Functional connectivity of major depression disorder using ongoing EEG during music perception
2020, Clinical NeurophysiologyCitation Excerpt :The results supported our findings of left hemisphere lateralization in the CON group because we also used a piece of music without lyrics. Furthermore, the left inferior frontal area was reported to be related to the memory of music (Watanabe et al., 2008), which also supported to the reliability of our results. Many studies have reported a lateralized hemispheric dysfunction in major depression (Uytdenhoef et al., 1983; Bench et al., 1993), and this dysfunction was well demonstrated by our results that no hemispheric lateralization effect exists in MDD patients during music perception.