TY - JOUR T1 - Signed Words in the Congenitally Deaf Evoke Typical Late Lexicosemantic Responses with No Early Visual Responses in Left Superior Temporal Cortex JF - The Journal of Neuroscience JO - J. Neurosci. SP - 9700 LP - 9705 DO - 10.1523/JNEUROSCI.1002-12.2012 VL - 32 IS - 28 AU - Matthew K. Leonard AU - Naja Ferjan Ramirez AU - Christina Torres AU - Katherine E. Travis AU - Marla Hatrak AU - Rachel I. Mayberry AU - Eric Halgren Y1 - 2012/07/11 UR - http://www.jneurosci.org/content/32/28/9700.abstract N2 - Congenitally deaf individuals receive little or no auditory input, and when raised by deaf parents, they acquire sign as their native and primary language. We asked two questions regarding how the deaf brain in humans adapts to sensory deprivation: (1) is meaning extracted and integrated from signs using the same classical left hemisphere frontotemporal network used for speech in hearing individuals, and (2) in deafness, is superior temporal cortex encompassing primary and secondary auditory regions reorganized to receive and process visual sensory information at short latencies? Using MEG constrained by individual cortical anatomy obtained with MRI, we examined an early time window associated with sensory processing and a late time window associated with lexicosemantic integration. We found that sign in deaf individuals and speech in hearing individuals activate a highly similar left frontotemporal network (including superior temporal regions surrounding auditory cortex) during lexicosemantic processing, but only speech in hearing individuals activates auditory regions during sensory processing. Thus, neural systems dedicated to processing high-level linguistic information are used for processing language regardless of modality or hearing status, and we do not find evidence for rewiring of afferent connections from visual systems to auditory cortex. ER -