RT Journal Article SR Electronic T1 Signed Words in the Congenitally Deaf Evoke Typical Late Lexicosemantic Responses with No Early Visual Responses in Left Superior Temporal Cortex JF The Journal of Neuroscience JO J. Neurosci. FD Society for Neuroscience SP 9700 OP 9705 DO 10.1523/JNEUROSCI.1002-12.2012 VO 32 IS 28 A1 Leonard, Matthew K. A1 Ferjan Ramirez, Naja A1 Torres, Christina A1 Travis, Katherine E. A1 Hatrak, Marla A1 Mayberry, Rachel I. A1 Halgren, Eric YR 2012 UL http://www.jneurosci.org/content/32/28/9700.abstract AB Congenitally deaf individuals receive little or no auditory input, and when raised by deaf parents, they acquire sign as their native and primary language. We asked two questions regarding how the deaf brain in humans adapts to sensory deprivation: (1) is meaning extracted and integrated from signs using the same classical left hemisphere frontotemporal network used for speech in hearing individuals, and (2) in deafness, is superior temporal cortex encompassing primary and secondary auditory regions reorganized to receive and process visual sensory information at short latencies? Using MEG constrained by individual cortical anatomy obtained with MRI, we examined an early time window associated with sensory processing and a late time window associated with lexicosemantic integration. We found that sign in deaf individuals and speech in hearing individuals activate a highly similar left frontotemporal network (including superior temporal regions surrounding auditory cortex) during lexicosemantic processing, but only speech in hearing individuals activates auditory regions during sensory processing. Thus, neural systems dedicated to processing high-level linguistic information are used for processing language regardless of modality or hearing status, and we do not find evidence for rewiring of afferent connections from visual systems to auditory cortex.