Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

Functional magnetic resonance imaging (fMRI) was used to compare the cerebral organization during sentence processing in English and in American sign language (ASL). Classical language areas within the left hemisphere were recruited by both English in native speakers and ASL in native signers. This suggests a bias of the left hemisphere to process natural languages independently of the modality through which language is perceived. Furthermore, in contrast to English, ASL strongly recruited right hemisphere structures. This was true irrespective of whether the native signers were deaf or hearing. Thus, the specific processing requirements of the language also in part determine the organization of the language systems of the brain.

Type

Journal article

Journal

Neuroreport

Publication Date

11/05/1998

Volume

9

Pages

1537 - 1542

Keywords

Adult, Brain, Brain Mapping, Deafness, Functional Laterality, Hearing, Humans, Language, Learning, Magnetic Resonance Imaging, Sign Language, United States