3
$\begingroup$

The human's ability to process language in the abstract is somewhat linked to our ability to both produce and understand sounds (see Broca's area), with that in mind, is Sign Language handled similarly by the brain? Is there any fMRI research that could corroborate that?

$\endgroup$
0

1 Answer 1

4
$\begingroup$

For a broad meaning of "similarly", the answer is yes, the processing of sign language has substantial similarities with that of spoken language, in terms of brain areas involved. There are also some differences; the latter issue is a vast and still somewhat controversial research area; from the conclusion of a 2007 review:

Where differences can be shown between sign and speech they can, on the whole, be ascribed to the different input modalities of the language system—auditory for spoken language, visuospatial for signed language. The specialization of cortical networks for language processing does not appear to be driven either by the acoustic requirements for hearing a spoken language or by the articulatory requirements for speaking. It seems likely, therefore, that it is the specialized requirements of language processing itself, including, for instance, compositionality, syntax, and the requirements of mapping coherent concepts onto a communicable form, that determine the final form of the specialized language circuits in the brain. Even the disagreements in the literature, especially those concerning the relative importance of the RH [right hemisphere] in processing sign, do not trouble this basic insight. It is probably more constructive to pose a somewhat different question: How might signed language use specific brain specializations that are unlikely to be involved in spoken language processing?

And for an illustration of the similarities/differences the review has this:

enter image description here

Color-rendered images of the brain depicting (group) fMRI activation. (a) Regions activated by BSL [British Sign Language] perception in deaf and hearing native signers (first and second columns, respectively) and by audiovisual speech perception in hearing nonsigners (third column). For language in both modalities, and across all three groups, activation is greater in the left than the RH and perisylvian regions are engaged. [The rows are the hemispheres.]

And a more recent (2015) study perhaps more elaborately addressing your question (of similar regions):

sign language constructions—even those similar to gesture—engage language-related brain systems and are not processed in the same ways that nonsigners interpret gesture. [...]

It is already established that a very similar, left-lateralized neural network is involved in the processing of many aspects of lexical and syntactic information in both spoken and signed languages. This includes the inferior frontal gyrus (IFG) (classically called Broca’s area), superior temporal sulcus (STS) and adjacent superior and middle temporal gyri, and the inferior parietal lobe (IPL) (classically called Wernicke’s area) including the angular (AG) and supramarginal gyri (SMG). Likewise, narrative and discourse-level aspects of signed language depend largely on right STS regions, as they do for spoken language.

$\endgroup$

Not the answer you're looking for? Browse other questions tagged or ask your own question.