How the brain processes sign language

How the brain processes sign language

The ability to speak is one of the essential characteristics that distinguishes humans from other animals. Many people would probably intuitively equate speech and language. However, cognitive science research on sign languages since the 1960s paints a different picture: Today it is clear, sign languages are fully autonomous languages and have a complex organization on several linguistic levels such as grammar and meaning. Previous studies on the processing of sign language in the human brain had already found some similarities and also differences between sign languages and spoken languages. Until now, however, it has been difficult to derive a consistent picture of how both forms of language are processed in the brain. …

The Leipzig-based researchers were indeed able to confirm that there is an overlap between spoken and signed language in Broca’s area. They also succeeded in showing the role played by the right frontal brain—the counterpart to Broca’s area on the left side of the brain. This also appeared repeatedly in many of the sign language studies evaluated, because it processes non-linguistic aspects such as spatial or social information of its counterpart. This means that movements of the hands, face and body—of which signs consist—are in principle perceived similarly by deaf and hearing people. Only in the case of deaf people, however, do they additionally activate the language network in the left hemisphere of the brain, including Broca’s area. They therefore perceive the gestures as gestures with linguistic content—instead of as pure movement sequences, as would be the case with hearing people.

By Emiliano Zaccarella, Department of Neuropsychology at the MPI CBS for Medical Xpress

Read Article