CORDIS - Forschungsergebnisse der EU

Cross-modal plasticity and functional modularisation in the deaf

Periodic Reporting for period 1 - CP-FunMoD (Cross-modal plasticity and functional modularisation in the deaf)

Berichtszeitraum: 2015-10-01 bis 2017-09-30

Previous studies have suggests that auditory cortex – a part of the brain that typically processes sound – is not ‘idle’ in Deaf individuals, but instead may respond to input from other sensory modalities such as vision and touch. Such crossmodal plasticity (CP) is traditionally thought to have a negative effect on clinical outcomes of cochlear implantation (CI). This has notably resulted in clinicians discouraging parents from teaching sign language to Deaf children for fear that the use of a visual language will cause the ‘auditory’ cortex to respond to vision instead, and that this will hinder auditory cortex from responding to a spoken language following auditory restoration.

However, there is little scientific evidence to support this notion, and little evidence to support a direct causal link between sign language, CP, and CI outcomes. Second, there is evidence that supports the opposite view – that CP is actually beneficial to outcomes, and that learning sign language encourages the language parts of the brain to develop properly, resulting in better outcomes. Consequently, preventing Deaf children from learning sign language could be having the opposite intended effect, causing worse CI outcomes along with other outcomes associated with not being able to communicate efficiently, such as poor mental health and quality-of-life. It is therefore crucial to examine the link between sign language usage in Deaf individuals, CP, and CI outcomes in more detail to determine how they are linked, if at all.

This project determines whether CP occurs in Deaf individuals who use sign language as their primary means of communication, and if so, in what parts of auditory cortex. Understanding the nature of CP in congenitally Deaf individuals is the first step to being able to understand: i) the relationship between CP and CI outcomes, which could form the basis of a predictor for CI outcomes; and ii) whether sign language is detrimental to such outcomes. Specifically, the project asks: 1) Does CP occur in auditory cortex of congenitally Deaf individuals; 2) Does CP occur in early parts of auditory cortex (consequently having a knock-on effect for all auditory processing), or only in later parts of auditory cortex (i.e. in areas associated with language processing)? 3) To what extent are auditory and language areas involved in processing sign language in congenitally Deaf individuals?
Data was collected from 12 congenitally Deaf participants who were fluent in sign language, 12 hearing non-signers, and 12 hearing adults who were born to Deaf parents and exposed to sign language from birth. The key findings were:
(1) All participants showed differences in the response to the same and different written words in language regions, supporting previous studies showing these areas are involved in processing written words. Interestingly, Deaf signers and hearing signers, but not hearing non-signers, showed adaptation to signed words in both language regions, suggesting that these areas are also involved in processing sign language.
(2) The results indicated that different neuronal populations for words and signs may exist for Deaf participants in both language regions. Hearing signers showed evidence for shared representations of words and signs; however, one possibility is that hearing signers were silently rehearing the signed words and written words. Current further work is ongoing to understand if this is a plausible explanation for this finding.
(3) There was no response to visual stimuli (words or signs) for Deaf or hearing participants in early auditory cortex. This suggests that sign language usage does not promote early auditory cortex to respond to vision in Deaf individuals.
(4) Additional exploratory analysis showed that individuals who were born into Deaf families and learnt sign language at an early age, showed the most sensitivity to sign language in language regions of the brain. This could suggest that a delay in language exposure in childhood may have long-term consequences on the language areas of the brain. Further studies on a larger population will be required to elucidate this.

Data was collected from 12 congenitally Deaf participants fluent in sign language, and 12 hearing non-signers. The key findings were:
(1) A whole brain group analysis for all Deaf participants and all hearing participants revealed a response to both vision tasks and both touch tasks in the right auditory cortex of congenitally Deaf individuals, but no such responses in hearing participants.
(2) There was a response to tactile stimuli, but not visual stimuli, in the early auditory area of congenitally Deaf individuals. Instead, only later auditory areas (i.e. areas that coincide with language parts of the brain) responded to both vision and touch. This suggests that that sign language usage does not promote early auditory cortex to respond to vision in Deaf individuals.

The findings from this project are yet to be published in peer reviewed journals and first drafts are expected by May 2018. However, the findings have been presented at various events including two PubhD events (audience: public), the Lumesse Learning Lounge (audience: the learning industry), and St Andrews Healthcare (audience: psychiatrists/psychologists specializing in Deafness). Upcoming events include a ‘Sharing Good Practice’ day on 6th June (audience: psychiatrists/psychologists specializing in Deafness), and SciBar/Café Scientifique on 25th April (audience: public). Due to the success of this study, we recently received some additional funding from the University of Nottingham Biomedical Innovation Funding Initiative to extent the visual-tactile study into acquired deaf individuals - one several planned phases.
The findings from this project bring into question the notion that sign language usage promotes early auditory cortex to respond to visual input. However, a remaining question is whether the response to visual stimuli in later auditory regions is linked to sign language usage or CI outcomes. The findings from this project lay the necessary groundwork for examining CP in acquired deaf individuals. Indeed, as a direct result of the success of this study, we recently received additional funding (£30,000) to extend this study, and data collection is currently underway in acquired deaf individuals who do not know sign language. If the same pattern of CP is present in these participants, this would suggest that sign language is not causing CP, and provide further information to clinicians who currently practice the discouragement of parents from teaching sign language to deaf children. Further, if CP is present in acquired deaf participants, then the next phase is to study individuals who are undergoing CI, to determine whether the extent of CP is linked to CI outcomes, and if so, whether the extent of CI can be used as a predictor for clinical outcomes. This will be hugely impactful for both patients and medical professionals as the decision to have a cochlear can be a difficult one due to social, personal, and medical reasons, and patients will be able to make a more informed decision whether to go ahead with the surgery; and such information will aid decisions in terms of where to direct limited financial resources.