Periodic Reporting for period 1 - CP-FunMoD (Cross-modal plasticity and functional modularisation in the deaf)
Reporting period: 2015-10-01 to 2017-09-30
However, there is little scientific evidence to support this notion, and little evidence to support a direct causal link between sign language, CP, and CI outcomes. Second, there is evidence that supports the opposite view – that CP is actually beneficial to outcomes, and that learning sign language encourages the language parts of the brain to develop properly, resulting in better outcomes. Consequently, preventing Deaf children from learning sign language could be having the opposite intended effect, causing worse CI outcomes along with other outcomes associated with not being able to communicate efficiently, such as poor mental health and quality-of-life. It is therefore crucial to examine the link between sign language usage in Deaf individuals, CP, and CI outcomes in more detail to determine how they are linked, if at all.
This project determines whether CP occurs in Deaf individuals who use sign language as their primary means of communication, and if so, in what parts of auditory cortex. Understanding the nature of CP in congenitally Deaf individuals is the first step to being able to understand: i) the relationship between CP and CI outcomes, which could form the basis of a predictor for CI outcomes; and ii) whether sign language is detrimental to such outcomes. Specifically, the project asks: 1) Does CP occur in auditory cortex of congenitally Deaf individuals; 2) Does CP occur in early parts of auditory cortex (consequently having a knock-on effect for all auditory processing), or only in later parts of auditory cortex (i.e. in areas associated with language processing)? 3) To what extent are auditory and language areas involved in processing sign language in congenitally Deaf individuals?
Data was collected from 12 congenitally Deaf participants who were fluent in sign language, 12 hearing non-signers, and 12 hearing adults who were born to Deaf parents and exposed to sign language from birth. The key findings were:
(1) All participants showed differences in the response to the same and different written words in language regions, supporting previous studies showing these areas are involved in processing written words. Interestingly, Deaf signers and hearing signers, but not hearing non-signers, showed adaptation to signed words in both language regions, suggesting that these areas are also involved in processing sign language.
(2) The results indicated that different neuronal populations for words and signs may exist for Deaf participants in both language regions. Hearing signers showed evidence for shared representations of words and signs; however, one possibility is that hearing signers were silently rehearing the signed words and written words. Current further work is ongoing to understand if this is a plausible explanation for this finding.
(3) There was no response to visual stimuli (words or signs) for Deaf or hearing participants in early auditory cortex. This suggests that sign language usage does not promote early auditory cortex to respond to vision in Deaf individuals.
(4) Additional exploratory analysis showed that individuals who were born into Deaf families and learnt sign language at an early age, showed the most sensitivity to sign language in language regions of the brain. This could suggest that a delay in language exposure in childhood may have long-term consequences on the language areas of the brain. Further studies on a larger population will be required to elucidate this.
STUDY 2: THE RESPONSE IN AUDITORY CORTEX TO VISION AND TOUCH
Data was collected from 12 congenitally Deaf participants fluent in sign language, and 12 hearing non-signers. The key findings were:
(1) A whole brain group analysis for all Deaf participants and all hearing participants revealed a response to both vision tasks and both touch tasks in the right auditory cortex of congenitally Deaf individuals, but no such responses in hearing participants.
(2) There was a response to tactile stimuli, but not visual stimuli, in the early auditory area of congenitally Deaf individuals. Instead, only later auditory areas (i.e. areas that coincide with language parts of the brain) responded to both vision and touch. This suggests that that sign language usage does not promote early auditory cortex to respond to vision in Deaf individuals.
The findings from this project are yet to be published in peer reviewed journals and first drafts are expected by May 2018. However, the findings have been presented at various events including two PubhD events (audience: public), the Lumesse Learning Lounge (audience: the learning industry), and St Andrews Healthcare (audience: psychiatrists/psychologists specializing in Deafness). Upcoming events include a ‘Sharing Good Practice’ day on 6th June (audience: psychiatrists/psychologists specializing in Deafness), and SciBar/Café Scientifique on 25th April (audience: public). Due to the success of this study, we recently received some additional funding from the University of Nottingham Biomedical Innovation Funding Initiative to extent the visual-tactile study into acquired deaf individuals - one several planned phases.