Periodic Reporting for period 2 - NovelExperiSENSE (How experience shapes brain specializations)
Reporting period: 2019-10-01 to 2021-03-31
The Nobel Prize laureates, Hubel & Wiesel, showed that visual deprivation in early life causes a permanent deficit to the development of the visual system, its division of labor into specialized brain areas and to its function. They concluded that brain plasticity is limited by the closure of critical periods in development. Following their seminal studies, it became nearly a dogma in neuroscience, that after the critical periods close at early childhood, brain plasticity is irreversibly and dramatically diminished, especially if the proper visual experience was not present at childhood. Our research project’s main goal is to better understand the relationship between critical periods of development on one hand, and on the other hand, experience-dependent plasticity which develops later in life in the adult brain. The research conducted with this grant tries to understand the principles driving human brain specializations and plasticity, and to explore how much the organization of the adult brain can be modified by novel sensory experiences. Another general goal of our research is to apply our understanding of the human brain’s plasticity to improve the rehabilitation of patients in the clinical setting. For example, we investigate how to help the hearing impaired to better understand speech in noisy environments by providing a new supplementary sensory experience via vibrations on the skin.
Work performed from the beginning of the project to the end of the period covered by the report and main results achieved so far
Quite surprisingly, our research shows that brain areas thought to be dedicated exclusively to a certain sensory input (e.g. vision), can use inputs from other sensory modalities (e.g. sounds and touch) to develop their typical functional specialization. Moreover, this specialization can suddenly occur in the adult brain - even in adults that were never exposed to the relevant, supposedly critical, sensory input. Take for example a congenitally blind person, who never saw a human face in his life. In his brain, the area usually dedicated to processing faces did not develop in early childhood, as it was completely deprived of visual input. We demonstrated that if you teach a congenitally blind person to “see” faces with sounds, the face processing area in his brain will become active! Furthermore, tones of a higher pitch denote higher locations, whereas their ordering in time corresponds to their respective positioning from left to right. We scanned the brain of our most proficient blind from birth user (MaMe) using functional magnetic resonance imaging, before the training and after he had successfully learned to interpret soundscapes. After learning, his brain was shown to be activated not only in the hearing parts of the brain but also in the seeing parts of his brain. Not only that, the activation followed a pattern of topographical maps – the highly ordered manner in which external stimuli are mapped in the sensory systems of the human brain. For example, in the visual cortex of sighted people, neighboring locations are "mapped" onto adjacent neurons, whereas places at a great distance from one another are represented by brain areas that are further apart. MaMe's scans revealed topographic maps tuned to pitch and time – or even to both concurrently – that had not existed in his brain before he'd started the training. For instance, tones of a similar pitch were represented by adjacent brain areas, whereas those of radically different pitches – by brain areas that were distant from one another. This is the first time sensory topographic maps have been shown to emerge in an adult human brain. These findings suggest that the brain's sensory regions can be adapted to processing novel sensory experiences. In fact, past studies by other labs (e.g. Sur lab at MIT) had shown in newborn laboratory animals that it's possible to "move" the vision function to the brain's hearing region by surgically guiding nerve fibers from the visual cortex to the auditory one. This new study supplies a proof of concept that a similar transition can be done in a noninvasive manner, and in an adult, rather than infant, brain. So critical periods are not permanent cut-off points for developing new sensory abilities – rather, in a way, we can give the brain a second chance at any point in life. We suggest that EyeMusic can teach people to develop an ability that is similar to that of bats and dolphins: extracting information about geometric shapes from complex sounds. A major difference, of course, is that the animals had developed this natural ability in the course of hundreds of thousands years of evolution, whereas in the lab, it can be acquired after a relatively short training. So with the right technology, one can induce a speeded-up evolution of sorts in the sensory brain. These exciting findings led us to propose a new theory of brain plasticity: the Reversible Plasticity Gradient. We suggest that with the right interventions, we can stop the general decline of brain plasticity over the lifespan, and regnite brain plasticity at adulthood. We now further examine how transforming one sensory input into another, with the aid of wearable devices we develop, can help restore brain functionality of adults. Moreover, by providing the brain multisensory input (for example, matching sound and vibration) we can help improve the rehabilitation of patients with a wide variety of sensory deprivations. Another goal of our research is to ignite the brain plasticity of normal healthy adults, and help them develop “new senses”. To achieve this ambitious, mind boggling, nearly Sci-Fi goal, we develop new sensory experiences never actually experienced before by any person, and deliver them to our subjects’ existing senses with purposely designed wearable devices. With sufficient training, we expect our subjects’ brains to adapt and seamlessly process these new experiences, to create additional “senses”.
Progress beyond the state of the art and expected potential impact (including the socio-economic impact and the wider societal implications of the project so far)
At this stage, our work is only half done. To create these new sensory experiences, we built at our lab a state-of-the-art multisensory room, equipped with dozens of speakers, special projectors and systems for generating haptic feedback, in which we can control sound, vision and touch in previously unfathomable precision. We then “export” these unique new sensory experiences into wearable devices, designed to receive certain input - normally inaccessible to the person wearing the device (either because he suffers from some sort of sensory deprivation, or because the information being conveyed to him cannot be sensed by humans) and “translates” this information into another sensory input this person is able to perceive. With sufficient training, which we also concurrently develop, we count on the reignited plasticity of the adult brain to incorporate this new input as if it were another sense. So far we have built several types of wearable devices, each capable of delivering a different experience. Started testing them and demonstrated proofs of concept. But much work still lies ahead. We plan to recruit more participants; refine our new sensory experience, as well as our wearable devices; polish our training programs; and above all, continue to conduct longitudinal fMRI brain scans (before, during and after learning to master each device) in our newly built imaging centre, and demonstrate that adult brains are much more plastic than previously understood. Who said you cannot teach old dogs new tricks? (for papers, clips etc see https://www.idc.ac.il/en/research/bct/pages/main.aspx).