Skip to main content
Przejdź do strony domowej Komisji Europejskiej (odnośnik otworzy się w nowym oknie)
polski polski
CORDIS - Wyniki badań wspieranych przez UE
CORDIS

Functional organization of the multisensory motion system

Periodic Reporting for period 1 - ItsAllAboutMotion (Functional organization of the multisensory motion system)

Okres sprawozdawczy: 2021-02-01 do 2023-01-31

One of the most significant current discussions in the understanding of the human brain is the functional recruitment of some regions of the cortex for specific tasks, regardless the sensory modality (e.g. visual, tactile or auditory) in which the stimuli is received. The ability to perceive motion, among others visual properties, is a fundamental faculty of the human brain. Brain lesions that impair the detection and processing of motion have a profound impact on daily activity. Consequently, visual motion processing is one of the most fundamental and well-studied systems in the human brain, canonically known to develop mainly for the purpose of visual perception. A great deal of study on the multisensory responses to motion processing in the human brain focused on the middle temporal complex and superior temporal sulcus. Several studies using both neurophysiological and neuroimaging techniques showed the multisensory properties of these areas, showing their recruitment during both tactile and auditory motion stimulation. Despite the large amount of study on the topic it is still unclear whether the recruitment of these areas directly mediates the perception of motion through the different sensory input or regulates responses within primary sensory areas involved in the task. The outcome of this MSCA will inspire novel research lines on how the brain reorganize in case of a sensory loss and on clinical applications such as rehabilitation programs that aim at restoring function through other sensory modalities.
WP1: Piloting and developing of the experimental setup for the high-resolution 7T fMRI experiment in human subjects during visual, auditory and tactile motion stimulation. Three different set of motion direction manipulation stimuli, one for sensory modality, have been designed and implemented in Psychopy. For the visual condition, three set of stimuli were employed (full field random dots and two sine wave gratings at low and high spatial frequency respectively, see Figure 1). The same set of stimuli was reproduced in the auditory sensory domain using a special audio setup developed in collaboration with Prof. Collignon, University of Louvain, Belgium. For the tactile stimulation a magneto-compatible tactile device was used. The device consists in a lubricated pad slowly moved by a slide. Three different pads were employed: two pads exhibited a pattern of parallel ridges and grooves at high and low spatial frequency respectively and a third circle dots (see Figure 1).
Implementation of the 2D BOLD MRI sequence (2D-EPI sequence, multiband (R=3), spatial resolution=1.5x1.5x1.5 mm 3 and TR=1500 ms). Sequence currently used in the CIBM AIT team and shared across scientists. Due to coils available we were not able to achieve 0.8mm spatial resolution therefore we implemented a sequence with highest spatial resolution as described in the risk management plan section.

WP2: Experiments were conducted on 15 healthy human volunteers using a 7T MRI scanner (Siemens Healthineers). Each motion block had either visual or auditory stimuli moving from left to right and right to left for 10 sec. For the visual condition, three set of stimuli were employed (see wp1). Each condition was randomly interleaved in a single run ending with 3 trials per condition per run. The second session consisted of 5 runs for the tactile stimulation. Each run consisted of three motion blocks of 10s per pad in which the pad moved up and down along the right palm of each participant. Preliminary results: During the visual motion localizer, subjects consistently exhibited positive BOLD responses in the human MT complex (hMT+) and early visual cortex. As expected, the random condition during the visual stimulation task elicited strong activation in hMT+ and early visual cortices (Figure 2). Using the same motion stimulation in the auditory domain we found two clusters of significant activations, one located in the primary auditory cortex A1 and the second located in the third visual cortex V2/V3 (Figure 2). Using univariate analysis, we did not find an engagement of hMT+ during auditory motion conditions. Further analyses are currently in progress. Preliminary results presented in the LINE Retreat in Nendaz (October 2022) and an abstract for IMRF 2023 has been submitted.
WP3: TMS Training was done at the University of Bern. Due to technical issues with the equipment, it was not possible to perform the TMS experiment at the host institution in CHUV. An agreement was made with the EPFL Biotech research center and the Human Neuroscience Platform in Geneva in order to use the neuromodulation facility.
We applied double pulse TMS on hMT+ of each single subject (same subjects who performed the MRI experiment) during visual or auditory motion stimulation using two different spatial frequencies. Functional visual localizer acquired at 7T was used to successfully localized hMT+. The subject was asked to detect first the presence of motion than the direction of motion (left to right or right to left). Results in the visual domain showed a reduction of the percentage of corrected responses to motion stimulation vs no motion stimulation as a consequence of the TMS pulse. The pilot did not show any differences in the auditory motion domain. This might be due to the interference of the TMS pulse noise with the auditory stimulation. Further studies need to be implemented.
WP4: Due to Covid there were no patients available from the HUG hospital in Geneva. We actively engaged in a collaboration with the UMC Utrecht in The Netherlands. We used previously collected data in 4 patients that underwent electrocorticography ECoG measurements to investigate the neuronal responses to visual motion in response to different spatial and temporal frequencies. We found that hMT+ encode motion via two different mechanisms. The first allows for distinct and independent selectivity for the spatial and temporal frequencies of the visual motion stimuli; the second implies pure tuning for the speed of motion. We show that both mechanisms occur different neuronal groups within hMT+, with the largest subregion of the complex showing separable tuning for the spatial and temporal frequency of the visual stimuli. Paper was published in Human Brain Mapping last year: https://pubmed.ncbi.nlm.nih.gov/36637226/(odnośnik otworzy się w nowym oknie). Results were presented in different talk and at Human Brain Mapping conference 2021 and 2022.
This MSCA fellowship allowed me to lay the foundations on the neural substrate underlying multisensory motion perception. We discovered that hMT+, an area manily involved in visual motion processing encode motion via spatial features of the stimulation rather than its intrinsic speed and our preliminary results show that together with other visual areas can decode speed via auditory and tactile motion stimulation, proving its multisensory function.
The outcome of this MSCA will inspire novel research lines on how the brain reorganize in case of a sensory loss and on clinical applications such as rehabilitation programs that aim at restoring function through other sensory modalities. Therefore, it has a major impact in public health related to blindness and deafness and stroke rehabilitation.
Figure 1: Stimulation paradigm
Figure 2: 7T fMRI Activation maps in response to visual and auditory stimulation performed in 15 hea
Moja broszura 0 0