Final Activity Report Summary - PERACT (Perception and action in space)
The findings of P2 suggest a contribution of the ventral, 'what' stream, at an unconscious stage, to control fast motor action. Illusions are part of our visual system and they are used to distinguish the limits of this system.
The study of P3 is focused on the size illusion during a speed discrimination task. This illusion provokes a misperception of the velocity of the stimuli, a small stimulus is perceived as moving faster compared to a large one. The results of P3 indicate the tendency of the neuronal activity to follow the illusionary perception rather than the reality.
The focus of P4 was on the neuronal basis of the processing of social cues, relevant for perception and action. In the first human fMRI study we identified a brain region that is specifically activated when we perceive and follow the eyes of another person towards a target in space. Functionally equivalent brain areas have been identified in the monkey brain, in which they were perceiving and following the head direction of a picture of another monkey towards a target in space.
P5 mainly tests stroke patients, who may exhibit the behaviour of actively pushing the paralysed side and a tendency to fall towards this side. This is the "pusher syndrome". This syndrome is associated with posterior thalamic strokes. Also, lesions sparing the thalamus, located at the postcentral gyrus and insula, provoke the disorder. It seems that a postural balance is related to the normal functioning of the cortical and thalamic structures and their disconnection (after thalamic and/or cortical lesions). In biological systems, navigation is based on inputs provided mostly by the visual and proprioceptive senses. The input can be divided into positional information (panoramic vision), and egomotion (or odometry) information. These systems thus differ substantially from robot navigation systems which may additionally rely on data provided by laser range finders, GPS, etc.
The goal of P6 was to develop a robotic navigation systems relying on panoramic vision and odometry.
The project P7 concentrated on bats echolocation: the ability to classify the perceived echoes. We first targeted their ability to classify plant echoes, and designed an algorithm that can reproduce this ability. We were inspired by research done in the visual field that helped to understand processes of the visual system, hoping that our work will provide similar understandings of the auditory system. In conclusion: we develop a new approach to classify echoes or other acoustical signals. By using this approach we could learn about the characteristics of the signals and the behaviour of the bats.
P8 carried out research on multi-camera calibration, object localisation, weakly supervised image categorisation, and the automatic discovery of visual taxonomies in data. The research on object localisation has resulted in the development of a system that is capable of localising instances of objects within images at great speed and accuracy. The techniques employed are also suitable for multisensory integration.