CORDIS - Wyniki badań wspieranych przez UE
CORDIS

Human reaching and grasping - cognitive networks of visual action control

Final Report Summary - GRASP-CN (Human reaching and grasping - cognitive networks of visual action control)

Grasping and manipulating objects are capabilities that are not unique to humans or even primates. However, the range and the flexibility of these human competences are unmatched by any other species. This unique human condition is based upon a complex interaction of sensorimotor control and cognitive processes. Our work aimed at a description of the means and ways through which cognition and multiple sensory channels influence motor control. At what stages of information processing does the recognition of objects and object properties influence our movements? What are the respective cortical and subcortical brain structures that are involved in the synthesis of visual object recognition, semantic and procedural object knowledge, and movement control?
We addressed these questions in examinations of patients who suffered a brain damage due stroke and in neuroimaging experiments in patients and healthy adults. Some of the patients demonstrated disorders that are directly related to different steps in the process of object manipulation. Patients with visual agnosia are unable to recognise an object visually, patients with optic ataxia show spatial inaccuracies of grasping and reaching movements, and patients with apraxia demonstrate difficulties in using objects as tools. These disorders are rare and the diagnostic criteria, especially for optic ataxia, were not well established. Therefore, we devised new instructions for the examination of optic ataxia and the subsequent rating of the patient's performance. Screening a large patient group we demonstrated a high reliability of our procedures and thereby provided common grounds for the diagnosis of optic ataxia for the very first time. From previous research we knew that humans produce very precise and stereotypical movements if they are asked to grasp objects of different sizes. The hand aperture usually scales perfectly with varying target object sizes. Our experiments showed that this holds true only for meaningless cuboids. Confronted with everyday objects of different sizes grasping movements were less accurately related to the actual object sizes. Against our intuitive expectations semantic knowledge about the target objects seemed to degrade visuomotor pre-planning that is based on direct visuospatial processing. Our studies in patients with damage to the object-recognition systems indeed showed that these areas contribute to a proper motor control, against earlier assumptions of an independence between movement implementation and object recognition. Our neuroimaging work showed that information about an object's identity can be found in dedicated sensorimotor areas of the human brain, far beyond the designated object recognition systems. Moreover, we identified a small scale network of multiple cortical modules at the junction area between the parietal and occipital cortex that controls reaching movements to objects that are located in the peripheral visual field, i.e. movements that are much more common in our everyday life than the extremely controlled movements to targets in the central visual field that are ususally examined in experimental studies on human motor control. We not only found stronger interactions between object recognition and motor control than expected, but also much more crosstalk in the brain between representations of our hands in either hemisphere than currently assumed. A small number of studies in primates in the 90's showed that a considerable number of neurons in the primary sensory cortices responded to stimulations of the ipsilateral hand as much as to stimulations of the contralateral hand. However, these findings never found their way into our textbooks and were not followed up later. Now we discovered that a small, isolated damage of the sensory hand representation in one hemisphere affects the position sense of both hands in humans. In conclusion, the results of our project show that there is much more crosstalk between object recognition and hand motor control than previously expected. Moreover, despite of a strong lateralisation of sensory and motor representations of the hand, there is much more bilateral overlap already in early sensory cortex. Both findings show that our hands are not simply controlled by an autonomic motor controller that is only fed veridical physical informations about sizes and distances but takes into account the identity and meaning of visual objects for movement programming and control.