European Commission logo
English English
CORDIS - EU research results
CORDIS
Content archived on 2024-06-18

Sensory integration for limb localization and action

Final Report Summary - LOCANDMOTION (Sensory integration for limb localization and action)

The aim of this project was to gain further insight into the relations between vision and proprioception. Across the different experiments and papers we have found that that proprioception may not be very useful for identifying the position of the hand relative to a target; indeed, when people make multiple movements in the dark, their hands can drift far from the targets, without the participant’s knowledge. We showed that this drift can be explained by a participant’s prior belief that their motor commands are very good (a belief gleaned, we propose, from daily experience with accurate full-vision movements); movement error accumulates until error from proprioception offsets trust in the motor command. In a separate study we asked participants to report the perceived position of their unseen hand after a movement and observed that their reports were attracted toward the target, supporting the notion that expectation about motor outcome may override sensory input. These two studies addressed localizing the stationary hand (after movement). In two other studies we considered localization during the movement. In one study we examined how vision and proprioception may be used differently for online control, depending on the nature of the target. If the target was visual, visual information was more useful for online control. If the target was proprioceptive, people relied more on proprioception for online control. In another study we outlined a model to explain why people tend to overestimate the position of their unseen hand during movement (and also why they tend to undershoot the target of their reach). The findings from the project advance our understanding of how people use proprioception, a sensory modality that we do not understand as well as vision.
This results can have impact on the future design of human-computer interfaces, specially in the way people will have to deal with differential delays between different sensory modalities.