Skip to main content
European Commission logo
italiano italiano
CORDIS - Risultati della ricerca dell’UE
CORDIS
Contenuto archiviato il 2024-06-18

Body-Object Integration (BOI): The neurocognitive basis of integrating conceptual object knowledge in the body representation

Final Report Summary - BOI (Body-object integration (BOI): The neurocognitive basis of integrating conceptual object knowledge in the body representation)

A main objective of the Intra-European fellowship (IEF) research project BOI was to identify the neurocognitive mechanisms underlying the bodily interaction with objects at both a proximal level and a distal level. Since the beginning of the project several research lines have been implemented to accomplish this goal. A first line focused on the bodily interaction with manipulable objects, a second line focused on the bodily interaction with a special class of tools, namely surgical robotics and a third line investigated the flexibility of the body representation more broadly.

First, in a series of behavioural experiments it was found that manipulable objects (e.g. a cup, a comb) compared to non-manipulable objects (e.g. a television screen, a waste basket) facilitate the integration of multisensory information in peripersonal space (van Elk & Blanke, 2011a). Furthermore, in another series of behavioural experiments we showed that the multisensory integration of objects is differentially affected by proprioceptive and visual information for the upper and lower extremities (van Elk, Forget, & Blanke, in prep.). A third line of experiments investigated whether object familiarity affects multisensory integration in the body representation (van Elk & Blanke, in prep). Together these studies indicate that the ease whereby the brain integrates object information is determined by different factors, related to both our action and visual experience with using these objects.

A second research line focused on the question which factors facilitate the interaction with a special class of objects, namely surgical robotics. When using a surgical robotic system, the user interacts through a haptic interface with a remote tool, thereby extending the body to a distal location. It was found that the integration of information related to the remote tool can be facilitated through active feedback about the movements, haptic feedback about the interaction of the tool with distal objects and a mapping in which the degrees of freedom of the user's hand matches the degrees of freedom of the remote tool (Sengul, van Elk, Aspell, et al., submitted; Sengul, van Elk, Rognini, et al., submitted).

A third line of research investigated the relation between our body representation and the perception of other bodies and external objects. It was found that vestibular stimulation of the body affects the perception of other bodies and objects, likely through a process of visuo-vestibular integration and embodied perspective taking (van Elk & Blanke, 2012). In another study it was found that the visual identity of an observed body facilitates the integration of multisensory information related to that body, suggesting an interaction between both bottom-up stimulus information and top-down knowledge regarding the visual appearance of our body (Salomon, van Elk, Aspell, & Blanke, 2012). Finally, it was found that words referring to our body are strongly associated to a detailed visuospatial representation of our body, thereby supporting an embodied view of cognition (van Elk & Blanke, 2011b).

Together, these studies highlight several important principles underlying the representation of our body and our interaction with objects and other persons. The results of this project have been submitted or have already been published in peer-reviewed scientific journals. In addition, these findings were presented at international scientific conferences, symposia and lab visits.