Skip to main content
European Commission logo print header

Can humans embody augmentative robotics technology?

Periodic Reporting for period 4 - EmbodiedTech (Can humans embody augmentative robotics technology?)

Okres sprawozdawczy: 2021-08-01 do 2023-01-31

Wearable technology is redefining the boundaries of our own body. Wearable robotic (WR) fingers and arms are robots, designed to free up or complement our hand actions, to enhance humans’ abilities. While tremendous resources are being dedicated to the development of this groundbreaking technology, little notice is given to how the human brain might support it. The intuitive, though unfounded, view is that technology will fuse with our bodies, allowing our brains to seamlessly control it (i.e. embodied technology). This implies that our brain will share resources, originally devoted to controlling our body, to operate WRs. Understanding the requirements for successful interface between brain and technology at the onset of its development is critical for realising the promise that WRs technology offers.

The ability to control WRs could revolutionise the way we interact with our environment, both for increasing productivity and for improving healthcare. As such, successful implementation of this technology could have long-term commercial, healthcare and societal impact. But this vision is restricted by the human brain’s ability to successfully and safely control extra limbs. By identifying and pushing the boundaries of body representation, our project has laid a crucial cornerstone towards understanding opportunities and bottlenecks with regards to embodiment of wearable robotics. The tools developed for indexing embodiment will be invaluable for psychologists, clinicians, neuroscientists and engineers studying or utilising this phenomenon. By considering how to enhance prosthetic limb usage, the project will have direct consequences on amputees rehabilitation. Finally, studying the human brain through the lens of supernumerary robotic fingers also provides fresh insight into the fundamental mechanisms of plasticity in body representation.

Here aimed to elucidate the conditions necessary for technological embodiment, using prosthetic limbs as a model. We build upon knowledge gained from rehabilitation, experimental psychology and neuroscience to characterise and extend the boundaries of body representation towards successful adoption of WRs. We combine behavioural, physiological and neuroimaging tools to address key questions that are currently obscuring the vision of embodied technology: What conditions are necessary for a person to experience an artificial limb as part of their body? Would the resources recruited to control an artificial limb be shared, or rather conflict, with human body representation? Will the successful incorporation of WRs disorganise representations of the human limbs? Can new sensory experiences (touch) be intuitively inferred from WRs? Can the adult brain support the increased motor and cognitive demands associated with successful WRs usage? We first focus on populations with congenital and acquired hand loss, who differ in brain resources due to plasticity, but experience similar daily-life challenges. We then test body representation in able-bodied people while learning to use a WR finger – the Third Thumb (Dani Clode Desing). Together, this research program provides the first foundation for guiding how to successfully incorporate technology into our body representation.
Although tremendous resources are being devoted to the development of ground-breaking WR technologies, little attention has been afforded to how the human brain might support it. To address the question of technological embodiment empirically and rigorously, we took a combined approach for substitution and augmentation.

Prosthetic limbs provide a strong test case for researching the embodiment of WR technologies. In our project we tested the relationship between successful prosthesis usage in individuals with congenital and acquired hand loss and a range of experimental parameters, aimed at probing hand representation at multiple experiential, behavioural and neural levels (embodiment). On average, we find that prosthesis do not feel like a body part (Maimon-Mor et al., 2020). Nevertheless, those individuals who use their prosthesis more productively in daily life also tended to report experiencing stronger sense of embodiment. Moreover, individuals who use their prosthesis more in daily-life show greater activity in hand-selective visual areas, as well as greater functional coupling between the visual and sensorimotor hand areas (van den Heiligenberg et al., 2018). However, despite benefiting from hand-selective cortical resources, the prosthesis is not necessarily ‘embodied’ into people’s body representation. Instead, prosthesis users represented their own prosthesis more dissimilarly to hands, compared to controls, challenging the traditional view of prosthesis embodiment (Maimon-Mor & Makin, 2020). This result highlight new and exciting opportunities for recycling neural resources devoted to the body to enable engineering solutions that are not strictly biomimetic (known as soft embodiment; Makin, de Vignemont, & Micera, 2020).

We next studied how to best merge augmentation technology with the human body and mind. For this purpose, we teamed up with Dani Clode, the Designer of the Third Thumb. We investigated whether successful and intuitive motor augmentation with an extra robotic thumb can be achieved, and what its implications are for the representation and function of the biological hand. We ran a series of longitudinal studies where individuals were trained to develop hand-robot interactions, including both lab-based and unstructured daily usage. Our studies demonstrate that successful integration of motor augmentation can be readily achieved in healthy participants, with the potential of flexible use, reduced cognitive reliance and an increased sense of embodiment of the device (Kieliba et al., 2021). This was, in part, due to surrogate somatosensory information being recruited from the controlling toes to inform the motor command to move the Third Thumb (Amoruso et al., 2022). Importantly, we showed that motor integration of the device resulted in a changed natural hand use at the behavioural level and an altered representation of the biological hand at the neural level (Kieliba et al., 2021). This latter finding is of major significance as it indicates that successful human-robot integration may have consequences for certain aspects of biological body representation and motor control.

Together, our research demonstrates that embodiment of an artificial limb is not trivial. Although, in principle, opportunities exist for harnessing hand neural and cognitive resources to control artificial limbs, the brain does not assimilate neural representations for the artificial limb with those for the biological body, creating opportunities for nonbiomimetic technological interfaces.
We have pushed beyond state of the art in multiple fronts, including coming up with new indices for measuring embodiment and intuitive use of WR for augmentation, running the first fMRI studies with intuitive control of artificial limbs, running the first comprehensive studies to understand how motor augmentation will impact our own body representation, and creating new models for thinking about the challenges of human machine interfaces.
The Third Thumb, designed by Dani Clode