Skip to main content
European Commission logo print header

Can humans embody augmentative robotics technology?

Periodic Reporting for period 3 - EmbodiedTech (Can humans embody augmentative robotics technology?)

Reporting period: 2020-02-01 to 2021-07-31

The issues being addressed: Wearable technology is redefining the boundaries of our own body. Wearable robotic (WR) fingers and arms are robots, designed to free up or complement our hand actions, to enhance humans’ abilities. While tremendous resources are being dedicated to the development of this groundbreaking technology, little notice is given to how the human brain might support it. The intuitive, though unfounded, view is that technology will fuse with our bodies, allowing our brains to seamlessly control it (i.e. embodied technology). This implies that our brain will share resources, originally devoted to controlling our body, to operate WRs. Understanding the requirements for successful interface between brain and technology at the onset of its development is critical for realising the promise that WRs technology offers.

Importance for society: The ability to control WRs could revolutionise work environments depending on manual cooperation between employees (e.g. surgical teams, construction workers). Ultimately, WR limbs could offer both higher dexterity and increased strength than provided by human limbs. As such, successful implementation of this technology could profoundly change the way we interact with our environment, with incredible long-term commercial, healthcare and societal impact. But this vision is restricted by the human brain’s ability to successfully and safely control extra limbs. By identifying and pushing the boundaries of body representation, the proposed project is laying a crucial cornerstone towards the embodiment of wearable robotics. The tools developed for indexing embodiment will be invaluable for psychologists, clinicians, neuroscientists and engineers studying or utilising this phenomenon. Importantly, by providing a program for enhancing prosthetic limb usage, the project will have direct consequences on the rehabilitation of tens of thousands of amputees living in Europe and many others across the world. Finally, studying the human brain through the lens of WR fingers and arms also provides fresh insight into the fundamental mechanisms of plasticity in body representation.

Overall objectives: Here we elucidate the conditions necessary for technological embodiment, using prosthetic limbs as a model. We build upon knowledge gained from rehabilitation, experimental psychology and neuroscience to characterise and extend the boundaries of body representation towards successful adoption of WRs. We combine behavioural, physiological and neuroimaging tools to address five key questions that are currently obscuring the vision of embodied technology: What conditions are necessary for a person to experience an artificial limb as part of their body? Would the resources recruited to control an artificial limb be shared, or rather conflict, with human body representation? Will the successful incorporation of WRs disorganise representations of the human limbs? Can new sensory experiences (touch) be intuitively inferred from WRs? Can the adult brain support the increased motor and cognitive demands associated with successful WRs usage? We first focus on populations with congenital and acquired hand loss, who differ in brain resources due to plasticity, but experience similar daily-life challenges. We then test body representation in able-bodied people while learning to use WR fingers and arm. Together, this research program provides the first foundation for guiding how to successfully incorporate technology into our body representation.
WP1: Indexing embodied technology. Here we test the relationship between successful prosthesis usage in individuals with congenital and acquired hand loss and a range of experimental parameters, aimed at probing hand representation at multiple sensory, motor, and cognitive levels (embodiment). Hand representation is being probed in an array of sensorimotor and cognitive domains, using new tasks that we be developed for this purpose. For example, we found that people who report feeling their prosthesis to be more like a part of them also tended to use the prosthesis during speech co-gesture (when you use your hands during speech to emphasise or elaborate on the meaning of your words). In addition, we utilise recent advances in fMRI analysis (brain decoding) to identify the extent to which hand-related brain areas represent a prosthetic limb. For example, we found that people who use their prosthesis more in daily life also show more distinct representation of a prosthesis from that of a hand, and in fact also from that of a tool. Instead we found that prosthesis users represent a prosthesis more like other artificial limbs, resulting in new categorical representation in the high-order visual cortex. These studies involve some of the largest cohorts of one-handed participants tested to date.

WP2: Training to use a new arm. A fundamental barrier for the assimilation of any new technology is the difficulty of learning to operate it. We wish to understand how we can improve training to use a prosthesis, whether this differs from training to use a new tool, and whether we could harness the power of embodiment to enhance prosthesis training usage. One approach we have taken is studying individuals who have spontaneously developed incredible motor dexterity with their artificial limbs, to try and understand what makes them so successful at prosthesis usage. We also tested professional tool-users (litter pickers), and found that similar to the one-handers, they tend to represent their tool (the litter picker) more differently to hands relative to naïve controls. In collaboration with Coapt and NIH we are now testing 2-handed individuals to use a robotic arm, which is placed in front of their own hand, before and after training to use a robotic arm. The robotic arm is controlled via muscle recordings of the biological hand. While one group can control the arm intuitively, by producing similar movements with their biological and robotic hand, a 2nd group needs to create different movements to achieve the same goal. This allows us to test whether biomimetic interfaces can benefit prosthesis control.


WP3: Supernumerary robotic fingers. Wrist-mounted WR fingers are currently being developed to enhance the function and capability of the hand. For example, the 3rd Thumb, designed by Dani Clode, gives you an extra opposable thumb, making your right hand also a left hand. Beyond the exciting promise that robotic fingers hold for clinical and commercial utilisation, these devices provide a relatively simple model for studying the consequences of bio-robotic synergies on body representation. We studied detailed hand representation using neuroimaging to explore whether the changed human hand movements (due to the augmentative robotic fingers) will also induce a change in finger maps in the primary somatosensory cortex (due to altered schedules of coordinated sensory input). For this purpose, two-handed participants trained in and outside the lab to use robotic fingers over a course of five days. We tracked their performance and ability to cooperate their finger movements with that of the robotic finger, and also asked people how much they feel like the robotic finger is embodied. We found that people break their natural finger synergies while working with the 3rd thumb. We also found that the natural hand representation was altered following intensive 3rd thumb usage. This demonstrates that there are real implications to motor augmentation on natural body representation.

WP4: tactile feedback. We wish to understand how best to provide tactile feedback from an artificial limb. Tactile feedback is essential for successful motor control, and is also considered a key ingredient for embodiment. Presently, none of the commercially available prostheses have purposely designed sensory feedback. The challenge that engineers face is how to best “touch” a missing limb. A commonly stated issue is that people find the substitutionary feedback (from the prosthetic fingers into the forearm) to be unintuitive, and therefore difficult to implement. There is a need to improve integration of substitutionary tactile feedback in the sensory and motor systems. First, we tried to train the central nervous system to associate substitutionary tactile feedback with normal hand function. This customised training was designed to increase shared processing between the fingers of the hand and the referral zone, thus providing the CNS with a new context by which to interpret subsequent substitutionary input on the referral zone. We found that people who learned better to compare tactile information across the two hands were also better at performing a related motor task, requiring bimanual stabilisation of an object with an unknown centre of mass. In another study, which is still ongoing, we have trained people to use the 3rd thumb (see above), after undergoing a pharmacological nerve block (either using local anaesthetic or in a different group saline, as a sham intervention). This meant that the blocked group was not receiving bodily feedback (in terms of tactile input and proprioception) for the effector controlling the robotic thumb. Preliminary results show that although both groups are able to train to use the robotic thumb similarly, the blocked group didn’t benefit as much from training, as evidenced by poorer performance on the 2nd day of the study. This indicates that bodily feedback is important for training to use a prosthesis, even if originates from a different body part, so long as this body part is relevant for controlling the device.
We have pushed beyond state of the art in multiple fronts, including coming up with new indices for measuring embodiment (WP1), running the first fMRI studies with intuitive control (using pattern recognition of muscle control) (WP2), running the first comprehensive studies to understand how motor augmentation will impact our own body representation (WP4), and creating new models for thinking about the challenges of invasive brain machine interfaces (WP4). For the rest of the project we will use this newly gained knowledge for translational purposes, mainly working closely with patients who could benefit from the technologies we are exploring (e.g. arm fracture patients, amputees, individuals with congenital hand loss). We also plan a collaboration with the creators of the MetaLimb to study augmentation of an additional pair of arms.
The Third Thumb, designed by Dani Clode