Skip to main content
Ir a la página de inicio de la Comisión Europea (se abrirá en una nueva ventana)
español es
CORDIS - Resultados de investigaciones de la UE
CORDIS

Physics-based Remote Touch

Periodic Reporting for period 1 - ReTouch (Physics-based Remote Touch)

Período documentado: 2023-06-01 hasta 2025-12-31

The ability to feel is central to how humans interact with the world. We rely on our sense of touch to hold a pen, find keys in a pocket, or judge the texture and softness of an object often without needing to look. In contrast, most robotic systems still operate largely by vision alone. When humans operate robots remotely, such as during telesurgery or in hazardous environments, the lack of realistic touch feedback severely limits precision, safety, and intuitiveness. This project addresses this gap by developing a new generation of tactile technology that allows users to physically “feel” what a robot touches, no matter how far away it is.
The project’s goal is to create a system for real-time remote touch, where tactile information sensed by a robot is faithfully reproduced on a human operator’s skin. This requires compressing the rich, high-resolution data captured by advanced tactile sensors into meaningful cues that humans can intuitively interpret. To do this, the project draws inspiration from how the human nervous system processes touch. By analyzing large databases of skin deformation patterns recorded during contact with different materials and surfaces, the project identifies a compact set of “tactile primitives”—basic units of touch perception that can be combined to recreate complex sensations.
These tactile primitives will be implemented in a new haptic device that deforms the user’s skin in precise patterns to simulate the feeling of touching real objects. For example, subtle lateral skin stretches may evoke sensations of slipperiness, while vertical pressure patterns may recreate a sense of hardness or softness. The project will test and refine these sensations in human-user experiments and integrate them into a full remote-touch system. This setup will link a robotic manipulator equipped with a tactile sensor to a haptic interface worn by a human user, enabling true bidirectional interaction through touch.
By combining insights from neuroscience, robotics, mechanical engineering, and computer science, the project takes a highly interdisciplinary approach to a pressing technological need. Its results will have significant scientific, societal, and industrial impacts. In the medical field, remote tactile feedback could improve the precision and safety of robotic surgery by enabling surgeons to feel tissue properties from a distance. In hazardous environments such as nuclear plants or space missions, operators could perform delicate manipulation tasks with much greater confidence. The technology also holds promises for prosthetics, helping users better perceive and control artificial limbs.
In addition to developing new hardware and software tools, the project will make all core results publicly accessible. A comprehensive, labeled database of tactile interactions will be released open-source, fostering collaboration and accelerating progress in tactile sensing research. The methods developed for compressing and reconstructing tactile data will also be made available, along with a tactile processing toolbox that others can build upon.
The project supports the EU’s strategic objectives in robotics, digitalization, and human-machine interaction. It will boost Europe’s leadership in haptics technologies and robotics, with clear paths to application in healthcare, assistive technologies, manufacturing, and beyond. By enabling robots to touch and humans to feel, this project opens new frontiers for safer, more intuitive, and more human-centered technology.
Over the course of the project, we explored how tactile feedback can be understood, replicated, and transmitted to support more intuitive and efficient human-robot interactions. The research focused on three core objectives: understanding tactile perception mechanisms, developing devices to recreate these sensations, and evaluating how much tactile information is necessary for manipulation tasks.
1. Investigating the Perception of Slipperiness through Radial Skin Deformation
The first study addressed the role of skin deformation in the perception of slipperiness—a key tactile property. We observed that when handling delicate or low-friction objects, the skin on the fingertips tends to expand radially. To test whether this deformation pattern actively causes the sensation of slipperiness, we developed a custom tactile device that deforms the skin of both the index finger and thumb using soft, pneumatically actuated membranes. By controlling the rate of radial expansion in proportion to the squeezing force applied, we were able to artificially induce the sensation of slipperiness. Psychophysical experiments confirmed that this specific deformation pattern can evoke the percept of low friction, suggesting a causal link between radial skin strain and friction perception. This represents a significant step toward the biomimetic reproduction of tactile sensations in haptic interfaces.
2. Enabling Remote Softness Perception through Sensor-Actuator Integration
The second major activity focused on closing the tactile feedback loop between a robotic manipulator and a human operator. We integrated the GelSight tactile sensor, a high-resolution, camera-based sensor capable of capturing detailed surface deformations, with a custom-built haptic device called SORI, designed to reproduce softness and stiffness sensations on a user’s fingertip. The GelSight sensor was mounted on a robotic end-effector to collect contact data from manipulated objects. This information was then processed and mapped to corresponding deformations rendered by the SORI interface. The result is a teleoperation system where users can not only control the robot but also feel the mechanical properties of grasped objects in real time. This setup lays the groundwork for more intuitive and immersive remote manipulation, with promising applications in minimally invasive surgery and assistive robotics.
3. Defining the Threshold of Useful Tactile Information for Dexterity
The final line of investigation seeks to answer a fundamental question: how much tactile information does a person need to perform simple manipulation tasks effectively? To explore this, we designed a series of dexterity tests such as peg transfers, while progressively degrading the participants’ sense of touch using finger cots of varying thickness and material. Each configuration was first evaluated with dedicated tactile perception tests to quantify the degree of sensory attenuation. Preliminary results indicate that while a moderate loss of tactile resolution does not significantly impact task performance, more severe reductions lead to a measurable decline in precision and speed. This suggests the existence of a perceptual threshold below which tactile information becomes insufficient for effective manipulation. These insights can guide the design of wearable haptic interfaces and robotic prosthetics by balancing the fidelity of feedback with practical constraints such as size, cost, and power consumption.

Together, these three research tracks significantly advance our understanding of human tactile perception and how it can be harnessed in artificial systems. They also provide concrete technological building blocks toward the goal of enabling realistic and efficient remote touch in human-robot interaction.
The outcomes of the project have the potential to transform multiple sectors where touch plays a critical role:

- Medical robotics: Surgeons could receive real-time tactile feedback during minimally invasive operations, enhancing safety and precision.
- Assistive technologies: People with limb loss could regain a sense of touch through prosthetics that simulate mechanical interaction with objects.
- Remote operations: Operators in hazardous environments (e.g. nuclear plants, deep-sea exploration, or space missions) could gain enhanced tactile awareness during manipulation tasks.
- Consumer and virtual technologies: Haptic interfaces in gaming, AR/VR, and remote presence systems could achieve a new level of realism.

To support further uptake and success, several key enablers have been identified:

- Further research and validation: Broader psychophysical testing and long-term usability studies are needed to validate the generalizability of the results across users and tasks.
- Technology demonstration: Building more compact, integrated, and portable prototypes will be essential to move from lab-based setups to real-world use cases.
- Access to markets and finance: Partnerships with medical and robotics companies will be critical to transition from proof-of-concept to commercial-ready solutions.
- IPR support and exploitation strategy: Select components of the tactile encoding and rendering framework have clear potential for patent protection and licensing.
- Standardisation and interoperability: Contributing to haptic standards (e.g. for teleoperation interfaces) will help ensure wider compatibility and adoption.

In conclusion, the project has not only advanced fundamental understanding of touch perception but also delivered practical tools and design principles that will shape the next generation of human-machine interfaces. With continued development and collaboration, these results are poised to make a tangible impact across science, technology, and society.
Mi folleto 0 0