European Commission logo
English English
CORDIS - EU research results
CORDIS

Multimodal haptic with touch devices

Article Category

Article available in the following languages:

Giving digital devices a new touch

Making screens more accessible by embedding touch-based features in human-computer interfaces.

Digital Economy icon Digital Economy
Health icon Health

From smartphones to smart devices, touchscreens have become a ubiquitous part of our everyday lives. While these buttonless screens were designed to make technology easier to use and thus more accessible to more people, in doing so they can end up excluding the visually and auditory impaired. “We believe that by introducing more touch-based features into human-computer interfaces, we can improve the accessibility of new digital devices and applications for all users,” says Frederic Giraud, a researcher at the University of Lille (website in French). With the support of the EU-funded MULTITOUCH project, Giraud supervised an effort to do exactly that: integrate auditory, visual and tactile feedback into next-generation multisensory devices, resulting in an enriched user experience for everyone.

Training and experience in the science of touch

According to Giraud, compared to the extensive knowledge of how vision and audition interact, our understanding of how touch integrates with these other senses remains rather limited. That’s why the project, which received support from the Marie Skłodowska-Curie Actions programme, focused on filling this knowledge gap. To do so, the project provided in-depth training to young researchers in the field of haptics – the science of touch. “By learning how to combine the stimulation of touch, vision and hearing, we aimed to give these researchers the skills they need to design interfaces capable of reinforcing the information that comes from an unreliable sense,” explains Giraud. This training included hands-on opportunities involving academic and industrial researchers, as well as a focus on a range of soft skills. It also offered researchers a chance to explore their own research, including in such diverse areas as neuroscience, computer science, rehabilitation, human interfaces, multisensory tactile displays, and virtual reality. For example, one researcher studied how the brain processes multisensory information, while another used imaging to investigate how the brain understands moving stimuli coming from multiple senses. Meanwhile, a third researcher translated these neuroscience findings into the specifications for a potential multisensory device. All this research has been compiled into the ‘Handbook of Best Practices for Mixed Tactile, Visual and Auditory Stimuli’.

Towards a next generation of multisensory devices

While Giraud is confident that the project’s Handbook will serve as a foundation for building a next generation of multisensory devices, what he’s really excited about is what lies ahead for the project’s researchers. “I am immensely proud of our research team and look forward to seeing how they take their newfound skills and knowledge to continue to advance this exciting field,” he adds. One possible area ripe for further investigation is using haptic features to make devices more accessible – and safe – for users without disabilities. For instance, instead of looking down at a visual device in a car, could a tactile dashboard display be used to provide the driver with the information they need without requiring them to take their eyes off the road? “It’s questions such as these that allow researchers to approach the design of digital devices with a new outlook and an appreciation for creating objects that work for everyone, and not specifically for a segment of the population,” concludes Giraud.

Keywords

MULTITOUCH, digital devices, human-computer interfaces, touchscreens, smartphones, visually and auditory impaired, auditory impaired, haptics, multisensory device

Discover other articles in the same domain of application