CORDIS - Forschungsergebnisse der EU
CORDIS

Multimodal haptic with touch devices

Periodic Reporting for period 1 - MULTITOUCH (Multimodal haptic with touch devices)

Berichtszeitraum: 2020-03-01 bis 2022-02-28

The question of introducing more multimodal haptic feedback into consumer products is becoming crucial today, with the advent of a society increasingly focused on digital solutions. Smartphones and tablets, which rely entirely on touch screens for user interaction, are now commonly used to access to the internet and, for example, interact with administration (34% of European people). Since these devices are cost-effective, future growth mostly relies now on the applications and software development that can be done in entertainment, education and technical training , and this conclusion holds true in the rapidly-expanding VR sector, which is taking benefit of the recent development of Head Mounted Displays. Paradoxically, the devices are now accessible to more people, but a segment of the population is excluded from these digital developments: elderly individuals who struggle to use touch screens, and visually- or auditory- impaired individuals. Indeed, computers and other devices provide information to users almost exclusively through visual and auditory feedback.

As compared to the extensive knowledge on how vision and audition interact, current knowledge on how touch integrates with the other senses is relatively scarce, especially in conditions of active touch, i.e. when tactile input is generated by active contact with the environment (e.g. tactile exploration of the surface of a display, tactile exploration of VR environments). Through a very ambitious training delivered during the project, the objective of MULTITOUCH is to train a cohort of scientific researchers that can work in the R&D department of companies of the digital economy, and that have the skills and will to create devices and applications accessible to everyone.
During this first reporting period, the actions were conducted in ech Work packages:

WP1: ESR 3 built up her setup in fMRI environnement. She then conducted a study of the representation of tactile and visual motion directions in a brain region called hMT+/V5 and which frame of reference it uses to code the tactile motion directions. Meantime, ESR 4 investigated the responses to changes in vibrotactile textures with a periodic oddball paradigm and EEG frequency-tagging. Results show that the reponses originate at least partly from primary somatosensory cortex, and were submitted in a conference. ESR 5 studied the velocity discrimination threshold of blind and sighted individuals in passive and active touch conditions. The results show that, compared to passive touch, active touch significantly impairs the perception of blind participants but not of their sighted counterparts, suggesting that blind individuals might be more sensitive to the movement-related gating of sensory transmission to S1.

WP2: ESR 1 designed two experiments. In the first experiment, he investigated the influence of stereoscopy, surface deformation, and tactile feedback on the perception of texture roughness in an Active Touch (AT) condition. In the second experiment, he investigated the spatiotemporal detection threshold of audio-tactile delays under conditions of AT. The results suggest that adding stereoscopic rendering and visual surface deformation (for smooth tactiletextures) can modify the perceived tactile roughness. Moreover, the second experiment provides insights into the sensitivity of participants in perceiving spatio-temporal differences between tactile-auditory stimuli generated during AT exploration. ESR2 developed a new eyes-free interaction technique in presence of haptic feedback and conducted the experiments to understand the user preferences about this technique, which will be presented at HCI 2022 conference.

WP3: ESR 6 developped a web-based application for graphical authoring of vibrotactile feedback for mobile and wearable devices, and an experiment to examine the User Experience (UX) of interactions with touchscreen displays.

WP4 (management): The supervisory board of multitouch has been setup, and 11 Network meetings were organized. During these meetings, quality of research was assessed by online presentations from ESRs, and 1 poster session.

WP5 (training): ATC 1 to 6 were given online, while ATC 3 was given in Dusseldorf. Most of ATCs are now completed. PCDP were filled in by ESRs, and updated each year. The PCDP are reviewed by the training commitee who gives feedback and advice. Secondements started with ESR 1 in UCL, and ESR2 in USM.

WP6 Dissemination: Our application to ESOF 2022 has been rejected.
The consortium organized a Workshop in the IEEEWH21 conference (online) where ESR1, 5, 6 participated as speakers.
ESR4: presented her work to the international conference NeuroCog 2021 in Brussels and to the "EURONPhD days” event and organised a workshop to the "Brain Awareness Week" event (general public).
ESR 5: participated to two conferences the National conference experimental section ofthe Association of Italian Psychology with an oral presentation and to the international conference on spatial cognition with a poster presentation.
ESR 6: organised a booth at the European Researchers’ Night at Suceava. Moreover, he took a part in the organisation of a 25-hour hackathon for Computer Science students at the University of Suceava (general public). He gave a speech at the ICMI conference 2021.

ESR 1,3,4 animated and organized 2 scientific workshop for pupils (general pubic) in Brussels.
A Twitter account is live for the project.

WP7 Ethics: The Ethic applications were submitted, and accepted by the local ethics commitees.
So far, 1 external Ethics advisor ha been appointed to the project. The Data Management Plan is setup and updated on a regular basis.
We expect from the project scientific results on:
- How multisensory couplings can be produced at hardware level in order to create innovative multimodal devices,
- the understanding and analyse of the underlying mechanisms in the exploitation and combination by the human brain of information conveyed by vision, audition and/or touch.
This will lead to publications in peer-reviewed journals and conferences. A multimodal haptic framework will make the work available for the scientific community, as well as to the non-academic sector, while a mobile demonstration package will allow demonstration aimed at the general public.
Project's logo