European Commission logo
English English
CORDIS - EU research results
CORDIS
Content archived on 2024-06-18

Virtual Prototyping of Tactile Displays

Final Report Summary - PROTOTOUCH (Virtual Prototyping of Tactile Displays)

PROTOTOUCH PUBLISHABLE SUMMARY
The Marie Curie Initial Training Network Virtual Prototyping of Tactile Displays (PROTOTOUCH: http://www.prototouch.org/) is a multidisciplinary intersectoral research programme that is training 11 ESRs and 4 ERs. It is coordinated by the University of Birmingham, UK with 9 other academic and industrial lead partners: Université catholique de Louvain, Belgium; Goeteborgs Universiteit, Sweden; Swansea University, UK; C3M, Slovenia; Scuola Universitaria Professionale Della Svizzera Italiana, Switzerland; Université Pierre et Marie Curie 6, France; Université des Sciences et Technologies de Lille I, France; STMicroelectronics, France and Metec Ingenieur AG, Germany. In addition, there are 5 Associated Partners in the UK: Unilever R&D, Redux, the European Space Agency, Skillstudio and The Department of Design.
The objectives of the research programme are as follows:
• Experimentally evaluate of the performance and usability of current, optimised and novel tactile displays against reference standards through psychophysical, tribological, neural peripheral and central measurements.
• Numerically simulate the interaction between a finger pad and the reference standards tactile displays in order to compare their mechanical, tribological and peripheral neural responses.
• Link human and virtual data from using information processing techniques in order to determine the critical design principles for optimising the designs of current displays, and developing and optimising novel prototype designs based on user performance.

Flat screen displays with tactile feedback for mobile phones etc., have been developed. Their design and control are based on modulating the friction e.g. a step increase in the friction will provide the illusion of a step. The optimised devices incorporate both out-of-plane ultrasonic vibrations that reduce the friction and electrovibrations that increase the friction. Numerical simulations have been developed to understand the mechanisms involved and thus to optimise the performance of these devices e.g. the effect of the ultrasonic vibrations was shown to increase with increasing frequency. A flat screen device based on in-plane vibrations has also been developed that is capable of recording the frictional response to sliding a surface texture over a finger pad and replaying the recording in order to reproduce the tactile experience. The device is capable of visualising the micro-displacements of the finger print ridges so that a closed loop feedback controller can cancel the relaxation oscillations to create a neutral tactile surface to which the target textures may be superimposed. A compact refreshable multi-pin braille device for visually impaired users has also been designed and implemented. In addition to displaying text, graphical representations are shown in tactile resolution (76x48=3648 pins). The device can be connected to a tablet or a smartphone and can fully interpret the content from the screen at a piezo-electrically possible refreshing speed. Gestures such as zoom, and wipe actions are possible to perform with collaborating major operating systems (IOS, Windows, and Android).

In summary, considerable advances have been made in developing tactile displays that are underpinned by understanding the psychophysical, neurophysiological (EEG and microneurography) and tribological responses of test subjects and the exploitation of finite element analysis to undertake virtual prototyping. For example, the human sense of touch has been studied by measuring the activity of peripheral low-threshold mechanoreceptors in the fingers using the technique of microneurography, where the patterns of neural discharge from single, identified tactile afferent units under different experimental conditions are analysed. Receptor discharge during active tactile exploration of different types of surfaces has been investigated together with receptor discharge during interaction with tactile displays, and the neural encoding of friction. Mechanoreceptor data has also been used to define and validate FEM models of human mechanoreceptors in the skin thus enabling a virtual finger with a sense of touch to be created. Machine learning is another critical aspect of the project since it allowed the complex features of the massive data sets derived from human testing to be analysed in a way that assists the optimisation of the design of the displays.

The outcome of the project is the development of next generation electronic user interfaces that have high definition haptic feedback to enhance user performance, ease of use and user experience. They have potential applications in e-shopping by allowing products to be virtually touched, assisting users with tactile or visual deficits, and creating more realistic virtual reality environments used in training, games and entertainment. Now that the project has been completed, the team are looking to the future and how the results will support development in this field. Participants are developing innovative products and projects using this new knowledge about tactile displays. For example, Metec AG has launched a braille and graphic display (http://web.metec-ag.de/) based on critical inputs from a number of the researchers. HAP2U (www.hap2u.net) is a start-up company designing ultrasonic haptic displays that exploits the E-Vita technology developed in PROTOTOUCH. Another group of researchers have started a spin-off company (https://www.gotouchvr.com) that is developing a revolutionary haptic wearable device for Augmented and Virtual Reality systems.

Network Coordinator, Professor Michael Adams, +44 121 4145297
http://www.prototouch.org/