Skip to main content
European Commission logo print header

Towards a Touching Presence: High-Definition Haptic Systems

Deliverables

The developed haptic displays cover a large range of possible applications. The ViSHaRD10 is a hyper-redundant haptic display that offers a large workspace, high dexterity and high payload. The Multilevel Haptic Display renders surfaces at high fidelity. Tactile displays like the Shear Force Display or the Pin Actuator Display extend the current technology by providing new stimuli and new actuator technologies. The Rheological Device deploys a completely new technique to haptically stimulate the whole hand. The Haptic Texture and Pattern Demonstrator renders environments like textures and rigid patterns. Additionally, we developed sensors and actuators for tactile flow and for the use in fMRI applications. A modular concept of the developed systems was realized, resulting in the Combined Kinesthetic Tactile Display and in the Multimodal High Fidelity Display.
The benchmark demonstrators developed at the end of the project hold great potential for further exploitation. Although the systems served as platforms to apply and evaluate research results obtained in the project, they represent state-of-the-art applications, which could find further usage in their diverse problem areas. The developed I-TOUCH software for benchmarking haptic research is already operational and could be used in generating sample haptic applications with various commercial and non-commercial haptic devices. Also, the developed interactive 3D data navigation system is already being applied in a clinical setting. It has been installed at the Radiology Department of the University Hospital Zurich, and has been applied to a number of patient cases to support the medical staff. In order to reduce the purchase costs the multi-modal segmentation system has been ported to the Linux OS, and can be used with arbitrary haptic devices. Due to limited funds for acquisition of haptic interfaces, and limited availability of space necessary for larger systems, the Radiology Department is currently using the system with a PHANToM Omni desktop system. Finally, it should be noted, that the evaluated physical principles for generation of force feedback could provideguidelines for future lines of haptic interfaces that outperform currently existing devices. The report evaluated the potential of the emerging possibilities for the realization of practically and commercially usable haptic feedback systems, thus giving valuable input for device manufacturers.
Within the TH project, we investigated human haptic and multimodal perception. We verified that humans often integrate multimodal information in a statistical optimal fashion. This is true for spatial properties as well as temporal properties. For describing our integration results we developed quantitative/statistical models (Maximum-Likelihood, Bayesian estimation). Furthermore, we verified the hypothesis that object recognition, movement detection and spatial localization might be commonly processed in the visual ventral and dorsal extrastriate cortical areas independently from the sensory modality, conveying the information to the brain. For instance, we used fMRI to measure patterns of response evoked during visual and tactile recognition of faces and manmade objects in sighted subjects and during tactile recognition in blind subjects. Visual and tactile recognition evoked category-related patterns of response that were correlated across modality for manmade objects in the inferior temporal gyrus in sighted and blind individuals. Blind adults also showed category-related patterns in the fusiform gyrus signifying that these patterns are not due to visual imagery and do not require visual experience to develop. Also the dorsal extrastriate cortical areas and the MT/V5 cortex are involved both in visual and tactile spatial discrimination and movement detection tasks, both in sighted and congenitally blind individuals. These brain areas sharing supramodal features should be able to integrate information conveyed from different sensory modalities (vision and touch) or from multiple receptor sites within the same modality (touch) to estimate the properties of an object (e.g., size, shape, position, motion). We studied the psychophysical and functional correlates of visual-haptic integration. The fMRI results correlations between neural activity and cue weights in areas involved in shape processing during bimodal but not during unimodal sensory stimulation may indicate that these cortical areas are involved in multisensory integration of shape information. In addition we showed, with psychophysical studies, how unreliable haptic position and slope information to the fingers is integrated over space and time and, with imaging and neuropsychological studies, how posterior parietal regions are involved in these functions.

Searching for OpenAIRE data...

There was an error trying to search data from OpenAIRE

No results available