Skip to main content

Mixed Haptic Feedback for Mid-Air Interactions in Virtual and Augmented Realities

Periodic Reporting for period 1 - H-Reality (Mixed Haptic Feedback for Mid-Air Interactions in Virtual and Augmented Realities)

Reporting period: 2018-10-01 to 2019-09-30

Digital content today remains focused on visual and auditory stimulation. Even in the realm of VR and AR, sight and sound remain paramount. In contrast, methods for delivering haptic (sense of touch) feedback in commercial media are significantly less advanced than graphical and auditory feedback. Yet without a sense of touch, experiences ultimately feel hollow, virtual realities feel false, and Human-Computer Interaction (HCI) become unintuitive.

Our vision is to be the first to imbue virtual objects with a physical presence, providing a revolutionary, untethered, virtual-haptic reality: H-Reality.

The ambition of H-Reality will be achieved by integrating ultrasonic “non-contact” haptics with state-of-the-art "contact" vibrotactile actuators. Such Mixed Haptic Interfaces (MHI) will be enabled though novel mathematical and tribological modelling of the skin and the mechanics of touch, as well by working with experts in the psychophysical rendering of sensation. The result will be a sensory experience where digital 3D shapes and textures are made manifest in real space via modulated, focused, ultrasound, ready for the untethered hand to feel, where next-generation wearable haptic actuators provide directional vibrotactile stimulation, informing users of an object's dynamics, and where computational renderings of specific materials can be distinguished via their surface properties.

The implications of this technology will be far-reaching. The computer touch-screen will be brought into the third dimension so that swipe gestures will be augmented with instinctive rotational gestures, allowing intuitive manipulation of 3D data sets and strolling about the desktop as a virtual landscape of icons, apps and files. H-Reality will transform online interactions; dangerous machinery will be operated virtually from the safety of the home, and surgeons will practice their skills on thin air.

The overall objectives of the project are three-fold:
1) Create MHI prototypes, and demonstrate how these can unlock the next generation of HCI applications.

2) Develop ergonomic interaction techniques and environment libraries for our MHI prototypes.

3) Provide scientific models and empirical analysis enabling the MHI prototypes.
The work performed during the first year of the project led to the achievement of the following main results:

* The simultaneous control of multiple ultrasonic devices through a single API (application program interface) for non-contact prototypes has been developed that achieves seamless hand-overs from one device to the other. This network architecture will also form a basis for the simultaneous input control of both contact and non-contact MHIs.

* A new family of contacting soft haptic interfaces has been developed that is compatible with ultrasonic non-contacting haptic devices since it completely frees-up the volar region of the hand where the airborne haptics operate.

* Perceptual limits for materials and objects have been determined for the contacting and non-contacting haptic prototypes. This has led to perceptual verification of device efficacy by employing absolute detection thresholds for the MHI.

* Integration techniques and paradigms for multi-modal contact/non-contact haptics are being developed to operate in immersive VR. This includes an algorithm that analyses different tangible and virtual objects to find the grasping strategy best matching the resultant haptic pinching sensations, as well as new techniques to use wearable contact haptic devices in combination with tangible objects in VR and AR.

* An analytical model has been developed of the propagation of vibrations that describes the mechanism of vibrotaction and perceptual constancy. It will enable a Finite Element/Boundary Element computational model of the hand to be greatly simplified by capturing only the essential features of vibrotaction.

* An untethered sensor array for the hand has been designed that includes real time orientational tracking. It will monitor the vibrations generated by vibrotaction including the wave propagation speeds for comparison with the modelling.

* 7 conference papers published, 8 papers in press and 4 papers submitted.

* 28 public events attended.
Our long-term vision is to enable a completely novel and natural way of interacting with digital content. We foresee a future where data processing is powered by ubiquitous computing and wireless connectivity, and where artificial intelligence (AI) and robotics will reshape the job market. A future where digital content will exist, not only as big data sets, gigapixel images, HD audio and video formats, but in haptic formats. Today’s graphical user interfaces (GUIs) will be replaced by 3D VR/AR interfaces with portable interaction components that support expressive haptic feedback. Our mixed haptic interfaces (MHI) will address socio-economic and wider societal needs:

* Software programmability and universality: Virtual objects can take many shapes and forms. MHI will render these as well as their textural information.

* Efficient communication, training, and collaboration of complex concepts: Two-way input experiences in VR/AR and MHI will improve understanding, and therefore can enable e-learning simulations of all types of practical and hands-on training.

* Diminish the transition cost between novice and expert: The project will allow users to reach high performance levels for small learning investment, since direct 3D manipulation relies on innate human ability to interact with objects. It will also empower the manipulation of complex 3D representations of data.

* Work-from-anywhere: Rich haptic VR/AR interfaces can make remote working more efficient and increase user adoption for a variety of tasks (e.g. engineering site visits). Remote working has direct benefits to the environment, businesses, and employee income - the average EU employee can save 260 kg CO2 and over €750 per year by working from home.

* Accessibility and inclusion: Some user groups can be better supported as well as included (e.g. people with visual or auditory impairments, or hand muscle and nerve disorders) by our software programmable haptic interface (e.g. by using advanced sensory-supplementation or substitution techniques).

* Rehabilitation of stroke patients: Given the success of current force feedback devices, the untethered, ungrounded MHI will greatly-improved patient outcomes.

* Facilitate creativity via increased engagement: Artists, designers, students, educators, and engineers will be able to take advantage of the increased engagement of MHI to offer new powerful active exploration techniques with tactile digital representations that have the potential to further stimulate reasoning, creative and analytical skills.
Redefining digital content as something that can be touched and felt.
Demonstrating a touchable beating bio-hologram heart.