Skip to main content

A Computational Design Approach to Haptic Synthesis

Periodic Reporting for period 1 - TouchDesign (A Computational Design Approach to Haptic Synthesis)

Reporting period: 2018-09-01 to 2020-02-29

We use touch permanently to explore, manipulate and interact with the world around us, but also to feel and transmit affection. Haptic synthesis, i.e. the ability to design and control what we feel, either on a computer application or with a consumer product, bears an immense scientific, industrial, and social impact. However, touch is still poorly understood and underexploited in today’s digital era. The state of the art in computational haptic synthesis lags well behind the technological and scientific progress in additive manufacturing, computational design for fabrication, virtual reality displays, or cutaneous haptic interfaces.

TouchDesign will define a formal and comprehensive computational design methodology for haptic synthesis, applied to both tactile digital communication and to computational design and fabrication of objects with desired tactile properties. Haptic synthesis will be formulated as an optimization problem, with the objective function defined based on haptic perceptual metrics, and with the design space defined by the high-dimensional parameter space of a fabrication process or a haptic interface.

TouchDesign will introduce multiple breakthroughs in four major scientific pillars. (i) In contact biomechanics: develop measurement-based and data-driven models that enable interactive evaluation of the deformations undergone by skin mechanoreceptors. (ii) In perceptual modeling: establish a connection between high-resolution biomechanics, mechanoreceptor activation fields, and psychophysics, through machine-learning analysis of exhaustive simulated and experimental data. (iii) In numerical optimization: design methods that optimize perceptual metrics through robust and efficient search of the high-dimensional design space of haptic fabrication and haptic display problems. (iv) In computational design: introduce methods and interfaces to visualize, explore, and define perceptual objective functions and haptic design spaces.
The project has advanced along the two main lines initially proposed: creation of a comprehensive model of touch, and application of this model (or preliminary versions) to various settings of haptic synthesis.

In the creation of the model of touch, the project has made progress in two directions. One is the collection of high-resolution biomechanical deformation data (still in progress, and publications pending), and the other one is the design of a biomechanics model for full-body touch. This line of work has already produced published results, in the form of a full-body mechanical model with personalized parameter estimation, which can be integrated with parametric skeletal models [Romero et al. 2020], as well as learning-based models of personalized parametric full-body deformations [Santesteban et al. 2020]. The results create a ground-breaking contribution, with the ability to create the first full-body models that can be personalized in terms of body shape and also local biomechanics.

In the application of touch models to haptic synthesis problems, the project has progressed in multiple directions too. One direction of work is the integration of full-body touch for ergonomics design of clothing. This work entails the design of cloth-body interaction models with accurate contact mechanics and good performance trade-off [Casafranca et al. 2019], as well as machine learning solutions to enable fast estimation of cloth fit and drape as a function of body shape and pose [Santesteban et al. 2019].

Another direction of work is the design of haptic rendering methods following computational optimization solutions. This approach has been applied to ultrasound-based haptic rendering, with two progressively more advanced methods [Barreiro et al. 2019; Barreiro et al. 2020], and to haptic rendering using underactuated devices [Lobo and Otaduy 2020].

Finally, we have explored extending the capability of touch models in applications beyond, but related to, haptic synthesis. These collaborations have been carried out in collaborations with other labs. Specifically, with Disney Research, we have explored the extension of computational design based on touch models for robotic sensing [Tapia et al. 2020]; with the Max Planck Institute, we have explored the application to hand tracking [Mueller et al. 2019].
In the remainder of the project, the expected results include:
- Extract efficient measurement-based models of fine-scale contact biomechanics
- Develop computational models of haptic perception
- Develop optimization-based formulations and solutions for haptic synthesis problems
- Design intuitive techniques to express and explore the design space of touch synthesis problems
Touch models applied to hand tracking [Mueller et al. ACM TOG 2019]
Cloth models for ergonomics design of clothing [Casafranca et al. CGF 2020]
Garment fit and drape for ergonomics design [Santesteban et al. CGF 2019]
Learning-based full-body biomechanics [Romero et al. CGF 2020]
Ultrasound-based haptic rendering [Barreiro et al. IEEE ToH 2020]
Ultrasound-based haptic rendering [Barreiro et al. IEEE WHC 2019]
Haptic rendering with underactuated devices [Lobo and Otaduy IEEE ToH 2020]
Touch models applied to robot sensing [Tapia et al. Soft Robotics 2020]
Measurement-based full-body biomechanics [Romero et al. CGF 2020]