Periodic Reporting for period 1 - TouchDesign (A Computational Design Approach to Haptic Synthesis)
Reporting period: 2018-09-01 to 2020-02-29
TouchDesign will define a formal and comprehensive computational design methodology for haptic synthesis, applied to both tactile digital communication and to computational design and fabrication of objects with desired tactile properties. Haptic synthesis will be formulated as an optimization problem, with the objective function defined based on haptic perceptual metrics, and with the design space defined by the high-dimensional parameter space of a fabrication process or a haptic interface.
TouchDesign will introduce multiple breakthroughs in four major scientific pillars. (i) In contact biomechanics: develop measurement-based and data-driven models that enable interactive evaluation of the deformations undergone by skin mechanoreceptors. (ii) In perceptual modeling: establish a connection between high-resolution biomechanics, mechanoreceptor activation fields, and psychophysics, through machine-learning analysis of exhaustive simulated and experimental data. (iii) In numerical optimization: design methods that optimize perceptual metrics through robust and efficient search of the high-dimensional design space of haptic fabrication and haptic display problems. (iv) In computational design: introduce methods and interfaces to visualize, explore, and define perceptual objective functions and haptic design spaces.
In the creation of the model of touch, the project has made progress in two directions. One is the collection of high-resolution biomechanical deformation data (still in progress, and publications pending), and the other one is the design of a biomechanics model for full-body touch. This line of work has already produced published results, in the form of a full-body mechanical model with personalized parameter estimation, which can be integrated with parametric skeletal models [Romero et al. 2020], as well as learning-based models of personalized parametric full-body deformations [Santesteban et al. 2020]. The results create a ground-breaking contribution, with the ability to create the first full-body models that can be personalized in terms of body shape and also local biomechanics.
In the application of touch models to haptic synthesis problems, the project has progressed in multiple directions too. One direction of work is the integration of full-body touch for ergonomics design of clothing. This work entails the design of cloth-body interaction models with accurate contact mechanics and good performance trade-off [Casafranca et al. 2019], as well as machine learning solutions to enable fast estimation of cloth fit and drape as a function of body shape and pose [Santesteban et al. 2019].
Another direction of work is the design of haptic rendering methods following computational optimization solutions. This approach has been applied to ultrasound-based haptic rendering, with two progressively more advanced methods [Barreiro et al. 2019; Barreiro et al. 2020], and to haptic rendering using underactuated devices [Lobo and Otaduy 2020].
Finally, we have explored extending the capability of touch models in applications beyond, but related to, haptic synthesis. These collaborations have been carried out in collaborations with other labs. Specifically, with Disney Research, we have explored the extension of computational design based on touch models for robotic sensing [Tapia et al. 2020]; with the Max Planck Institute, we have explored the application to hand tracking [Mueller et al. 2019].
- Extract efficient measurement-based models of fine-scale contact biomechanics
- Develop computational models of haptic perception
- Develop optimization-based formulations and solutions for haptic synthesis problems
- Design intuitive techniques to express and explore the design space of touch synthesis problems