Periodic Reporting for period 3 - TouchDesign (A Computational Design Approach to Haptic Synthesis) Reporting period: 2021-09-01 to 2023-02-28 Summary of the context and overall objectives of the project We use touch permanently to explore, manipulate and interact with the world around us, but also to feel and transmit affection. Haptic synthesis, i.e. the ability to design and control what we feel, either on a computer application or with a consumer product, bears an immense scientific, industrial, and social impact. However, touch is still poorly understood and underexploited in today’s digital era. The state of the art in computational haptic synthesis lags well behind the technological and scientific progress in additive manufacturing, computational design for fabrication, virtual reality displays, or cutaneous haptic interfaces.TouchDesign will define a formal and comprehensive computational design methodology for haptic synthesis, applied to both tactile digital communication and to computational design and fabrication of objects with desired tactile properties. Haptic synthesis will be formulated as an optimization problem, with the objective function defined based on haptic perceptual metrics, and with the design space defined by the high-dimensional parameter space of a fabrication process or a haptic interface.TouchDesign will introduce multiple breakthroughs in four major scientific pillars. (i) In contact biomechanics: develop measurement-based and data-driven models that enable interactive evaluation of the deformations undergone by skin mechanoreceptors. (ii) In perceptual modeling: establish a connection between high-resolution biomechanics, mechanoreceptor activation fields, and psychophysics, through machine-learning analysis of exhaustive simulated and experimental data. (iii) In numerical optimization: design methods that optimize perceptual metrics through robust and efficient search of the high-dimensional design space of haptic fabrication and haptic display problems. (iv) In computational design: introduce methods and interfaces to visualize, explore, and define perceptual objective functions and haptic design spaces. Work performed from the beginning of the project to the end of the period covered by the report and main results achieved so far The project has advanced along the two main lines initially proposed: creation of a comprehensive model of touch, and application of this model (or preliminary versions) to various settings of haptic synthesis.In the creation of the model of touch, the project has made major progress in the design of efficient yet accurate models of contact and deformation of skin. The results include: a full-body mechanical model with personalized parameter estimation, which can be integrated with parametric skeletal models [Romero et al. 2020], reduced-order modeling of the combination of skeletal deformations and contact-induced deformations of skin [Tapia et al. 2021]; learning-based models of personalized parametric full-body deformations [Santesteban et al. 2020]; and learning-based modelling of contact-induced deformations [Romero et al. 2021]. The project has introduced important breakthroughs in the design of subspaces for deformation and contact modelling, and this work has been acknowledged through an invited plenary talk at the IEEE/CVF CVPR 2021 conference.In the application of touch models to haptic synthesis problems, the project has progressed in multiple directions too: low-level modelling of contact interactions with fabric materials [Casafranca et al. 2020; Sánchez-Banderas et al. 2020; Pizana et al. 2020]; learning-based solutions to garment interaction with the body [Santesteban et al. 2019; Santesteban et al. 2021]; hand tracking methods that leverage computational models of touch [Mueller et al. 2019; Wang et al. 2020]; ultrasound rendering of contact interactions with complex materials [Barreiro et al. 2019; Barreiro et al. 2020; Barreiro et al. 2021]; optimization-based haptic rendering with wearable thimbles [Verschoor et al. 2020]; haptic rendering using underactuated devices [Lobo and Otaduy 2020]; computational design based on touch models for robotic sensing [Tapia et al. 2020]; and personalization of hand and touch models for VR-based natural interaction [Sorli et al. 2021]. Progress beyond the state of the art and expected potential impact (including the socio-economic impact and the wider societal implications of the project so far) In the remainder of the project, the expected results include:- Extract efficient measurement-based models of fine-scale contact biomechanics- Develop computational models of haptic perception- Develop optimization-based formulations and solutions for haptic synthesis problems- Design intuitive techniques to express and explore the design space of touch synthesis problems Touch models applied to hand tracking [Mueller et al. ACM TOG 2019] Cloth models for ergonomics design of clothing [Casafranca et al. CGF 2020] Hand tracking methods using computational models of touch [Wang et al. ACM TOG 2020] Garment fit and drape for ergonomics design [Santesteban et al. CGF 2019] Learning-based full-body biomechanics [Romero et al. CGF 2020] Ultrasound-based haptic rendering [Barreiro et al. IEEE ToH 2020] Learning-based modelling of contact-induced deformations [Romero et al. ACM TOG 2021] Low-level modelling of contact with fabric materials [Sánchez-Banderas et al. ACM TOG 2020] Learning-based solutions to garment interaction with the body [Santesteban et al. CVPR 2021] Reduced-order models of skeletal and contact-induced deformations [Tapia et al. CGF 2021] Optimization-based haptic rendering with wearable thimbles [Verschoor et al. ACM TOG 2020] Haptic rendering with underactuated devices [Lobo and Otaduy IEEE ToH 2020] Touch models applied to robot sensing [Tapia et al. Soft Robotics 2020] Measurement-based full-body biomechanics [Romero et al. CGF 2020]