Skip to main content

Sensory Experiences for Interactive Technologies

Periodic Reporting for period 4 - SenseX (Sensory Experiences for Interactive Technologies)

Reporting period: 2019-10-01 to 2020-06-30

In the 20th century the industrial demands for a controllable way to describe colours initiated intense research on the descriptions of colours (substantially contributing to advances in computer graphics and image processing, photography and cinematography). Similarly, the 21st century now demands an in-depth investigation of touch, taste, and smell as sensory interaction modality for interactive systems.

Despite the fact that interactive technologies have permeated our environment (e.g. mobile, ubiquitous computing) and have become an essential part of our everyday life (e.g. work, leisure, education, health, etc.), the senses we call upon to interact with technology are still very limited relying mostly on visual and auditory senses. Given the immediacy of touch and the ubiquity of taste and smell and their importance to health, safety, leisure and work, and a persons emotional wellbeing, future multi-sensory experiences with interactive technologies can have a major impact on society and consumer markets creating entirely new product, technology, and service opportunities. More importantly, multi-sensory experience research can make a step-change in our understanding of the human senses as interaction modalities and revolutionize existing interaction paradigms within Human-Computer Interaction (HCI).

The grand challenge and vision of this project is to gain a rich and integrated understanding on touch, taste, and smell experiences for interactive technologies. We aim to achieve this ambitious grand vision by 1) creating a ‘sensory interaction framework’ on the bases of a systematic empirical investigation of touch, taste, and smell experiences, 2) integrating the generated understanding on the three senses into meaningful and efficient experiential cross-sensory gamuts and interaction principles, and 3) demonstrating the added value of the created experiential understanding on touch, taste, and smell – aka the experiential gamuts – through their integration into the development of multi-sensory systems verifying the short-, mid- and long-term societal and scientific impact (short-term: multi-sensory media experiences; mid-term: interaction concepts for partially sensory impaired people; long-term: multi-sensory interaction approach for life beyond Earth).

This research will pioneer novel interaction concepts for interactive technologies in relation to essential components of multisensory experiences. This project will transform existing interaction paradigm in Human-Computer Interaction (HCI) and likewise impact other disciplines such as sensory and cognitive sciences by delivering ground-breaking new insights on the experiential dimensions underlying neurological processes and human perception.
We perceive the world around us with all of our senses, that is, through what we see, hear, touch, taste, and smell. The senses, in turn, contribute to form the countless experiences that we have in our lives. We, humans, are equipped with multiple sensory channels which, at any given moment, allow us to detect and process different kinds of information to form specific impressions about ourselves, others, and the world around us. Notably, most of our everyday life experiences are multisensory in nature, that is, they involve many, if not all, of our senses. What is more, our senses have evolved to talk to one another while we sense and perceive the world, that is, they do not work independently.

Multisensory experiences are a central part of our everyday lives. However, we often tend to take them for granted, at least when our different senses function normally (say, when we have sight functioning) or are corrected-to-normal (when we use glasses). However, closer inspection to any, even the most mundane experiences, reveals the remarkable multisensory world in which we live in. Our growing understanding of the senses and multisensory technologies, as well as the increased symbiosis between humans and technology, have facilitated the development of the concept of multisensory experiences (sometimes also referred to as multisensory experience design). This concept is based on the idea that we are not just passive receivers in, but instead can be active creators of, the multisensory worlds in which we live. In other words, we can carefully consider the different senses, the way in which they work together, and the available multisensory technologies to shape our experiences. We can explore sensory arrangements that can change the way we perceive and interact with our environment and ultimately form new experiences.

This ERC SenseX project contributed to this basic understanding of multisensory experiences and the design and future development of multisensory technologies. Those emerging technologies not just stimulate our eyes (think of screens) and ears (and audio systems), but also consider how and what we touch, smell, and taste, among others, in our lives. Multisensory digital technology enables the creation of multisensory experiences that enrich and augment the way we interact with the worlds around us.

This project has so far resulted in a
- over 60 publications in peer-reviewed journals and conferences, that won several best paper and honourable mention awards (~ six awards)
- lead to the organisation of various international events including workshops, tutorials, panels
- a range of keynote talks by the PI in renowned international conferences, and invited talks to global leading research organisations and industries, including MIT, Stanford, Facebook/Oculus
- a successful University spin-out company (OWidgets Ltd) that has been supported by an ERC PoC

Above all, this project has truly advance our ability to understand, design, and develop multisensory interfaces in a broad range of application domains, from entertainment, immersive media, to automotive and healthcare applications. With July 2020, the PI has moved her research laboratory to a world-leading University, becoming Professor of Multisensory Interfaces at UCL (University College London), Department of Computer Science and was further selected as Deputy Director (Digital Health) for the UCL Institute of Healthcare Engineering. She is also a Visiting Professor at the Material Science Research Centre at the Royal College of Art in London and was a Visiting Professor at the HCI Engineering Group at MIT CSAIL in summer 2019.

Most recently, she published a book on ‘Multisensory Experiences: where the senses meet technology’, published by Oxford University Press as a popular science book; thus making the research of this project accessible to the wider public.
The first achievements in the SenseX project are linked to the creation of a sensory interaction framework based on a systematic empirical investigation of touch, taste, and smell experiences (see publications above). Based on those sensory-specific investigations we now started investigating cross-sensory experiences that will enable us for the first time to establish an integrated understanding on the three senses alongside audio-visual stimuli. We will progress towards a design framework that will guide the meaningful and efficient integration of sensory stimuli into interactive systems and demonstrate their added value in three areas of interest from a societal and scientific perspective covering different time-scales: short-term: multi-sensory media experiences in the realm of games, television, and online video production; mid-term: novel interaction approaches for people with sensory impairments advancing research efforts in the development of sensory substitute devices; long-term: multi-sensory research for preparing humanity for life beyond Earth (colonization on Mars and long-term space flights).
Research Lab on Multisensory Experiences and Interfaces