Project description
For intuitive body-based user interfaces
Today, users are increasingly using body-based user interfaces (UIs) to interact with computers. These UIs heavily rely on visual cues and require conscious, deliberate movements. However, human-computer interaction (HCI) seeks to incorporate more intuitive body language. The EU-funded BODY-UI project aims to overcome current challenges and develop the next generation of body-based UIs. It will follow a new approach based on embodied cognition, drawing on psychology, neuroscience, robotics, and artificial intelligence. By recognising the integration of sensory, motor and cognitive systems, this interdisciplinary project aims to revolutionise HCI by re evaluating criteria and design, leading to more seamless HCI.
Objective
Recent advances in user interfaces (UIs) allow users to interact with computers using only their body, so-called body-based UIs. Instead of moving a mouse or tapping a touch surface, people can use whole-body movements to navigate in games, gesture in mid-air to interact with large displays, or scratch their forearm to control a mobile phone. Body-based UIs are attractive because they free users from having to hold or touch a device and because they allow always-on, eyes-free interaction. Currently, however, research on body-based UIs proceeds in an ad hoc fashion and when body-based UIs are compared to device-based alternatives, they perform poorly. This is likely because little is known about the body as a user interface and because it is unclear whether theory and design principles from human-computer interaction (HCI) can be applied to body-based UIs. While body-based UIs may well be the next interaction paradigm for HCI, results so far are mixed.
This project aims at establishing the scientific foundation for the next generations of body-based UIs. The main novelty in my approach is to use results and methods from research on embodied cognition. Embodied cognition suggest that thinking (including reasoning, memory, and emotion) is shaped by our bodies, and conversely, that our bodies reflect thinking. We use embodied cognition to study how body-based UIs affect users, and to increase our understanding of similarities and differences to device-based input. From those studies we develop new body-based UIs, both for input (e.g. gestures in mid-air) and output (e.g. stimulating users’ muscles to move their fingers), and evaluate users’ experience of interacting through their bodies. We also show how models, evaluation criteria, and design principles in HCI need to be adapted for embodied cognition and body-based UIs. If successful, the project will show how to create body-based UIs that are usable and orders of magnitude better than current UIs.
Fields of science
- social sciencespsychology
- engineering and technologyelectrical engineering, electronic engineering, information engineeringelectronic engineeringsensors
- engineering and technologyelectrical engineering, electronic engineering, information engineeringinformation engineeringtelecommunicationsmobile phones
- natural sciencescomputer and information sciencessoftwaresoftware applicationsvirtual reality
- natural sciencescomputer and information sciencesartificial intelligencemachine learning
Programme(s)
Funding Scheme
ERC-COG - Consolidator GrantHost institution
1165 Kobenhavn
Denmark