Skip to main content
European Commission logo print header

Natural Interaction with Projected User Interfaces

Final Report Summary - NIPUI (Natural interaction with projected user interfaces)

The overall objective of NIPUI was the development of mixed reality interfaces using controllable projectors, and integrating these interfaces with multimodal interactive systems. The project had two major outputs. First a mixed reality interface was developed for the AMI content linking device. Second a system for projected virtual characters was built.

The AMI content linking device is a 'search without query system', in which relevant documents are presented to meeting participants based on the current context of the meeting - obtained using real-time speech recognition. Although content linking is a powerful concept, the way the linked content is presented to users is a serious challenge, if cognitive overload is to be avoided. We hypothesised that an approach using a projected interface to create a mixed reality smart table would be exciting and efficient.

A prototype of such an interface to the AMI content linking device was designed, built, and evaluated. The system interfaces with the content linking device via the central hub of the AMI infrastructure, displaying those documents returned as linked content by the AMI system. As the content linking system currently does not specify for which user or which role the document is relevant, all documents appear at the position of the user specified in the management application. Then users can move documents around using grabbing devices tracked by a marker based video tracking system, both within their own projection space and to different projection units.

When the first prototype was working, it quickly became clear that the resolution of the projectors used was not sufficient to be able to project documents as virtual sheets of A4 papers in a way that they were readable. To improve that we used a high definition projector in a second prototype to demonstrate that we could produce readable documents on the tabletop.

Dr Ehnes submitted papers about this system to three conferences (ICEIS, HSI, HCII) which were all accepted. The HSI paper also won a best paper award.

The second period of the project was primarily concerned with the development of a controllable projector system to enable projected virtual characters to be used as a novel user interface. The system we developed allows us to model the room as a set of surfaces which represent the physical surfaces on which the characters live. We use cartoon versions of animals that can crawl on walls - geckos in particular - as characters, because they are more believable to be living on these room surfaces than human like characters would be. In order to increase the believability we simulate their motion, in particular that of their feet, on the surface. This way we can make sure that a foot placed on the surface does not drift, but stays fixed in its place while the rest of the body moves. Even a character consisting of just footprints projected as crawling across the surface this way achieved an astonishing level of presence. These footprints really came alive.

As a next step we created a character with a body and legs which were moved using inverse kinematics based on the feet's motion. This looks even more like an animal crawling on the wall or ceiling, although we will need to refine the body's motion to make it more natural. In order to let the character talk, we used speech synthesis which also generates events we use to animate the character's lips synchronously to the speech output. We use a hypersonic sound speaker, a directional speaker system which can be described as a spotlight for sound, on a controllable gimbal to make the sound seem to appear from the character's mouth.

The results of the project are applicable to business meetings (interface to content lining) and creative industries, arts and exhibitions (virtual characters).