Skip to main content

Multi-Modal and Cognition-Aware Systems

Final Report Summary - MACAS (Multi-Modal and Cognition-Aware Systems)

Our eyes are involved in nearly everything that we do in our daily lives. In addition to physical activity, eye movements are also strongly linked to several processes of visual cognition and attention. This suggests that eye movements carry additional information on a user’s context that are difficult – if not impossible – to infer using common modalities, such as body movements or physiological measurements. The first objective of the MACAS project was to develop new knowledge and technology for automatic analysis and recognition of cognitive processes from human visual behaviour – a challenge at the core of ambient intelligence toward systems and services that are proactive and adaptive to human needs. A second objective was to push the state-of-the-art in using gaze for natural and efficient, i.e. calibration-free, human-computer interaction.

MACAS finished early due to the fellow taking up a new research position abroad. Nevertheless, the project made a number of contributions torwards the above research objectives. Core contributions of the project during the reporting period include, based on earlier work of the fellow, further investigations on automatic inference of visual memory recall from visual behaviour, a (so far unpublished) study on inferring user expertise as well as another study on inferring document types from visual behaviour, a robust method for smooth pursuit detection as well as for using smooth pursuit movements for interaction and eye tracker calibration, a novel wearable EOG head cap particularly geared for long-term eye movement recordings, as well as two prototype computer vision systems for eye gesture recognition and model-based gaze estimation on unmodified handheld devices. Through close collaborations with researchers in the UK, Switzerland, Germany and Japan, MACAS further contributed to closely related research efforts on calibration-free gaze-based interaction with situated displays and gaze-based object transfer between ambient and hand-held devices, as well as physical behaviour modelling and qualitative activity recognition. Finally, through the MACAS project, we organised two workshops on pervasive eye tracking and mobile eye-based human-computer interaction at the leading conferences in ubiquitous computing (UbiComp) and eye movement research (ECEM).