CORDIS
EU research results

CORDIS

English EN

Emotionally Rich Man-Machine Interaction Systems

Project information

Grant agreement ID: IST-2000-29319

  • Start date

    1 January 2002

  • End date

    31 December 2004

Funded under:

FP5-IST

  • Overall budget:

    € 2 458 626

  • EU contribution

    € 1 550 000

Coordinated by:

ALTEC INFORMATION AND COMMUNICATION SYSTEMS S.A.

Greece

Objective

The main objective of the ERMIS Project is the development of a prototype system for human computer interaction that can interpret its users' attitude or emotional state, e.g. activation/interest, boredom, and anger, in terms of their speech and/or their facial gestures and expressions. The adopted technologies include linguistic and paralinguistic speech analysis and robust speech recognition, facial expression analysis, interpretation of the user's emotional state using hybrid, neurofussy, techniques, while being in accordance with the MPEG-4 standard. Specific attention is given to the evaluation of the system's ability to improve effectiveness, user friendliness and user satisfaction, while examining and resolving related ethical issues. Real life applications, where users interact with machines, and in particular with call/ information centres and next generation PC interfaces, have been selected to test the performance of the ERMIS system.

Objectives:
Recent developments indicate that emotion analysis is an area that is well placed to develop, while there already exists a substantial body of prior knowledge and an increasingly clear picture of what current techniques can achieve. Based on these results and on the expertise of the Consortium partners, the ERMIS project will conduct a systematic analysis of speech and facial input signals, in separate, as well as in common; the aim is to extract parameters and features which can provide human computer interaction (HCI) systems with the ability to recognise the basic emotional state of their users and interact with them in a more natural and user friendly way. Testbed applications have been selected for testing and evaluating the ERMIS system performance, referring to everyday interaction of users with their PCs and with service or information providing call centres, with successful developments prospecting a large market size.

Work description:
Linguistic and paralinguistic analysis of speech will be investigated systematically, creating a prototype that is able to analyse and respond to its users' commands, taking into account the cues about their emotional state. Analysis of facial expressions, especially in the framework of the MPEG-4 standard, constitutes another input for retrieving cues about the user's emotional state. Facial expression analysis will be applied separately, or combined with emotional speech analysis. The ERMIS system will be able to rely on prior knowledge related to the emotional analysis of speech and/or facial expressions, and to accommodate for the different expressive styles of humans. The continuity of emotion space, the uncertainty involved in the feature estimation process and the required ability of the system to use prior knowledge, while being also capable of adapting its behaviour to its users' characteristics, will be handled by using intelligent hybrid, neurofuzzy, approaches.

A crucial criterion for the success of the project developments has to do with the effectiveness and the improvement of user friendliness achieved by the proposed system when compared with the current state of Human Computer Interaction. User research and testing will be carried out by the project industrial partners and subcontractors, evaluating user acceptance and related ethical issues.

The ERMIS Consortium includes the multidisciplinarity required for tackling the project objectives. It includes big industrial IT and telecommunication partners, well known research centres and universities in the speech analysis, image analysis, psychological and computational emotion analysis fields and in intelligent multimedia systems. The Consortium expertise in the above fields will be used, on the one hand, for achieving its technological goals and developments, and on the other hand, for performing reliable market and exploitation analysis.

Milestones:
The project milestones are:
-User requirements analysis and system specifications
- Facial feature and expression analysis
- An emotional representation space suitable for HCI
- Emotion analysis based on speech and facial cues
- A prototype system that analyses its users' attitude and responds accordingly
- A call centre understanding its users' commands and attitude towards offered services
- Evaluation of the system performance and exploitation

Leaflet | Map data © OpenStreetMap contributors, Credit: EC-GISCO, © EuroGeographics for the administrative boundaries

Coordinator

ALTEC INFORMATION AND COMMUNICATION SYSTEMS S.A.

Address

Patmou 12
15123 Maroussi - Athens

Greece

Participants (9)

BRITISH TELECOMMUNICATIONS PLC

United Kingdom

EYETRONICS N.V

Belgium

FRANCE TELECOM

France

INSTITUTE FOR LANGUAGE AND SPEECH PROCESSING

Greece

INSTITUTE OF COMMUNICATION AND COMPUTER SYSTEMS

Greece

KATHOLIEKE UNIVERSITEIT LEUVEN

Belgium

KING'S COLLEGE LONDON

United Kingdom

MIT-MANAGEMENT INTELLIGENTER TECHNOLOGIEN GESELLSCHAFT MIT BESCHR. HAFTUNG

Germany

THE QUEEN'S UNIVERSITY OF BELFAST

United Kingdom

Project information

Grant agreement ID: IST-2000-29319

  • Start date

    1 January 2002

  • End date

    31 December 2004

Funded under:

FP5-IST

  • Overall budget:

    € 2 458 626

  • EU contribution

    € 1 550 000

Coordinated by:

ALTEC INFORMATION AND COMMUNICATION SYSTEMS S.A.

Greece