Skip to main content
European Commission logo print header

FEEL, Interact, eXpress: a Global appRoach to develOpment With INterdisciplinary Grounding

Project description


Advanced Robotics
Robots in touch with your emotions

Robots that care for the sick or comfort the elderly are close to becoming a reality, but they are only likely to be accepted by society if they can respond to human emotions.

Creating robots that know when a person is sad, happy or angry is the goal of the Feelix Growing project, which is developing software that will allow a new generation of robots to detect and respond to a person’s emotional state.

Inspired by the human brain

The Feelix Growing researchers are focusing their work on the development of artificial neural networks – computational systems that mimic the way information is processed in the human brain and which can adapt to changing inputs.

In essence, the neural networks allow the robot to learn to interpret emotions and to behave accordingly.

Using cameras and sensors, robots built by the researchers from mostly off-the-shelf components can detect different parameters, such as a person’s facial expressions, voice, body movement, temperature and proximity to the robot, to determine their emotional state.

The technology draws on research in developmental and comparative psychology, neuroscience, ethology (the science of human behaviour) and robotics.

Much like a human child, the robot learns from experience how to respond to emotions displayed by the people around it.

Happy people, happy robot

If someone shows fear, for example, the robot may learn to change its behaviour to appear less threatening or back away. If someone is happy, the robot might also show joy.

The researchers’ goal is to build robots that are not only able to respond to common emotions in any person, but which can also fine-tune their response from experience to the individuals around them.

In a family environment, for example, the robot could learn that if one person bursts into tears it should try to comfort them, while if another person does the same that it is better to leave them alone.

Following mother

Several demonstration robots are being developed by the project partners to showcase their work.

One mimics the behaviour of young animals that follow their mother around. The small-wheeled robots can follow a researcher around, learning from how they move whether to stick close to them or to trail them at a distance.

The researchers are also developing a robot face that can indicate different ‘emotions’.

By interpreting and responding to human emotions, such types of robots should be more readily accepted by the people they are meant to help.

A caring machine

Rather than being seen as cold, emotionless machines, the robots would at least be able to give people the impression of understanding how they feel even though they cannot feel themselves.

Such abilities are particularly important if robots are to be used to look after the sick or the elderly, to play a role as domestic helpers or even if they are simply intended for entertainment.

If robots are to be truly integrated in humans' everyday environment in order to provide services such as company, caregiving, entertainment, patient monitoring, aids in therapy, etc., they cannot be simply designed and taken "off the shelf" to be directly embedded into a real-life setting. Adaptation to incompletely known and changing environments and personalization to their human users and partners are necessary features to achieve successful long-term integration. This integration would require that, like children (but on a shorter time-scale), robots develop embedded in the social environment in which they will fulfil their roles.
The overall goal of this project is the interdisciplinary investigation of socially situated development from an integrated or "global" perspective, as a key paradigm towards achieving robots that interact with humans in their everyday environments in a rich, flexible, autonomous, and user-centred way. To achieve this general goal we set the following specific objectives:
1) Identification of scenarios presenting key issues and typologies of problems in the investigation of global socially situated development of autonomous (biologically and robotic) agents.
2) Investigation of the roles of emotion, interaction, expression, and their interplays in bootstrapping and driving socially situated development, which includes implementation of robotic systems that improve existing work in each of those aspects, and their testing in the key identified scenarios.
3) Integration of (a) the above "capabilities" in at least 2 different robotic systems, and (b) feedback across the disciplines involved.
4) Identification of needs and key steps towards achieving standards in: (a) the design of scenarios and problem typologies, (b) evaluation metrics, (c) the design of robotic platforms and related technology that can be realistically integrated in people's everyday life.

Call for proposal

FP6-2005-IST-6
See other projects for this call

Coordinator

THE UNIVERSITY OF HERTFORDSHIRE
EU contribution
€ 661 436,00
Address
College Lane
AL10 9AB Hatfield, Herts
United Kingdom

See on map

Activity type
Higher or Secondary Education Establishments
Links
Total cost
No data

Participants (8)