Wspólnotowy Serwis Informacyjny Badan i Rozwoju - CORDIS

FP7

TURNTAKE Streszczenie raportu

Project ID: 301155
Źródło dofinansowania: FP7-PEOPLE
Kraj: Portugal

Final Report Summary - TURNTAKE (Turn-Taking in Human-Robot Interactions: a Developmental Robotics Approach)

Social coordination can be measured at a time scale of less than a second by the partners' synchronization of sensorimotor activity and at a larger time scale (of a few seconds to minutes) by the oscillation of activity (turn-taking) and the modulation of the interaction rhythm, evident in the statistical contingencies between the two. The project’s main goal was to advance the design and implementation of human-robot interfaces, specifically robots that can adjust to individual user’s interaction rhythms, while engaged on a joint task. Robotic architectures capable of open-ended and adaptive interaction can be used in several contexts: for example as learners with humans as instructors (e.g. by demonstration, by imitation), as teachers, and as assistants. Mounting evidence shows that the seamless timing of engaging and disengaging in a social interaction–and the capability for dynamic control of a turn-taking rhythm–is associated with perceived quality and warmth, and learning outcomes in adult-adult social exchanges. Finally, in infancy, turn-taking dynamics are predictors of developmental outcomes in the emotional, social and cognitive domains.
We conducted novel developmental psychology studies of turn-taking dynamics, when children are engaged in open-ended joint play with a parent; to this end, we developed a methodology, including experimental apparatus and data coding, processing, and analysis pipelines. We tested dyads in two tasks: one required the participants to build an object from parts using a photo of target object – this was a challenging copy task, clearly goal-oriented, that required the adult to assist the child; the second was to build a tower as tall as possible – a task with a goal but less constrained and less challenging. In addition, we also opted to adapt the two tasks and test adult-adult dyads. During execution, the project was supported by a novel collaboration at the host between a robotics laboratory and an experimental psychology laboratory.
Our main results include:
• Quantitative measurement of coordination is possible using the statistical contingencies between partners (how well can we predict the future behaviour of one partner from the past behaviour of the second partner, above what can be predicted using only data from the first partner). This measurement is possible using durations of turn-taking states in speech (replicating past literature) but also using analogues of these turn-taking states in head and hand motion, and using the motion velocity profile itself.
• We found that task complexity modulates the strength of the coordination, i.e. the most challenging task was associated with more tightly coupled social partners – this shows how modulation of the degree of coordination is important in joint tasks.
• The findings regarding the partner’s body motion data, both in terms of the similarities and differences with respect to turn-taking dynamics in speech, are very promising for Human-Robot Interaction (HRI). Put together, the complete set of results suggests that a robot using relatively simpler mechanisms, measuring velocity profiles and durations of activity/inactivity in speech and in motion in the human partner, can use such signals to predict a time budget for its own activity and thus effectively modulate the degree of coordination with a human partner. This was one of our key conjectures motivating the project and the empirical findings supported this idea.
The studies have potential for far-reaching consequences that extend beyond human-robot interaction design and will be relevant to social, emotional, and cognitive development fields, including clinical applications.

Kontakt

Vasco Teixeira, (PRO-RECTOR)
Tel.: +351253601118
Faks: +351253601059
Adres e-mail
Numer rekordu: 184329 / Ostatnia aktualizacja: 2016-06-23