Servicio de Información Comunitario sobre Investigación y Desarrollo - CORDIS

Final Activity Report Summary - GESTURE (The role of gesture speech synchronisation in interaction and cognition)

This project investigated how speakers coordinate their words and their gestures in time. In conversation, speakers use hand gestures to provide prosodic, semantic, pragmatic and interactional information. Speakers coordinate their gestures temporally with their speech in specific ways. We investigated how different gestures are aligned with speech. Based on a corpus of video and motion tracking data we determined four different types of temporal relationships providing a systematic account of the temporal relationship of speech and gesture.

In addition, we generated animations based on the motion tracking corpus and used them as stimuli investigating if these animations can be used in psychological experiment on audio-visual speech perception. The study showed that the animations are interpreted in similar ways as the video recordings. This provides the basis to be able to exploit motion tracking data for experimentation and the investigation of audio-visual speech perception.

Reported by

MAX-PLANCK-GESELLSCHAFT ZUR FOERDERUNG DER WISSENSCHAFT VERTRETEN DURCH DAS MAX-PLANCK-INSTITUT FUER PSYCHOLINGUISTIK
Wundtlaan 1
6500 AH NIJMEGEN
Netherlands
See on map
Síganos en: RSS Facebook Twitter YouTube Gestionado por la Oficina de Publicaciones de la UE Arriba