Skip to main content

The role of gesture speech synchronisation in interaction and cognition

Final Activity Report Summary - GESTURE (The role of gesture speech synchronisation in interaction and cognition)

This project investigated how speakers coordinate their words and their gestures in time. In conversation, speakers use hand gestures to provide prosodic, semantic, pragmatic and interactional information. Speakers coordinate their gestures temporally with their speech in specific ways. We investigated how different gestures are aligned with speech. Based on a corpus of video and motion tracking data we determined four different types of temporal relationships providing a systematic account of the temporal relationship of speech and gesture.

In addition, we generated animations based on the motion tracking corpus and used them as stimuli investigating if these animations can be used in psychological experiment on audio-visual speech perception. The study showed that the animations are interpreted in similar ways as the video recordings. This provides the basis to be able to exploit motion tracking data for experimentation and the investigation of audio-visual speech perception.