Project description
Listening to our hands
When speaking, we often unconsciously use hand gestures in close synchrony with our speech. However, what this temporal coupling between hands and mouth contributes to spoken communication is unknown. The EU-funded HearingHands project proposes that gesture-speech coupling is an audiovisual cue to prosody, potentially influencing the perception of lexical stress (e.g. OBject vs. obJECT in English) and lexical tone (e.g. in tone languages like Mandarin Chinese). This premise will be studied across a set of typologically different languages and across different populations, including individuals with autism who are known to have trouble with spoken prosody. Using such methods as neuroimaging and virtual reality, the communicative role of even the simplest up-and-down hand gestures will be enlightened.
Fields of science
Programme(s)
- HORIZON.1.1 - European Research Council (ERC) Main Programme
Topic(s)
Funding Scheme
ERC - Support for frontier research (ERC)
Coordinator
6525 XZ Nijmegen
Netherlands
See on map