Language is fundamental to human sociality. While the last century has given us many fundamental insights into how we use and understand it, core issues that we face when doing so within its natural environment—face-to-face conversation—remain untackled. When we speak we also send signals with our head, eyes, face, hands, torso, etc. How do we orchestrate and integrate all this information into meaningful messages? CoAct will lead to a new model with in situ language processing at its core, the Contextualized Action and Language (CoALa) processing model. The defining characteristic of in situ language is its multimodal nature. Moreover, the essence of language use is social action; that is, we use language to do things—we question, offer, decline etc. These social actions are embedded in conversational structure where one speaking turn follows another at a remarkable speed, with millisecond gaps between them. Conversation thus confronts us with a significant psycholinguistic challenge. While one could expect that the many co-speech bodily signals exacerbate this challenge, CoAct proposes that they actually play a key role in dealing with it. It tests this in three subprojects that combine methods from a variety of disciplines but focus on the social actions performed by questions and responses as a uniting theme: (1) ProdAct uses conversational corpora to investigate the multimodal architecture of social actions with the assumption that they differ in their ‘visual signatures’, (2) CompAct tests whether these bodily signatures contribute to social action comprehension, and if they do so early and rapidly, (3) IntAct investigates whether bodily signals play a facilitating role also when faced with the complex task of comprehending while planning a next social action. Thus, CoAct aims to advance current psycholinguistic theory by developing a new model of language processing based on an integrative framework uniting aspects from psychology , philosophy and sociology.
Call for proposal
See other projects for this call