Computational analysis of social interaction is an emerging field of research in several communities such as human computer interaction, machine learning, speech and language processing, and computer vision. Social psychologists have researched the dimensions of social interaction for decades, finding out that nonverbal cues strongly determine human behavior and communication. One of the ultimate aims in computational analysis is to develop computational systems that can automatically recognize, discover, or predict human behavior via sensing devices such as cameras and microphones. The scientific objective of this proposal is to develop new and principled computational methods to detect and analyze visual nonverbal cues for the
automatic analysis of social interaction in small group face-to-face conversations. Specifically, we will concentrate on hand gestures, head
gestures and body posture. As nonverbal communication in social interactions does not only include visual cues but also the aural ones, the automatic analysis of interactions requires the use of both cues in modeling and recognition. Hence, our specific objectives are (1) the
automatic detection and analysis of visual nonverbal communication cues, and (2) the multimodal integration of audio and visual nonverbal cues.
We will concentrate on a selected number of key research tasks in social interaction analysis including, among others, the automatic estimation of dominance in a group conversation, and the level of interest of the members of the group during their interaction.
Field of science
- /natural sciences/computer and information sciences/artificial intelligence/computer vision
- /natural sciences/computer and information sciences/artificial intelligence/machine learning
Call for proposal
See other projects for this call