Project description
A closer look at facial behaviours across cultures
Across cultures, understanding facial expressions is crucial for effective communication. However, consistent cross-cultural disagreements challenge the universality hypothesis of facial expressions. While automatic facial expression analysis has made strides, understanding emotions in diverse, realistic environments remains limited. In this context, the MSCA-funded ACMod project will advance facial behaviour modelling across cultures. This addresses gaps in human-computer interaction, robotics and virtual reality. By combining psychology and computer science, ACMod aims to digitise emotions and develop 4D facial models. The project fosters collaboration between EU and East Asian partners, enhancing research and knowledge exchange. ACMod seeks to revolutionise communication with intelligent agents in both real and virtual settings.
Objective
Mind reading plays an indispensable role in human social interaction and communication. One of the key factors in mind reading is facial expression analysis. It is thus crucial to understand and model facial behaviors reflecting mental states to enhance communication experience with robots and other intelligent agents in either real social interaction or augmented virtual environments. As well, consistent cross-cultural disagreement about the emotion and intensity conveyed by gold-standard universal facial expressions questions the universality hypothesis. In recent years, even though automatic facial expression analysis has achieved excellent progress, the research of emotion understanding in realistic environment across cultures is still lacking. Substantial psychological works support the use of the appraisal theory for internal emotion detection through facial behaviors, while research in computer science mainly focuses on facial modelling but ignoring the underlying biologically-driven mechanism. Therefore, we plan to reconstruct facial macro- and micro-expressions and digitise emotion appraisal, collect 4D (dynamic 3D) facial expression dataset with culture diversity and develop autonomous 4D facial model based on the collected 4D data and cultural emotion appraisal theory, and push forward commercialisation.
ACMod project aims to advance the state of the art in understanding and modelling naturalistic facial behaviours in social interaction cross various cultures, which is essential to the next generation of human-computer interaction, robotics and augmented virtual reality techniques. This project will open up a new avenue for modelling facial behaviors by taking a fundamentally different and interdisciplinary approach, combining theories and techniques in e.g. psychology and computer science with an emphasis on social interaction applications. The project will promote knowledge exchange between EU and East Asian partners to foster the researchers involved.
Fields of science
- natural sciencescomputer and information sciencesartificial intelligence
- social sciencespsychology
- engineering and technologyelectrical engineering, electronic engineering, information engineeringelectronic engineeringrobotics
- natural sciencescomputer and information sciencessoftwaresoftware applicationsvirtual reality
Keywords
Programme(s)
- HORIZON.1.2 - Marie Skłodowska-Curie Actions (MSCA) Main Programme
Funding Scheme
HORIZON-TMA-MSCA-SE - HORIZON TMA MSCA Staff ExchangesCoordinator
90014 Oulu
Finland