Community Research and Development Information Service - CORDIS

Periodic Report Summary 1 - EMO (Happy or Sad? The Influence of Emotional Body Cues on Facial Expression Perception in Huntington’s Disease)

Successful emotion recognition is arguably one of the most central skills required for interpersonal communication. However, the vast majority of studies to date have focused on the perception of stereotypical facial expressions that bear little resemblance to the dynamic, ambiguous and contextualized facial expressions encountered in real-life. This paucity of data on real-life emotion perception is mirrored in the literature on neuropsychological disorders manifested by poor emotion perception. With the support of the CIG, I have established the affective neuropsychology lab at the Hebrew University equipped with state of the art eye-tracking equipment and electrophysiological monitoring systems. The current studies constitute a first step at examining the processes of emotion perception in more naturalistic conditions in healthy and neuropsychological populations.
In the first of such studies, we examined the processing of highly intense ambiguous facial expressions (e.g., winners vs. losers in professional tennis). We found that although viewers fail to differentiate positive from negative expressions, the faces are actually physically different as revealed by an analysis of facial actions. Remarkably, exposing viewers to objectively valid information about how to differentiate the faces had virtually no effect on their performance (Aviezer et al., 2015). We next examined the prevalence of such ambiguous facial expressions in less controlled (and televised) situations than tennis matches. In fact, we found that viewers often confuse extreme expressions of spontaneous joy and terror (Wenzler et al., 2016). Finally, recent work examining real-life expressions shows that fearful individuals clearly express fear in the body, while the isolated fear face remains quite ambiguous (Abramson et al., under review). Together, these studies demonstrate that during intense situations the face is a poor source of information that required body context in order to be deciphered. Studies in our lab are also aiming to decipher the process in which the information from the face and body is integrated. Specifically, using eye tracking we find that people read-out information from the diagnostic body and then automatically read-in that information to the face (Semyonov et al., in prep). In fact, a striking illusion is apparent: viewers think they are recognizing the emotion from the face but not from the body! Importantly, intensity is not the only cause of ambiguity in the face. In fact, extremely subtle expressions may be even more common. As part of the CIG project, we produced a set of dynamic extremely subtle expressions. Our findings show that while recognizable, these expressions are far more challenging to viewers than the stereotypical expressions. In fact, while the stereotypical expressions are equally recognizable in static and dynamic presentations, the subtle expressions are far better recognized when presented dynamically (Yitzhak et al., under review). Together, these studies formulate a new step forward in research on emotional expressions. Our next aim is to apply the stimuli and paradigms discovered in this work on clinical populations – specifically individuals with Huntington’s disease. We believe this will allow us to reveal subtle deficits of face processing and unique patterns of atypical face-body integration in Huntington’s disease.

Reported by



Life Sciences
Follow us on: RSS Facebook Twitter YouTube Managed by the EU Publications Office Top