Skip to main content

Happy or Sad? The Influence of Emotional Body Cues on Facial Expression Perception in Huntington’s Disease

Final Report Summary - EMO (Happy or Sad? The Influence of Emotional Body Cues on Facial Expression Perception in Huntington’s Disease)

Successful emotion recognition is arguably one of the most central skills required for interpersonal communication. However, the vast majority of studies to date have focused on the perception of stereotypical facial expressions that bear little resemblance to the dynamic, ambiguous and contextualized facial expressions encountered in real-life. This paucity of data on real-life emotion perception is mirrored in the literature on neuropsychological disorders manifested by poor emotion perception. With the support of the CIG, I have established the affective neuropsychology lab at the Hebrew University equipped with state of the art eye-tracking equipment and electrophysiological monitoring systems. The current studies constitute a first step at examining the processes of emotion perception in more naturalistic conditions in healthy and neuropsychological populations.
In the first of such studies, we examined the processing of highly intense ambiguous facial expressions (e.g. winners vs. losers in professional tennis). Although the emotional states of the expressers were quite different, viewers failed to differentiate positive from negative expressions. Nevertheless, a muscular analysis of the facial activity revealed that winners and losers are actually physically different. Remarkably, exposing viewers to objectively valid information about how to differentiate the faces had virtually no effect on their performance (Aviezer et al., 2015). We next examined the prevalence of such ambiguous facial expressions in less controlled (and televised) situations than tennis matches. In fact, we found that viewers often confuse extreme expressions of spontaneous joy and terror (Wenzler et al., 2016). Finally, recent work examining real-life expressions shows that fearful individuals clearly express fear in the body, while the isolated fear face remains quite ambiguous (Abramson et al., 2017). In fact, even when people see dynamic videos of real-life intense emotions (people reacting to meeting homecoming soldiers), the isolated face is insufficient for deciphering the true emotions of the expresser, and context must be used (Israelashvili et al., 2018). Together, these studies demonstrate that during intense situations the face may be a poor source of information that requires context in order to be deciphered. Studies in our lab are also aiming to decipher the process in which the information from the face and body is integrated. Specifically, using eye tracking we find that people read-out information from the diagnostic body and then automatically read-in that information to the face (Semyonov et al., in prep). In fact, a striking illusion is apparent: viewers think they are recognizing the emotion from the face but not from the body! Importantly, intensity is not the only cause of ambiguity in the face. Extremely subtle expressions may be even more common. As part of the CIG project, we produced a set of dynamic extremely subtle expressions. Our findings show that while recognizable, these expressions are more challenging to viewers than the stereotypical expressions often used in research, but they also are rated as more natural and real-life like (Yitzhak et al 2017). Using subtle expressions has been extremely useful in testing neuropsychological populations, from developmental prosopagnosia (Yitzhak et al., 2018) to Huntington disease (Yitzhak et al., in prep). Our studies show how dynamic information is critical for emotion recognition - especially when faces are natural-looking and non-stereotypical. Such tests are far more sensitive than regular caricature-like facial expressions and may help reveal early signs of social deficits in Huntington’s disease and many other neuropsychological disorders.