Skip to main content
European Commission logo print header

Revealing the neurocognitive mechanisms underlying the visual perception of social interactions

Periodic Reporting for period 3 - POINTS (Revealing the neurocognitive mechanisms underlying the visual perception of social interactions)

Reporting period: 2020-09-01 to 2022-02-28

“Social perception” has emerged as an umbrella term to describe research addressing the visual perception of others. To date, this discipline has focused on the perception of individuals; for example, researchers have examined the visual processing of faces and facial expressions, body shapes and postures, actions and kinematics. The resulting body of research suggests that social stimuli are detected by specialised mechanisms, recruit dedicated perceptual models, and are processed by specialised neural substrates. However, while social perception research has made considerable progress elucidating the visual perception of individuals, virtually nothing is known about the visual perception of social interactions; how we detect, encode, and interpret social interactions viewed from third-person perspectives. Given the adaptive value of accurate interaction interpretation, this paucity of knowledge represents a remarkable gap in our understanding of social perception. POINTS will develop a battery of original behavioural and neuroimaging paradigms to reveal the neurocognitive mechanisms that mediate this essential form of social perception. The overarching objective of POINTS will be achieved by addressing five research questions: 1) Do observers exhibit evidence of interaction detection mechanisms? 2) How do we process fleeting visual displays of social interactions? 3) How do we represent dynamic interaction change? 4) Do specialised neural substrates mediate interaction perception? 5) Do some observers exhibit impaired interaction perception? The POINTS project represents a significant step-change in social perception research. Understanding the mechanisms of interaction perception will provide new insights into the organisation of the human visual system, and will inform attempts to remediate socio-cognitive and perceptual deficits in neurodevelopmental populations, including those with Autism Spectrum Disorder.
The results from our early experiments indicate that participants are faster to find a target dyad amongst non-interacting individuals, when the individuals in the target dyad are shown face-to-face (suggestive of a social interaction), than when they are presented back-to-back. The strong directional cues contained within face-to-face arrangements appear to create an attentional ‘hot-spot’. As a result, participants’ attention is drawn to the correct location earlier in a serial search of the items presented. Similarly, participants appear to remember face-to-face dyads as closer together in space, than pairs of individuals shown back to back. One possibility is that the individuals shown are attracted to a prototypical interaction distance in visual working memory.
We are currently conducting experiments using fMRI that will reveal more about the way co-ordinated spatiotemporal dynamics are represented when we view dynamic social interaction displays. We are also exploring a parallel line of behavioural psychophysical research investigating how the distance between two interactants is coded within the human visual system. In future studies, we hope to compare the visual processing of social interaction stimuli in typical observers and those with autism spectrum disorder.