Skip to main content
European Commission logo
polski polski
CORDIS - Wyniki badań wspieranych przez UE
CORDIS
CORDIS Web 30th anniversary CORDIS Web 30th anniversary

Revealing the neurocognitive mechanisms underlying the visual perception of social interactions

Periodic Reporting for period 5 - POINTS (Revealing the neurocognitive mechanisms underlying the visual perception of social interactions)

Okres sprawozdawczy: 2023-01-01 do 2023-08-31

“Social perception” has emerged as an umbrella term to describe research addressing the visual perception of others. To date, this discipline has focused on the perception of individuals; for example, the visual processing of faces and bodies has been studied extensively. In comparison, virtually nothing is known about the visual perception of social interactions. Given the adaptive value of accurate interaction interpretation, this paucity of knowledge represents a remarkable gap in our knowledge. POINTS sought to reveal the neurocognitive mechanisms that mediate this essential form of social perception, and to examine whether these mechanisms differ in autistic and non-autistic individuals.

Our results suggest three key conclusions. First, our findings indicate that pairs of individuals arranged face-to-face are processed differently within the visual system relative to pairs of individuals arranged back-to-back. Faces and bodies viewed in profile cue observers’ attention leftwards or rightwards according to their orientation. The differential configuration of these attention cues in face-to-face and back-to-back dyads appears to be a key determinant of the visual processing engaged in different arrangement conditions. Second, key hubs in the social perception network, including extrastriate body area (EBA) and posterior superior temporal sulcus (pSTS), represent information about interpersonal synchrony. A previously unknown region of right fusiform cortex also responds more strongly when observers view synchronous than asynchronous social interactions. Third, the mechanisms of social interaction perception appear to be intact in autism.
First, we examined whether social interactions capture observers’ attention. In visual search experiments, we found that participants locate pairs of facing individuals faster in search displays than pairs of non-facing individuals. We discovered that this search advantage is attributable to the arrangement of attention cues within facing and non-facing dyads. When viewed in profile, faces and bodies cue observers’ attention leftwards or rightwards, according to their orientation. The configuration of these attention cues in facing dyads creates an attention trap whereby the lefthand actor directs observers’ attention towards the righthand actor and vice-versa. Over a series of experiments, we proved that this attention trap is responsible for the search advantage.

Second, we sought to elucidate the visual processing is engaged by static interaction stimuli. We showed that the perception of interpersonal distance is affected by the Muller-Lyer illusion. As result, the space between two facing people viewed in profile (i.e. the nose-to-nose distance) appears to be expanded. Similarly, we found that the perceptual representations of dyads formed by observers afford greater feature migration (e.g. participants mistakenly believe the face of the lefthand actor appeared with the body of the righthand actor) when the individuals are arranged face-to-face, than when arranged back-to-back.

Third, we explored the visual processing engaged by dynamic interaction stimuli. It has been proposed that pairs of individuals shown upright and face-to-face recruit configural visual processing that aids the detection and interpretation of social interactions. Back-to-back or upside-down dyadic arrangements are not thought to engage this processing. According to this account, participants should exhibit disproportionate sensitivity to changes in interpersonal distance when dyads are shown upright and face-to-face. However, across four highly powered experiments we show that participants exhibit similar sensitivity to changes in interpersonal distance regardless of whether dyads are presented upright or inverted, face-to-face or back-to-back.

Fourth, we examined how are social interactions are represented within the visual brain. The social perception network comprises a series of brain regions within the human visual system that are each thought to play a key role in the perception of faces, bodies, and actions. Across two neuroimaging experiments, we revealed that regions of the social perception network – notably EBA – represent information about observed interpersonal synchrony. A previously unknown region of right fusiform cortex also responds more strongly to synchronous than to asynchronous interaction kinematics.

Finally, we examined whether autistic participants exhibit impaired perception of social interactions. Across two studies, we found no evidence that participants with autism exhibit aberrant processing of social interactions. In one study, we found that autistic participants showed typical sensitivity to changes in interpersonal distance when viewing pairs of facing individuals from third-person perspectives. In a second study, we found that autistic participants exhibit a typical search advantage for facing dyads: like non-autistic controls, autistic participants find pairs of facing individuals faster in search displays, than pairs of non-facing individuals.

Our findings have been published in leading international journals (e.g. Journal of Experimental Psychology: General). Our work has also been presented at national (e.g. meetings of the Experimental Psychology Society) and international (e.g. the European Conference on Visual Perception) conferences.
To isolate interaction-selective processing, previous studies have sought to compare neural responses to dyads shown face-to-face and back-to-back. The logic here is that non-facing arrangements contain similar, face, body, and action cues to those seen in face-to-face arrangements, but unlike face-to-face arrangements, do not imply social interaction. This approach has identified EBA and pSTS as potential sources of interaction processing within the human visual system.

These effects may reflect attentional modulation of single-actor processing, however. Faces and bodies viewed in profile cue participants’ visuospatial attention leftwards or rightwards, according to their orientation. When arranged face-to-face, the directionality of the lefthand actor cues the observer’s attention towards the righthand actor, and vice versa, creating an ‘attention trap’. In back-to-back arrangements, however, the same cues direct observers’ attention towards the periphery. If participants simply attend more closely to the actors shown in the face-to-face condition, this may explain stronger neural responses in regions implicated in body (EBA) and face (pSTS) perception.

Our approach was different: we manipulated interpersonal synchrony without altering the basic dyadic configuration (actors were always presented face-to-face). Critically, however, the representation of relative phase requires the extraction and integration of dynamic information from multiple people. As such, the representation of synchrony cannot be attributed to augmented processing of individuals. Our findings therefore provide crucial new evidence of multi-actor visual processing within the social perception network.

Our approach also revealed a new area of right-fusiform cortex that responds more to synchronous than asynchronous social interactions. More research is required to understand the profile of this new region. However, the possibility that this region mediates dedicated visual processing of synchronous social interactions is hugely exciting.
social-binding.jpg