Periodic Reporting for period 5 - POINTS (Revealing the neurocognitive mechanisms underlying the visual perception of social interactions)
Okres sprawozdawczy: 2023-01-01 do 2023-08-31
Our results suggest three key conclusions. First, our findings indicate that pairs of individuals arranged face-to-face are processed differently within the visual system relative to pairs of individuals arranged back-to-back. Faces and bodies viewed in profile cue observers’ attention leftwards or rightwards according to their orientation. The differential configuration of these attention cues in face-to-face and back-to-back dyads appears to be a key determinant of the visual processing engaged in different arrangement conditions. Second, key hubs in the social perception network, including extrastriate body area (EBA) and posterior superior temporal sulcus (pSTS), represent information about interpersonal synchrony. A previously unknown region of right fusiform cortex also responds more strongly when observers view synchronous than asynchronous social interactions. Third, the mechanisms of social interaction perception appear to be intact in autism.
Second, we sought to elucidate the visual processing is engaged by static interaction stimuli. We showed that the perception of interpersonal distance is affected by the Muller-Lyer illusion. As result, the space between two facing people viewed in profile (i.e. the nose-to-nose distance) appears to be expanded. Similarly, we found that the perceptual representations of dyads formed by observers afford greater feature migration (e.g. participants mistakenly believe the face of the lefthand actor appeared with the body of the righthand actor) when the individuals are arranged face-to-face, than when arranged back-to-back.
Third, we explored the visual processing engaged by dynamic interaction stimuli. It has been proposed that pairs of individuals shown upright and face-to-face recruit configural visual processing that aids the detection and interpretation of social interactions. Back-to-back or upside-down dyadic arrangements are not thought to engage this processing. According to this account, participants should exhibit disproportionate sensitivity to changes in interpersonal distance when dyads are shown upright and face-to-face. However, across four highly powered experiments we show that participants exhibit similar sensitivity to changes in interpersonal distance regardless of whether dyads are presented upright or inverted, face-to-face or back-to-back.
Fourth, we examined how are social interactions are represented within the visual brain. The social perception network comprises a series of brain regions within the human visual system that are each thought to play a key role in the perception of faces, bodies, and actions. Across two neuroimaging experiments, we revealed that regions of the social perception network – notably EBA – represent information about observed interpersonal synchrony. A previously unknown region of right fusiform cortex also responds more strongly to synchronous than to asynchronous interaction kinematics.
Finally, we examined whether autistic participants exhibit impaired perception of social interactions. Across two studies, we found no evidence that participants with autism exhibit aberrant processing of social interactions. In one study, we found that autistic participants showed typical sensitivity to changes in interpersonal distance when viewing pairs of facing individuals from third-person perspectives. In a second study, we found that autistic participants exhibit a typical search advantage for facing dyads: like non-autistic controls, autistic participants find pairs of facing individuals faster in search displays, than pairs of non-facing individuals.
Our findings have been published in leading international journals (e.g. Journal of Experimental Psychology: General). Our work has also been presented at national (e.g. meetings of the Experimental Psychology Society) and international (e.g. the European Conference on Visual Perception) conferences.
These effects may reflect attentional modulation of single-actor processing, however. Faces and bodies viewed in profile cue participants’ visuospatial attention leftwards or rightwards, according to their orientation. When arranged face-to-face, the directionality of the lefthand actor cues the observer’s attention towards the righthand actor, and vice versa, creating an ‘attention trap’. In back-to-back arrangements, however, the same cues direct observers’ attention towards the periphery. If participants simply attend more closely to the actors shown in the face-to-face condition, this may explain stronger neural responses in regions implicated in body (EBA) and face (pSTS) perception.
Our approach was different: we manipulated interpersonal synchrony without altering the basic dyadic configuration (actors were always presented face-to-face). Critically, however, the representation of relative phase requires the extraction and integration of dynamic information from multiple people. As such, the representation of synchrony cannot be attributed to augmented processing of individuals. Our findings therefore provide crucial new evidence of multi-actor visual processing within the social perception network.
Our approach also revealed a new area of right-fusiform cortex that responds more to synchronous than asynchronous social interactions. More research is required to understand the profile of this new region. However, the possibility that this region mediates dedicated visual processing of synchronous social interactions is hugely exciting.