Skip to main content

How the brain creates Mutual Understanding during Social InteraCtion

Periodic Reporting for period 2 - MUSIC (How the brain creates Mutual Understanding during Social InteraCtion)

Reporting period: 2018-12-31 to 2019-12-30

Intuitively, humans understand one another because they share the same set of communicative signals such as words and gestures. However, that intuition neglects the extreme flexibility with which we employ our communicative signals in everyday social interaction. Neither can it account for our evolutionarily anomalous ability to instantly reach joint meaning of new signals in the first place. By combining neuroscience and psychiatry with controlled studies of live human social interaction, this research project found evidence for the notion that human communicators share not signals but a fleeting cognitive space. This shared cognitive space provides context for selecting and interpreting communicative signals that can be mutually understood. The shared cognitive space is jointly coordinated and updated during social interaction, and that coordinated updating is altered in individuals diagnosed with Autism Spectrum Disorder. The findings characterize a fundamental and evolutionarily unique ability of our species, and open the way for understanding and treating disorders of human social interaction.
The project has produced a testable mechanistic account of human mutual understanding abilities with implications for several academic fields (Stolk et al., TiCS 2016; Wheatley et al., Neuron 2019). This account makes predictions on the neuronal mechanisms supporting core interpersonal processes as well as how alterations of these mechanisms affect observable communicative behavior. Building bridges to the clinical domain, the project has tested some of the predictions by investigating social conduct deficits in medial prefrontal lesion patients (Stolk et al., Current Biology 2015) and sources of misunderstanding in individuals on the autism spectrum (Wadge et al., Cortex 2019). The project has also laid the groundwork for investigating the neuronal mechanisms supporting the creation of mutual understanding during social interaction. We have acquired scalp EEG simultaneously from interacting subjects and we have successfully developed a new methodological framework that improves the precision and reproducibility of intracranial EEG studies (Stolk et al., Nature Protocols 2018; Holdgraf et al., Nature Scientific Data 2019; Stolk et al., eLife 2019). The latter study types involve electrodes implanted in individuals undergoing presurgical monitoring for refractory epilepsy and provide rare but precise access to neurocognitive mechanisms underlying human mutual understanding.
As recently outlined in a perspective paper targeting a broad neuroscientific audience, the project's findings call for a shift of focus in the field from how individual brains respond to social signals to how two or more brains converge on a shared understanding of a signal (Wheatley et al., Neuron 2019). Taking advantage of our new methodological framework and rare intracranial data obtained in humans, we aim to deepen our insights into the neuronal mechanisms supporting our mutual understanding abilities. Defining these fundamental neuronal mechanisms is also relevant for a number of generative models of human high-order cognition (e.g. reasoning and decision-making). Furthermore, we plan to build more bridges to the clinical domain and further characterize communicative deficits as seen in Autism Spectrum Disorder and frontotemporal dementia. This approach might have important societal consequences, providing testable mechanistic accounts of communicative alterations in a number of psychiatric disorders. More generally, the findings might also prove instrumental in defining the cognitive architecture and algorithmic implementation of communicative abilities in artificial agents, a necessary but currently unsatisfactory feature of human-robot interactions.