Skip to main content

Prediction and Anticipation of Actions: Modelling How We Foresee the Others

Periodic Reporting for period 1 - PREDACTION (Prediction and Anticipation of Actions: Modelling How We Foresee the Others)

Reporting period: 2016-02-01 to 2018-01-31

The unparalleled social abilities proper to human beings are mostly due to our capacity to understand each other and attribute a meaning to simple actions.
This faculty develops in parallel with our motoric and cognitive skills and is deeply grounded into them.
The reiterated exposure to a certain action and its consequence, creates a bond between the two that is very hard to change. As such, sometimes we do not need to see an actions performed to the end to read the actual intention of its actor. However, we encounter a strong surprise effect in the moment when the action develops in an ‘unpredictable’ way, meaning, its outcome does not match what we expected to see.
Two important questions that would add valuable insights to the understanding of social cognition are: 1, What information is actually used to predict someone else’s action? And 2, How do humans deal with surprise and what this surprise effect is cause by?
Answers to these questions can find important and multidisciplinary practical applications. The first question will inspire technological development, namely brain computer interface and robotics. In a digital era like the one we live now, machines and technology are expected to be predictive more than reactive to human commands. This will shorten their response making them more efficient and the human-machine interaction more fluid and ‘natural’. Social robots are nowadays a reality: they are mostly used for elderly, to keep them company and accompanying them in basic errands in the house. Making the interaction smoother means also make people more willing to accept this new form of help. The second question might as well be of great importance in the clinical domain. Big part of the literature links diseases like autism to an inherent inability to relate to other people. In particular, people with autism might have difficulty in understanding the meaning behind people’s actions. This project will give new insights about the possibility that predictive skills are actually jeopardized in this population.
This said, the literature does not provide yet a clear explanation of the cortical mechanisms behind our predictive ability, nor this surprise effect. Shedding light on them is the main objective of this project.
The overall aims of the study are:
- to investigate how different features impact the ability to predict someone’s action
- to unravel the mechanisms behind the surprise effect and what triggers it
The first aim was achieved by considering two main cues commonly related to action prediction and understanding: familiarity with the object and attention orientation of the actor. When we are familiar with an object and its use, it is easy to assume why someone is using it. Same is true when someone orients his attention towards an object. This is an important signal that he is willing to interact with it.
The second aim was achieved by creating a learning paradigm in which originally unpredictable actions become more predictable by making our participants repeatedly exposed to them.
- The project started with the selection of valuable object stimuli. I first collected a list of objects associated to hand actions or foot actions. I ran a first online survey to ask a separate sample of participants to rate this object based on familiarity and wiliness to use them using either the hand or the foot. Through the survey I selected 8 final objects (four per category)
- I prepared the video clip in which two actors (a male and a female) were shown behind one of the eight objects ready to act on them. In this first experiment I investigated the role of the object familiarity in action prediction: the face of the actor was not visible and the only valuable information to predict the upcoming action was the object itself.
- I prepared a first script and version of the paradigm and I ran it on a small sample pilot of 10 participants only recording behavioral data.
- The initial paradigm produced a too much polarized behavior and we changed the videos to a more neutral scenario.
- I recorded and edited new videos and tested them on a small sample of 10 participants
- The study was run on a big sample of 25 participants together with the recording of their electroencephalographic signal.
- I this time I supervised two master students who helped with the recording and preprocessing of the EEG data
- I started the implementation of the control experiment with the aim to remove the social aspect given by the actor and meaningful/meaningless actions
- I ran a first pilot of the experiment on 10 participants
- I implemented the control and run it alongside the main experiment collecting 25 participants
- Based on preliminary results I decided to extend the experiment to another technique: the eye-tracking
- The same behavioral paradigm was implemented recording the eye movement and pupil dilation of the participant. This part of the study was done in collaboration with the laboratories of the University of Amsterdam
- I collected 25 participants for the main experiment and 25 participants for the control with the eye tracker
- I this time I supervised a master student who helped with the recording and preprocessing of the eye-tracker and behavioral data
- Data analysis of the EEG and eye tracking data
- Implementation of a behavioral study to investigate other cues in action prediction in collaboration with a PhD student in the lab
- Recording of new video in which the actors pointed to one of the two objects presented in front of them
- Script and implementation of the study using the online platform Gorilla to reach out to as many participants as possible
- Pilot of this new study and analysis of the initial results.
In 2017 I presented the results in two conferences: Neural Control of Movement in Dublin and Social and Affective Neuroscience in Los Angeles.
In this project I have used and combined multiple analysis techniques. I moved beyond the standard procedures of analysing ERP and EEG data and I correlated them directly with behavioural information. This innovative methodology is not yet used in social neuroscience and might open new insights to other researchers also working on integrating different sources of information.
The main result of the project is the link between the prediction error and one specific evoked component of the EEG. The prediction error is a numerical value of the amount of surprise we experience when our expectancy and reality do not match. The more we were sure about an outcome the bigger the prediction error when it does not occur. With this study, for the first time we link this parameter to a specific late cortical activity in the occipito-temporal cortex. This result is in line with action prediction theories stating that when what we observe is at odds with what expected, a reactivation of visual areas occurs as the new information needs to be processed also at lower perceptual level.
What we imply here is that this mechanism is also responsible for one of the most important components ever studied in the field of EEG, a cortical responses associated to language: the P600. With this project, a new way to interpret this component might rise. It could be no longer related to syntactic processing but to expectancy. This would connect even more two important functions, language and action, that have already been proved to share multiple features.
Paradigm example