Skip to main content
European Commission logo print header

Intention-from-MOVEment Understanding: from moving bodies to interacting minds.

Final Report Summary - I.MOVE.U (Intention-from-MOVEment Understanding: from moving bodies to interacting minds.)

How do we understand the intentions of other people? Are intentions observable in others’ movements? Debate on these questions has been primarily focused on untestable theoretical considerations about the observability of mental states. The main achievement of the ERC I.MOVE.U. project has been the development of an experimental strategy for measuring the observability of mental states and articulating the conditions under which mental states are observable. The strategy, as shown in Figure 1, combines rigorous kinematic and quantitative behavioural techniques with modelling and classification analysis of neuroimaging data. First, we record the kinematics of movements performed with different intents, and use statistics and machine learning techniques as a means to quantify available intention information. Next, using videos of the same movements, we measure and manipulate the usefulness of this information for the detection of mental states. Finally, by combining modeling techniques such as classification and regression tree analysis with neuroimaging techniques (e.g. fMRI, TMS), we investigate the neurofunctional architecture supporting advance information pickup from observed movement patterns.
This experimental strategy has proven productive in revealing the significance of movement kinematics for decoding action intention at multiple levels of a potential representational hierarchy: from anticipation of the properties of the object to be grasped (e.g. whether the object is large or small, light or heavy) to detection of an agent’s prior intention in grasping an object (e.g. whether a bottle is grasped with the intent to pour or drink). Highlights of this work include:
• The demonstration that kinematics convey intention-specifying information [1];
• The demonstration that observers are sensitive to this information and can use this information to discriminate intention [2, 3];
• The identification of the specfic features that observers use to detect intentions (kinematic determinants of intention ascription), and the establishment of a measurable relationship between those features and the observability of the mental states [4];
• The demonstration that the observability of the mental states can be manipulated by modifying the parameters of the observed movements [3, 5];
• The finding that mirror-neuron regions within the action observation network encode intention information conveyed by movement kinematics [6];
• The demonstration that movement kinematics drive chain selection towards intention detection [7];
• The finding that the influence of expectations on intention ascription is modulated by movement informativeness [5];
• The demonstration of idiosynchratic abnoramalities in the prospective motor control of self- and other-actions in autism [8].

References
[1] Ansuini C., Cavallo A., Bertone C., Becchio C. (2015). Intentions in the brain: The unveiling of Mister Hyde. The Neuroscientist, 21, 126-135.
[2] Ansuini C., Cavallo A., Koul A., D’Ausilio A., Taverna L., Becchio C. (2016). Grasping others’ movements: rapid discrimination of object size from observed hand movements. Journal of Experimental Psychology: Human Perception and Performance, 42, 918-929.
[3] Cavallo A., Koul A., Ansuini C., Capozzi F., Becchio C. (2016). Decoding intentions from movement kinematics. Scientific Reports, 6, 37036.
[4] Becchio C., Koul A., Ansuini C., Bertone C., Cavallo A. (2018). Seeing mental states: An experimental strategy for measuring the observability of other minds. Physics of Life Reviews, 24, 67-80.
[5] Koul A., Soriano M., Tversky B., Becchio C., Cavallo A. (2019). Integrating prior information and kinematics towards intention choice. Cognition, 182, 213-219.
[6] Koul A., Cavallo A., Cauda F., Costa T., Diano M., Pontil M., Becchio C. (2018). Action observation areas represent intentions from subtle kinematic features. Cerebral Cortex, 28, 2647-2654.
[7] Soriano M., Cavallo A., D’Ausilio A., Becchio C., Fadiga L. (2018). Movement kinematics drive chain selection towards intention detection. Proceedings of the National Academy of Sciences of the United States of America, 115, 10452-10457.
[8] Cavallo A., Romeo L., Ansuini C., Podda J., Battaglia F.,Veneselli E., Pontil M., Becchio C. (2018). Prospective motor control obeys to idiosyncratic strategies in autism. Scientific Reports 8, 13717.