Skip to main content

Eye-hand coordination in space and time

Final Report Summary - EYEHAND (Eye-hand coordination in space and time)

EYEHAND examined interceptive actions such as catching and hitting. The key question is whether humans use explicit predictions of the time and location of interception, to allow them to get to the right place (positioning) at the right time (timing). This is often assumed, but definitive evidence is lacking. EYEHAND studied interceptive positioning through a task forcing participants to predict where they would intercept an object. In this task, interceptive performance was no different than for normal interception, which may have been due to the available visual feedback. A follow-up experiment – including catch trials without visual feedback – is currently being finalized (18/24 participants completed) and analysed. We examined how participants intercept occluded targets moving along both straight and curved trajectories; our findings suggested that perception of how a target is moving at any moment in time is biased. This explains many other published instances of spatially biased interceptive actions.
EYEHAND focused on positioning and timing in two separate projects, which should be of interest to behavioural scientists studying interception and neuroscientists studying motor control, including those working on neural prosthetics. EYEHAND examined the role of the superior parietal occipital cortex and medial intraparietal sulcus in interceptive positioning. Movement-related activity in these brain areas depends on the visual field a static target is shown. By observing hemisphere-specific effects of Transcranial Magnetic Stimulation (TMS) when the target is in one visual field, but moving to the other, we determined whether these brain areas code for the current or predicted target position. This study has been pre-registered at the Open Science Framework; it involves a large sample size (n=24) and 17 of 48 TMS sessions have been completed so far. EYEHAND examined interceptive timing by measuring variations in motor cortical excitability (probed using TMS) just before movement initiation. If human use explicit temporal predictions, this excitability should change as a function of hand movement parameters, while it should change as a function of target motion if interception involves non-predictive continuous control. This experiment is scheduled to start shortly.
EYEHAND opened up new avenues for the fellow, resulting in projects unforeseen in the proposal, but directly related. We found that extrapolation of occluded target motion is different from extrapolation of a static line due to differential biases associated with the perception of motion direction and line orientation. We also captured eye-hand coordination of 20 expert jugglers; data analysis for this study is currently underway. We created a state-of-the-art virtual reality cricket simulator to study expert cricket batsmen and used virtual reality to study expert goalkeeper behaviour during free kicks with visual occlusion by a wall of players. EYEHAND has allowed the fellow to develop a research lab in which eye-hand coordination can be studied both at the fundamental level as well as in real-life applications. He has supervised a PhD student, several research assistants, and many undergraduate and MSc students. The fellow is now in a great position to pursue funding for further research projects to guarantee long-term impact of his research within Europe.