Temporally controlling our movements to successfully perform an action (e.g. directing our eyes to read this text, picking up a pen, drinking from a cup) is something we do thousands of times a day, without thinking twice. Given that many of these actions are self-paced, (i.e. no perceptual timing cues are provided by the environment) prospective control has to be determined by intrinsic neural mechanisms. How is prospective information for temporal control represented in the brain? What happens when the neural temporal control of the movement system breaks down (as in the case of Parkinson’s disease)? Is it possible to provide some kind of artificial sensory guide that provides prospective information extrinsically and can therefore be monitored through perceptual systems to regulate movement? This project will examine in detail the theoretical underpinnings of the temporal control of movement and how temporal information may be represented in the brain. More specifically the project will test the idea that coupling between temporal information provided extrinsically (e.g. the trajectory of a ball) or intrinsically (e.g. hitting a stationary object) and the ensuing movement employ the same mechanisms of control. Working alongside engineers, temporal sensory substitutes will be generated artificially so that prospective sensory information necessary to guide movement can be picked up through the visual, acoustic or haptic domains. The litmus test for the project will be assessing the utility of these sensory guides in two different areas - i) skill acquisition and ii) movement facilitation for two different patient populations (stroke and Parkinson’s disease). Movement performance and stability with and without these temporal guides will be measured, analysed and compared across groups. The findings will then be fed back into movement timing theory to see how they can improve our understanding of the spatio-temporal guidance of movement.
Call for proposal
See other projects for this call