Skip to main content
European Commission logo print header

Neural bases of temporal processing in the human brain

Final Report Summary - NEUROTIME (Neural bases of temporal processing in the human brain)

Humans continuously rely on temporal information in almost every aspect of their lives at different time scales, from sound localisation (microseconds) to memories (years). Temporal information of the order of seconds can be processed explicitly and used in tasks requiring awareness (e.g. time estimation as in judging temporal intervals, ascertaining rhythm). Humans show great flexibility in using such temporal information that arises from different sensory domains such as audition, vision, and somatosensation, for long or for short durations. Such information can be put towards common goals (e.g. judging relative duration of visual and auditory information). However, comparatively little is known about the neural bases of these phenomena. On the one hand, dedicated generalised amodal timing mechanisms have been proposed to represent time in the brain, where specific neural structure(s) provide the chronotopy for diverse tasks regardless of the input modality (i.e. serve as a 'neural ticking clock' regardless of whether a duration of a tone or a duration of visual stimuli is measured). On the other hand, in local modality-specific mechanisms the duration of visual stimuli will arise from visual cortex dynamics, whereas the duration of tones would rely on auditory cortex dynamics.

Behavioural research focusing on temporal perception, neuropsychological studies of patients with temporal perception deficits and various disorders in temporal perception, as well as theoretical and modelling approaches and recent neuroimaging studies have all tried to pinpoint brain structures and mechanisms underlying temporal perception, yet, there is no consensus concerning which brain mechanisms enable us to perceive and process time. This is due to various reasons such as that many studies focus on sub-sec time scale that is rather automatic and associated with motor-timing, that many studies use unimodal stimuli and do not employ various input modalities, that evidence emerging from patient data is ambiguous.

In this project, we proposed to examine the neural mechanisms in humans underlying temporal processing derived from different sensory modalities and for different durations to reveal the neural structures involved in human temporal processing using converging evidence from multiple methodologies such as structural MRI, fMRI, and TMS. We also proposed to take advantage of behavioural measures and specifically use individual differences in time perception to infer underlying mechanisms supporting time perception. We hypothesised that the underlying temporal mechanisms for time scales of up to 1-2 seconds comprise generalised amodal mechanisms (residing in subcortical regions such as the cerebellum and basal ganglia) that send and receive inputs from modality-specific cortical regions. However, for time-scales longer than a few seconds we hypothesised involvement of amodal timing mechanisms that also include cortical regions (e.g. prefrontal and parietal cortices), that 'register' the temporal events and information into cognitive memory.

To achieve these objectives we used converging approaches. First, we used behavioural paradigms with 31 participants to determine whether time estimation is consistent across different sensory modalities (vision and audition) indicating amodal temporal mechanism. This was done for short (2 s) and long (12 s) durations and we found that individuals are highly consistent in time estimations across modalities (left), less so across short and long durations. We applied additional control experiments that confirmed that our results were related to temporal abilities and sensitive enough.

Second, at the Wellcome Trust Centre for Neuroimaging at UCL we collected structural MRI brain images of these participants and investigated whether neural brain structure is reliably associated with temporal ability (using voxel based morphometry).