The ability to consciously recognise faces, objects, or sounds is crucial for adaptive behaviour and survival. Yet, how our conscious experience of the world emerges in our brain remains unknown. The overall aim of the START programme is to fill an important gap in our understanding of consciousness by elucidating the neural underpinnings of conscious access. How does the brain select relevant information among distractors, and keep this information in mind? Why does our ability to consciously recognise salient objects sometimes fail under pressure and exhibit variability across days and individuals? Current theories of conscious access thus far rely on EEG recordings or fMRI measurements, which in isolation offer limited insights into the precise neural dynamics of conscious access. A further shortcoming of current models is their limited explanations for inter-trial and inter-individual variations in conscious access thresholds. START will tackle these challenges using attentional blink tasks involving the perception and recollection of visual objects. First, it will precisely track where in the brain and when in time the representations critical for conscious access are established, by using a novel approach which combines the strengths of EEG, fMRI, and Deep Convolutional Neuronal Networks. Second, START will reveal how activity patterns are amplified by the brain and encoded in working memory using multivariate pattern analyses of EEG data and novel experimental designs. Finally, START will carefully model variability in task performance across trials and participants using a new approach of representational sampling. In summary, START will provide new insights on the precise spatio-temporal dynamics of conscious access, the mechanisms governing it, and the idiosyncratic subtleties behind the meanderings of consciousness. This has deep implications for our understandings of brain health and disease, especially in clinical cases where consciousness is impaired.
Call for proposal
See other projects for this call