Community Research and Development Information Service - CORDIS


TEMPLATE 2.0 Report Summary

Project ID: 615423
Funded under: FP7-IDEAS-ERC
Country: Netherlands

Periodic Report Summary 2 - TEMPLATE 2.0 (Template 2.0: Depicting the picture in your head)

Whether we are looking for our child in the school yard, or our keys on the kitchen table, we continuously filter the incoming visual information for what is relevant to us. This project investigates these visual filters. We try to uncover the mental image of what we are looking for, rather than what we see. To this end, we ask observers to look for a particular object in a soon to appear display. We then measure during the period prior to this display, when there is nothing on the screen yet. This allows us to isolate the image we have in mind from the actual image displayed. This mental image is typically represented in short term memory (to allow for flexibility), but is also influenced by long term memory (representing our inherent biases).

We use both behavioural and brain imaging techniques to uncover the nature and functioning of this mental filter, and the memory systems that support it. Our initial findings show that the code that the brain uses to represent the filter differs depending on the anticipated task, even when people need to remember the same thing for those tasks. In other words, the brain not only codes what you need to remember, but also what you need it for.

We also find that the mental image differs depending on whether you need it now, for a current perceptual task, or for later task in the near future. It furthermore distinguishes from memories that were once relevant, but no longer are. In other words, we find that memory distinguishes between the past, the presence and the future.

The techniques also enable us to determine the mechanisms underlying how we change our mind, and what the capacity is for holding information. For example, we ask people to load two objects into their memory, then first look for the one, and only after they have found it to go look for the other object. So we force people to set up two mental filters in sequence, with one having priority, and with a subsequent need to switch. Using electroencephalography (EEG) to measure brain waves, we can track on-line which of the two mental filters is currently active, and how observers switches from one to the other.

The eyes also turn out to be revealing. For example, we are developing eye movement measures of learning. From the eye data we can see if someone is looking for a novel object or an object that they know well, even before they start searching a scene. But from the eye movements we can also see how many things people can remember or look for at the same time, and what type of information they rely on when looking for something (e.g. the visual appearance of an object, or its meaning).

Reported by

Follow us on: RSS Facebook Twitter YouTube Managed by the EU Publications Office Top