Skip to main content
European Commission logo
English English
CORDIS - EU research results
CORDIS

Perception with New Sensory Signals

Periodic Reporting for period 3 - NewSense (Perception with New Sensory Signals)

Reporting period: 2022-05-01 to 2023-10-31

From birth, we are like little scientists – using all our senses to learn about our surroundings. The question arises whether we could learn to use new technologies like wearables and electronic devices to "see" in similar, intuitive ways. The EU-funded NewSense project will test the possibility of enriching human experience and perceptual abilities with new signals. Using virtual reality training, neuroimaging, and computational modelling, the project asks to what extent new perceptual skills can become similar to typical perception. The information gained is expected to lead to breakthrough discoveries about the mechanisms that guide human perception.
We are studying people’s abilities to learn to sense the environment in new ways in two main settings. In the first setting (Work Package 1), we are studying abilities to sense the distance to an object using a new auditory (sound) signal. Besides basic abilities to use the signal at all, we are interested in how well the new sense becomes integrated alongside natural everyday perception. To study this, we are measuring (1) the degree to which judgments with the new signal are combined with those from vision, (2) the degree to which this happens “automatically” (e.g. without conscious effort), and (3) the degree to which this is supported by the same brain areas involved in everyday distance perception. Our results so far show that (1) even after less than 10 hours of training, signals are combined in key ways similar to everyday multisensory perception. This enables people to perceive better with the new signal and vision together than with either on its own. (2) We also see some signs of “automatic” processing – for example, people can use the new signal while doing another verbal task at the same time, suggesting that they are not talking themselves through using the new signal. (3) In work in progress we are developing longer-term training protocols with visual and auditory objects designed for a neuroimaging study in which we will be able to determine which brain areas are involved in the newly learned skill.

In the second setting (Work Package 2), we are studying abilities to sense the material of which an object is made using a new auditory signal. Specifically, we are training people to sense how dense (and therefore how heavy) an object is. As in WP 1, besides basic abilities to use the signal at all, we are interested in how well the new sense becomes integrated alongside natural everyday perception. To study this, we are measuring (1) the degree to which judgments with the new signal are combined with those from vision and (2) the degree to which this happens “automatically” (without conscious effort). Work on this WP is at an early stage, but so far we see a mixed picture with some indications of combination and automaticity (e.g. people’s grip forces as they pick objects up are calibrated to both the visual cue and the new auditory cue) but not others (e.g. people do not experience an illusion linked with automatic processing of visual cues to weight when they experience the auditory cue).

In additional work underpinning both WP1 and WP2, we have also (1) shown an important role of mis-calibration of sensory signals in limiting efficient perception in children and adults, even with familiar everyday multisensory signals, (2) developed and published new methods for analysing data on multisensory perception, and (3) developed new computational models of changes in multisensory space perception with learning and development.
By the end of the project we will have advanced our knowledge of the potential to adopt new sensory signals, and mechanisms underlying this, substantially beyond the current state of the art. We hope to understand which characteristics are shared and not shared between familiar, highly-practiced perceptual skills and more newly gained ones, from three perspectives: (i) which skills are gained, (ii) computationally, how these skills are supported, and in which ways they are limited, and (iii) which brain mechanisms support them. We will also test hypotheses about factors that can promote the better integration of new perceptual skills, including testing for parallels between children learning to use their everyday senses for the first time and people of all ages learning to tune in to and effectively use new senses.
project-image.png