Project description
From a lab animal perspective: real-time smart cameras for head and eye tracking
As popular models of cognitive function, rodents are often used for understanding tasks such as perception, memory and decision-making. To properly study these abilities, recordings on conscious animals engaged in these behaviours is necessary. Furthermore, in order to relate it to the underlying brain activity, it's necessary to track real-time motor behaviour. The EU-funded SMARTLABCAM project will develop innovative, minimally invasive and user-friendly head- and eye-tracking devices. It aims to migrate the tracking algorithms developed for the LEARN2SEE project to real-time smart cameras, processing head/eye video streams. The project hopes to transform these tracking devices into a commercial product to support the growing market of neurophysiology laboratories that are using rodents as model organisms.
Objective
Rodents have become very popular models of cognitive functions, such as perception, memory and decision-making. Understanding the neuronal substrates of these processes requires performing neurophysiological recordings in awake animals, engaged in complex behavioral tasks. This, in turn, calls for tracking in real-time the motor behavior of the animal, to relate it to the underlying brain activity. Custom, as well as commercial, solutions exist that allows tracking basic behaviors in rodents – e.g. the 2D position and orientation of an animal over the arena where he navigates. However, while carrying out the research of the ERC project LEARN2SEE, we realized that methods for 3D reconstruction of some key behavioral features (e.g. pose of the head and gaze direction) are not commercially available, and the custom solutions found in the literature lack generality and are hard to implement and replicate. This prompted us to develop innovative, minimally invasive and user-friendly solutions to head/eye tracking in rodents. Yet, we soon faced another challenge – in order to perform high-throughput experiments, by testing in parallel many subjects (one of the advantages of rodent models over larger laboratory animals), our head/eye trackers would need to be re-engineered to make them easily replicable for series production. Critically, this would also allow transforming them into commercial devices for the ever-growing market of neurophysiology laboratories that, especially in the field of vision science, are employing rodents as model organisms. Here, we present our plan to achieve this goal, by migrating the tracking algorithms we developed for the LEARN2SEE project to smart cameras, engineered for real-time, on-board processing of head/eye video streams. These cameras will become a new product line of CyNexo, a recently established SISSA startup that operates in the market of laboratory equipment and is looking to expand its products’ package to videography tools.
Fields of science
CORDIS classifies projects with EuroSciVoc, a multilingual taxonomy of fields of science, through a semi-automatic process based on NLP techniques.
CORDIS classifies projects with EuroSciVoc, a multilingual taxonomy of fields of science, through a semi-automatic process based on NLP techniques.
Programme(s)
Funding Scheme
ERC-POC-LS - ERC Proof of Concept Lump Sum PilotHost institution
34136 Trieste
Italy