Rodents have become very popular models of cognitive functions over the past 20 years, including perception, memory and decision-making. Understanding the neuronal substrates of these processes requires performing neurophysiological recordings in awake animals, engaged in complex behavioral tasks. This, in turn, calls for tracking in real-time the motor behavior of the animal, to relate it to the underlying brain activity. The goal of the PoC project SMARTLABCAM was to build, validate and bring as close as possible to the market two tracking systems, which started to be develop during the former ERC Consolidator Grant LEARN2SEE: 1) a head tracking system that allows tracking both the position and pose of the head of laboratory rodents in 3D; and 2) an eye-tracking system that allows monitoring in real-time the gaze direction of small rodents (body constrained but not necessarily head-fixed) in a calibrated way (i.e. by inferring the point, over a stimulus display, where the eye is pointing). With regard to the first device, a fully working system had been already developed during the LEARN2SEE project. The goal of the SMARTLABCAM project was to migrate this system from a regular workstation to a more compact, embedded platform, so as to pave the way to the development of a self-standing unit that is way more practical to use in laboratory settings. With regard to the second device, it had been developed only at the algorithmic level during the LEARN2SEE project, but it had not been validated, neither applied it to a real-word scenario (i.e. to consistently and reliably track the gaze of partially restrained rodents). The goal of the SMARTLABCAM project was to carry out such validation and systematic tests on the field. The project was successful in achieving both goals. The head tracker was migrated to an embedded system based on a NVIDIA® Jetson Nano™ 2GB Developer Kit, yielding a compact, self-standing unit that can track the head of a rodent in real-time at 21 frames per second (fps) and, given its small size and limited cost, can be especially attractive for high-throughput applications – e.g. multiple of such units could be installed on a high-throughput rig for behavioral tests (each unit for every testing box of the rig). As for the eye tracker, it was successfully implemented using a stereo imaging system made of two Basler cameras and infrared (IR) illuminator, synchronized by an external microcontroller (an Arduino Uno board). The tracker is based on a custom algorithm that infers the plane where the pupil lies and, after finding its center, estimates the gaze vector (or line of sight) of the eye as the orthogonal to the pupil’s plane passing through the center. A number of experiments were carried out using a 3D-printed artificial eye to validate the system, making sure it can estimate the position where the pupil is pointing over a stimulus display with an error that can be as small as 6-7 mm, running in real-time at 15 fps. Tests on rats were also performed, showing that the system can correctly track a well-known oculomotor reflex – the optokinetic nystagmus (OKN), which is a slow drifting movement of the eye, followed by a fast corrective saccade in the opposite direction, that occurs when an animal is presented with a drifting grating. Overall, the work carried out with the support of the POC grant allowed bringing both the head and eye trackers to the levels of fully validated and working systems in laboratory settings. As the next step, we are carrying out a preliminary market search, by contacting investigators studying visual processing in rodents to present to them our systems.