Skip to main content

NextPerception - Next generation smart perception sensors and distributed intelligence for proactive human monitoring in health, wellbeing, and automotive systems

Periodic Reporting for period 1 - NextPerception (NextPerception - Next generation smart perception sensors and distributed intelligence for proactive human monitoring in health, wellbeing, and automotive systems)

Reporting period: 2020-05-01 to 2021-04-30

We put our lives increasingly in the hands of smart complex systems making decisions that directly affect our health and wellbeing. This is very evident in healthcare – where systems watch over your health – as well as in traffic – where autonomous driving solutions are gradually taking over control of the car. The accuracy and timeliness of the decisions depend on the systems’ ability to build a good understanding of both you and your environment, which relies on observations and the ability to reason on them.
This project will bring perception sensing technologies like Radar, LiDAR and Time of Flight cameras to the next level, enhancing their features to allow for more accurate detection of human behaviour and physiological parameters. Besides more accurate automotive solutions ensuring driver vigilance and pedestrian and cyclist safety, this innovation will open up new opportunities in health and wellbeing to monitor elderly people at home or unobtrusively assess health state.
To facilitate building the complex smart sensing systems envisioned and ensure their secure and reliable operation, the new Distributed Intelligence paradigm will be embraced, enhanced and supported by tools. It leverages the advantages of Edge and Cloud computing, building on the distributed computational resources increasingly available in sensors and edge components to distribute also the intelligence.

The goal of this project is to develop next generation smart perception sensors and enhance the distributed intelligence paradigm to build versatile, secure, reliable, and proactive human monitoring solutions for the health, wellbeing, and automotive domains

The project brings together 43 partners from 7 countries. The partners are major industrial players and prominent research centres and universities and will address top challenges in health, wellbeing, and automotive domains through three use cases: integral vitality monitoring for elderly and exercise, driver monitoring, and providing safety and comfort for vulnerable road users at intersections.

NextPerception aims at the exploitation of enhanced sensing and sensor networks focusing on the application domains of Health, Wellbeing and Automotive. The starting point is to acquire quality information from sensors nodes (O1). The next step is to transform the data provided by collections of networked sensors into information suitable for proactive decision making (covering a.o. clinical decision making, wellbeing coaching and vehicle control decisions (O2). The provision of a reference platform to support the design, implementation, and management of distributed sensing and intelligence solutions (O3) greatly facilitates the use and implementation of distributed sensor networks, enabling their practical application in the primary domains of healthcare, wellbeing, and automotive as well as in cross-sector applications. The demonstration of these applications (O4) is key to achieve potential products and services that can finally contribute to the solution of the Global Challenges in Health and Wellbeing, and Transport and Smart Mobility. The picture below states these four project objectives.
The project is organised around three use cases, and the project outcomes have been produced largely with their needs in mind. The current status of the use cases is:
- UC1 Integral Vitality Monitoring – This use case focuses on elderly and activity monitoring and has four demonstrators: Continuous activity monitoring (UC1D1), Episodic Health Gate (UC1D2), Wearable Activity Monitor (UC1D3) and Sleep Monitoring (UC1D4). The project has been working on the proof of concept sensors and analytics methods needed to implement the foreseen UC1 demonstrator functionality. This includes indoor positioning, radar-based vital sign monitoring, and spectrum camera assisted activity monitoring for the elderly monitoring solutions in UC11D1, a radar-based vital sign monitoring and weight sensing solution in UC1D2, a wrist-based exercise monitoring solution including sweat analysis for UC1D3, and a radar-based sleep monitoring solution for UC1D4.
- UC2 Driver monitoring – This use case addresses the topic of driver monitoring in the context of partial automated driving. It includes three demonstrators operating in a driver simulator (UC2D1), passenger car (UC2D2) and heavy-duty vehicle (UC2D3). UC2D1 focuses on the definition of a Driver Complex State (DCS) made of the combination of cognitive, behavioural, and emotional states. System implementation has started to acquire various parameters from different unobtrusive sensors (cameras, biofeedback sensors, driving data) and develop a fusion algorithm for the definition of a “fitness to drive index”. Demonstrator UC2D2 and UC2D3 focus on the development, validation, and integration of unobtrusive sensors for driver state monitoring. In particular, a FMCW radar has been installed to constantly monitor the driver’s vital signs. The system also includes a driver monitoring camera in order to detect the driver’s gaze, and LiDAR for external monitoring.
- UC3 Safety and Comfort at Intersections – UC3 constitutes a great number of demonstrators (14), so that the work was clustered in three main topics: “vehicle” (UC3a), “Communication + GNSS” (UC3b) and “Infrastructure” (UC3c). UC3a has developed sensing solutions and intelligent algorithms for improving ego-localisation and vehicle environment monitoring – particularly for vulnerable road user (VRU) detection - by means of IMU’s, Lidar, cameras and vehicle data. UC3b worked on a UWB based positioning system with accurate synchronisation and highly secure solutions to improve GNSS based positioning robustness and reliability. UC3c focused on the development of road-side sensing solutions utilizing radar, camera, thermal camera and time of flight camera for the detection of VRU’s and their behaviour.
The results will include:
- A set of use cases providing insight in user and technical requirements and real world scenarios in the domains of Healthcare and Automotive systems.
- A set of methodologies and tools to support design of intelligent distributed systems.
- A set of smart perception sensor modules featuring multiple and switchable sensing modalities and modes, embedded intelligence, and provisions to be included into the framework.
- Analytics and explainable AI for the processing and fusion of sensor data as well as reasoning for higher level information extraction and decision making.
- A set of evaluated demonstrators realizing the specified use case challenges and featuring proof of concept implementations of the newly developed concepts in the healthcare, wellbeing, and automotive domains.

The project results will have immediate impact on the addressed domains of healthcare and mobility - particularly in the addressed use cases - as well as indirect impact on any domain deploying smart sensing systems. The distributed sensing methodology will deploy innovating sensor technologies for human sensing and facilitate the design of complex multi-sensor systems and related intelligent algorithm development and deployment, decreasing development time. Systems will become more reliable and robust, and enable improved insight in the user/environment. The project will be instrumental in leveraging the technologies for important future challenges in the health and automotive domains.
Project logo