Skip to main content
European Commission logo
English English
CORDIS - EU research results
CORDIS
CORDIS Web 30th anniversary CORDIS Web 30th anniversary

NextPerception - Next generation smart perception sensors and distributed intelligence for proactive human monitoring in health, wellbeing, and automotive systems

Periodic Reporting for period 3 - NextPerception (NextPerception - Next generation smart perception sensors and distributed intelligence for proactive human monitoring in health, wellbeing, and automotive systems)

Reporting period: 2022-05-01 to 2023-07-31

We put our lives increasingly in the hands of smart complex systems making decisions that directly affect our health and wellbeing. This is very evident in healthcare – where systems watch over your health – as well as in traffic – where autonomous driving solutions are gradually taking over control of the car. The accuracy and timeliness of the decisions depend on the systems’ ability to build a good understanding of both you and your environment, which relies on observations and the ability to reason on them.
This project brought perception sensing technologies like Radar, LiDAR and cameras to the next level, enhancing their features to allow for more accurate detection of human behaviour and physiological parameters. Besides more accurate automotive solutions ensuring driver vigilance and pedestrian and cyclist safety, this innovation will open up new opportunities in health and wellbeing to monitor elderly people at home or unobtrusively assess health state.
The project embraced the Distributed Intelligence paradigm for integrating the use cases, utilising the distributed computational resources increasingly available in sensors and edge components to distribute also the intelligence.
The goal of this project was to develop next generation smart perception sensors and enhance the distributed intelligence paradigm to build versatile, secure, reliable, and proactive human monitoring solutions for the health, wellbeing, and automotive domains
NextPerception aimed at the exploitation of enhanced sensing and sensor networks focusing on the application domains of Health, Wellbeing and Automotive. The starting point was to acquire quality information from sensors nodes (O1). The next step was to transform the data provided by collections of networked sensors into information suitable for proactive decision making (covering a.o. clinical decision making, wellbeing coaching and vehicle control decisions (O2). The provision of a reference platform to support the design, implementation, and management of distributed sensing and intelligence solutions (O3) greatly facilitated the use and implementation of distributed sensor networks, enabling their practical application in the primary domains of healthcare, wellbeing, and automotive as well as in cross-sector applications. The demonstration of these applications (O4) was key to achieve potential products and services that can finally contribute to the solution of the Global Challenges in Health and Wellbeing, and Transport and Smart Mobility.
The project brought together 43 partners from 7 countries. The partners are major industrial players and prominent research centres and universities and will address top challenges in health, wellbeing, and automotive domains through three use cases: integral vitality monitoring for elderly and exercise, driver monitoring, and providing safety and comfort for vulnerable road users at intersections.
The project successfully worked on the four objectives during two cycles. Each of the cycles consisted of requirements definition, development, integration and piloting phases, concentrating on technology development in the start, and the validation of the proofs of concept in the end. The objectives were realised as follows:
O1: Development of perception sensors and low-level sensor data processing was done and included advancements in FMCW and UWB radar technology for human vital sign monitoring, Lidar for environment detection, IMU for both automotive and human motion detection and diaphoresis sensors.
O2: Data processing and ML algorithms were developed for the processing and fusion of sensor data in order to derive the features for the realisation of the use case requirements. The algorithms provided e.g. improved vital sign extraction from radar and foil-based sensors, behaviour extraction and classification from a combination of remote and motion sensors, improved accuracy of positioning of road users from IMU, camera and radar based measurements, emotion and distraction of drivers and improved estimation of exertion level.
O3: A reference architecture was developed to capture the components, functionality, and connectivity necessary to realise distributed sensing and computation. System integration exemplifying this architecture was done for each of the use cases resulting in functioning demonstrator systems.
O4: Demonstrators were developed for each of the use cases and piloted in lab or real environments. An overview of the use cases is given below:
UC1 Integral Vitality Monitoring – This use case focuses on elderly and activity monitoring and has four demonstrators validated in six pilots:
- Continuous activity monitoring - indoor positioning, radar-based vital sign monitoring, and camera assisted activity monitoring for the elderly and activity monitoring solutions piloted in the laboratory and the Lleida living lab in Spain.
- Episodic Health Gate - a radar-based vital sign monitoring and weight sensing chair for frailty detection in elderly piloted in a care facility in Kuopio, Finland.
- Wearable Activity Monitoring - a wearable devices-based exercise monitoring solution including sweat analysis piloted in outdoor exercises.
- Sleep Monitoring - a radar-based sleep monitoring solution piloted in Kempenhaeghe, The Netherlands
UC2 Driver monitoring – This use case addresses the topic of driver monitoring in the context of partial automated driving. It includes three demonstrators and two pilots:
- Driver simulator – integrated system for assessing the Driver Complex State (DCS, a combination of cognitive, behavioural, and emotional states) and derived a “fitness to drive index” by means of different unobtrusive sensors (cameras, biofeedback sensors, driving data).
- Passenger car and heavy-duty vehicle – both car and truck feature an FMCW radar to monitor the driver’s vital signs as well as a gaze detection system. Pilot validation was done in real traffic situations.
UC3 Safety and Comfort at Intersections – constitutes three main topics:
- Vehicle - has developed sensing solutions and intelligent algorithms for improving ego-localisation and vehicle environment monitoring – particularly for vulnerable road user (VRU) detection by means of IMU’s, Lidar, cameras and vehicle data.
- Communication + GNSS - worked on a UWB based positioning system with accurate synchronisation and highly secure solutions to improve GNSS based positioning robustness and reliability.
- Infrastructure - focused on the development of road-side sensing solutions utilizing radar, camera, thermal camera and time of flight camera for the detection of VRU’s and their behaviour.
The current proof of concept technologies and demonstrators in the project reached the current state of the art and often surpassed it, while being applied to innovative use cases.
The project results will have immediate impact on the addressed domains of healthcare and mobility - particularly in the addressed use cases - as well as indirect impact on any domain deploying smart sensing systems. The distributed sensing methodology provides a pathway for utilising innovative sensor technologies for human sensing and facilitating the design of complex multi-sensor systems and related intelligent algorithm development and deployment, decreasing development time. The most important impact will be the improved health and quality of life of people, and safer traffic, especially for pedestrians and cyclists.
Project logo