Skip to main content
Vai all'homepage della Commissione europea (si apre in una nuova finestra)
italiano it
CORDIS - Risultati della ricerca dell’UE
CORDIS

Robust Automated Driving in Extreme Weather

Periodic Reporting for period 2 - ROADVIEW (Robust Automated Driving in Extreme Weather)

Periodo di rendicontazione: 2024-03-01 al 2025-08-31

The primary goals of ROADVIEW are:
• Define complex environmental conditions and use case specifications
• Design and validate the physical system architecture and define the system requirements
• Create digital models to support system development and testing
• Optimise data processing performance using the new concept of ‘data readiness levels’
• Develop an improved in-vehicle perception system that is robust under harsh weather conditions and that can handle a wide range of traffic scenarios
• Develop a weather-aware decision-making system that complies with explainable AI concepts
• Ensure reliability of the perception and decision-making systems by X-in-the-loop testing and validation
• Integrate the ROADVIEW-developed solutions in OEM platforms to reach TRL7
The ROADVIEW consortium introduced Operational Design Domain (ODD) definitions for five different use cases, particularly detailing traffic density, drivable areas and harsh weather conditions, with a specific focus on rain, fog, and snow. These ODD definitions were specified by extending the ODD taxonomy defined in ISO 34503 to include the ROADVIEW use cases and relevant environmental conditions for different modalities, such as RGB cameras, LiDAR, RADAR, and thermal cameras. To design and validate autonomous systems in adverse weather, ROADVIEW introduced the following items: a) generation of high-fidelity digital twins of some of the testing/demo facilities; b) creation of harsh weather paired dataset including data generated by most of the automotive perception sensors modalities (i.e. LiDAR, RADAR, thermal and colour cameras) and c) validated sensor noise models for the above-mentioned sensors, using the collected data and novel metrics to evaluate the performance of sensor models in the context of assisted and automated driving.

ROADVIEW’s data processing performance was improved by using the concept of Data Readiness Level (DRL) to quantitatively show where the images from RGB and thermal cameras are unsharp, distorted, blurred, over/under exposed or where noise is present in the LiDAR and RADAR point clouds. We started with 43 image quality tools and five different point cloud quality tools.

ROADVIEW developed an enhanced in-vehicle perception system that integrates multiple sensor modalities to enable robust operation under harsh weather conditions and in a wide range of traffic scenarios. The perception system fuses a set of LiDAR, RADAR, and colour cameras to build an integrated world view around the vehicle in various traffic and weather conditions. The selection of sensors is based on the best findings from existing setups of multiple partners (AVL, FGI, FORD, THI, VTT, and S4) developing their own autonomous driving platforms within the consortium. The ROADVIEW perception stack includes also the methods and algorithms for sensor fusion, object detection, free space detection and weather type detection.

Furthermore, the perception system includes in-vehicle visibility and slipperiness estimators which utilize the vehicle's own sensors to estimate the visibility conditions around the vehicle, and the grip on the road surface in front of the vehicle to enable robust operation in harsh weather conditions. The positioning capabilities of the vehicle in difficult weather are improved using the novel environment-aware normal distribution transform (EA-NDT) based HD-mapping method.

ROADVIEW developed a weather-aware decision-making system that incorporates the state estimates in the decision-making process to adjust the vehicle behaviour. The consortium developed the decision-making system, which includes the weather-aware navigation system and velocity controller as well as Infrastructure-based manoeuvre cooperation software by following the reference architecture for the control and decision-making system. The software implementations of these systems were successfully tested and evaluated following the test scenarios. Evaluation was conducted with simulations and real-world tests in winter conditions.

ROADVIEW integrated advanced XiL (X-in-the-Loop) test methods, incorporating fine-tuned camera and LiDAR sensor models as well as high-fidelity vehicle dynamics models. Noise models for adverse weather conditions were also implemented in simulation environments. Supported by high-precision digital twins and stimulation technologies based on Over-the-Air (OTA) and Direct Data Injection (DDI) interfaces, both Hardware-in-the-Loop (HiL) and Vehicle-in-the-Loop (ViL) test setups were successfully integrated and are now operational. These stimulation methods enable ROADVIEW partners to interface simulation environments with vehicle hardware in the same manner as with real sensors, allowing for a seamless transition from XiL-based testing to real-world validation.
ROADVIEW developed an innovative 3D point cloud filtering framework designed to detect, classify, and eliminate noisy points in point clouds, particularly those degraded by rain or snow. The ROADVIEW filtering method is available as supervised and unsupervised variants, both of which were extensively benchmarked against traditional statistical techniques and advanced AI-based filtering approaches in the literature by using multiple public datasets.

ROADVIEW introduces a new dataset, named REHEARSE (adveRse wEatHEr datAset for sensoRy noiSe modEls), which is strictly focused on clean weather, rain, fog, and snow conditions. Automotive sensors’ data collections with rain and fog are conducted in controlled environments using a test chamber and an outdoor test track. A new version of REHEARSE, named REHEARSE-3D, was introduced as a unique multi-modal emulated rain dataset for 3D point cloud de-raining as it provides fused 3D LiDAR and 4D RADAR point clouds.

ROADVIEW also introduced a novel multimodal dataset (logged in Finland using LiDAR, RGB, and thermal cameras) annotated to solve multiple perception tasks: 3D object detection, 3D semantic segmentation and 3D point cloud denoising. ROADVIEW developed a compressed HD map representation that can cope with the seasonal changes and be kept robustly updated while excluding the dynamic objects. ROADVIEW’s map compression strategy is based on an environment aware representation that leverages the semantic information of the scene.

ROADVIEW also developed novel physics-based weather noise models for cameras, LiDAR, and 4D RADAR.
Sample RGB camera images (left) corresponding LiDAR point clouds (middle),confusion matrix (right)
Annotated REHEARSE LiDAR point cloud data with the corresponding RGB and Thermal camera images.
Enhanced 4D RADAR model simulator data.
Grip prediction model takes RGB
Example of images obtained by the data-driven numerical simulation model on camera images.
Braking test with VTT vehicle on 12% downhill
ROADVIEW Collaborative perception solutions
HiL (Left) and ViL (Right) integrated at THI
Different elements considered to describe the ODD. Red boxes indicate the focus of ROADVIEW
Sample image with fog noise model added. Left: Image captured at CEREMA, Right: ROADVIEW fog model
Car detection on the CADC dataset
Example image from Snowy Scenes showing the labelled LiDAR point cloud and front camera image.
Il mio fascicolo 0 0