Skip to main content
Go to the home page of the European Commission (opens in new window)
English English
CORDIS - EU research results
CORDIS

Robust Automated Driving in Extreme Weather

Periodic Reporting for period 1 - ROADVIEW (Robust Automated Driving in Extreme Weather)

Reporting period: 2022-09-01 to 2024-02-29

The primary goals of ROADVIEW are:
• Define complex environmental conditions and use case specifications
• Design and validate the physical system architecture and define the system requirements
• Create digital models to support system development and testing
• Optimise data processing performance using the new concept of ‘data readiness levels’
• Develop an improved in-vehicle perception system that is robust under harsh weather conditions and that can handle a wide range of traffic scenarios
• Develop a weather-aware decision-making system that complies with explainable AI concepts
• Ensure reliability of the perception and decision-making systems by X-in-the-loop testing and validation
• Integrate the ROADVIEW-developed solutions in OEM platforms to reach TRL7
The ROADVIEW consortium has successfully created Operational Design Domain (ODD) definitions for five different use cases, particularly detailing traffic density (e.g. low, moderate, high), drivable areas (such as highways, urban traffic, and rural roads), and harsh weather conditions, with a specific focus on rain, fog, and snow. The ODD definitions were specified by extending the ODD taxonomy defined in ISO 34503 to include the ROADVIEW use cases and relevant environmental conditions for the sensor types investigated in the project, such as RGB cameras, LiDARs, RADARs, and Thermal cameras. The consortium has described the ROADVIEW system reference architecture as a blueprint for the specifications of the entire ROADVIEW system. This includes all Perception (e.g. low-level sensor fusion, object detection, and weather-type detection, among others), Planning, and Control functionalities that rely on various sensing modalities such as RGB cameras, LiDARs, RADARs, Thermal cameras, Inertial Measurement Units (IMU), and Global Navigation Satellite System (GNSS) modules. To design and validate autonomous systems in adverse weather, ROADVIEW introduces the following items: a) generation of high-fidelity Digital Twins of some of the testing/demo facilities; b) creation of harsh weather paired dataset including data generated by most of the automotive perception sensors modalities (i.e. LiDAR, RADAR, thermal and color cameras) and c) validated sensor noise models for the above-mentioned sensors, using the collected data and novel metrics to evaluate the performance of sensor models in the context of assisted and automated driving. ROADVIEW’s data processing performance is improved by using the concept of Data Readiness Level (DRL) to quantitively show where the images from RGB and thermal cameras are unsharp, distorted, blurred, over / under exposed, or where noise is present in the LiDAR and RADAR point clouds. Quantitively, we used tested open-sourced tools and averaged across the tools to obtain values, scaled to 0-100, combined with the LiDAR and RADAR noise levels & averaged again over the 3 modalities, & averaged again (the median) over time to a 1-9 scale.
ROADVIEW is developing an improved in-vehicle perception system integrating multiple sensor modalities to enable robust operation under harsh weather conditions and in a wide range of traffic scenarios. The perception system is planned to fuse LiDAR, RADAR, thermal and colour cameras to build an integrated world view around the vehicle in all various traffic and weather conditions. The ROADVIEW perception stack includes also the methods and algorithms for sensor fusion, object detection, for free space detection and for weather type detection. ROADVIEW is developing methods for estimating the environmental and weather conditions surrounding the vehicle and the traction and drivable area conditions on the road surface. The development work has provided initial results from related training data collection on the road, neural network design, training and testing. The first version of the visibility range estimation method has been developed. Additionally, for road grip or slipperiness prediction at the front of the vehicle, the results are available and will be published at an academic conference. The consortium has described the reference architecture for the control and decision-making system as well as more implementation-focused adaptations for the vehicle demonstrators. The implementation of software modules for the weather-aware decision-making system has already provided the first concrete results such as the Minimum Risk Maneuver functionality, V2X Message format definitions, and implementation of the Manoeuvre Coordination Module. Metrics and testing processes for the benchmarking of the different test methods were defined. The innovative test rigs will be developed for testing automated driving function and perception systems including the stimulation of LiDAR, RADAR, and Camera sensors using Direct Data Injection (DDI) as well as Over-the-Air (OTA) methodologies. The comparison and evaluation of the different test methods will enable effective distribution of the test cases among the different levels of reality and simulation, improving the use of testing resources and accelerating the development of automated driving systems.
ROADVIEW proposes a new sensor denoising approach . The first version of the AI-based snow removal model, which filters out falling snowflakes in 3D LiDAR data, has already been implemented. This model is a ROADVIEW innovation thatsignificantly reduces memory consumption, number of operations and execution time per point cloud on real LiDAR data. Specifically, it reduces memory consumption by 99.92%, number of operations by 96.87%, and execution time by 82.84%. ROADVIEW introduces a new dataset, named the REHEARSE (adveRse wEatHEr datAset for sensoRy noiSe modEls) Dataset . The dataset is strictly focused on clean weather, rain, fog, and snow conditions. Automotive Sensors’ data collections with rain and fog are conducted in controlled environments using the test chamber from CE and on the outdoor test track from THI.
Sample RGB camera images (left), corresponding LiDAR point clouds (middle), confusion matrix (right
Annotated REHEARSE LiDAR point cloud data with the corresponding RGB and Thermal camera images
Denoising of a snowy LiDAR point cloud from real-world WADS dataset with 3D-OutDet
Grip prediction model
Images from FGI urban journey (credit: The Finish Geospatial Institute, fgi.fi)
Braking test with VTT vehicle on 12% downhill
Use case within a specific ODD is expanded into multiple scenarios and test cases
Different elements considered to describe the ODD. Red boxes indicate the focus of ROADVIEW
ROADVIEW Digital Twins
Sample image with fog noise model added. Left: Image captured at CEREMA, Right: ROADVIEW fog model
Hardware in the loop environment for the testing of the ROADVIEW system
My booklet 0 0