European Commission logo
English English
CORDIS - EU research results
CORDIS

aDverse wEather eNvironmental Sensing systEm

Article Category

Article available in the following languages:

First full fusion sensor solution enables all-weather assisted driving

For truly autonomous vehicles to ever hit the roads, their environmental perception will have to meet safety targets regardless of location, weather or time of day. For the first time, the DENSE project integrates three sensing technologies processed by deep neural networks.

Digital Economy icon Digital Economy

Most current advanced driver-assistance systems (ADASs) were developed for ‘normal’ or average driving conditions, such as during daylight and in good weather. Under tougher conditions – at night or during fog, rain or snow – these systems can fail. In response, the EU-supported DENSE project combined three specialised sensors: a gated camera, a short-wave infrared (SWIR) lidar and a high-resolution multiple-input multiple-output (MIMO) radar. These were complemented with a processing unit based on deep neural networks. “Our approach significantly improves driver assistance systems, taking an important step towards making them suitable 24/7 for all weather conditions,” says project coordinator Werner Ritter from Mercedes-Benz, which hosted the project. “Currently such environmental perception is only possible worldwide with our method!”

The sensor suite and processing unit

DENSE’s architecture, involving a camera, lidar and radar, along with the processing unit, was designed after a study with sensor and component suppliers. In contrast to standard passive cameras, DENSE’s gated camera is active, synchronising its aperture with laser illumination to record defined near, middle and distant depth areas. The camera’s aperture remains closed for areas outside the ranges to avoid visual interference. This enables clearer pictures. An image of all the depth ranges is generated by overlaying the individual depth images. Gated cameras also provide precise distance information for each pixel to within an accuracy of 5 % (i.e. 5-metre accuracy at 100-metre distance). “With the combination of our gated cameras’ image and depth maps, at the rate of 60 per second, small obstacles at great distances can be reliably detected even in poor visibility. To our knowledge, this has not been possible with any other system so far!” explains Ritter. Standard lidar works in the near-infrared (NIR) wavelength which is not considered eye-safe. Current ADASs have already reached lidar’s resolution and distance limits. In contrast, SWIR lidars work in eye-safe ranges, increasing the luminance for one ray (in simpler systems) by a factor of at least 1 000. Lastly, the resolution of radar systems currently used can’t easily discriminate between stationary objects or backgrounds. However, DENSE’s MIMO technique combines radar with other sensors, increasing reliability and classification accuracy. A prototype incorporating all three sensors was constructed and tested in project partner CEREMA’s (website in French) 30-metre weather chamber in France, in which fog and rain conditions were simulated. "Our system demonstrated significant improvement for the environmental perception of driver assistance systems, especially in bad weather, compared to conventional systems,” notes Ritter. Sensor performance was also improved by the processing unit, based on deep neural networks, trained using sensor-specific bad weather algorithms. Fine-tuning and testing used data obtained from the CEREMA weather chamber or from longer drives in northern Europe.

From assisted driving to autonomous vehicles

The DENSE project’s innovation contributes to assisted driving that seeks to avoid costly accidents and save lives. It also helps pave the way for autonomous vehicles where all-weather 24/7 availability and reliance is essential, benefiting the elderly and disabled, amongst others. Before industrial production, necessary improvements to DENSE will include: access to more bad weather simulation data; better automatic weather recognition and adaptation (using AI); and increased sensor hardware performance. “Current sensors could already be replaced with ours, improving the environmental perception of ADASs, but this first requires hardware developments by suppliers. Our neural networks can be introduced directly into existing architectures, improving existing sensors, but production changes usually take at least 3 years,” says Ritter.

Keywords

DENSE, advanced driver-assistance systems, autonomous vehicles, bad weather, lidar, radar, gated cameras, SWIR, MIMO, sensors, neural networks

Discover other articles in the same domain of application