The project will develop a sensing system for driver assistance systems with the aim of getting further ahead in the development of Advanced Driver Assistance Systems (ADAS) for complex situations, at the beginning at low speeds. This system will be based on image processing technology, radar and laser. Sensor information will be merged in order to achieve a good perception of the car environment. Based on experience from other projects, such as UDC or AC-ASSIST, the programme will focus on definition of characteristic scenarios for low-speed driving, improvement of sensors for use according to the specifications, interface harmonisation and data bus definition, data fusion with visualisation of results, and a test vehicle.
Based on previous research activities, the concept of Adaptive Cruise Control (ACC) is in the development process as a first Advanced Driver Assistance System and was introduced to the market in 1999. All surveys and experimental assessments have proven that users have shown a high interest and product acceptance for such systems. This is only the beginning of developments towards more advanced functions. Future Advanced Driver Assistance Systems (ADAS) may help the driver in more and more complex driving tasks. They can partly take over the control of driving a car in traffic situations in which the driver hands over the control of the car to the Driver Assistance System. However, these commercially available ADAS are today based on single sensor approaches with either Radar or Laser sensors. ADAS are at present very much limited to use on motorways or urban expressways without crossings. The traffic consists of other vehicles (cars, trucks), traffic scenarios under such circumstances are rather simple and processing can be focussed on a few, well-defined detected objects. Nevertheless, even in these relatively simple situations, these first systems cannot cope reliably with fixed obstacles. They also surprise the driver, e.g. in some cases of cut-in, the insertion of other vehicles in the detection beam close to the vehicle. Here, the beam width of the sensor beams does not cover the area in front of the vehicle. With a wider use of such systems it will be necessary to extend the operation of use to more complex situations in dense traffic environments around or inside urban areas. There, traffic is characterised by lower speeds, traffic jams, tight curves, traffic signs, crossings and "wrack" traffic participants such as motorbikes or bicycles. Very soon, road scenarios become very complex and it is more and more difficult to reliably operate an ADAS.
The project will start with a definition of the sensing requirements. After this, the partners of the project will carry out a sensing system design/subsystem specification work. The composition of the project, involving vehicle manufacturers, sensor industrials and technology groups, allows a good knowledge of both operational requirement (application) and technological feasibility to converge to the optimal system design.
Adaptation and development of sensors (radar, laser, video) will target the enhancement of the direction and perception capabilities of sensor technology and reach the level of performance specified by the group of partners:
Video sensor: improved processing of complex multiple targets road scenarios.
Radar sensor: improved short range detection to prevent cut-ins and enhance the fixed obstacle detection capabilities of the present sensor technology.
Laser sensor: improvement of existing laser scanning technology for use in urban areas in order to analyse complex situations under several environmental conditions.
Experience from the partners in terms of automotive feasibility (vehicle manufacturers), industrial constraints (sensor industrials) and technological options (research institutes) will be combined in order to find the right compromise between perfect sensor and THE needed sensor for the optimal system design.
It seems as if the optimal sensing system of a driver assistance system is not one single sensor, but a merger of different sensors. Therefore, the fusion of the available sensor data will lead to a new, innovative system. This will form the basis for later manufacturing of a competitive system. The partners will build up a test vehicle that will be used for validation of the system and for the demonstration of its capabilities. The system will be evaluated in open loop operation. Environment perception will be visualised.
Reports on scenarios, on test specifications and system specifications. Recommendations for data bus.
Datalogger: Data files with sequences of existing sensors. One test vehicle first with existing sensors, then with new sensors and the fusion unit.
New sensors: Laser: short range, wide angle; weather conditions, visibility, automotive packaging. Radar: short range, wide field of view, fixed obstacles. Video: high dynamics, stereo vision, automotive packaging
A fusion unit to integrate all sensor information and deduce reference values for vehicle control including visualisation tool.
Reports on deliverables, test results and evaluation of the test vehicle.
Final report (semi-public overview).
Funding SchemeCSC - Cost-sharing contracts
75272 Paris 6
10043 Orbassano (To)
78153 Le Chesnay
92100 Boulogne Billancourt
B90 4LA Solihull