Skip to main content

Article Category


Article available in the folowing languages:

The empathetic car

Driver assistance systems are popular, but they can sometimes be rather complicated to operate. In future, intelligent human-machine interfaces will ensure that drivers receive relevant information in manageable quantities in any situation.

Driver assistance and navigation systems are ‘in’. Yet the more buttons and display fields grace the dashboard, the greater is the likelihood that the driver’s attention will be drawn away from other traffic on the road. The swelling number of automatic helpers in the car could easily become a curse rather than a blessing, over-taxing drivers and leading them into precarious situations. To prevent this from happening, psychologist Claus Marberger and engineer Günter Wenzel of the Fraunhofer Institute for Industrial Engineering IAO in Stuttgart are developing assistance and information systems that are capable of adapting to the driver’s physical condition and the traffic situation. In the Vehicle Interaction Lab at the IAO the researchers are working in car simulators to devise a concept for controlling numerous different functions with a single input element. One such proposal would be to operate a navigation system, an MP3 player and a cell phone via a single shared menu. And that menu must be so designed that every function can be triggered with as few commands as possible. The scientists aim to find out which type of menu requires the least amount of concentration on the part of the driver. Their work is part of an EU-wide project known as AIDE (Adaptive Integrated Driver-vehicle InterfacE). However, it takes more than just the right menu to adapt human-machine interaction to specific situations. A whole gamut of data is therefore fed into the systems: Sensors observe the surroundings of the vehicle, the car itself, and the physical condition of the driver. Cameras even monitor the driver’s eyelid movements. If these grow slower, the system concludes that the driver is getting tired. All the data together affect the interaction between the car and the driver, using sounds or voice, shaking the steering wheel or causing displays to flash. “Many of these assistance functions are ready to go into production straight away,” says Wenzel, “but in view of the growing number of functions it is not yet evident how they can best interact with the human user.” It may be advisable to suppress non-essential information during hazardous road situations, for instance by automatically redirecting a phone call to the voice mailbox. On the other hand, the eyelid movement sensor could trigger an alarm to protect the driver from microsleep episodes. The researchers are now testing different scenarios and menu-guided operations in simulators, which comprise real cars surrounded by screens displaying a virtual roadscape, to come up with a sophisticated human-machine system.


Austria, Belgium, Czechia, Germany, Denmark, Estonia, Greece, Spain, Finland, France, Hungary, Ireland, Italy, Lithuania, Luxembourg, Latvia, Malta, Netherlands, Poland, Portugal, Sweden, Slovenia, Slovakia, United Kingdom