Skip to main content
European Commission logo print header

Spatial-temporal information processing for collision detection in dynamic environments

Article Category

Article available in the following languages:

Insects that avoid looming threats inspire the next generation of robots

Collision avoidance features in robotics could find numerous applications. A European project exploited the insects’ visual pathways to generate innovative anti-collision sensor technology.

Digital Economy icon Digital Economy
Transport and Mobility icon Transport and Mobility

Insects have evolved a heightened motion perception capacity to respond quickly in complex dynamic environments. They have a unique ability to deal with visual motion, discriminate small moving targets and detect looming stimuli. This allows them to respond quickly and avoid collisions. The visual systems of insects can recognise motion cues and distinguish different patterns against visual chaos, offering insects the ability to respond swiftly. Although the underlying mechanisms may remain elusive, these vision systems serve as a source of inspiration for developing artificial visual systems for motion perception. Importantly, there is great potential to computationally model the visual pathways of insects. Such models could contribute to the design of innovative sensors for future intelligent and fully automated robots.

Modelling insect visual neural systems

The STEP2DYNA project brought together neurobiologists, neural system modellers, robotics researchers and engineers to study the visual neural system of insects and successfully model it for robotics applications. The research was undertaken with the support of the Marie Skłodowska-Curie Actions programme and focused on modelling movement detection in insects. Researchers worked on two movement detectors, LGMD1 and LGMD2, found in the locust brain in the region of the visual system, which enable locusts to respond rigorously to looming objects. The consortium developed insect-inspired algorithms based on the LGMD mechanisms, enabling ground robots or unmanned aerial vehicles (UAVs) to detect imminent collisions while moving or flying in complex dynamic environments. “Unlike LGMD1 which responds to approaching objects from bright or dark background, LGMD2 exhibits selectivity to dark moving objects against bright background in depth, a similar situation which ground vehicles and robots are facing,” explains project coordinator Shigang Yue. Moreover, the team combined information from direction selective neurons and small target motion detector neurons of fruit flies to generate a synthetic neural network.

Exploiting bio-inspired algorithms in robots

The STEP2DYNA insect-inspired algorithms are relatively simple but exhibit high efficiency. They focus only on key motion cues and calculate the expanding edges of the approaching objects, so the colour, shape and other physical characteristics of the objects are not that important. “These features significantly reduce computational cost, rendering them suitable for real-time applications,” highlights Yue. The team built a multi-sensor platform, with particular emphasis on hardware and software design. Flight experiments with drones validated the ability of the LGMD vision-based collision detector to operate effectively in different dynamic environments with complex backgrounds. Further testing in ground micro-robots and UAVs was also successful, further attesting to the potential of these methodologies to find application in human-made flying machines, restricted in size, energy consumption and computational power. According to Yue: “The next step will be to explore the mechanisms of binocular vision in insects and how it contributes to depth perception with such a limited number of neurons, to detect potential threats more accurately than by just using one eye.” New funding is necessary to continue down that path.

Keywords

STEP2DYNA, insects, robots, LGMD, visual system, UAV, unmanned aerial vehicle

Discover other articles in the same domain of application