CORDIS - Résultats de la recherche de l’UE
CORDIS

Low-latency Perception and Action for Agile Vision-based Flight

Description du projet

Vers une navigation autonome des drones comparable à celle des pilotes humains

Les drones sont encore loin d’être capables de naviguer aussi bien dans des environnements complexes que les pilotes humains. Pour rendre les robots plus agiles, il est nécessaire de disposer de capteurs plus rapides et de traitement à faible latence. Le projet AGILEFLIGHT, financé par l’UE, a pour objectif de mettre au point des méthodes scientifiques innovantes pour faire la démonstration d’une navigation quadrirotor autonome, basée sur la vision et agile, dans des environnements inconnus, privés de GPS et encombrés d’obstacles éventuellement mobiles. L’objectif est de parvenir à une navigation aussi efficace en termes de maniabilité et d’agilité que celle des pilotes de drones professionnels. À cette fin, le projet développera des algorithmes qui combinent les avantages des caméras standard et des caméras événementielles. AGILEFLIGHT développera également de nouvelles méthodes qui permettront des manœuvres agiles dans des environnements encombrés, inconnus et dynamiques. Ces travaux pourraient être utiles pour les interventions en cas de catastrophe, les livraisons aériennes et les travaux d’inspection.

Objectif

Drones are disrupting industries, such as agriculture, package delivery, inspection, and search and rescue. However, they are still either controlled by a human pilot or heavily rely on GPS for navigating autonomously. The alternative to GPS are onboard sensors, such as cameras: from the raw data, a local 3D map of the environment is built, which is then used to plan a safe trajectory to the goal. While the underlying algorithms are well understood, we are still far from having autonomous drones that can navigate through complex environments as good as human pilots. State-of-the-art perception and control algorithms are mature but not robust: coping with unreliable state estimation, low-latency perception, real-time planning in dynamic environments, and tight coupling of perception and action under severe resource constraints are all still unsolved research problems. Another issue is that, because battery energy density is increasing at a very slow rate, drones need to navigate faster in order to accomplish more within their limited flight time. To obtain more agile robots, we need faster sensors and low-latency processing.

The goal of this project is to develop novel scientific methods that would allow me to demonstrate autonomous, vision-based, agile quadrotor navigation in unknown, GPS-denied, and cluttered environments with possibly moving obstacles, which can be as effective in terms of maneuverability and agility as those of professional drone pilots. The outcome would not only be beneficial for disaster response scenarios, but also for other scenarios, such as aerial delivery or inspection. To achieve this ambitious goal, I will first develop robust, low-latency, multimodal perception algorithms that combine the advantages of standard cameras with event cameras. Then, I will develop novel methods that unify perception and state estimation together with planning and control to enable agile maneuvers through cluttered, unknown, and dynamic environments.

Régime de financement

ERC-COG - Consolidator Grant

Institution d’accueil

UNIVERSITAT ZURICH
Contribution nette de l'UE
€ 2 000 000,00
Adresse
RAMISTRASSE 71
8006 Zurich
Suisse

Voir sur la carte

Région
Schweiz/Suisse/Svizzera Zürich Zürich
Type d’activité
Higher or Secondary Education Establishments
Liens
Coût total
€ 2 000 000,00

Bénéficiaires (1)