European Commission logo
français français
CORDIS - Résultats de la recherche de l’UE
CORDIS

Low-latency Perception and Action for Agile Vision-based Flight

Periodic Reporting for period 2 - AGILEFLIGHT (Low-latency Perception and Action for Agile Vision-based Flight)

Période du rapport: 2022-03-01 au 2023-08-31

Drones are disrupting industries ranging from inspection of bridges and power lines, videography and filmography, agriculture, law enforcement, delivery, inventory management, and search and rescue. According to a report by Grand View Research published in September 2021 [1], the global commercial drone market is valued at $24 billion and is expected to reach $500 billion by 2028. However, commercial drones are still operated by expert human pilots. Autonomous drones that can substitute human pilots in maneuverability and agility in unknown, GPS-denied, complex, and potentially dynamic environments do not yet exist. This is crucial for several reasons, such as minimizing the risk of injuries and damages, maximizing the efficiency and effectiveness of the operation, [2] and accessing remote areas beyond the remote controller's communication range. The overarching goal of this project is to develop scientific methods to fly autonomous, vision-based drones as good as or even better than human pilots (i.e. human-level or super-human-level performance) using onboard standard or neuromorphic event cameras and onboard computation. We argue that the main reason why traditional model-based approaches did not achieve human-level performance is that they are slow and sensitive to imperfect perception and unmodeled effects (notably perception and aerodynamics). We argue that combining model-based approaches and deep networks combined with the low-latency and high-temporal resolution of neuromorphic event cameras can overcome these problems.

[1] Grand View Research, Commercial drone market size, share & trends analysis, 2021: https://www.grandviewresearch.com/industry-analysis/global-commercial-drones-market
[2] L. Bauersfeld, D. Scaramuzza, Range, Endurance, and Optimal Speed Estimates for Multicopters, Robotics and Automation Letters (RAL), 2022: https://rpg.ifi.uzh.ch/docs/RAL22_Bauersfeld.pdf
In these first 2.5 years of the project, we have made significant progress in autonomous navigation of agile vision-based quadcopters and event cameras. Here are the major achievements in each area (from oldest (on top) to newest (bottom)):

**Autonomous Navigation of Agile Vision-based Quadcopters**

We focused on learning agile end-to-end control policies for quadcopters, which are trained entirely in simulation with either zero or minimum fine-tuning in the real world. Two applications were targeted: 1) Agile navigation in unknown environments (e.g. forests, search-and-rescue environments), 2) Autonomous drone racing:

- In [1], we showed for the first time that we could fly faster than ever before (up to 40 km/h) in cluttered forest environments, snowy terrains, and search-and-rescue environments.

- In [2, 3, 4], we presented works for time-optimal flight applied to autonomous drone racing using classic controllers (differential flatness (DF), non-linear model-predictive control (MPC), a combination of both): by time-optimal flight, we mean time-optimal planning and control.

- In [5], we showed how to use Gaussian Processes to learn residual aerodynamic forces that first-principle methods cannot explain.

- In [6,7], we presented a neural network controller trained via deep reinforcement learning rather than classic controllers. In publications to appear later this year, we show that RL beats MPC or DF and allows for the first time to outfly the world champions of drone racing: https://spectrum.ieee.org/zurich-autonomous-drone-race

- In [13], we proposed the first learning-based controller that can be conditioned on an auxiliary input from a user. This allows for regulating the aggressiveness (i.e. risk-taking) of the drone during deployment.

- In [14], we released Agilicious, our open-source and open-hardware agile quadrotor for vision-based flight, which has been in the making since the start of this project. Agilicious supports both model-based and neural-network-based controllers. Also, it provides high thrust-to-weight and torque-to-inertia ratios for agility, onboard vision sensors, GPU-accelerated compute hardware for real-time perception and neural-network inference, a real-time flight controller, and a versatile software stack.

- In [15], we proposed a method based on a hybrid model-based and learning-based dynamics model for simultaneous state estimation and force estimation. In real-world experiments performed on a quadrotor flying in strong winds, our method improved state and force estimation accuracy by up to 30% and 40%, respectively.

- In [16], we presented a neural network controller trained via deep reinforcement learning rather than classic controllers that can race vision-based autonomous quadcopters at speeds competitive with human world champions and even outfly them in speed. The paper was published in Nature.

- In [17], we delve into the reasons why reinforcement learning outperforms optimal control at racing. The paper was published in Science Robotics.


**Event Cameras**

- In [8], we presented methods to allow a 200-fold computational complexity reduction of computer vision algorithms for neuromorphic event cameras thanks to an algorithm we coined Asynchronous Graph Neural Networks.

- In [9], we addressed the current shortage of datasets to train deep networks for event cameras by using unsupervised domain adaptation to transfer labels from standard images to events.

- In [10], we proposed the first recurrent vision transformers for object detection with event cameras, allowing for the first to achieve an object detection latency below 10 ms with comparable accuracy to state of the art.

- In [11], we proposed the first data-driven feature tracker for event cameras. We showed that thanks to deep learning, feature tracks are up to twice as long as what is achieved with model-based approaches and exhibit lower latency. The paper was selected as an award candidate at the IEEE Conference on Computer Vision and Pattern Recognition (award candidate selection rate 0.5%)

- In [12], we demonstrated that thanks to event cameras, we could make a quadruped robot catch objects tossed from 4 m with relative speeds up to 15 m/s.

The results were published in the top robotics journals, e.g. Science Robotics and IEEE Transactions on Robotics, and have received world media coverage: The Economist, Forbes, Forbes, IEEE Spectrum:

References (from old to newest):

[01] A. Loquercio*, E. Kaufmann*, R. Ranftl, M. Müller, V. Koltun, D. Scaramuzza, Learning High-Speed Flight in the Wild, Science Robotics, 2021.
[02] S. Sun, A. Romero, P. Foehn, E. Kaufmann, D. Scaramuzza, A Comparative Study of Nonlinear MPC and Differential-Flatness-Based Control for Quadrotor Agile Flight, IEEE Transactions on Robotics, 2022.
[03] P. Foehn, A. Romero, D. Scaramuzza, Time-Optimal Planning for Quadrotor Waypoint Flight, Science Robotics, July 21, 2021.
[04] A. Romero, S. Sun, P. Foehn, D. Scaramuzza, Model Predictive Contouring Control for Time-Optimal Quadrotor Flight, IEEE Transactions on Robotics, 2022.
[05] G. Torrente*, E. Kaufmann*, P. Foehn, D. Scaramuzza, Data-Driven MPC for Quadrotors, IEEE Robotics and Automation Letters (RA-L), 2021.
[06] Y. Song, D. Scaramuzza, Policy Search for Model Predictive Control with Application for Agile Drone Flight, IEEE Transaction on Robotics, 2022.
[07] Y. Song*, M. Steinweg*, E. Kaufmann, D. Scaramuzza, Autonomous Drone Racing with Deep Reinforcement Learning, IEEE/RSJ International Conference on Intelligent Robots and Systems, 2021.
[08] S. Schaefer*, D. Gehrig*, D. Scaramuzza, AEGNN: Asynchronous Event-based Graph Neural Networks, IEEE Conference of Computer Vision and Pattern Recognition (CVPR), 2022, New Orleans, USA.
[09] N. Messikommer, D. Gehrig, M. Gehrig, D. Scaramuzza, Bridging the Gap between Events and Frames through Unsupervised Domain Adaptation, Robotics and Automation Letters (RAL), 2022.
[10] Mathias Gehrig and Davide Scaramuzza, Recurrent Vision Transformers for Object Detection with Event Cameras, IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2023.
[11] Nico Messikommer*, Carter Fang*, Mathias Gehrig, Davide Scaramuzza, Data-driven Feature Tracking for Event Cameras, IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2023. Award Candidate.
[12] Benedek Forrai*, Takahiro Miki*, Daniel Gehrig*, Marco Hutter, Davide Scaramuzza, Event-based Agile Object Catching with a Quadrupedal Robot, IEEE International Conference on Robotics and Automation (ICRA), London, 2023.
[13] Leonard Bauersfeld, Elia Kaufmann, Davide Scaramuzza, User-Conditioned Neural Control Policies for Mobile Robotics, IEEE International Conference on Robotics and Automation (ICRA), London, 2023.
[14] Philipp Foehn, Elia Kaufmann, Angel Romero, Robert Penicka, Sihao Sun, Leonard Bauersfeld, Thomas Laengle, Giovanni Cioffi, Yunlong Song, Antonio Loquercio, and Davide Scaramuzza, Agilicious: Open-Source and Open-Hardware Agile Quadrotor for Vision-Based Flight, Science Robotics, 2022.
[15] Giovanni Cioffi*, Leonard Bauersfeld*, Davide Scaramuzza, HDVIO: Improving Localization and Disturbance Estimation with Hybrid Dynamics VIO, Robotics: Science and Systems (RSS), 2023.
[16] Kaufmann, Bauersfeld, Loquercio, Mueller, Koltun, Scaramuzza, Champion-Level Drone Racing using Deep Reinforcement Learning, Nature, 2023.
[17] Song, Romero, Mueller, Koltun, Scaramuzza, Reaching the Limit in Autonomous Drone Racing: Optimal Control versus Reinforcement Learning, 2023, Science Robotics.



Main press coverage:

- 2023.09.01 - Our Nature Paper on beating the best human pilots in drone racing has received significant media attention:
IEEE Spectrum: https://spectrum.ieee.org/ai-drone-racing
The Guardian: https://www.theguardian.com/technology/2023/aug/30/ai-powered-drone-beats-human-champion-pilots
Spiegel: https://www.spiegel.de/wissenschaft/technik/ki-haengt-weltklasse-drohnenpiloten-ab-a-c7fdcdc8-e087-4d1d-8937-99a52b23acf0
Heise: https://www.heise.de/news/Autonome-Drohne-fliegt-Piloten-davon-9291088.html
The New Scientiest: https://www.newscientist.com/article/2389071-ai-beats-champion-human-pilots-in-head-to-head-drone-races/

- 2023.04.18 - IEEE Spectrum talks about our paper on agile catching of objects: https://spectrum.ieee.org/quadrupedal-robot

- 2022.12.29 - IEEE Spectrum places our achievement in the AI vs. Human drone race in the top 10 news of 2022: https://spectrum.ieee.org/top-robotics-stories-2022

- 2022.07.13 - TV appearance at RAI1 SuperQuark: Our research on autonomous drones, from drone racing to search and rescue, from standard to event cameras, is featured in a video reportage in the prestigious Italian science program SuperQuark, aired on RAI1: https://youtu.be/TlKX4TUS9qI

- 2022.07.07 - IEEE Spectrum reports on the world's first AI vs. Human Drone Race organized by my lab: https://spectrum.ieee.org/zurich-autonomous-drone-race

- 2022.01.29 - The Economist reports on our research on event cameras: https://www.economist.com/science-and-technology/a-new-type-of-camera/21807384

- 2021.10.26 - Forbes reports on our Science Robotics paper on teaching drones to fly in the wild: https://www.forbes.com/sites/davidhambling/2021/10/25/omniscient-master-teaches-ai-drone-real-flying-skills-in-virtual-world/

- 2021.07.03 - Forbes reports on our IEEE Transactions on Robotics 2021 papers on time-optimal quadrotor flight: https://www.forbes.com/sites/davidhambling/2021/07/23/swiss-ai-drone-racer-is-faster-than-human-pilots/



Awards:

- 2023.10.22 - IEEE Technical Field Award: Davide Scaramuzza, PI of the ERC project, receives the 2024 IEEE Kiyo Tomiyasu Award, an IEEE Technical Field Award, for outstanding early to mid-career contributions to technologies holding the promise of innovative applications. Davide Scaramuzza was awarded for his "contributions to agile visual navigation of micro drones and low-latency robust perception with event cameras.” Achievements made during this ERC grant contributed to this prize.

28.09.2023 - 2023 IEEE/RSJ IROS Best Paper Award: Our work "Autonomous Power Line Inspection with Drones via Perception-Aware MPC" wins the best paper award at IEEE/RSJ IROS 2023. Congratulations to all collaborators! Paper, video, code.

- 28.07.2023 - 2023 Frontier of Science Award: Our Science Robotics 2021 paper wins the prestigious Frontier of Science award in the category "Robotics, Science and Systems". Congratulations to Antonio Loquercio and his co-authors! Paper, open-source code, and video.

- 2023.04.25 - CVPR Best Paper Award Candidate: our 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) paper entitled "Data-driven Feature Tracking for Event Cameras" was selected as one of the 12 award candidates out of 2,359 accepted papers. The paper was presented as an oral (conference oral presentation acceptance rate: 0.5%). This paper was written as part of the ERC project.

- 2023.04.13 - UZH Best Master Thesis Award: former master student Ms. Asude Aydin wins the UZH Best Master Thesis Award for her thesis on "A Hybrid ANN-SNN Architecture for Low-Power and Low-Latency Visual Perception." The thesis was completed as part of the ERC Project. The thesis has been submitted to an international conference.

- 2023.03.07 - ETH Medal and Willi Studer Prize for Best Master Thesis: former master student Mr. Fang Nan wins the ETH Medal and Willi Studer Prize for his Master Thesis on "Nonlinear MPC for Quadrotor Fault-Tolerant Control." The thesis was completed as part of the ERC Project. The thesis has been published in the journal "Robotics and Automation Letters (RAL), 2022."

- 2022.07.04 - George Giralt Ph.D. Award – Former Ph.D. student Antonio Loquercio, funded by the ERC, won the 2022 George Giralt Ph.D. Award, the most prestigious award for Ph.D. dissertations in robotics in Europe, for his work on learning vision-based high-speed drone flight.

- 2022.05.27 - RAL Best Paper Award – Our paper "Autonomous Quadrotor Flight Despite Rotor Failure With Onboard Vision Sensors: Frames vs. Events" wins the prestigious IEEE Robotics and Automation Letters Best Paper Award.

- 2021.09.26 - NASA Tech Briefs Award - Our work on controlling a quadrotor after motor failure with only onboard vision sensors wins the 2021 NASA Tech Briefs "Create the Future" contest out of over 700 participants worldwide.

- 2021.09.13 - IEEE Transactions on Robotics Best Paper Award Honorable Mention - Our paper Deep Drone Racing: from Simulation to Reality with Domain Randomization wins the prestigious IEEE Transactions on Robotics Best Paper Award Honorable Mention.
For the progress beyond the state of the art, please see the previous section. In terms of expected results until the end of the project, I plan on:
- working on vision-based reinforcement learning-based approaches that reason directly from low-level abstractions of the sensory inputs (e.g. features) instead of high-level abstraction (e.g. explicit state estimation, as done so far)
- working on learning-based approaches that can generalize better to different environments and tasks (e.g. see GATO).
- efficient event-based computer vision algorithms that are computationally significantly cheaper than current ones without sacrificing performance. We are also looking at spiking neural networks and spiking chips.
204-2021-06-11-swiss-drone-days-rs-8120.jpg
bild-01.jpg
image1.jpg
drone-led.jpg
cover-4.jpg
drone-smoke.jpg
cover-7.png
cover-1.jpg
cover.jpg
motionblurmedium-opacity000.jpg
cover-less-saturation.jpg
simulation-cover-1.png
604-2021-06-11-swiss-drone-days-rs-7436.jpg
567-2021-06-11-swiss-drone-days-rs-9091.jpg