Skip to main content

MULTIple DRONE platform for media production

Periodic Reporting for period 1 - MULTIDRONE (MULTIple DRONE platform for media production)

Reporting period: 2017-01-01 to 2018-06-30

The aim of MULTIDRONE is to develop an innovative intelligent multi-drone team platform for media production to cover outdoor events (e.g. sports) that are typically distributed over large expanses,. The drone team to be managed by the production director and his/her will have: a) increased multiple drone decisional autonomy, by minimizing production crew load and required interventions and b) improved multiple drone robustness and safety mechanisms (e.g. communication robustness/safety, embedded flight regulation compliance, enhanced crowd avoidance and emergency landing mechanisms), enabling it to carry out its mission against errors or crew inaction and to handle emergencies. Such robustness is particularly important, as the drone team will operate close to crowds and/or may face environmental hazards. The overall multiple drone system will be built to serve identified end user (i.e. broadcaster) needs. Thus, its innovative multiple drone audiovisual (AV) shooting will provide novel media production functionalities.
User requirements. Broadcasters (DW, RAI) cooperated with the technical partners towards finalizing the user requirements for sports shooting. Three different shooting scenarios have been detailed. The requirements for audiovisual shooting with autonomous drones have been identified, analysed and detailed.

System specifications and design. The user requirements were translated into system specifications in terms of a) drone hardware and the ground station platform b) needed software functionalities. The required software modules were listed and their interfaces were detailed. Wireless communications infrastructure specifications were developed. Furthermore, the general architecture of cooperative mission planning and mission execution has been designed.

Mission planning and control. A specific novel language has been designed to allow the Director describe shooting missions. A centralized algorithm for planning the mission has been designed and on-drone software modules have been developed for mission execution. The tasks assigned to each drone are executed, when the associated event has been detected. In emergency situations, drones can compute a safe path to the closest landing spot. Drone cinematography has been modelled in terms of more than 20 shooting modes and various shot (framing) types. Furthermore, novel algorithms for formation control, autonomous trajectory tracking and multi-drone collision avoidance have been proposed.

Drone perception and visual analysis. 3D maps of the shooting site will be created from LIDAR data, images/video and information from on-board sensors. The work on semantic 3D map analysis and enrichment was combined with human crowd detection algorithms to provide semantic information (landing sites, crowd gathering regions etc). 6D drone localization is achieved, by comparing a known geometry or map with the data provided by LIDAR. Furthermore, research work on human-centered visual information analysis has progressed significantly. Novel deep learning architectures have been developed for cyclist detection, football player detection, boat detection, human crowd detection etc. The approaches combine fully-convolutional neural networks with novel regularizers that allow for lighter architectures. Novel research work has also been performed on visual quality assessment in various sports environments by employing realistic simulated videos and subjective testing.

Drone platform and ground station implementation. The first experimental drone hardware has been assembled and is currently being tested. The MULTIDRONE platform software implementation has also significantly advanced. The common interface for all modules has been finalized. Furthermore, the necessary flight supervisor and media (artistic) director GUIs have been designed. Drone2ground video streaming transmitted over LTE/4G has been achieved successfully in laboratory.

MULTIDRONE Dataset. The first version of the dataset has been prepared and made publiclly available. This dataset is mainly composed of UAV-related audiovisual material. A significant subset of this dataset has been annotated.

Ethics, safety and security. The University of Bristol (UoB) Faculty of Engineering Ethics Committee was appointed as the Independent Ethics Advisor. Procedures justifying the collection and/or processing of personal data as well as procedures for data collection, storage, protection etc were defined. Safety will be taken into account during the system specification step and through the procedures that have to be followed before, during and after the flights. Finally, privacy-preserving technologies have been researched.

Dissemination and communication activities. The project consortium laid out its detailed dissemination plan, created and populated its www page and produced 2 project newsletters. Moreover, the project created and usedof social media accounts with emphasis on Twitter and LinkedIn. Dissemination efforts included pu
The consortium has already produced several novel research results that go beyond the state of the art. On drone visual information analysis was focused on proposing novel lightweight deep neural networks . These approaches were designed for the tasks of target detection/recognition, crowd detection, target pose estimation, automatic map analysis. Additionally, new research directions have been explored, such as deep reinforcement learning for drone control, new neuron types having a paraboloid decision boundary and deep autoencoders for facial image analysis. Furthermore, research has been performed on real-time target tracking in sports videos, privacy-preserving technologies and drone localization without GPS.

In terms of media production planning, the project has developed a novel language for the description of shooting missions. These missions are translated into drone tasks for a mission planner. A novel media director dashboard GUI has been designed and is being developed. Novel algorithms for planning multi-drone task allocation and scheduling have been designed. A gimbal control solution for tracking of moving objects has also been designed and implemented. New algorithms for multiple drone collision avoidance have been developed and tested.

Regarding wireless communications, THALES LTE/4G and WiFi mesh technologies have been adapted to tackle the project needs. As such, the LTE/4G infrastructure is highly miniaturized, but still has high performance. A dedicated Quality of Service (QoS) management has been developed. Furthermore, the WiFi mesh allows drones to share their status as well as target status.

The project is expected to have a significant impact on the development of innovative multiple drone systems, which achieve measurable service level gains in AV shooting. It will have measurable improvements in the provision of multiple drone autonomy. Furthermore, it will set new frontiers for TV programme production and drone cinematography, while overcoming barriers due to regulations and improving public acceptance of this technology.
MULTIDRONE project overview.