Skip to main content

Mechanisms of collective predator detection and information transfer in African ungulates

Periodic Reporting for period 1 - UNGULATE (Mechanisms of collective predator detection and information transfer in African ungulates)

Période du rapport: 2017-08-01 au 2019-07-31

A central goal in the field of animal behavior is to understand the causes and consequences of animal sociality. Predation risk is one of the major drivers of group formation: animals in groups benefit from early detection of threats, which leads to a higher probability of surviving a predator attack. This phenomenon, known as "collective detection", arises from the monitoring behavior of individuals in the group, and relies on the efficient transfer of information from initial detectors to unaware group mates. Collective detection and intragroup information transfer are understudied phenomena, because traditional observation methods limit our ability to collect the high-resolution data needed to explore dynamic behavioral processes. Studying collective processes requires continuous, high-resolution data on all individuals within a group simultaneously. Conventional methods require observers to choose between collecting detailed data on one animal and collecting coarse-grained data on many animals. Video-based methods are used in laboratory studies to generate high-resolution data on animal group. However, these methods require high-contrast backgrounds and consistent lighting and are thus unsuitable for use in field studies where backgrounds are visually complicated and lighting conditions are uncontrollable.

Our research program had two goals. First, we aimed to develop video-based field methodologies and analytical tools that would enable us to efficiently film groups of wild animals and automatically extract behavioral data from these videos. Second, we aimed to use this data to explore the social, environmental and biological factors that drive individual vigilance behavior and affect the efficiency of information flow within groups.

Delays in acquiring drone operation permits resulted in delays to our research timeline; as such we were unable to complete our behavioral analyses during the fellowship. However, we continue to pursue these goals outside of the context of the fellowship and intend to produce scientific results in the coming year. This research is important to understand collective sensing in groups, but may also have important conservation implications: as wildlife populations are depleted and habitats are altered, the size, composition, and spatial ecology of animal groups may change in ways that affect their ability to effectively respond to threats. If collective detection no longer functions properly, the cost-benefit ratio of group formation may be affected, leading to changes in population dynamics and viability.
The objectives of our program were to answer two sets of scientific questions:
1. How does information regarding a threat flow from detectors of the threat to naïve individuals? Does the efficiency of information transfer vary between species? What social and environmental conditions enable or impede efficient information transfer?
2. In the absence of threats, what social and ecological factors affect individual vigilance behavior? Are there consistent differences in the vigilance behavior of individuals subject to conditions that favor effective information transfer compared to those found in conditions less conducive to reliable information transfer?

We used drones to collect aerial footage of wild animal groups. We carried out predator simulation experiments in which we presented humans on foot to ungulate groups in order to observe and record their responses. We conducted 8 experimental trials on Grevy’s zebra, 7 trials on plains zebra, 35 trials on impala, and 2 trials on other species. We collected observational data on 12 Grevy’s zebra groups, 18 plains zebra groups, 49 impala groups, 13 buffalo groups, and 10 groups of other ungulate species. After completing an observation, we used a drone to photograph the surrounding landscape in order to produce 3D models of the animals' habitat.

We manually annotated 50,000 individual animals in the video footage and used these annotations to train neural networks to automatically detect animals. Once individuals are detected in each frame, we stitch these locations together to achieve continuous tracks for all individuals. Compared to manual video scoring, our tracking software reduces human labor by a factor of more than 2000. We then project the locations from video frame space into geographic space and embed each animal into our 3D landscape maps. These software solutions were developed in collaboration with Benjamin Koger (Max Planck Institute of Animal Behavior).

We also developed DeepPoseKit, a software toolkit for estimating animal posture. This software was developed primarily by collaborator Jacob Graving (MPI of Animal Behavior) and is based on previous posture estimation methods used in neurobiology studies of laboratory animals. DeepPoseKit improves upon these methods by doubling processing speeds with no loss of accuracy. Our method is the first to be applied to UAV imagery of wild animals in complex natural environments. The details of this method and all software code are provided in a paper that has been accepted for publication in eLife.

We are using these software tools to extract behavioral data from our video footage. We have begun to analyze the data extracted so far in order to answer the questions posed above. Upon completion of data extraction and analysis, we will disseminate our findings in scientific journals.

Dissemination efforts achieved:
• a methods paper published in eLife
• 7 oral presentations to scientific audiences in 3 countries (Netherlands, USA, Germany)
• 2 poster presentations to scientific audiences in 2 countries (USA, Italy)
The project is among the first to develop and apply drone and automated image-based tracking methods to to field studies of behavioral ecology. This research field is moving toward the adoption of new technologies and quantitative methodologies, as evidenced by the focus of the 2019 Association for the Study of Animal Behaviour Summer Conference: “New Frontiers in the Study of Animal Behavior”. The ER presented two talks at this conference. The project has become a well-known example of the application of novel technological methods to the study of classic questions in animal behavior, and the ER is actively promoting such advances to professionals in her field. She has delivered 9 professional presentations thus far and was recently invited to participate in a special symposium at next year’s Animal Behavior Society Conference, entitled “Weaving the Future of Animal Behavior: Technological Advances to See the Unwatchable”. The ER’s research group is directly providing novel tools, as evidenced by the open access publication of the DeepPoseKit posture estimation software and code.

The behavior of individual animals is a basic driver of many higher-order processes, including population dynamics, interspecies interactions, speciation and extinction. Historically, animal behavior data has been laborious to collect, requiring trained researchers to spend many hours in the field directly observing animals. By developing tools to automate the collection of behavioral data, we are encouraging the use of large, objective datasets in the field of animal behavior. These tools will also make it easier for practitioners in adjacent fields, such as wildlife conservation and landscape management, to collect animal behavior data and generate actionable insights to aid in conservation initiatives.
Illustration of DeepPoseKit pose estimation software. Credit: Jacob Graving
Movement of a Grevy's zebra herd over a 45 minute period. Credit: Benjamin Koger