European Commission logo
polski polski
CORDIS - Wyniki badań wspieranych przez UE
CORDIS

Gestural Origins: Linguistic Features of pan-African Ape Communication

Periodic Reporting for period 3 - GESTURALORIGINS (Gestural Origins: Linguistic Features of pan-African Ape Communication)

Okres sprawozdawczy: 2022-03-01 do 2023-08-31

Language may be the most powerful social tool any species has evolved, we use it for physics and poetry, for gossip and jokes. Understanding the origins of language speaks to the fundamental question of what it means to be human. But what, if anything, makes human language unique? What did we need to communicate that took us beyond the systems of signals seen used by other species around us today? Many other species’ communication also contains rich exchange of nuanced information; but humans do more than broadcast information, we use it to share ideas and intentions that come into our minds with the minds of those around us.

It revolutionised our understanding of non-human communication when we discovered that great apes’ use their gestures to convey meaningful information in a similar language-like way: ape gesture is essential to understanding what language is, and how human language evolved. Beyond meaning, two core features of human language are social learning and syntactic structure. These are universals, present across cultures. We all learn words and how to use them from others, leading to languages and dialects. We all use syntax; expressing different meanings by recombining words. In fact, these two particular features are common in animal communication: sperm whales learn songs from others; finches re-order notes into different songs. But, in a significant evolutionary puzzle, both appear absent in the communication of our closest great ape relatives.

The discovery of meanings in ape gesture resulted from studying ape communication under the challenging natural environments that allowed for chimpanzees to fully express their system of communication. A single study of a single group: it was the tip of the iceberg. Employing pan-African data across 17 ape and 9 human groups the Gestural Origins project tackles three major objectives.

(1) Is there cultural variation in ape gesture? We recognise that to understand human behaviour, we must study people across diverse cultures and environments. Within the 8 subspecies of African great ape there are hundreds of groups with unique cultures, inhabiting habitats as diverse as rainforest and savannah. To investigate whether or not features of human language exist in the communication of non-human great apes we must compare ape communities, including humans, within and across populations on a new, pan-African scale. We will look at how their biological inheritance, their physical environment, and their social interactions affect how apes acquire and use gestures.

(2) When apes combine signals, does it change their meaning? Moving beyond sequential structure we will look at how apes combine signals to construct meaning, and how the speed, size, and timing of gestures impacts meaning. We will use our rare access to multi-generational ape datasets.

(3) Human-ape gesture. Ape gesture research to date, has neglected the one ape that may be crucial to addressing these questions: us. With two new approaches, we will turn the tables on comparative research using ape field-methods (focal follows, playback experiments) to investigate human behaviour. We will investigate adults’ and children’s use and understanding of gestures to compare them directly to other apes.
The Gestural Origins project and team have made substantial progress, breaking new ground in developing open-access methods and frameworks for the study of ape communication, and in the scale and scope of data collection on ape gesture. In addition to the core team (PI, Post-doc, 5 PhD students) we have built a collaborative network across 22 research sites and groups, we have conducted 9 expeditions to 6 remote field locations, we have collected and extracted over 31,000 videos, from which we've coded over 16,600 gestures across 8 sub-species from almost 500 ape signallers. In addition, we have presented our work to scientific and public audiences around the world (from small scale workshops in Oslo, Tanzania, and Uganda, to television documentaries that reached millions).

1. Our most significant achievement is in having established a dataset of wild ape gesture that is already an order of magnitude larger than anything that has existed for our field. Most papers to date contain in the order of several hundred up to several thousand gesture cases, and typically include data from 1-3 social groups. Our data set already approaches 17,000 gesture cases and with our current progress will include over 30,000 in the next 2-years. Across 8 (sub)species of apes we already have data from over 500 signallers, and 17 social groups. This has not simply been an exercise in refining existing understanding through more detailed data, the fundamental Objectives we want to address can only start to be explored with data at this scale.

2. We have developed new data collection protocols and direct-in-video coding methods that allow us to collect data across all species - including humans - living in a wide range of environments in a similar way. To date, most gestural research has focused on chimpanzees, here we used pilot data collection across other species, including gorillas, bonobos, and humans, to make sure we could develop a truly ape-wide approach. Doing so is not straightforward, a particular behaviour in a chimpanzee may represent aggression, while the same behaviour in a gorilla may represent something very different. It takes substantial work to be able to set up and verify a universal approach. Our principle is not to dictate that gesture and other forms of communication be coded in a specific way, but allow sufficient flexibility that users can adapt the scheme to their own needs while still allowing for direct comparison across projects and questions. We have built a robust new open-access tool in ELAN, where there is substantial flexibility built in, for example tiers that allow users to capture vocal or facial signals where they are interested in doing so, or tiers that specifically focus on intentional production, or gestural exchange. We have particularly focused on the exploration of gesture form, gesture meaning, and gesture structure. The templates are accompanied by a detailed user guide and example and training video sets. We anticipate publication of the coding tool in Spring 2022. Doing so will also allow us to incorporate additional collaborations with scientists who use this tool, allowing more detailed exploration of our planned Objectives now and into the future.

3. To date over 20,000 people have taken part in our online studies, hosted on our website (www.GreatApeDictionary.com) which also serves to make our research findings accessible to the public and to other researchers. For example anyone can go there to request access to collaborate with our video archives. To make these data-arks more accessible we recently published a searchable Open Data database for ~14,000 videos of chimpanzee behaviour. Our original Great Ape Dictionary experiment is now an online learning game, that has been widely used as a science communication tool, for example in popular television documentaries on the BBC and in classrooms around the world. We have a YouTube channel that hosts video examples of our ape gestures, which are regularly used by other researchers as well as by science educators. Our GitHub page github.com/Wild-Minds hosts our data and code.

4. Video coding is a gold standard across animal behaviour research; allowing researchers to extract rich behavioural datasets and validate reliability. Like others, we have access to large video data archives from which we can extract data on animal behaviour as it occurs in natural habitats. However, in practice, these videos are only useful if data can be efficiently extracted. Manually locating relevant footage in 10,000s of hours is extremely time-consuming, as is the manual coding of animal behaviour, which requires extensive training. Computer coding of animal behaviour makes locating and analysing data much faster and more reliable. However, until recently, machine learning tools called deep neural networks were only able to track behaviour in ‘clean’ predictable environments (stable and free of visual noise). In practice this made their use for fieldwork-based data impossible: wild animals live in visually noisy unpredictable environments. In 2021 we were able to train the first network models using DeepLabCut to locate and track ape body movements in wild handheld footage of great apes. We will publish DeepWild as an open-access tool in 2022 and hope that it will help to transform future research for ourselves and others.
Exploring the gestural communication of apes from different subspecies is not simply about building big data sets that give us more details about the questions we can ask by studying one or two groups. There are some questions that you can only answer at scale. We could not describe human language by only collecting data in St Andrews, Addis Ababa, or Shanghai - we need to understand the full range of ways in which it is expressed across cultures to see the human universals. Just like humans, the communities of other ape species can be small or large, egalitarian or despotic, cohesive or dispersed, live in wide open savannah or in dense rainforest. Each one will have its own unique social and physical environment, and moreover other apes also have their own unique culture. We need to be able to put all of these diverse pieces of the puzzle together to be able to see the road map along which human language evolved.

Gestural Origins gives us that road map, not only can we explore the similarities and differences between human language and ape gesture, but by building a rich detailed picture we can make new predictions about the communication of groups as yet unstudied - a robust way in which to test hypotheses about language evolution. We have already established a dataset of wild ape gesture that is already an order of magnitude larger than anything that has existed for our field, including 17,000 gesture cases and anticipated to reach 30,000 over the next 2-years. Across 8 (sub)species of apes we have data from over 500 signallers, and 17 social groups, and we expect to significantly extend this, for example covering not only additional social groups but multi-generational data within a social group. This has not simply been an exercise in refining existing understanding through more detailed data, the fundamental objectives we want to address can only start to be explored with data at this scale. Building a fully accessible video data archive will allow us to address current barriers present to research on wild ape behaviour, as well as providing an invaluable data-ark for species’ in urgent conservation crisis. To achieve this we are building new data collection and analysis tools that will provide an open-access framework for exploring questions of comparative communication across diverse species of primates and beyond.
Gesture types from the Great Ape Dictionary
Chimpanzee gesture image 4
Chimpanzee gesture image 3
Chimpanzee gesture image 1
Chimpanzee gesture image 2