Skip to main content
Weiter zur Homepage der Europäischen Kommission (öffnet in neuem Fenster)
Deutsch Deutsch
CORDIS - Forschungsergebnisse der EU
CORDIS

Polarizing Chats? Political Misinformation on Discussion Apps in India and Brazil.

Periodic Reporting for period 2 - POLARCHATS (Polarizing Chats? Political Misinformation on Discussion Apps in India and Brazil.)

Berichtszeitraum: 2023-07-01 bis 2024-12-31

The goal of ERC POLARCHATS is, broadly speaking, to understand the causes and consequences of exposure to misinformation in two large social media markets representative of the current “infodemic” in the global South.
The misinformation crisis constitutes one of the most important challenges in countries such as India and Brazil. Both countries have over the past few years emerged as hotspots in the global misinformation crisis. In recent years, messages containing misinformation about electoral security, terrorism, Covid-19, and vaccine safety have circulated widely in both countries, arguably leading to a slay of problematic behaviors offline, and possibly affecting the outcome of elections (Sinha et al 2019). As argued by the Indian state itself, both partisan and non-partisan threads often contain fake or erroneous news, many of which are suspected to lead to offline crimes. Mob lynchings incidents triggered by hateful messages circulated on the platform have especially attracted attention, both in India and Brazil. These cases have been well documented in prominent international press outlets, with some observers referring to misinformation as a major “public health crisis”. All of this points to the need for independent, unaffiliated researchers to investigate this issue, as such research would be in the public interest.
POLARCHATS accordingly has four objectives:


1. Analyze why and how political actors specifically choose to take to discussion apps and use them. POLARCHATS’ first objective is largely descriptive. Indian and Brazilian political parties have over the past few years developed their organizations to include additional low-level workers in charge of social media (Chauchard 2020; Magenta et al 2018). These individuals typically create WhatsApp groups that they use to communicate about the performance of their party, and often, to disseminate misinformation (Evangelista and Bruno 2019). The goal of this work package is to document the structure of this important new appendage to traditional organizations and to better understand the motivations of party strategists. Who are the social media workers of parties? What are their motivations? What are their leaders’ motivations as they strategically choose to invest in WhatsApp, as opposed to other media? How are these alleged “misinformation machines” organized?


2. Quantify the amount of political misinformation circulating on political/partisan and non-political/non-partisan WhatsApp groups, in WhatsApp’s two biggest markets (India and Brazil). Little more than anecdotal evidence exists to confirm the idea that political misinformation abounds on WhatsApp, especially during elections. As noted by Tucker et al (2018), we need more precise, more comparative and more systematic evidence of both the extent and the nature of misinformation on discussion apps. Specifically, encryption means that we currently lack evidence on several metrics:
1. the total proportion of misinformed content (relative to other content).
2. the proportion of that misinformed content that is political, electoral or identity-related in nature (as opposed to other content, for instance entertainment).
3. the proportion of discussions/threads that are organized along political or partisan lines.
4. the proportion of misinformed content to appear in various formats (text, visuals, memes etc).
5. the proportion of content focusing on each likely “hot button topic” for misinformation (for instance: probity of political leaders, corruption, ethnic and religious relations, racism, electoral integrity).
In light of encryption, POLARCHATS develops an innovative strategy to overcome practical and ethical concerns attached to the collection of discussion app data, as described in the methodology section.


3.Examine the causal determinants of belief in political misinformation circulated on WhatsApp. In order to gain further understanding of political misinformation on discussion apps, POLARCHATS will identify the causal determinants of individual-level belief in information and misinformation. When misinformation appears on a thread, why and when are users likely to believe it? This part of the project will rely on two innovative research designs (described at length in methodology section below). It will test, among other hypotheses, whether variations in the intensity of ties between users (as measured by their offline ties and shared identities), in their similarity along politically relevant characteristics and in the size of groups predicts their propensity to believe misinformation.


4.Examine the effects of political misinformation on the range of downstream attitudes in box 3 (Fig.1). In order to gain further understanding of political misinformation on chat apps, the team will examine the downstream effects of common political misinformation in each country.

In the aforementioned journalistic view, it is argued that misinformation, because some of the citizens who are exposed to it believe it, leads to dramatic changes in these political attitudes and behaviors. Yet we so far have no causal evidence credibly establishing that misinformation does have an impact on these downstream political attitudes, and even less evidence establishing that the mechanism resembles this causal pathway. Worse, some results openly challenge this argument: misinformation appears to have only limited effects on citizens’ levels of political knowledge (Allcott & Gentzkow, 2017) or on participation (Guess et al 2020). For this reason, this section of the project will explore the extent to which exposure to common political misinformation does impact three key sets of political attitudes that may be affected by the misinformation circulating on WhatsApp in India and Brazil:
1. Hatred/antagonism / targeted social groups – minority and disadvantaged groups especially.
2. Support for non-democratic tactics and values – for instance: vigilantism, extra-judicial killings and violent activism.
3. Support for party/leader disseminating misinformation.

As defined in the initial grant agreement between UC3M and the ERC, there are 4 work packages.

Work Package 1 will rely on qualitative Interviews with Party Workers and Other Stakeholders to address the first objective.

Work Package 2 relies on a novel, GDPR-compatible strategy for data donation to collect social media data from group chats in both countries to address objective 2.

Work packages 3 and 4 rely on survey-experiments and field-experiments to address objectives 3 and 4.
The PI along with co-authors (Niloufer Siddiqui of University of Albany and Sumitra Badrinathan of American University) has started running a series of experiments in India as part of WP3. One paper from this project is now under review at the discipline’s foremost journal, the American Political Science Review.
Since this is a project that requires a number of building blocks, most of the PI’s activity have however been focused on developing the capacity of the project to attain its ambitious objectives and be eventually successful by:
1. Fulfilling the ethical obligations put forward in the ERC’s ethical review. We have also formed a project-specific ethical review board.
2. Developing the technical tools that will either be necessary to collect the relevant data. This is true especially with regards to WP2, which required building a web interface allowing respondents to easily and rapidly donate their social media data.
- This technical challenge task has required extensive collaboration with a number of co-authors (the main one is Kiran Garimella of Rutgers Uiversity) and subsequently hiring
and directing the work of a series of contractors/programmers.
- It has later required a number small-scale pre-testing and testing exercises to guarantee the reliability of the tool. The tool is now more or less ready to be released and to be
used in the context of the WP2 data collection.
- It has finally required that the PI engage in a series of sustained and complex discussions with ethical committees and data protection officers in Spain, Brazil and India in
order to ensure that the tool was GDPR—compatible (it now officially is both in India and Brazil). And the project is undergoing an ethical review at the ERC as we speak.
3. Developing partnerships with Brazilian and Indian higher education institutions.
- The PI has over the course of two trips in Brazil developed the capacity of the project and of the UC3M to run WP2 in Brazil by developing a formal partnership with FGV-ECMI,
the country’s best media and communication school (based in Rio de Janeiro), hereby also instituting a collaboration with several researchers there (Amaro Grassi and Victor
Piaia) and by visiting colleagues at the Federal University of Pernambuco in Recife.
- This guarantees that the project involves institutions in these countries and guarantees that the PI can obtain ethical committee approval.
4. Identifying and meeting contractors in India and Brazil that will be able and interested in executing the many tasks necessary for the data collection of WP 2 and WP4.
5. Developing the research designs of the large, complex experiments that are at the core of WP4.
- The research design for this part of the project was developed as part of a collaboration with Rajeshwari Majumdar of NYU.
- It is now ready to be submitted for ethical committee review. The pre-analysis plan will be presented in conferences in the fall.
- We plan to run the India version of the experiment over the winter of 2024. The Brazil component will be run at a later date.
6. Hiring team members. By the end of the summer, the team will formally count the PI, two post-docs (Paul Atwell and Fernando Mello) and a doctoral student at UC3M (Rodrigo, in addition to a number of external collaborators and co-authors and to the project manager at UC3M (Sergio Torres)
7. Developing collaborations and enlisting co-authors in aforementioned tasks 1 and 2.
Main co-authors and collaborators on the project so far are:
• Niloufer Siddiqui, University of Albany NY
• Sumitra Badrinathan, American University
• Rajeshwari Majumdar, New York University
• Kiran Garimella, Rutgers University
• Natalia Bueno, Emory University
• Tiago Ventura, Georgetown University
• Victor Piaia, FGV-ECMI
• Amaro Grassi, FGV-ECMI
• Nara Pavaõ, UFPE.

As noted elsewhere, the PI (Simon Chauchard) has also extensively presented the content of the project, and in some cases design plans, in relevant research networks in Europe, Brazil and the USA.
While we have (lightly) amended the grant to reflect the fact that we need more external, technical contractors and fewer full-time personnel to successfully execute the grant, plans remain very much in line with what the grant agreement specified. And we remain confident that achieving the aforementioned objectives constitutes important milestones improving the current state of the art in the discipline.
What we have so far achieved as part of WP2 is likely the best example of this.
While high-quality evidence about Facebook and Twitter users’ “information diets” now exists (Guess et al n.d. Barbera et al 2015), there is currently NO data available anywhere about how much misinformation circulates (or how it circulates) on discusión apps / group chats. Besides the ethical and technical challenges that researchers face are immense. Encryption means that we currently lack evidence on several crucial metrics: 1.the total proportion of misinformed content (relative to other content) 2. the proportion of that misinformed content that is political, electoral or identity-related in nature (as opposed to other content, for instance entertainment) 3. The proportion of discussions/threads that are organized along political lines. 4. the proportion of content focusing on each likely “hot button topic” for misinformation (for instance: probity of political leaders, corruption, ethnic and religious relations, racism, electoral integrity, etc…).

All of this points to the need for independent, unaffiliated researchers to develop innovative and ethics-complient strategies to investigate this issue.
This is what this project allows us to do. We developed a large-scale data donation program for WhatsApp data. In this program, our general strategy is to ask users to donate some of their WhatsApp data for social science research.
As we do so, we provide them with extensive guarantees regarding privacy and anonymization and highlight that their (anonymized-at-the-source) data will at no point be shared beyond the main member of the research team (for now, only Simon Chauchard; though this will in the lead up to the research be extended to one or two more UC3M employees subject to the same restrictions and code of conduct). Importantly, in the design we outline below, ALL field staff (enumerators and local partners) will have no access to the data collected. To ensure that this is the case, they will sign a data-processing agreement (as per GDPR article 28) clearly outlining their action, identifying a number of restrictions, and penalizing data-processing behaviors outside of the scope of their mission. While field staff make the data collection possible, the data never transit through their own devices (they are instead instantly encrypted and uploaded to a secured server that only the PI has access to) nor do they subsequently have access to the server on which the data are securely stored. We also refrain from asking users to donate one-on-one threads, and concentrate on group threads, to limit privacy concerns (While large group threads are technically private threads, many of them are in practice public or semi-public in India and Brazil, as both public and private actors add users on groups that are very often large without properly asking for their consent. This type of group is our main target in this research project).
The key technical innovation of the project is to make this donation process relatively seamless for consenting users, so that they may donate the data they wish to donate with minimal effort, with the assistance of one of our research associates (who once more, will assist, but never access the data).

We expect WP1 (so-far untouched) as well as the final stages of WP3 and WP4 will yield equally beneficial and innovative results.
Mein Booklet 0 0