Skip to main content
Go to the home page of the European Commission (opens in new window)
English English
CORDIS - EU research results
CORDIS

Development and Mass-dissemination of Intervention to Mobilize Pro-social Bystander Reactions to Hostile Content on Social Media

Periodic Reporting for period 1 - STANDBYCOMMS (Development and Mass-dissemination of Intervention to Mobilize Pro-social Bystander Reactions to Hostile Content on Social Media)

Reporting period: 2023-11-01 to 2025-04-30

In recent years, online hostility—ranging from hate speech to disinformation and incitement—has emerged as a major societal challenge. It undermines democratic dialogue, threatens the well-being of individuals, and erodes trust in public institutions. While social media platforms have introduced various moderation tools, many are now reducing investments in professional content moderation and fact-checking. Instead, they are increasingly relying on users themselves to flag, counter, or respond to harmful content—a strategy often referred to as "crowd-moderation."
However, research consistently shows that most users remain passive bystanders when confronted with hostile online content. Only a small, vocal minority actively engage, and unfortunately, this minority is often composed of those spreading negativity. This imbalance leaves online spaces vulnerable and creates an urgent need for tools that empower a broader spectrum of users to participate in maintaining a respectful digital environment.
The STANDBYCOMMS project was launched to address this challenge. Its core objective was to develop, test, and disseminate a scalable, behaviorally informed intervention that motivates ordinary users to take simple, constructive actions when they witness hostility online. Instead of focusing on punitive moderation or censorship, the intervention—called “Speak up, Report, Support”—promotes a pro-social bystander approach. It encourages users to report harmful content, express disagreement in respectful ways, or support those targeted by hostility.
By grounding the intervention in behavioral science and testing it in real-world settings, the project aimed to create a practical, evidence-based tool that could be adopted by media outlets, public communicators, and civil society actors.
The expected impact of the project is twofold: first, to make online platforms more respectful through increased user participation in moderation; and second, to shift the broader culture of online engagement toward empathy and responsibility.
The STANDBYCOMMS project focused on refining, testing, and disseminating a behavioral intervention—“Speak up, Report, Support”—designed to encourage pro-social bystander responses to hostile content on social media. The intervention is grounded in Protection Motivation Theory, aiming to increase users’ motivation and perceived ability to engage in constructive actions such as reporting harmful content or engaging in counter-speech.
The project began with qualitative research, including 18 semi-structured interviews with end-users such as journalists, NGO staff, and politicians in Denmark. These interviews informed the adaptation of the intervention by identifying key implementation concerns, such as the need for shorter formats, customizable design, and concerns about backlash from hyper-engaged users.
Informed by these findings, the project team—working closely with behavioral science experts at the Melbourne Centre for Behaviour Change—developed a suite of six intervention tools: three video formats (74s, 30s, and 15s) and three matching infographics. The shorter formats retained theoretical coherence while improving usability and shareability. A structured feedback survey distributed at an international behavioral science conference confirmed the intervention’s practical and theoretical strengths.
To test effectiveness of new and improved versions of the intervention, the project conducted a large-scale survey experiment with nationally representative samples in the US and Australia. Results showed that all versions increased participants’ willingness and confidence to respond to online hostility, with the original long-form video being most effective overall.
The intervention was further field-tested in a real-world experiment involving Danish local politicians. Politicians were randomly assigned to post the intervention on some days and not others. While the study showed some evidence of increased constructive counter-speech, statistical power was limited due to low participation and non-compliance with posting protocols.
To address these limitations, a second field experiment was launched in collaboration with the national media outlet TV 2, using a more robust design. Here, the intervention is automatically posted in response to algorithmically detected hostility, ensuring timely and targeted delivery. This design is expected to produce more reliable data on intervention effectiveness at scale.
A key technical achievement was the development of the onlinestandby.com platform, where end-users can access all intervention tools, along with implementation guidance and evidence summaries.
Together, these activities provide a scientifically validated, scalable approach to activating wider user participation in online content moderation, with the potential for substantial societal impact in combating online hostility.
The STANDBYCOMMS project has advanced the state of the art by developing and validating a novel, evidence-based intervention that addresses a key gap in current strategies for tackling online hostility: the activation of ordinary users as constructive bystanders. While most existing approaches focus on platform-level moderation or direct confrontation with hostile users, “Speak up, Report, Support” shifts attention to encouraging subtle, pro-social actions such as reporting content or engaging in counter-speech—actions that are more acceptable for the broader public.
A core innovation lies in the integration of behavioral science theory with real-world implementation. The intervention tools were developed through iterative collaboration with international experts, refined through end-user feedback, and validated through both experimental and field-based research. This dual focus on scientific rigor and practical usability sets the intervention apart from most existing materials, which are often neither theory-driven nor systematically evaluated.
The project has also delivered an open-access online platform (onlinestandby.com) enabling broad dissemination of the tools and lowering the barrier for adoption by media outlets, public communicators, educators, and civil society actors. The platform provides not only intervention materials in multiple formats (and soon languages) but also implementation guidance and evidence summaries.
For further uptake and success, several needs must be addressed. First, continued field-testing is required to generate robust, generalizable evidence across diverse contexts. Second, broader dissemination—supported by targeted marketing—will be key to scaling adoption. Third, the development of automated delivery systems, such as AI-driven identification of hostile content, is an interesting venue to further explore. Fourth, uptake and impact can be furthered via the development of an educational component aimed at fostering early awareness and skills for pro-social bystander behavior among youth. Finally, further funding and policy support will help bridge the gap between proof-of-concept and large-scale impact.
In sum, STANDBYCOMMS has delivered a scientifically validated, user-friendly intervention with strong potential to improve digital civility. The project’s outcomes include a full suite of multilingual intervention tools, tested formats for dissemination, and a foundation for future educational and technological integration.
standbycomms-1.png
standbycomms-2.png
standbycomms-3.png
My booklet 0 0