Project description
Zooming in to rethink global health imagery
Global health images often contain biased stereotypes, reinforcing harmful views from colonial times. These stereotypes are learned and perpetuated by AI, worsening societal biases. Supported by the Marie Skłodowska-Curie Actions (MSCA) programme, the AIrbrush project will use advanced AI to generate and assess images to promote fair representation in global health visuals. Specifically, the project expands on purposeful generation and value-sensitive evaluation of AI-generated visuals. Its approach aims to decolonise global health and its visual culture by analysing, evaluating, and theorising the reproduction of biased depictions by AI. The project’s findings will inform academic articles, webinars, collaborations with the WHO AI and Ethics research group, and art exhibitions, catalysing change across sectors.
Objective
This project engages with the proliferation of abusive and biased stereotypes from colonial and humanitarian photography through generative AI technology, and investigates their consequences for science and society. This is a pressing issue: AI simultaneously absorbs and learns from real images, which, in the case of global health, have been marked by racism, coloniality and sexism, meaning that given images become a cluster for generative AI to learn from biased depictions and perpetuate negative stereotypes. Such cycles have to be studied and eliminated in order to move toward more equal postcolonial societies and promote a culture of value-sensitive depictions of vulnerable people. The project builds and greatly expands on the emerging methodology of purposeful generation and value-sensitive evaluation of AI-generated Global Health visuals, recently pioneered by Prof. Koen Peeters (the supervisor) and Dr. Alenichev, and encapsulated in a Lancet Global Health Article in August 2023. Offering a first-ever systematic study of AI-generated Global Heath visuals, AIrbrush sets five core objectives and asks: How should the international community account for generative AI as part of the internationally set goal of decolonizing Global Health and its visual culture, and tackle biased depictions of race, class, gender, and other socially enacted markers of similarity and difference? AIrbrush answers this question by analysing the substrate of the real global health images AI learns from, evaluates the learning progress and the reproduction and modification of such tropes by AI, theorizes this relationship and outlines societal outcomes with regard to the future of respectful depictions in the AI era. The findings from this study will be encapsulated as academic articles, a thematic webinar, a collaboration with the WHO AI and Ethics research group, and an art exhibition at ITM (the host) and in other places, among other outputs.
Fields of science
Programme(s)
- HORIZON.1.2 - Marie Skłodowska-Curie Actions (MSCA) Main Programme
Funding Scheme
HORIZON-TMA-MSCA-PF-EF - HORIZON TMA MSCA Postdoctoral Fellowships - European FellowshipsCoordinator
2000 ANTWERPEN
Belgium