Description du projet
L’historique de navigation des utilisateurs lié à la désinformation
Internet est l’une des principales sources d’information. Cependant, les réseaux sociaux utilisent des algorithmes visant à promouvoir des messages adaptés aux préférences des utilisateurs. Étant donné que le contenu est diffusé en fonction de l’intérêt qu’il présente aux yeux de l’utilisateur, son exactitude ne saurait être garantie. Le projet FARE_AUDIT, financé par le CER, mettra au point un outil de contrôle des moteurs de recherche en ligne. Son objectif est de renforcer la transparence et de réduire l’incidence des moteurs de recherche sur la propagation de la désinformation. Plus particulièrement, le projet générera différents historiques de navigation et testera la manière dont les algorithmes de différents moteurs de recherche utilisent cet historique pour classer les résultats qu’ils proposent. Les conclusions du projet mettront en lumière la manière dont l’historique de navigation influence les résultats des moteurs de recherche et la probabilité d’être la cible de la désinformation.
Objectif
The spread of disinformation is a serious problem that impacts social structure and threatens democracies worldwide. Citizens increasingly rely on (dis)information available online, either somewhat passively, through social media feeds, or actively, by using search engines and specific websites. In both scenarios, algorithms filter and select displayed information according to the users’ past preferences. There is a real risk that algorithms might reinforce the user’s beliefs and create (dis)information bubbles, by offering less divergent views, or even directing them to low-credibility content. For these reasons, serious efforts have been made to identify and remove “fake-news” websites and minimize the spread of disinformation on social media, but we have not witnessed equivalent attempts to understand and curtail the role of search engines. FARE_AUDIT addresses this imbalance and offers an innovative tool to audit search engines that can be broadly used. It will help to 1) better understand how browsing history influences search engine results, particularly the likelihood of being directed to disinformation, 2) create a system that democracy-promoting institutions and concerned citizens can use to identify new disinformation, in near real-time, and 3) breach information bubbles by simulating how search results would be different if users had a different online profile. By relying on web-crawlers, our tool is privacy-protecting and does not require any real user data. Moreover, the proposed system anticipates the announced shift from cookie-tracking to fingerprinting and takes advantage of the expected small time overlap between both systems to learn from both and broaden its scope. Overall, we expect this novel tool to have a meaningful social impact by increasing public awareness of the role of search engines on disinformation spread, and by equipping organizations with a tool to detect and monitor disinformation, especially in political contexts.
Champ scientifique
Programme(s)
- HORIZON.1.1 - European Research Council (ERC) Main Programme
Régime de financement
ERC-POC - Proof of Concept GrantInstitution d’accueil
3004 516 Coimbra
Portugal