Project description
Users’ browsing history linked to disinformation
The internet is a top source of information. However, social networks use algorithms to promote posts according to the users’ preferences. Since content is delivered based on what is more ‘interesting’ for a user, there is no guarantee it is the most accurate. The ERC-funded FARE_AUDIT project will develop a tool to audit online search engines. The aim is to increase transparency and reduce their impact on the spread of disinformation. Specifically, the project will generate different browsing histories and test how the algorithms of different search engines use this history to rank their suggested results. The findings will shed light on how browsing history influences search engine results, and the likelihood of being directed to disinformation.
Objective
The spread of disinformation is a serious problem that impacts social structure and threatens democracies worldwide. Citizens increasingly rely on (dis)information available online, either somewhat passively, through social media feeds, or actively, by using search engines and specific websites. In both scenarios, algorithms filter and select displayed information according to the users’ past preferences. There is a real risk that algorithms might reinforce the user’s beliefs and create (dis)information bubbles, by offering less divergent views, or even directing them to low-credibility content. For these reasons, serious efforts have been made to identify and remove “fake-news” websites and minimize the spread of disinformation on social media, but we have not witnessed equivalent attempts to understand and curtail the role of search engines. FARE_AUDIT addresses this imbalance and offers an innovative tool to audit search engines that can be broadly used. It will help to 1) better understand how browsing history influences search engine results, particularly the likelihood of being directed to disinformation, 2) create a system that democracy-promoting institutions and concerned citizens can use to identify new disinformation, in near real-time, and 3) breach information bubbles by simulating how search results would be different if users had a different online profile. By relying on web-crawlers, our tool is privacy-protecting and does not require any real user data. Moreover, the proposed system anticipates the announced shift from cookie-tracking to fingerprinting and takes advantage of the expected small time overlap between both systems to learn from both and broaden its scope. Overall, we expect this novel tool to have a meaningful social impact by increasing public awareness of the role of search engines on disinformation spread, and by equipping organizations with a tool to detect and monitor disinformation, especially in political contexts.
Fields of science
Keywords
Programme(s)
- HORIZON.1.1 - European Research Council (ERC) Main Programme
Funding Scheme
HORIZON-ERC-POC - HORIZON ERC Proof of Concept GrantsHost institution
3004 516 Coimbra
Portugal