European Commission logo
English English
CORDIS - EU research results

ALgorithms EXposed. Investigating Automated Personalization and Filtering for Research and Activism

Periodic Reporting for period 1 - ALEX (ALgorithms EXposed. Investigating Automated Personalization and Filtering for Research and Activism)

Reporting period: 2018-12-01 to 2020-02-29

Personalization algorithms—filtering digital content on the basis of someone's profile—increasingly mediate the web experience of users. By forging a specific reality for each individual, personalization algorithms silently shape customized "information diets": in other words, they determine which news, products, but also opinions and rumors users are exposed to. Restricting users possibilities, they ultimately infringe on their agency. As exposed by the Cambridge Analytica scandal (2018), personalisation online is supported by the questionable data sharing practices at the core of the business models of the social media industry. Yet, personalization algorithms are proprietary and thus remain inaccessible to end users. The few experiments auditing these algorithms rely on data provided by platform companies themselves. They are highly technical, hardly scalable, and fail to put social media users in the driver seat.

The ALgorithms EXposed (ALEX) project aims at unmasking the functioning of personalization algorithms on social media and shopping platforms. It is 'data activism' in practice, as it uses publicly available data for awareness raising and citizen empowerment. ALEX pursues four goals: i) open-source software development and stabilization: building on the alpha version of (fbtrex), ALEX developed working prototypes of browser extension analyzing the outcomes of Facebook's News Feed algorithms ( content delivery algorithms on YouTube ( and Amazon's dynamic pricing algorithms ( It also documented the corresponding methodology that allows others to customise the software to study other platforms; ii) the release of two spin-off products for fbtrex, respectively a series of Jupyter notebooks enabling researchers and advanced users to do expert analysis on algorithmic biases, and the so-called Dashboard, allowing low-skill users to monitor their own social media consumption patterns; iii) the design and organization of data literacy modules on algorithmic personalization, in collaboration with two nongovernmental organisations, and iv) testing the feasibility of a business model based on the provision of customised software and consultancy service to promote tool take-up and the future sustainability of the project.