European Commission logo
polski polski
CORDIS - Wyniki badań wspieranych przez UE
CORDIS

Profiling and targeting news readers – implications for the democratic role of the digital media, user rights and public information policy

Periodic Reporting for period 4 - PersoNews (Profiling and targeting news readers – implications for the democratic role of the digital media, user rights and public information policy)

Okres sprawozdawczy: 2020-02-01 do 2021-05-31

On the web, news media are using AI and data analytics to personalise news, and serve readers news stories and advertisements that match their individual interests. For example, the New York Times uses algorithms to give readers story recommendations based on their reading history, or to hide stories they have already read. European news organisations, like the BBC in the UK and de Volkskrant in the Netherlands, as well as newer players like Facebook, are experimenting with personalisation as well.
News personalisation can allow media companies to better serve their users by, for example, helping them deal with information overload or by serving them more interesting content. Monetizing the resulting increase in attention from users can also have financial benefits for the industry. However, the increasing personalisation of news also raises concerns about the interests of media users, the role of the media in society, and the media’s relationship with news readers. For example, what is the impact of AI and algorithms in newsrooms? What are the users’ concerns with regard to privacy or the diversity of the news they receive? Will personalization result in the creation of ‘filterbubbles’? Who controls the algorithm? And how can the media use AI and algorithms in a way that is responsible and respects the fundamental rights to privacy and freedom of expression?
In the PersoNews project, we combine legal and empirical research to answer these questions. We do surveys and talk to users to better understand their concerns but also where they may find personalization potentially useful. We interview journalists, technical experts in the media, and talk to regulators. These insights feed into the legal analysis where we inquire what the rights of users are, and if users need additional rights, but also how ‘algorithmic journalistic ethics’ could look like to guarantee that AI and data is used in the interest of users, society and the role of the media to inform.
In workpackage 1, we performed various empirical analysis (surveys, focus group research) of the impact of AI and algorithmic recommendations on users, their concerns, expectations and the potential for technical and/or legal solutions to alleviate these concerns. Also, we study the broader societal impact of this technologies on the public sphere, including the question whether or not personalised recommendations create filterbubbles, and if so, for whom.

In workpackage 2, we were among the first to perform in-depth interviews with newsrooms in Europe on how they use AI and algorithms to personalise news, but also: what the professional, economic and ethical considerations are when doing so.

In workpacakge 3, the focus is on the (fundamental) rights of users, in particular under freedom of expression law and the newly adopted GDPR.

Workpackage 4 has developped a unique normative framework to answer questions regarding explainability, and the information that the media should be required to provide to the audience, and the public.

Workpackage 5 has engaged in an extensive review of democratic theory and on the basis of these insights, what the democratic role of news recommendations can be. In the context of this framework we also developped the so far most advanced, normatively informed metrics for diverse recommender design, which we translate, in cooperation with media organisations, into a diversity toolkit that allows media corporations to assess the diversity of recommendations. We played an important role in testing the filterbubble hypothesis, and developing directions for future media law and policy to respond to the impact of AI and algorithms on media markets. We also had a prominent role in the debate about the regulation of platforms, and again, informed the debate with concrete suggestions for future policy. Because of our research and expertise, the Council of Europe commissioned us to summarize some of our main research findings in a report to the Ministerial Conference on 'AI and freedom of expression".
PersoNews is the first comprehensive project in Europe that studies news personalisation both from an empirical and a normative perspective. Our research into the concerns and expectations of users, but also: suggestions of how to give users more agency is an important contribution to the debate on the responsible use of AI and data analytics in the media. Our in-depth interviews with newsrooms contribute to our understanding of the impact of digital technolgies on journalistic roles and routines, but also more generally an understanding of journalistic values in the digital realm, and how digital technology can contribute to the democratic role of the media. In so doing, our research benefits from the expertise that we have build up in the area of fundamental rights, data protection and consumer law regulation, and our research helps formulating concrete solutions, which inform not only the academic debate, but also the activities of the media, new players such as social media platforms, regulators such as national regulatory authorities in Europe, the Council of Europe and the European Commission. Until the end of the project we expect to continue our work on user rights and concrete possibilities for giving users more agency; in cooperation with the media: define principles of journalistic algorithmic ethics; analyse the existing legal framework and develop, where needed, concrete solutions for the governance of algorithmic news recommendations in the media, and on platforms, and finally, informed by fundamental rights and democratic theories, develop a normative theory of personalised news.
helberger4.jpg