Most of us are familiar with information overload. But in an increasingly polarised society, is the ability to read only what interests us, from pre-selected sources, fanning the flames of selective bias and shutting down our ability to see both sides? The project PersoNews (Profiling and targeting news readers – implications for the democratic role of the digital media, user rights and public information policy), supported by the European Research Council, investigated the impact the trend for personalisation has on the role of digital media in society and how that can be assessed. Who controls the algorithms behind the content we see? What rights do users have? And how does personalisation impact on trust? The project’s principal investigator, Natali Helberger, is the distinguished university professor of Law and Digital Technology, with a special focus on artificial intelligence, at the University of Amsterdam.
A double-edged sword
“The public is keen to be better informed, both in terms of news quality and relevance, and also because they are interested in the diversity of recommendations,” notes Helberger. Many existing news recommender systems are designed to show content that matches the user’s preferences and to keep them on the site for longer to create the opportunity for targeted advertising. “These are legitimate goals of a news recommendation algorithm, but are short term, often informed by economic interests and not by a societal perspective. In other words, they are not embracing the role that recommenders could play in a diverse and healthy media landscape,” says Helberger. She explains that news recommendation algorithms which do not simply serve up more of ‘the same’, or try to increase clicks and advertising sells only, have great potential. The project brought together scholars from law, communication science, journalism studies and artificial intelligence to create a comprehensive view of news personalisation from the perspective of users, newsrooms, society and the law. The team devised surveys and focus group research to understand how users perceive and experience news personalisation, and what their concerns and expectations are. To gain an insight into the providers’ priorities, PersoNews designed interviews with newsroom professionals. “The insights from that research informed our work on defining emerging journalistic algorithmic ethics,” Helberger adds. “Empirical insights into users’ attitudes informed our legal exploration into the role of the law to address the concerns that users have, for example, issues surrounding personal data and privacy. We also conceptualised ways of realising more diversity in recommendations.”
Throughout the project the team worked with journalists, editors and data scientists from organisations such as the United Kingdom’s BBC, Europe’s RTL Group, the VRT in Belgium, the German ZDF, along with newspapers, such as the Dutch Volkskrant and Het Financieele Dagblad. “Through our legal and policy research on user rights and media regulation the project sought to contribute to ongoing debates on responsible use of AI in the media. We have shared our insights with policymakers, such as the European Commission, the Council of Europe and national governments in Canada, Germany, the Netherlands, Norway and the United Kingdom,” says Helberger. Among other outcomes, PersoNews has published an award-winning paper describing the democratic role of news recommenders. This has led to invitations to do follow-up research and has been the basis for several projects looking into the ‘diverse recommender’ model. It has also served as the basis for a Schloss Dagstuhl manifesto by a worldwide group of experts in the field.
PersoNews, personalisation, media, algorithms, democracy, journalism, digital media, artificial intelligence