Periodic Reporting for period 1 - PROFILE (Safeguarding Equality in the European Algorithmic Society: Tackling Discrimination in Algorithmic Profiling through EU Equality Law)
Reporting period: 2020-09-05 to 2022-09-04
PROFILE has explored the role of law, and in particular EU equality law, in confronting these issues. Its overarching objective was to identify the key legal challenges linked to addressing algorithmic discrimination and to assess how the legal framework for equality protection in the European Union can adequately address machine-mediated inequalities. PROFILE has addressed the problem through three central inquiries:
1) How does algorithmic discrimination differ from human discrimination?
2) How do these new forms of technology-driven discrimination disrupt and challenge the anti-discrimination legal framework in place in the EU?
3) What regulatory solutions can be proposed within and outside EU anti-discrimination law in order to efficiently remedy algorithmic discrimination?
PROFILE explored the key legal challenges linked to addressing algorithmic discrimination and assessed how the legal framework for equality protection in the EU can adequately address machine-mediated inequalities.
• WP1 mapped the specific forms of discrimination arising from algorithmic profiling and the deployment of other algorithmic systems. It theorised the notion of 'algorithmic discrimination' by establishing a taxonomy of algorithmic harms. It confirmed that algorithmic systems generate and amplify complex forms of inequality.
• WP2 assessed how some forms of technology-driven discrimination escape the antidiscrimination legal framework in place. In particular, it showed how some forms of algorithmically induced behavioural discrimination, proxy discrimination and intersectional discrimination do not find an easy redress in EU equality law. In addition, WP2 showed how the normative equilibria anchored in EU anti-discrimination rules are disrupted by the ‘algorithmization’ of numerous areas of life. Problems arise, for instance, in relation to hierarchies of protection in EU equality law, uncertainties about the definition of protected grounds, the cardinal dichotomy between direct and indirect discrimination, liability issues in the human-machine relationship, and proof and comparability in the ‘black box’.
• WP3 explored how concepts of EU Equality Law could be re-visited to address algorithmic discrimination. In particular, it proposed that the notion of 'instruction to discriminate' be legally engineered to create a duty to reasonably prevent algorithmic bias from leading to discrimination and to clarify liability issues for algorithmic discrimination in machine-learning systems.
2) Dissemination
During the MSCA fellowship, I delivered more than 60 talks, trainings, seminars and guest lectures across Europe to disseminate the findings of PROFILE to various audiences including academics, law and policy professionals (lawyers, judges, CSOs, policy-makers…), students at undergraduate and postgraduate level, and the general public. Some of these dissemination activities were conducted online when Covid-19 regulations in place at the time did not allow me to travel, and some others were conducted in person when travel was possible.
3) Organisation of events
I organised 2 scientific events as part of PROFILE. The first, entitled ‘EU law through the lens of data science and computational approaches’, was organized at iCourts, University of Copenhagen, Denmark on 8 April 2022 and gathered 19 scholars and practitioners from Europe working in different areas of EU law, law and technology and data science. The objective was to discuss data science and computational analysis as a methodology for analyzing EU law and to critically reflect on new research directions in the field. The second event, entitled ‘The Law is not a decision tree’, was a research retreat that took place on 12-13 December 2022 at the University of Edinburgh, United Kingdom. The aim was to investigate the intersection of notions of bias and fairness as approached in computer science and notions of discrimination and equality as approached in law. The event gathered 5 researchers and allowed us to work together on a paper that translates these notions across disciplines.