European Commission logo
polski polski
CORDIS - Wyniki badań wspieranych przez UE
CORDIS

Protecting Personal Data Amidst Big Data Innovation

Periodic Reporting for period 1 - PROTECT (Protecting Personal Data Amidst Big Data Innovation)

Okres sprawozdawczy: 2019-08-01 do 2021-07-31

The overall goal of PROTECT is to grow a new generation of 14 Early Stage Researchers (ESRs), as PhDs graduated via a unique multidisciplinary, inter-sectoral and international European Training Network ETN). The network will investigate the protection of rights and interests of individuals and groups impacted by the continuous large-scale analysis of personal data, while still enabling the economy and society to benefit from rapid innovation in digital applications that collect and use this data. The PROTECT ESRs will implement a Personal Career Development Plan (PCDP) that will enable them to integrate and apply arguments, analyses and tools from across the fields of law, ethics and knowledge engineering, so that they can take on leading research and governance roles within digital services industry and public policy sectors to address challenges of data protection, data ethics and data governance.

The rate of technological innovation, now accelerated by big data and machine learning, increasingly outpaces public policy debate and the development of new regulation for the protection of personal data. This comes as the scale and social impact of data analysis is rapidly increasing. Tech companies, especially SMEs, face complex legal and ethical implications resulting from the collection of personal data from users. The pace of change and its complex technical nature serves to overload individuals and enterprises in considering the impact of use of their personal information, especially when this use also delivers attractive personalisation of services. PROTECT ESRs will develop new ways of empowering users of digital services, individually and collectively, to understand the risks they take with their rights and interests when they go online.
To date, the 14 Early Stage Researchers (ESR, 8 female, 6 male) have been recruited and commenced PhDs in law, philosophy, or computer science. COVID restrictions have meant that the training workshop planned for 2020 was conducted virtually, while the one in 2021 was conduct as a hybrid of virtual and a face-to-face workshop in Twente University. The schedule of academic and industrial secondments planned for the ESR cohort was severely disrupted by COVID restrictions and their impact on potential hosts. However, some virtual secondments were undertaken and on-site secondments are now commencing where restrictions permit, and in a compressed form where necessary. The ESRs have also embraced the new opportunities that the pandemic has presented for remote dissemination and communication though: virtual conferences and workshops; online posts, videos and podcasts; and public engagement through remote citizen-in events.

The technical research work of the PROTECT network is conducted through three multidisciplinary workpackages (WP1, 2 and 3). These workpackages combine ESR researching privacy law, the philosophy sub discipline of technology ethics and the computer science sub discipline of knowledge engineering.

WP1, “Privacy Paradigm”, focuses on how an organisation’s data processing intent and the expectations of its data subjects can better align by building consensus around standard forms for privacy policies, including human readable language, technical legal code and machine-readable code. WP2, “Ethics of Personalisation”, focuses on the strategic methodological concerns raised by new digital technology that builds an intimate representation of individuals to better tailor services to them but which also raises new risks to privacy and personal autonomy. WP3, “Personal Data Governance”, focuses on handling the uncertainty and risk involved in accurately informing and guiding the architectural and technological decisions that an organisation must undertake in responding to changes in business, information and technology context.

To date, workpackages undertook a detailed literature reviews in the problem domain from each discipline, captured uses cases and conduced further problem-specific investigations. WP1 identified the sufficient solutions exist for standard forms for privacy policies for use by organisations communicating with individuals, but not into privacy policies developed by communities of data subjects using decentralised personal online datastores to which organisations may seek access. It then used citizen engagement activities to explore attitudes to taking more communal control over privacy policies. WP2 initially examined conceptual issues related to the notion of personalisation, an overview of generic technical features of digital personalisation technologies, and a review of the main ethical and legal issues identified in the literature. To address the high level of uncertainty in the evolution of personalisation technologies, it undertook a foresight analysis of four such technologies as part of an anticipatory ethics assessment and introduced a concept-term model for the anticipatory ethics analysis of these technologies. WP3 assessed existing approaches to ethical and privacy risk assessment and based on that identified the need to address both structural injustices as well as individual harm. It developed an initial model that mapped this range of ethical concerns as captured by the EU High Level Expert Group to risk-based management systems that could be adopted by enterprises.
Progress has been made in standard form privacy policies that go beyond the current individual-focussed notice-and-consent model, to address the collective development of privacy policies enabled by the new generation of decentralised personal online datastores such as SOLID. This offers new opportunities for communal data governance in, for example, future data cooperatives that operate in the interests of communities of data subject, as envisage for trusted data intermediaries in the EU proposed Data Governance Act. This may provide a basis for GDPR’s data protection rights to be more directly and efficiently managed collectively by data subjects while also address emerging issues around group privacy more directly.

Progress in the ethics of personalisation advances the application of anticipatory technology ethics approaches to this important class of digital technologies. The use of open semantic models to capture the core concerns of such analyses, across variations in methodologies, allows for future cross methodology comparison and refinements as well as the development of findable, accessible, interoperable and reusable representations of personalisation technology ethics assessments. This may enable the more systematic study of the ethical and privacy concerns in a way that can keep pace with the accelerating capability and application scope of personalisation technology.

Progress in the governance of personal data advances the use of risk management models to address ethical and privacy risks of AI technology that must be assessed by enterprises. The use of open semantic model to capture such risk analyses can support future requirements for enterprises to undertake and document AI risk assessments under the EU’s proposed AI Act. These open semantic models also have the potential to contribute to the development of the harmonised standards required by the AI Act. The development of open risk management models also enables further study of structural injustices arising from enterprise use of AI, which may not be immediately addressed under the AI Act proposal.
Members of the PROTECT consortium at a workshop in University of Twente, Netherlands.