Project description
Justifying AI decision making
Artificial intelligence’s (AI) Black Box combined with machine learning (ML) is widely being used in automated decision-making. It is helping to manage data and quickly process decisions but these automated decisions can involve biases from collected data or be unfair. Furthermore, they can’t be explained nor transparent, depriving clients of the right to an explanation. The EU-funded XAI project aims to produce meaningful explanations for AI/ML systems. The research focuses on how to design transparency in ML models, how to produce controlled black-box explanations, how to reveal used data and algorithms, unfairness and causal relationships in processes. The project will also formulate ethical and legal standards for AI.
Objective
A wealthy friend of mine asks for a vacation credit card to his bank, to discover that the credit he is offered is very low. The bank teller cannot explain why. My stubborn friend continues his quest for explanation up to the bank executives, to discover that an algorithm lowered his credit score. Why? After a long investigation, it turns out that the reason is: bad credit by the former owner of my friends house.
Black box AI systems for automated decision making, often based on ML over (big) data, map a users features into a class or a score without explaining why. This is problematic for lack of transparency, but also for possible biases inherited by the algorithms from human prejudices and collection artefacts hidden in the training data, which may lead to unfair or wrong decisions.
I strive for solutions of the urgent challenge of how to construct meaningful explanations of opaque AI/ML systems, introducing the local-to-global framework for black box explanation, articulated along 3 lines: a) the language for explanations in terms of expressive logic rules, with statistical and causal interpretation; b) the inference of local explanations for revealing the decision rationale for a specific case; c), the bottom-up generalization of many local explanations into simple global ones. An intertwined line of research will investigate both causal explanations, i.e. models that capture the causal relationships among the features and the decision, and mechanistic/physical models of complex system physics, that capture the data generation mechanism behind specific deep learning models.
I will also develop: an infrastructure for benchmarking, for the users' assessment of the explanations and the crowdsensing of observational decision data; an ethical-legal framework, for compliance and impact of our results on legal standards and on the right of explanation provisions of the GDPR; case studies in explanation-by-design, with a priority in health and fraud detection.
Fields of science (EuroSciVoc)
CORDIS classifies projects with EuroSciVoc, a multilingual taxonomy of fields of science, through a semi-automatic process based on NLP techniques. See: https://op.europa.eu/en/web/eu-vocabularies/euroscivoc.
CORDIS classifies projects with EuroSciVoc, a multilingual taxonomy of fields of science, through a semi-automatic process based on NLP techniques. See: https://op.europa.eu/en/web/eu-vocabularies/euroscivoc.
You need to log in or register to use this function
We are sorry... an unexpected error occurred during execution.
You need to be authenticated. Your session might have expired.
Thank you for your feedback. You will soon receive an email to confirm the submission. If you have selected to be notified about the reporting status, you will also be contacted when the reporting status will change.
Keywords
Project’s keywords as indicated by the project coordinator. Not to be confused with the EuroSciVoc taxonomy (Fields of science)
Project’s keywords as indicated by the project coordinator. Not to be confused with the EuroSciVoc taxonomy (Fields of science)
Programme(s)
Multi-annual funding programmes that define the EU’s priorities for research and innovation.
Multi-annual funding programmes that define the EU’s priorities for research and innovation.
-
H2020-EU.1.1. - EXCELLENT SCIENCE - European Research Council (ERC)
MAIN PROGRAMME
See all projects funded under this programme
Topic(s)
Calls for proposals are divided into topics. A topic defines a specific subject or area for which applicants can submit proposals. The description of a topic comprises its specific scope and the expected impact of the funded project.
Calls for proposals are divided into topics. A topic defines a specific subject or area for which applicants can submit proposals. The description of a topic comprises its specific scope and the expected impact of the funded project.
Funding Scheme
Funding scheme (or “Type of Action”) inside a programme with common features. It specifies: the scope of what is funded; the reimbursement rate; specific evaluation criteria to qualify for funding; and the use of simplified forms of costs like lump sums.
Funding scheme (or “Type of Action”) inside a programme with common features. It specifies: the scope of what is funded; the reimbursement rate; specific evaluation criteria to qualify for funding; and the use of simplified forms of costs like lump sums.
ERC-ADG - Advanced Grant
See all projects funded under this funding scheme
Call for proposal
Procedure for inviting applicants to submit project proposals, with the aim of receiving EU funding.
Procedure for inviting applicants to submit project proposals, with the aim of receiving EU funding.
(opens in new window) ERC-2018-ADG
See all projects funded under this callHost institution
Net EU financial contribution. The sum of money that the participant receives, deducted by the EU contribution to its linked third party. It considers the distribution of the EU financial contribution between direct beneficiaries of the project and other types of participants, like third-party participants.
56126 PISA
Italy
The total costs incurred by this organisation to participate in the project, including direct and indirect costs. This amount is a subset of the overall project budget.