Project description DEENESFRITPL Justifying AI decision making Artificial intelligence’s (AI) Black Box combined with machine learning (ML) is widely being used in automated decision-making. It is helping to manage data and quickly process decisions but these automated decisions can involve biases from collected data or be unfair. Furthermore, they can’t be explained nor transparent, depriving clients of the right to an explanation. The EU-funded XAI project aims to produce meaningful explanations for AI/ML systems. The research focuses on how to design transparency in ML models, how to produce controlled black-box explanations, how to reveal used data and algorithms, unfairness and causal relationships in processes. The project will also formulate ethical and legal standards for AI. Show the project objective Hide the project objective Objective A wealthy friend of mine asks for a vacation credit card to his bank, to discover that the credit he is offered is very low. The bank teller cannot explain why. My stubborn friend continues his quest for explanation up to the bank executives, to discover that an algorithm lowered his credit score. Why? After a long investigation, it turns out that the reason is: bad credit by the former owner of my friend’s house.Black box AI systems for automated decision making, often based on ML over (big) data, map a user’s features into a class or a score without explaining why. This is problematic for lack of transparency, but also for possible biases inherited by the algorithms from human prejudices and collection artefacts hidden in the training data, which may lead to unfair or wrong decisions. I strive for solutions of the urgent challenge of how to construct meaningful explanations of opaque AI/ML systems, introducing the local-to-global framework for black box explanation, articulated along 3 lines: a) the language for explanations in terms of expressive logic rules, with statistical and causal interpretation; b) the inference of local explanations for revealing the decision rationale for a specific case; c), the bottom-up generalization of many local explanations into simple global ones. An intertwined line of research will investigate both causal explanations, i.e. models that capture the causal relationships among the features and the decision, and mechanistic/physical models of complex system physics, that capture the data generation mechanism behind specific deep learning models.I will also develop: an infrastructure for benchmarking, for the users' assessment of the explanations and the crowdsensing of observational decision data; an ethical-legal framework, for compliance and impact of our results on legal standards and on the “right of explanation” provisions of the GDPR; case studies in explanation-by-design, with a priority in health and fraud detection. Fields of science natural sciencescomputer and information sciencesartificial intelligencemachine learningdeep learning Keywords automated decision making black-box models machine learning data mining statistical logic rules rule learning transparency right to explanation participatory platform Programme(s) H2020-EU.1.1. - EXCELLENT SCIENCE - European Research Council (ERC) Main Programme Topic(s) ERC-2018-ADG - ERC Advanced Grant Call for proposal ERC-2018-ADG See other projects for this call Funding Scheme ERC-ADG - Advanced Grant Host institution SCUOLA NORMALE SUPERIORE Net EU contribution € 915 500,00 Address PIAZZA DEI CAVALIERI 7 56126 Pisa Italy See on map Region Centro (IT) Toscana Pisa Activity type Higher or Secondary Education Establishments Links Contact the organisation Opens in new window Website Opens in new window Participation in EU R&I programmes Opens in new window HORIZON collaboration network Opens in new window Total cost € 915 500,00 Beneficiaries (3) Sort alphabetically Sort by Net EU contribution Expand all Collapse all SCUOLA NORMALE SUPERIORE Italy Net EU contribution € 915 500,00 Address PIAZZA DEI CAVALIERI 7 56126 Pisa See on map Region Centro (IT) Toscana Pisa Activity type Higher or Secondary Education Establishments Links Contact the organisation Opens in new window Website Opens in new window Participation in EU R&I programmes Opens in new window HORIZON collaboration network Opens in new window Total cost € 915 500,00 UNIVERSITA DI PISA Italy Net EU contribution € 1 022 000,00 Address LUNGARNO PACINOTTI 43/44 56126 Pisa See on map Region Centro (IT) Toscana Pisa Activity type Higher or Secondary Education Establishments Links Contact the organisation Opens in new window Website Opens in new window Participation in EU R&I programmes Opens in new window HORIZON collaboration network Opens in new window Total cost € 1 022 000,00 CONSIGLIO NAZIONALE DELLE RICERCHE Italy Net EU contribution € 562 500,00 Address PIAZZALE ALDO MORO 7 00185 Roma See on map Region Centro (IT) Lazio Roma Activity type Research Organisations Links Contact the organisation Opens in new window Website Opens in new window Participation in EU R&I programmes Opens in new window HORIZON collaboration network Opens in new window Total cost € 562 500,00