Machine Learning Explainability Via Microaggregation and Shallow Decision Trees

Artificial intelligence (AI) is being deployed in missions that are increasingly critical for human life. To build trust in AI and avoid an algorithm-based authoritarian society, automated decisions should be explainable. This is not only a right of citizens, enshrined for example in the European General Data Protec- tion Regulation, but a desirable goal for engineers, who want to know whether the decision algorithms are capturing the relevant features. For explainability to be scalable, it should be possible to derive explanations in a systematic way. A common approach is to use simpler, more intuitive decision algorithms to build a surrogate model of the black-box model (for example a deep learning algo- rithm) used to make a decision. Yet, there is a risk that the surrogate model is too large for it to be really comprehensible to humans. We focus on explaining black-box models by using decision trees of limited depth as a surrogate model. Specifically, we propose an approach based on microaggregation to achieve a trade-off between the comprehensibility and the representativeness of the surro- gate model on the one side and the privacy of the subjects used for training the black-box model on the other side.

Data and Resources
To access the resources you must log in
  • BibTeXBibTeX

    The resource: 'BibTeX' is not accessible as guest user. You must login to access it!
  • htmlHTML

    The resource: 'html' is not accessible as guest user. You must login to access it!
Additional Info
Field Value
Creator Blanco-Justicia, Alberto,,
Creator Domingo-Ferrer, Josep
Creator Martinez, Sergio
Creator Sanchez, David
Group Ethics and Legality
Group Social Impact of AI and explainable ML
Publisher Elsevier
Source Knowledge-Based Systems, 2020, 194. Jg., S. 105532.
Thematic Cluster Privacy Enhancing Technology [PET]
system:type JournalArticle
Management Info
Field Value
Author Pozzi Giorgia
Maintainer Pozzi Giorgia
Version 1
Last Updated 8 September 2023, 18:26 (CEST)
Created 10 February 2021, 13:42 (CET)