Explanation of Deep Models with Limited Interaction for Trade Secret and Privacy Preservation

An ever-increasing number of decisions affecting our lives are made by algorithms. For this reason, algorithmic transparency is becoming a pressing need: automated decisions should be explainable and unbiased. A straightforward solution is to make the decision algorithms open-source so that everyone can verify them and reproduce their outcome. However, in many situations, the source code or the training data of algorithms cannot be published for industrial or intellectual property reasons, as they are the result of long and costly experience (e.g. this is typically the case in banking or insurance). We present an approach whereby individual subjects on whom automated decisions are made can elicit in a collaborative and privacy-preserving manner a rule-based approximation of the model underlying the decision algorithm, based on limited interaction with the algorithm or even only on how they have been classified. Furthermore, being rule-based, the approximation thus obtained can be used to detect potential discrimination. We present empirical work to demonstrate the practicality of our ideas.

Data and Resources
To access the resources you must log in
  • BibTeXBibTeX

    The resource: 'BibTeX' is not accessible as guest user. You must login to access it!
  • htmlHTML

    The resource: 'html' is not accessible as guest user. You must login to access it!
Additional Info
Field Value
Creator Domingo-Ferrer, Josep,,
Creator Pérez-Solà, Cristina
Creator Blanco-Justicia, Alberto
Group Ethics and Legality
Group Social Impact of AI and explainable ML
Publisher Association for Computing Machinery
Source WWW '19: Companion Proceedings of The 2019 World Wide Web Conference, May 2019, Pages 501–507
Thematic Cluster Privacy Enhancing Technology [PET]
system:type ConferencePaper
Management Info
Field Value
Author Pozzi Giorgia
Maintainer Pozzi Giorgia
Version 1
Last Updated 8 September 2023, 17:50 (CEST)
Created 10 February 2021, 13:15 (CET)