approved
Beyond Distributive Fairness in Algorithmic Decision Making

Beyond Distributive Fairness in Algorithmic Decision Making Feature Selection for Procedurally Fair Learning With widespread use of machine learning methods in numerous domains involving humans, several studies have raised questions about the potential for unfairness towards certain individuals or groups. A number of recent works have proposed methods to measure and eliminate unfairness from machine learning models. However, most of this work has focused on only one dimension of fair decision making: distributive fairness, i.e., the fairness of the decision outcomes. In this work, we leverage the rich literature on organizational justice and focus on another dimension of fair decision making: procedural fairness, i.e., the fairness of the decision making process. We propose measures for procedural fairness that consider the input features used in the decision process, and evaluate the moral judgments of humans regarding the use of these features. We operationalize these measures on two real world datasets using human surveys on the Amazon Mechanical Turk (AMT) platform, demonstrating that our measures capture important properties of procedurally fair decision making. We provide fast submodular mechanisms to optimize the tradeoff between procedural fairness and prediction accuracy. On our datasets, we observe empirically that procedural fairness may be achieved with little cost to outcome fairness, but that some loss of accuracy is unavoidable.

Tags
Data and Resources
To access the resources you must log in
Additional Info
Field Value
Creator Grgic-Hlaca, Nina
Creator Zafar, Muhammad Bilal
Creator Gummadi, Krishna P.
Creator Weller, Adrian
DOI 10.1145/3178876.3186138
Group Social Impact of AI and explainable ML
Publisher Machine Learning Group at the University of Cambridge
Source AAAI 2018
Thematic Cluster Other
system:type ConferencePaper
Management Info
Field Value
Author BRAGHIERI MARCO
Maintainer BRAGHIERI MARCO
Version 1
Last Updated 8 September 2023, 17:03 (CEST)
Created 5 April 2021, 18:32 (CEST)