approved
How the machine thinks. Understanding opacity in machine learning algorithms

This article considers the issue of opacity as a problem for socially consequential mechanisms of classification and ranking, such as spam filters, credit card fraud detection, search engines, news trends, market segmentation and advertising, insurance or loan qualification, and credit scoring. These mechanisms of classification all frequently rely on computational algorithms, and in many cases on machine learning algorithms to do this work. In this article, I draw a distinction between three forms of opacity: (1) opacity as intentional corporate or state secrecy, (2) opacity as technical illiteracy, and (3) an opacity that arises from the characteristics of machine learning algorithms and the scale required to apply them usefully. The analysis in this article gets inside the algorithms themselves. I cite existing literatures in computer science, known industry practices (as they are publicly presented), and do some testing and manipulation of code as a form of lightweight code audit. I argue that recognizing the distinct forms of opacity that may be coming into play in a given application is a key to determining which of a variety of technical and non-technical solutions could help to prevent harm.

Tags
Data and Resources
To access the resources you must log in
  • htmlHTML

    The resource: 'html' is not accessible as guest user. You must login to access it!
  • BibTeXBibTeX

    The resource: 'BibTeX' is not accessible as guest user. You must login to access it!
Additional Info
Field Value
Creator Burrell, Jenna, jburrell@berkeley.edu
DOI https://doi.org/10.1177/2053951715622512
Group Ethics and Legality
Group Social Impact of AI and explainable ML
Publisher SAGE Publications
Source Big Data & Society January–June 2016: 1–12
Thematic Cluster Other
system:type JournalArticle
Management Info
Field Value
Author Pozzi Giorgia
Maintainer Pozzi Giorgia
Version 1
Last Updated 8 September 2023, 18:30 (CEST)
Created 3 March 2021, 19:40 (CET)