approved
Bias in algorithmic filtering and personalization

Online information intermediaries such as Facebook and Google are slowly replacing traditional media channels thereby partly becoming the gatekeepers of our society. To deal with the growing amount of infor- mation on the social web and the burden it brings on the average user, these gatekeepers recently started to intro- duce personalization features, algorithms that filter infor- mation per individual. In this paper we show that these online services that filter information are not merely algorithms. Humans not only affect the design of the algorithms, but they also can manually influence the filtering process even when the algorithm is operational. We further analyze filtering processes in detail, show how personalization connects to other filtering techniques, and show that both human and technical biases are present in today’s emergent gatekeepers. We use the existing litera- ture on gatekeeping and search engine bias and provide a model of algorithmic gatekeeping.

Tags
Data and Resources
To access the resources you must log in
  • BibTeXBibTeX

    The resource: 'BibTeX' is not accessible as guest user. You must login to access it!
  • htmlHTML

    The resource: 'html' is not accessible as guest user. You must login to access it!
Additional Info
Field Value
Author Bozdag, Engin, bozdag@tudelft.nl
DOI https://doi.org/10.1007/s10676-013-9321-6
Group Ethics and Legality
Publisher Springer
Source Ethics and Information Technology volume 15, pages209–227(2013)
Thematic Cluster Text and Social Media Mining [TSMM]
system:type JournalArticle
Management Info
Field Value
Author Pozzi Giorgia
Maintainer Pozzi Giorgia
Version 1
Last Updated 19 July 2022, 16:32 (CEST)
Created 9 February 2021, 13:01 (CET)