Article
Digital Version
Download | Copy/paste | Printing

Algoritmi decisionali e discriminazione percepita : uno studio sugli audits di sicurezza nei sistemi di self-checkout della Gdo.

2021 - Franco Angeli

118-137 p.

  • The increasing diffusion in many important sectors of algorithmic decision systems, often generated by machine learning, has led to the study of the possible biases that may be introduced in the various phases of creation implementation and use of these algorithms. The possibility of discriminatory biases, whose identification is made difficult by the often opaque nature of algorithms, preventing a direct analysis of their internal logic, appears especially relevant. After an introductory overview of predictive and decisional algorithms and the different types of algorithmic bias, some recent cases of algorithmic discrimination will be described.
  • Then we will present a field study, conducted in the city of Pisa in November 2019, about the perception by foreign customers of discrimination in selfscan and selfcheckout security audits in supermarkets. The study showed how algorithmic discrimination is not felt strongly as an issue by foreign respondents, even when faced with the possibility of discriminatory profiling and with a low satisfaction about selfcheckout systems. [Publisher's text].

Is part of

Sociologia e ricerca sociale : 125, 2, 2021