Data Article / Data Paper

Automated face recognition assists with low-prevalence face identity mismatches but can bias users

Details

Citation

Mueller M, Hancock PJB, Bobak AK, Cunningham EK, Watt RJ & Carragher D (2024) Automated face recognition assists with low-prevalence face identity mismatches but can bias users.

Abstract
We present three experiments to study the effects of giving information about the decision of an automated face recognition (AFR) system to participants attempting to decide whether two face images show the same person. We make three contributions designed to make our results applicable to real-word use: participants are given the true response of a highly accurate AFR system; the face set reflects the mixed ethnicity of the city of London from where participants are drawn; and there are only 10% of mismatches. Participants were equally accurate when given the similarity score of the AFR system or just the binary decision but shifted their bias towards match and were over-confident on difficult pairs when given only binary information. No participants achieved the 100% accuracy of the AFR system, and they had only weak insight about their own performance.

Keywords
attitudes towards AI; automated face recognition; decision making; deep neural networks; face matching; face recognition

Journal
British Journal of Psychology

StatusEarly Online
FundersExperimental Psychology Society
Publication date online30/11/2024
Date accepted by journal08/10/2024
URLhttp://hdl.handle.net/1893/36533
ISSN0007-1269
eISSN2044-8295

People (4)

Dr Anna Bobak

Dr Anna Bobak

Senior Lecturer, Psychology

Dr Daniel Carragher

Dr Daniel Carragher

Research Assistant, Psychology

Miss Emily Cunningham

Miss Emily Cunningham

Tutor (ASF), Psychology

Professor Roger Watt

Professor Roger Watt

Emeritus Professor, Psychology

Files (1)