Can an algorithmic system be a 'friend' to a police officer's discretion? ACM FAT 2020 translation tutorial

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

This tutorial aims to increase understanding of the importance of discretion in police decision-making. It will guide computer scientists, policy-makers, lawyers and others in considering practical and technical issues crucial to avoiding the prejudicial and instead develop algorithms that are supportive - a 'friend'- to legitimate discretionary decision-making. It combines explanation of the relevant law and related literature with discussion based upon deep operational experience in the area of preventative and protective policing work.

Autonomy and discretion are fundamental to police work, not only in relation to strategy and policy but for day-to-day operational decisions taken by front line officers. Such discretion 'recognizes the fallibility of interfacing rules with their field of application.' (Hildebrandt 2016). This discretion is not unbounded however and English common law expects discretion to be exercised reasonably and fairly. Conversely, discretion must not be fettered unlawfully, by failing to take a relevant factor into account when making a decision, or by abdicating responsibility to another person, body or 'thing'. Algorithmic systems have the potential to contribute to factors relevant to the decision in question at the point of interaction between their outputs and the real-world outcome for the victim, offender and/or community.

Algorithmic decision tools present a number of challenges to legitimate discretionary police decision-making. Unnuanced outputs could be highly influential on the human decision-maker (Cooke and Michie 2012) and may undermine discretionary power to deal with atypical cases and 'un-thought of' factors that rely upon uncodified knowledge (Oswald 2018).

Practical and technical considerations will be crucial to developing MLA that are supportive to discretionary decision-making. These include the methodological approach, design of the humancomputer interface having regard the decision-maker's responsibility to give reasons for their decision, the avoidance of unnuanced or over-confident framing of results, understanding of the policing context in which the MLA will operate, and consideration of the implications of organisational culture and processes to the MLA's influence.
Original languageEnglish
Title of host publicationFAT* '20: Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency
Place of PublicationNew York
PublisherACM
Pages698
Number of pages1
ISBN (Electronic)9781450369367
DOIs
Publication statusPublished - 27 Jan 2020

Fingerprint Dive into the research topics of 'Can an algorithmic system be a 'friend' to a police officer's discretion? ACM FAT 2020 translation tutorial'. Together they form a unique fingerprint.

Cite this