Abstract
1. The UK Government’s draft ‘Algorithmic Transparency Standard’ is intended to provide a standardised way for public bodies and government departments to provide information about how algorithmic tools are being used to support decisions. The research discussed in this report was conducted in parallel to the piloting of the Standard by the Cabinet Office and the Centre for Data Ethics and Innovation.
2. We conducted semi-structured interviews with respondents from across UK policing and commercial bodies involved in policing technologies. Our aim was to explore the implications for police forces of participation in the Standard, to identify rewards, risks, challenges for the police, and areas where the Standard could be improved, and therefore to contribute to the exploration of policy options for expansion of participation in the Standard.
3. Algorithmic transparency is both achievable for policing and could bring significant rewards. A key reward of police participation in the Standard is that it provides the opportunity to demonstrate proficient implementation of technology-driven policing, thus enhancing earned trust. Research participants highlighted the public good that could result from the considered use of algorithms.
4. Participants noted, however, a risk of misperception of the dangers of policing technology, especially if use of algorithmic tools was not appropriately compared to the status quo and current methods.
5. Participation in the Standard provides an opportunity to develop increased sharing among police forces of best practices (and things to avoid), and increased thoughtfulness among police force personnel in building and implementing new tools. Research participants were keen for compliance with the Standard to become an integral part of a holistic system to drive reflective practice across policing around the development and deployment of algorithmic technology. This could enable police to learn from each other, facilitate good policy choices and decrease wasted costs. Otherwise, the Standard may come to be regarded as an administrative burden rather than a benefit for policing.
6. Several key areas for amendment and improvement from the perspective of policing were identified in the research. These could improve the Standard for the benefit of all participants. These include a need for clarification of the scope of the Standard, and the stage of project development at which the Standard should apply. It is recommended that consideration be given to a ‘Standard-Lite’ for projects at the pilot or early stages of the development process in order to gain public understanding of new tools and applications. Furthermore, the Standard would benefit from a more substantial glossary (to include relevant policing terms) and additional guidance on the level of detail required in each section and how accuracy rates should be described, justified and explained in order to ensure consistency.
7. The research does not suggest any overriding reason why the Standard should not be applied in policing. Suitable exemptions for sensitive contexts and tradecraft would be required, however, and consideration given to ensuring that forces have the resources to comply with the Standard and to respond to the increased public interest that could ensue. Limiting the scope initially to tools on a defined list (to include the most high-risk tools, such as those that produce individualised risk/predictive scores) could assist in mitigating concerns over sensitive policing capabilities and resourcing. A non-public version of the Standard for sensitive applications and tools could also be considered, which would be available to bodies with an independent oversight function.
8. To support police compliance with the Standard, supplier responsibilities – including appropriate disclosure of algorithmic functionality, data inputs and performance – should be covered in procurement contracts and addressed up front as a mandatory requirement of doing business with the police.
9. As well as contributing to the piloting of the Standard, it is recommended that the findings of this report are considered at NPCC level, by the College of Policing and by the office of the Chief Scientific Advisor for Policing, as new sector-led guidance, best practice and policy are developed.
2. We conducted semi-structured interviews with respondents from across UK policing and commercial bodies involved in policing technologies. Our aim was to explore the implications for police forces of participation in the Standard, to identify rewards, risks, challenges for the police, and areas where the Standard could be improved, and therefore to contribute to the exploration of policy options for expansion of participation in the Standard.
3. Algorithmic transparency is both achievable for policing and could bring significant rewards. A key reward of police participation in the Standard is that it provides the opportunity to demonstrate proficient implementation of technology-driven policing, thus enhancing earned trust. Research participants highlighted the public good that could result from the considered use of algorithms.
4. Participants noted, however, a risk of misperception of the dangers of policing technology, especially if use of algorithmic tools was not appropriately compared to the status quo and current methods.
5. Participation in the Standard provides an opportunity to develop increased sharing among police forces of best practices (and things to avoid), and increased thoughtfulness among police force personnel in building and implementing new tools. Research participants were keen for compliance with the Standard to become an integral part of a holistic system to drive reflective practice across policing around the development and deployment of algorithmic technology. This could enable police to learn from each other, facilitate good policy choices and decrease wasted costs. Otherwise, the Standard may come to be regarded as an administrative burden rather than a benefit for policing.
6. Several key areas for amendment and improvement from the perspective of policing were identified in the research. These could improve the Standard for the benefit of all participants. These include a need for clarification of the scope of the Standard, and the stage of project development at which the Standard should apply. It is recommended that consideration be given to a ‘Standard-Lite’ for projects at the pilot or early stages of the development process in order to gain public understanding of new tools and applications. Furthermore, the Standard would benefit from a more substantial glossary (to include relevant policing terms) and additional guidance on the level of detail required in each section and how accuracy rates should be described, justified and explained in order to ensure consistency.
7. The research does not suggest any overriding reason why the Standard should not be applied in policing. Suitable exemptions for sensitive contexts and tradecraft would be required, however, and consideration given to ensuring that forces have the resources to comply with the Standard and to respond to the increased public interest that could ensue. Limiting the scope initially to tools on a defined list (to include the most high-risk tools, such as those that produce individualised risk/predictive scores) could assist in mitigating concerns over sensitive policing capabilities and resourcing. A non-public version of the Standard for sensitive applications and tools could also be considered, which would be available to bodies with an independent oversight function.
8. To support police compliance with the Standard, supplier responsibilities – including appropriate disclosure of algorithmic functionality, data inputs and performance – should be covered in procurement contracts and addressed up front as a mandatory requirement of doing business with the police.
9. As well as contributing to the piloting of the Standard, it is recommended that the findings of this report are considered at NPCC level, by the College of Policing and by the office of the Chief Scientific Advisor for Policing, as new sector-led guidance, best practice and policy are developed.
Original language | English |
---|---|
Number of pages | 30 |
Publication status | Published - 7 Jul 2022 |
Keywords
- Algorithms
- Machine Learning
- Police
- Transparency
- Standards
- Law
- Regulation
- Ethics