A Force for Good?
- As lockdown is set to ease, RSA warns that police forces must not rush into use of AI without proper safeguards or public consultation.
- Freedom of Information requests (FOI) reveals lack of clarity from police around use of new technologies such as facial recognition and predictive policing.
- Police failing to consult public on how these technologies are being used or provide enough information on artificial intelligence (AI) and automated decision systems (ADS).
- RSA warns that proper scrutiny and engagement necessary as forces patrol the easing of lockdown restrictions from Monday.
Police forces risk undermining policing-by-consent and their relations with ethnic minority communities, a report warns, as new FOIs reveal only one force which has adopted AI such as facial recognition consulted local communities about its use.
A Force for Good? from the RSA’s Asheem Singh and Will Grimond, says that AI technology offers huge potential to improve policing, but must be carried out “for purposes of improving police work rather than simply as a cost-cutting measure.”
The Royal Society for the encouragement of Arts, Manufactures and Commerce says this is especially important in the context of the easing of the lockdown from Monday.
FOIs of all police forces in the United Kingdom by the RSA found:
- Of the police forces reporting use of AI or ADS, just one reported carrying out public engagement. An FOI returned in March found that the Met, which began a programme of facial recognition in February, had no record of consulting the public, despite suggesting that this would take place alongside deployment.
- While most police forces reported not using AI or ADS, several forces reported using ‘predictive policing’, where statistical analysis influences the deployment of police forces, and South Wales and the Metropolitan Police forces have deployed live facial recognition programmes.
Report authors Singh and Grimond warn this could harm relations with particular communities, stating: “Racial and gender biases can be exacerbated by technologies as they are based on historic data, and we fear that a lack of transparency could undermine the principle of policing-by-consent.”
The report also warns that the guidelines given to police staff are varied and often inadequate, not dealing with the specific implications of using AI and ADS. This patchwork approach also means that public consultation is rarely built-in to the procurement and deployment process.
The police roll-out of technology such as facial recognition has not been without controversy. Last year South Wales Police faced a court battle against its use of facial recognition, and in March the Equalities and Human Rights Commission called for it to be halted until better scrutiny is available and the law has been improved.
“Adopting new technologies without adequate cultural safeguards – especially around deliberation and transparency - risks storing up considerable problems for the future, around both community cohesion but also truly innovative technological uptake”, the authors conclude.
The report comes amid a police-enforced lockdown of the UK in response to the Covid-19 pandemic. The authors warn that increased police powers mean that it is more important than even that police use of these technologies come with appropriate safeguards.
A Force for Good calls for the use of deliberation to provide scrutiny and inform the public on how AI and ADS is being used by the police, and for citizens’ juries on ethics in policing with balances for ethnicity and gender.
Deliberative bodies have been trialled in various settings, including this year’s climate assemblies. Last year the RSA published a toolkit based on the results of an initial round of deliberative bodies in Democratising Decisions about Technology.
Asheem Singh, Head of the RSA’s Tech and Society programme, said:
“Over the last few years we have seen a rapid proliferation of the use of technology by our police forces. Innovation is exciting and welcome, but there are causes for concern in the lack of public engagement that has come with these technologies. Racial and gender biases can be exacerbated by technologies as they are based on historic data: we need to talk about that.
“Our findings indicate a lack of transparency and input from the public on how these new technologies are being used, which in turn undermines the principle of policing-by-consent. It’s fine to cut costs but not at the expense of the improvements forces have made in their relations with BME communities. Law enforcement should work with civil society groups to provide proper consultation around how AI and ADS is being used.
“This has implications beyond policing. As lockdown begins to ease from today, we need to be sure that new tech is being deployed with all the public’s best interests in mind. We have models for deliberating and discussing these complex technological challenges. We want to ensure government has the tools to do its job – but that means ensuring that those tools are beyond reproach and consented to and trusted by all.”
Ash Singleton, Head of Media and Communications, firstname.lastname@example.org, 07799 737 970.
The RSA (Royal Society for the encouragement of Arts, Manufactures and Commerce) is an independent charity which believes in a world where everyone is able to participate in creating a better future.
Through our ideas, research and a 30,000 strong Fellowship, we are a global community of proactive problem solvers, sharing powerful ideas, carrying out cutting-edge research and building networks. We create opportunities for people to collaborate, influence, and demonstrate practical solutions to realise change.
Our work covers a number of areas including the rise of the 'gig economy', robotics & automation; education & creative learning; and reforming public services to put communities in control.
The UK government should consider a new ‘carbon dividend’ to help sell the benefits of climate action to red wall areas – with as few as 46% of Brits thinking currently thinking COP26 will help ordinary Brits – our new report says.