AI could detect rogue police officers, says expert

A House of Lords committee heard that algorithms could be used to monitor officers’ behaviour but are not, because of a lack of ‘political will’.
The Lords Justice and Home Affairs Committee has heard that artificial intelligence could be used to identify ‘rogue’ police officers but for a lack of ‘political will’ (Kirsty O’Connor/PA)
PA Archive
Christopher McKeon12 October 2021

Artificial intelligence could be used to identify “rogue” police officers but for a lack of “political will”, a group of peers has heard.

Professor Karen Yeung, of the University of Birmingham, told the Lords Justice and Home Affairs Committee to think carefully about who is being targeted by crime-fighting algorithms.

Prof Yeung, who researches AI at Birmingham Law School, said: “We’re not building criminal risk assessment tools to identify insider trading or who’s going to commit the next corporate fraud, because we’re not looking for those kinds of crimes and we do not have high volume data.

“This is really pernicious. What is going on is that we are looking at high volume data, which is mostly about poor people, and we are turning them into prediction tools about poor people and we are leaving whole swathes of society untouched by these tools. So, this is a serious systemic problem and we need to be asking those questions.”

We might have tracked down rogue individuals who were prone to committing violence against women. We have the technology

Professor Karen Yeung, University of Birmingham

Alluding to concerns that the police should have identified the risk posed by serving police officer Wayne Couzens before he murdered Sarah Everard in March 2021, Prof Yeung added: “Why are we not collecting data, which is perfectly possible now, about individual police behaviour?

“We might have tracked down rogue individuals who were prone to committing violence against women. We have the technology.

“We just don’t have the political will to apply them to scrutinise the exercise of public authority in more systematic ways than the way in which we are towards poor people.”

Prof Yeung made her comments at a session of the Justice and Home Affairs Committee on Tuesday examining the use of new technology in law enforcement, during which she called for greater transparency of how algorithms are designed and used in the criminal justice system.

The committee also heard concerns regarding police use of live facial recognition software, which Silkie Carlo, director of Big Brother Watch, described as “disproportionate”.

Ms Carlo said the Metropolitan Police had achieved just 11 true positive matches over “four or five years” of testing on the streets of London along with “an awful lot of false positive matches”, after capturing tens if not hundreds of thousands of people’s faces.

Even some of the positive matches, she added, were of people who were not wanted in connection with any crime but appeared on databases of people with mental health problems or protesters.

She said: “Their current rate over the entirety of their employment is 93% false positive matches, so I struggle to see a world in which that could be considered proportionate.”

Prof Yeung added that the police did not know how many false negatives the technology had returned because it had only been used in live tests rather than controlled, scientific conditions.

The Metropolitan Police claim they use facial recognition in a lawful and proportionate way.

Create a FREE account to continue reading

eros

Registration is a free and easy way to support our journalism.

Join our community where you can: comment on stories; sign up to newsletters; enter competitions and access content on our app.

Your email address

Must be at least 6 characters, include an upper and lower case character and a number

You must be at least 18 years old to create an account

* Required fields

Already have an account? SIGN IN

By clicking Create Account you confirm that your data has been entered correctly and you have read and agree to our Terms of use , Cookie policy and Privacy policy .

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged in