Face Recognition

UK police are concerned AI will lead to bias and over-reliance on automation


British police have expressed concern that using AI in their operations may lead to increased bias and an over-reliance on automation.

A study commissioned by UK government advisory body the Centre for Data Ethics and Innovation warned that police felt AI may “amplify” prejudices.

50 experts were interviewed by the Royal United Services Institute (RUSI) for the research, including senior police officers.

Racial profiling continues to be a huge problem. More young black men are stopped than young white men. The experts interviewed by the RUSI are worried these human prejudices could make their way into algorithms if they’re trained on existing police data.

It’s also noted how individuals from disadvantaged backgrounds tend to use more public transport. With data likely to be collected from the use of public transport, this increases the likelihood of those individuals being flagged.

The accuracy of facial recognition algorithms has often been questioned. Earlier this year, the Algorithmic Justice League tested all the major technologies and found that algorithms particularly struggled with darker-skinned females.

A similar report published by the American Civil Liberties Union focused on Amazon’s so-called Rekognition facial recognition system. When tested against members of congress, it incorrectly flagged those with darker skin more often.

Both findings show the potentially devastating societal impact if such technology was rolled out publicly today. It’s good to hear British authorities are at least aware of the potential complications.

The RUSI reports that experts in the study want to see clearer guidelines established for acceptable use of the technology. They hope this will provide confidence to police forces to adopt such potentially beneficial new technologies, but in a safe and responsible way.

“For many years police forces have looked to be innovative in their use of technology to protect the public and prevent harm and we continue to explore new approaches to achieve these aims,” Assistant Chief Constable Jonathan Drake told BBC News.

“But our values mean we police by consent, so anytime we use new technology we consult with interested parties to ensure any new tactics are fair, ethical and producing the best results for the public.”

You can find the full results of the RUSI’s study here.

Interested in hearing industry leaders discuss subjects like this? Attend the co-located 5G Expo, IoT Tech Expo, Blockchain Expo, AI & Big Data Expo, and Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London, and Amsterdam.

Click to comment

You must be logged in to post a comment Login

Leave a Reply

To Top

We are using cookies on our website

We use cookies to personalise content and ads, to provide social media features, and to analyse our traffic. Please confirm if you accept our tracking cookies. You are free to decline the tracking so you can continue to visit our website without any data sent to third-party services. All personal data can be deleted by visiting the Contact Us > Privacy Tools area of the website.