Register now for better personalized quote!

Police are investing in facial recognition and AI. Not everyone thinks that it's going well

Sep, 09, 2021 Hi-network.com

While the deployment of new technologies in law enforcement agencies is booming, there also seems to be growing pushback from those who will be most affected by the tools.   

Image: Matthew Horwood / Getty Images News

Police officers are using algorithms such as facial-recognition tools to carry out law enforcement, often without supervision or appropriate testing -but it's looking like this is now causing citizens to voice their discontent in what could be a new wave of backlash against such technologies. 

Invited to speak before UK lawmakers as part of an inquiry into the use of algorithms in policing, a panel of experts from around the world agreed that while the deployment of new technologies in law enforcement agencies is booming, there also seems to be growing pushback from those who will be most affected by the tools. 

Artificial Intelligence

  • 7 advanced ChatGPT prompt-writing tips you need to know
  • The 10 best ChatGPT plugins of 2023 (and how to make the most of them)
  • I've tested a lot of AI tools for work. These are my 5 favorite so far
  • Human or bot? This Turing test game puts your AI-spotting skills to the test

"With respect to certain technologies, we've begun to see some criticism and pushback," said Elizabeth Jo, professor of law at the University of California, Davis. "So for example, while predictive policing tools were embraced by many police departments in the 2010s let's say, in the US you can see small movements towards backlash." 

SEE:The Privacy Paradox: How can businesses use personal data while also protecting user privacy?

Earlier this year, for instance, the local government of King County in Washington voted to ban local police from using facial recognition technology, which is typically used by the police to find criminals that they are seeking by comparing live camera feeds of faces against a pre-determined watch list. When the technology identifies a possible match to a person of interest, it generates an alert to warn police officers. 

King County's move was depicted by advocacy groups as reflective of a growing movement across the US to ban the use of facial recognition technology by the police force. Four governments in California -the city councils of Oakland, San Francisco, Alameda and Berkely -have also passed facial recognition bans, while multiple cities and towns in the country have implemented laws to regulate the technology. 

Vermont and Virginia have even passed statewide legislation to ban or regulate the use of facial recognition by the police. 

While it is one of the most debated and discussed tools, facial recognition is just one of a series of new technologies that law enforcement agencies around the world have adopted in recent years. 

From body-worn cameras and automatic number-plate recognition systems, to CCTV surveillance cameras in the UK, all the way to screening algorithms tasked with predicting the risk of recidivism for young offenders in New Zealand, the last decade has seen a boom in the use of emerging technologies in police departments. 

One factor that is largely at play, according to Jo, is the influence of the private sector, which often develops the tools used by officers and, therefore, has a huge stake in making sure the technology is adopted.  

Only a few months ago, for example, one of the leading manufacturers of technology products for law enforcement agencies in the US, Axon, announced a new program to equip every police officer in the country with a free body camera for a one-year trial -and the company claims to have already generated interest from hundreds of police agencies. 

The problem? With next-to-no rules in place at a national level to control the spread of these tools in police departments, said Jo, much of the adoption of new technologies is left completely unchecked. 

"As the American here I suppose I must rely on the terrible analogy of saying we are the Wild West when it comes to these technologies, meaning that there has been outright experimentation in the US in respect to many different kinds of technologies," said Jo.  

"We've seen adoption of many kinds of technologies all around the US kind of on a case-by-case basis." 

This can partly be attributed to US-specific hierarchies and the distribution of power between federal and local governments, which means that there is no nationwide rule that can apply to every police department. But the issue of unsupervised police technology is far from being US-specific. 

Across the Atlantic, a recent report highlighted similar problems in the UK police force. A committee on standards in public life showed that new technologies are introduced in law enforcement agencies with very little oversight, and often no clear process for evaluating, procuring or deploying the tools.  

Police algorithms, as a result, are often used with little transparency, to the point that citizens might not even be aware that a particular technology is being used against them. 

Rosamunde Elise Van Brakel, a digital criminologist at the Vrije University in Brussels, painted a similar picture in Belgium. "In Belgium it is very unclear how the procurement is done, there is no transparency about the rules, if they have to abide to certain steps in procurement with regards to the police," said Van Brakel. "It's all very unclear and there is no public information to be found about how decisions are made." 

Recommends

The best cybersecurity certifications

These certifications can help you enter an industry with a high demand for skilled staff.

Read now

This is problematic because examples of the misuse of technology in police departments abound, and they are now coming to the fore. Earlier this year, for example, Detroit police in Michigan was sued by a man who was wrongfully arrested after a facial recognition algorithm mistook him for a shoplifter.

SEE: Innovation is hard. Here are five ways to make it easier

The victims of flawed algorithms in policing are likely to be from communities that have historically been discriminated against: multiple studies from established institutes like MIT or Harvard, for instance, have demonstrated that facial recognition platforms have particular difficulty in distinguishing the faces of people with darker skin. 

In 2017 in the UK, for example, police officers in Durham started using an algorithm called the Harm Assessment Risk Tool (HART), which predicted the risk of re-offending for individuals who had been arrested, based on the data of 104,000 people arrested in the city over a five-year period.  

Among the data used by Hart was suspects' age, gender and postcode; and since geographical information has the potential to reflect racial communities, the decisions made by HART were inevitably biased against those communities.  

And as these missteps multiply, so are citizens' concerns growing.  

"What we are beginning to see are limited individual cases where individuals who are being criminally prosecuted are raising questions about a particular technology that is being used against them and trying to find out something about how that technology is being used," said Jo. 

Last year, a UK citizen named Ed Bridges won a court case against South Wales Police (SWP) after he complained that he was filmed without his consent by a facial recognition van. The court found that the use of live facial recognition breached privacy rights, data protection laws and equality laws, and that tighter rules were needed to manage the deployment of facial recognition technologies.   

The Bridges case has directly led to a re-draft of rules surrounding surveillance cameras in the country, which was published earlier this year. The updated laws, called the Surveillance Camera Code of Practice, provides new guidance on the use of live facial recognition, specifically on the basis of lessons learnt from the SWP fiasco. 

SEE:Quantum computers could read all your encrypted data. This 'quantum-safe' VPN aims to stop that

For Van Brakel, this new awareness of surveillance technologies is linked to the COVID-19 pandemic, and the profusion of digital tools that governments developed to cope with the crisis, ranging from contact-tracing apps to vaccine passports. 

"The public debate has really kicked off with the pandemic," said Van Brakel. "Citizens feel happy if technologies are used in a targeted way, in the context of anti-terrorism or organized crime. But now, with the pandemic, what has been happening is the technologies are focusing on the whole population, and people are questioning the government. And we see there is a clear loss of trust in the government." 

In France, for example, the government trialed facial-recognition software in a metro station in Paris, with six cameras that could identify passengers who had failed to wear a mask. Barely a week after the start of the experiment, the French data protection agency CNIL put an end to the trial, condemning the privacy intruding nature of the technology, and the cameras were removed. 

In another sign of change coming, the EU Commission recently published draft regulations on the use of artificial intelligence, which included a ban on some forms of facial recognition by law enforcement agencies. 

Whether it is overly optimistic to think that rules will soon exist to keep control over the use of algorithms by police departments remains to be seen. Citizens and civil society groups are making their voices heard, and are unlikely to quiet down.

Privacy

How to delete yourself from internet search results and hide your identity onlineThe best browsers for privacySamsung's smartphone 'Repair Mode' stops technicians from viewing your photosAre period tracking apps safe?
  • How to delete yourself from internet search results and hide your identity online
  • The best browsers for privacy
  • Samsung's smartphone 'Repair Mode' stops technicians from viewing your photos
  • Are period tracking apps safe?

tag-icon Hot Tags : Artificial Intelligence Innovation

Copyright © 2014-2024 Hi-Network.com | HAILIAN TECHNOLOGY CO., LIMITED | All Rights Reserved.