Defendant: South Wales Police
Prosecutor: Ed Bridges (Civil Rights Campaigner)
Location: South Wales
Accusation: Facial recognition technology is biased, unreliable and an infringement on our rights to privacy.
Campaigners are calling for South Wales police and other forces to stop using facial recognition technology.
Ed Bridges, a civil liberties campaigner, argues that the capturing of thousands of faces by the Welsh force was indiscriminate and disproportionate.
The South Wales police force said it was confident that “this is a judgment that we can work with” and said its use of the technology – a pilot for forces across England and Wales – was expected to continue with minor changes.
It said live-time deployments in the force’s area had resulted in 61 people being arrested for offences including robbery and violence, theft and court warrants. No unlawful arrests had been made, it added.
Facial recognition technology maps faces in crowds and compares them to images of people on a watchlist, which can include suspects, missing people and other persons of interest to the police.
Other forces have begun to follow South Wales’s lead. In February, the Met announced plans to deploy live systems, which automatically scan against 5,000 biometric profiles, in shopping centres and other crowded areas of London.
It is argued by the presecutor that there is “too broad a discretion” left to police officers in applying the technology. South Wales police also breached their public sector equalities duty, by failing to properly investigate whether the facial recognition algorithms were biased in terms of race or sex.
Daragh Murray, a senior lecturer at the school of law at Essex University, said he believed the judgment meant police forces could not be confident that their use of the technology would survive legal challenge in future trials. “The only way this can be resolved for sure is if proper legislation is now introduced.”
Concerns have been raised that facial recognition technology can be racially and gender biased, in particular that it is less effective in accurately distinguishing black people – although few studies have been conducted by the authorities in the UK.
In 2018, a researcher at MIT’s Media Lab in the US concluded that software supplied by three companies made mistakes in 21% to 35% of cases for darker-skinned women. By contrast, the error rate for light-skinned men was less than 1%.
Following the ruling, Matt Jukes, the chief constable of South Wales police, said he recognised that “public confidence, fairness and transparency are vitally important”, and that “academic analysis” had been commissioned to look at the performance of the NeoFace Watch software, supplied by the Japanese firm NEC.
South Wales force has already captured 500,000 faces, the overwhelming majority of whom were not suspected of any wrongdoing. Bridges’ face was scanned while he was doing Christmas shopping in Cardiff in 2017 and at a peaceful anti-arms protest outside the city’s Motorpoint Arena in 2018.
The 37-year-old said: “For three years now South Wales police has been using it against hundreds of thousands of us, without our consent and often without our knowledge. We should all be able to use our public spaces without being subjected to oppressive surveillance.”
South Wales police said it did not intend to take the case to the supreme court, but would instead refine its policies.
Last year, the King’s Cross Central Limited Partnership became one of the first property companies to say it had used facial recognition software in two street cameras at its London site until 2018 for reasons of “public safety”, but after an outcry it said it had abandoned plans to deploy the controversial technology more widely.
Source: The Guardian