A new set of guidelines for the use of CCTV facial recognition by police has been criticised as too broad and non-specific. The Surveillance Camera Code of Practice, updated for the first time in eight years, “provides guidance on the appropriate use of surveillance camera systems by local authorities and the police” including the use of automatic facial recognition (AFR), which compares people’s faces with a watch list, alerting police when there is a match.
The technology is controversial, providing police with the ability to keep close watch on our communities, and monitor the whereabouts of specific individuals through the watch list.
According to the government, the new update in guidance is to consider recent legislation, “in particular Data Protection legislation, and the judgment in Bridges v South Wales Police.” Such updates include: the need to consider adverse impacts for protected groups, to publish details of what categories put someone on the watch list, and to delete unused biometric data quickly.
Bridges v South Wales Police was a 2020 case concerned with two situations in which Ed Bridges’ face was scanned: Christmas shopping in Cardiff in 2017, and participating in an anti-arms protest outside the Motorpoint Arena in 2018. Arguing the use of AFR by South Wales Police had caused him distress, Bridges took the force to court. In the end, three court of appeal judges ruled in Bridges’ favour, stating that there was no clear guidance on where AFR can be used or who can go on the watchlist. “Too much discretion is currently left to individual police officers,” said the judges.
Though the government claims the new guidance has considered the Bridges case, Megan Goulding, a lawyer for Liberty, believes “These guidelines fail to properly account for either the Court’s findings or the dangers created by this dystopian surveillance tool.”
Following the Bridges case Tony Porter, at the time Surveillance Camera Commissioner, developed 72 pages of guidance on the use of facial recognition. He believes the Home Office’s current update is “bare bones” and that it fails to provide “much guidance to law enforcement, I don’t really think it provides a great deal of guidance to the public as to how the technology will be deployed.”
Automatic facial recognition has also fallen under criticism for being ineffective. In the case of the Metropolitan Police, last year (2020) saw 13,000 people’s faces scanned but led to only one arrest. Eight people were highlighted by the technology in this period, but seven of these were entirely innocent. Silkie Carlo, director of Big Brother Watch, argued “the police’s own data shows facial recognition surveillance is dangerously inaccurate and a waste of public money, let alone a serious assault on our civil liberties.”
Adding to her criticism of the new guidelines, Megan Goulding says “The changes to this surveillance guidance show that it is impossible to regulate for the dangers created by a technology that is oppressive by design. The safest, and only, thing to do with facial recognition is to ban it.”
Philip English, is a member of the YCL’s Manchester Branch