Police Scotland urged to abandon facial scanning tech plans amid warnings it is 'not fit' and 'unethical'

Experts sound caution over Police Scotland’s push towards live facial recognition biometric software

Controversial new artificial intelligence-powered facial recognition software being considered by Scotland’s national police force is not fit for apprehending serious criminals, and is beset by ethical concerns that risk undermining public trust in law enforcement, a leading expert has warned.

Police Scotland is looking at the possibility of introducing live facial recognition technology (LFR) as part of sweeping reforms, with Chief Constable Jo Farrell insisting she wants to “open up conversations” about it and other biometric tools that would “enable us to tackle crime and keep people safe.”

Hide Ad
Hide Ad

Ms Farrell told The Scotsman last week that while she was “very alive” to privacy concerns around LFR - which compares a live camera feed, or multiple feeds, of faces against a predetermined watchlist - she considered it would be an “abdication” of her duties if she did not “keep pushing” to reach the point where the technology can be used “appropriately and without bias.”

The Metropolitan Police deploying the use of live facial recognition technology in Croydon, south London, earlier this year. Picture: PAThe Metropolitan Police deploying the use of live facial recognition technology in Croydon, south London, earlier this year. Picture: PA
The Metropolitan Police deploying the use of live facial recognition technology in Croydon, south London, earlier this year. Picture: PA | PA

Four years ago the issue was the subject of a report by Holyrood’s justice sub-committee, which criticised LFR’s “lack of accuracy” and concluded that it was “currently not fit for use by Police Scotland.” Last month, however, Police Scotland’s chief data and information officer, Andrew Hendry, met with Martyn Evans, chair of the Scottish Police Authority, to discuss the use of AI and specifically, LFR. While no proposed timescale for its introduction has been announced, the force has been urged not to adopt the tech.

Angela Daly, a professor at the University of Dundee’s Leverhulme Research Centre for Forensic Science, and an internationally recognised authority in the regulation and governance of new technologies, said she had seen nothing to assuage her concerns over LFR.

“It is not fit for purpose in its deployments,” she told Scotland on Sunday. “For example, not catching the kinds of serious criminals the police claim, and instead identifying people who have committed lesser and non-violent crimes. When it is used it is not done in a proportionate and necessary manner, and it is generally unethical in its development and deployment.”

Hide Ad
Hide Ad

LFR, regarded by some policing figures as capable of achieving significant crime detection breakthroughs, is only routinely used in the UK by the Metropolitan Police and South Wales Police. The experience of those forces raises questions around its efficacy, given the legal challenges and criticism that have followed.

Police Scotland Chief Constable Jo Farrell said she wanted to ‘keep pushing’ with a view to using the technology. Picture: Andrew Milligan/PA WirePolice Scotland Chief Constable Jo Farrell said she wanted to ‘keep pushing’ with a view to using the technology. Picture: Andrew Milligan/PA Wire
Police Scotland Chief Constable Jo Farrell said she wanted to ‘keep pushing’ with a view to using the technology. Picture: Andrew Milligan/PA Wire | PA

In 2019, an independent report on the Met’s LFR rollout found facial recognition matches were deemed verifiably correct on just 19 per cent of occasions. The force has since said its LFR algorithm is exceeding accuracy expectations. Lindsey Chiswick, the Met’s director of intelligence, said earlier this summer the technology was “precise, efficient, and effective.” Even so, the Met is experiencing pushback. In June, Shaun Thompson, a black anti-knife crime activist threatened with arrest due to a mistake by the technology, applied for a judicial review of LFR’s use.

Elsewhere, South Wales Police has deployed LFR on 119 occasions. The data shows there were 72 resultant arrests, but LFR also produced 2,833 false alerts. The Court of Appeal also found the Welsh force’s initial use of LFR breached privacy rights and broke equalities law, with no clear guidance on where it could be used, and who could be put on watchlists.

The scope of such watchlists is itself vexed. The Met’s definition includes individuals subject to court or bail orders, but also people “where there are reasonable grounds to suspect that the individual depicted is about to commit an offence.” Such ambiguity, say critics of LFR, is indicative of a flawed system. The campaign group, Big Brother Watch, said it has received reports of people wrongfully placed on watchlists being stopped.

Hide Ad
Hide Ad

Public consultation around the use of LFR in Scotland will be crucial. Dr Anna Bobak, a senior lecturer in psychology at the University of Stirling who contributed to the justice sub-committee’s work, said that while AI technology was an “important tool” in reducing crime and helping with menial tasks, LFR was “much more controversial.”

The Metropolitan Police and South Wales Police are the only UK forces to routinely use LFR, but Police Scotland is considering following their lead.  Picture: PAThe Metropolitan Police and South Wales Police are the only UK forces to routinely use LFR, but Police Scotland is considering following their lead.  Picture: PA
The Metropolitan Police and South Wales Police are the only UK forces to routinely use LFR, but Police Scotland is considering following their lead. Picture: PA | PA

She said: “Everyday surveillance, given the bias that LFR often poses, seems unlikely to be accepted by the public. Britain embraces policing by consent, and as such, consent of the public, with special care taken to consult minority groups, should be sought to introduce live face recognition, be it routinely, or in exceptional circumstances.”

Unlike in the EU, where new legislation rules LFR to be an “unacceptable risk” for routine policing of public spaces, no explicit legal framework exists authorising its use in Scotland or the rest of the UK. The Scottish Biometrics Commissioner’s statutory code of practice says such tech must be used in a way that is both proportionate and strictly necessary. It also states that police must ensure that algorithms for biometric matching are “free from bias and are non-discriminatory on the grounds of race, gender, or any protected characteristic.” But is LFR up to the job? Not according to Dr Gideon Christian, a law professor at the University of Calgary, who looked at its ability to correctly identify people of different ethnicities; the faces of black women manifested the highest error rate of around 35 per cent.

Questions around bias also extend to the way the tech is utilised. Prof Daley said there were ethical issues around the way LFR has been used in lower income areas in England, especially those with large black and minority ethnic populations. “Such deployments exacerbate bias in the technology itself by reinforcing structural bias in society as regards the policing of some and not others, which can lead to less trust in the police by the community,” she said.

Hide Ad
Hide Ad

Dr Bobak said it was important for forces to analyse the LFR software they use. “If the decision were to be taken to introduce LFR, an independent group of experts - for example, computer scientists and psychologists - should be consulted before purchasing such software, rather than relying purely on technical information provided by profit driven companies trying to sell their products.”

Dare to be Honest
Follow us
©National World Publishing Ltd. All rights reserved.Cookie SettingsTerms and ConditionsPrivacy notice