WE ARE A MAGAZINE ABOUT LAW AND JUSTICE | AND THE DIFFERENCE BETWEEN THE TWO
December 21 2024
WE ARE A MAGAZINE ABOUT LAW AND JUSTICE | AND THE DIFFERENCE BETWEEN THE TWO

Eight out of 10 suspects identified by Met’s facial recognition ‘innocent’

Eight out of 10 suspects identified by Met’s facial recognition ‘innocent’

'Data path' by r2hox, from Flickr, creative comms

New research into the Metropolitan police’s use of facial recognition technology has found that more than eight out of 10 suspects (81%) flagged by the system are innocent. Commissioned by Scotland Yard, the authors of the report surmise that without explicit legal authorisation, it is ‘highly possible’ that the ‘live facial recognition’ trial process would be held unlawful if challenged before the courts.

The technology allows for the real time biometric processing of video imagery in order to identify individuals. Its use raises various human rights concerns. For example, the Surveillance Camera Commissioner has noted that the significant future capability of facial recognition software may be even more intrusive on the privacy of citizens than aspects of covert surveillance.

The anti-surveillance group, Big Brother Watch, moreover, has compared the use of facial recognition to a large-scale identity check, equivalent to checking papers or fingerprinting at physical checkpoints.

Between 2016 and 2019 the Met conducted a total of 10 test deployments, trialling facial recognition. Written by Professor Peter Fussey and Dr Daragh Murray from the University of Essex, the report evaluates the accuracy of the technology at recognising the faces of individuals recorded on watchlists. They find that facial recognition matches are verifiably correct in less than one-in-five instances.

Human rights law requires that any interference with individuals’ rights be in accordance with the law, pursue a legitimate aim and be ‘necessary in a democratic society’. As established in S and Marper v UK – a case concerning the retention of DNA, fingerprints and cellular samples – police forces must ‘strike the right balance’ between the pursuit of a policy aim and interference with individual rights. Because facial recognition involves biometric processing, it relates to data protection and the right to private life. 

The report finds that the implicit legal authorisation claimed by the Metropolitan police for the use of facial recognition – coupled with the absence of publicly available, clear, online guidance – is likely inadequate when compared with the ‘in accordance with the law’ requirement. Of the legal sources cited by the Met, only the common law and the Protection of Freedoms Act 2010 could potentially establish an implicit legal basis for the technology. Legal ambiguity in turn diminishes the ‘foreseeability’ of how facial recognition technology is utilised.

In response, Duncan Ball, the Met’s deputy assistant commissioner, said that they were ‘extremely disappointed with the negative and unbalanced tone of this report.’ ‘We have a legal basis for this pilot period and have taken legal advice throughout,’ he said. ‘We believe the public would absolutely expect us to try innovative methods of crime fighting in order to make London safer.’

Facial recognition technology may also fall short of being ‘necessary in a democratic society’. As established by the Surveillance Camera Commissioner’s March 2019 guidance on ‘Police Use of Automated Facial Recognition Technology with Surveillance Camera Systems’, the Met must prepare impact and risk assessment documents. 

In the view of the report’s authors, however, these have been inadequate. There has been a lack of effective consideration on alternative measures and the criteria for including ‘wanted’ persons on watchlists contains ‘significant ambiguity’. Reduced clarity risks eroding public trust, says the report. 

As the European Court of Human Rights recently held in Big Brother Watch v UK, the law must be sufficiently clear such that the public know when and how public authorities are empowered to use surveillance methods.

Despite the Met’s claims that the deployments were on a trial basis, there was in fact no clear distinction between the research objectives of the trial and the operational use of the technology. Professor Fussey and Dr Murray wrote: ‘Treating live facial recognition camera avoidance as suspicious behaviour undermines the premise of informed consent.’

‘The arrest of live facial recognition camera-avoiding individuals for more minor offences than those used to justify the test deployments raise clear issues regarding the extension of police powers and of “surveillance creep”.’

Public concerns over discrimination in both the technical performance and police deployment of LFR is further widespread. Earlier this year, for example, San Francisco banned the use of facial recognition technology by city government agencies. The report contends that the prohibition on discrimination requires that police forces take active measures to ensure rights compliance.

The policy and campaigns officer at human rights group, Liberty, Hannah Couchman said that it would ‘display an astonishing and deeply troubling disregard for our rights if the Met ignored this independent report and continued to deploy this dangerous and discriminatory technology’.

The authors further argue that the use of facial recognition technology may have a ‘chilling effect’ on democratic participation whereby individuals refrain from lawfully exercising their democratic rights because of a fear of police monitoring. This potentially inhibits their right to freedom of expression. 

The first court case against police use of facial recognition began in May in Cardiff.  According to Liberty, the South Wales force will use facial recognition at the Wales National Air show this Saturday in Swansea.

Related Posts