'Analysts find system often wrongly identifies people and could breach human rights law'
Police are facing calls to halt the use of facial recognition software to search for suspected criminals in public after independent analysis found matches were only correct in a fifth of cases and the system was likely to break human rights laws.
Academics from the University of Essex were granted access to six live trials by the Metropolitan police in Soho, Romford and at the Westfield shopping centre in Stratford, east London.
They found the system regularly misidentified people who were then wrongly stopped. They also warned of “surveillance creep”, with the technology being used to find people who were not wanted by the courts. And they warned it was unlikely to be justifiable under human rights law, which protects privacy, freedom of expression and the right to protest.
Similar facial scanning software is being used in shopping centres, where it is embedded in advertising hoardings to track the shoppers’ age, gender and even mood, and has been deployed by other police forces in Manchester, Leicester and South Wales – where it will be used this weekend at the Swansea airshow. Officers will be scanning for “persons of interest” and “other persons where intelligence is required” as well as wanted criminals, the force said.
David Davis MP, a former shadow home secretary, said the research by Prof Peter Fussey and Dr Daragh Murray at the University of Essex’s Human Rights Centre showed the technology “could lead to miscarriages of justice and wrongful arrests” and poses “massive issues for democracy”.
“All experiments like this should now be suspended until we have a proper chance to debate this and establish some laws and regulations,” he said. “Remember what these rights are: freedom of association and freedom to protest; rights which we have assumed for centuries which shouldn’t be intruded upon without a good reason.”
The Neoface system used by the Met and South Wales police is supplied by Japanese company NEC, which markets the same technology to retailers and casinos to spot regular customers, and to stadium and concert operators to scan crowds for “potential troublemakers”.
Scotland Yard insisted its deployments were legal and successful in identifying wanted offenders, and that the public would expect it to trial emerging technology.
Deputy assistant commissioner Duncan Ball said the force was “extremely disappointed with the negative and unbalanced tone of this report”.'
Read more: Police face calls to end use of facial recognition software