Facial recognition software used by the Metropolitan Police in the UK has returned false positives in more than 98 percent of alerts generated, according to The Independent. The UK’s biometrics commissioner, Professor Paul Wiles, says that legislation to govern the technology is “urgently needed.”
The technology scans people in a video feed and compares their images to pictures stored in a reference library or watch list, and it has been used at large events such as the Notting Hill Carnival and a Six Nations Rugby match, The Independent reports.
The Met’s system produced 104 alerts, of which only two were later confirmed to be positive matches. The force says it does not consider the inaccurate matches “false positives” because alerts were checked a second time after they occurred.
Wiles adds: “I have told both police forces (the Metropolitan Police and the South Wales Police, which utilizes another facial recognition program that returned more than 2,400 false positives in 15 deployments since June 2017) that I consider such trials are only acceptable to fill gaps in knowledge and if the results of the trials are published and externally peer-reviewed. We ought to wait for the final report, but I am not surprised to hear that accuracy rates so far have been low as clearly the technology is not yet fit for use.”
The Met says there is no current end date for its facial recognition experiment, and it has made no arrests through the system.