The Metropolitan Police has confirmed it has made its first arrest using its controversial live facial recognition system.
Scotland Yard tonight announced a 35-year-old woman was detained in the busy Oxford Street area for failing to appear at court.
The woman had been previously charged with a serious assault on an emergency service worker, police said.
The controversial technology uses live cameras to scan the faces of people walking past to check against a list of wanted suspects.
Cops have continued to go ahead with plans to use the cameras in key locations in the capital, despite concerns over the technology’s accuracy and fears it could compromise innocent people’s privacy.
Facial recognition cameras 'confused by transgender and non-binary people'
Murder suspect ‘caught by AI software that spotted dead person’s face’
Trials of the cameras had previously taken place on 10 occasions in locations such as Stratford's Westfield shopping centre and the West End of London. South Wales Police have also carried out tests.
The Met said it tested the system during these trials using police staff whose images were stored in the database.
The results suggested that 70% of wanted suspects would be identified walking past the cameras, while only one in 1,000 people generated a false alert.
But most people scanned are not on a watchlist and so most matches are false alarms.
'FaceApp AI' technology helps reunite man kidnapped as a baby with his family
An independent review of six of these deployments found that only eight out of 42 matches were "verifiably correct".
The cameras are now planned to be in use for five to six hours at a time, with bespoke lists of suspects wanted for serious and violent crimes drawn up each time.
Cameras will be clearly signposted, covering a "small, targeted area", and police officers will hand out leaflets about the facial recognition scanning.
Police use Jason Bourne-style facial recognition cameras to find missing kids
Assistant Commissioner Nick Ephgrave previously said the Met has "a duty" to use new technologies to keep people safe, adding that research showed the public supported the move.
"We all want to live and work in a city which is safe: the public rightly expect us to use widely available technology to stop criminals," he said.
"Equally I have to be sure that we have the right safeguards and transparency in place to ensure that we protect people's privacy and human rights. I believe our careful and considered deployment of live facial recognition strikes that balance."
Source: Read Full Article