London’s Metropolitan Police will trial an automated facial recognition system to identify people at this weekend's Notting Hill Carnival as the government continues to drag its feet on the use of the technology.
This is is only the second time that British cops have openly trialled live automated facial recognition (AFR) systems in the UK.
Last year, Leicestershire Police also trialled AFR at Download Festival – though this was found to not have been part of the policing plan for the event and police didn't bother assessing how effective it was after the event.
According to the Met, the AFR system at the Notting Hill Carnival “involves the use of overt cameras which scan the faces of those passing by and flag up potential matches against a database of custody images. The database has been populated with images of individuals who are forbidden from attending Carnival, as well as individuals wanted by police who it is believed may attend Carnival to commit offences.”
Speaking to The Register, the government's Surveillance Camera Commissioner, Tony Porter, said that "the Surveillance Camera Code of Practice requires relevant authorities such as Local Authorities and Police Forces to ensure they use surveillance cameras effectively, efficiently and proportionately."
"This is so communities can be sure that they are being protected by this technology rather than spied on. I would expect any organisation that is using tools like automatic facial recognition to do so transparently so members of the public know it is being used and what its use is for," explained Porter.
Even if the use of AFR complies with the code, the Met’s collection of custody images has been a greater source of controversy. In his annual report earlier this year, the Biometrics Commissioner warned that the Home Office was cruising for a lawsuit in this area, particularly after a High Court ruling in 2012, R (RMC and FJ) v MPS, in which Lord Justice Richards found:
[T]he just and appropriate order is to declare that the [Metropolitan Police's] existing policy concerning the retention of custody photographs … is unlawful. It should be clear in the circumstances that a 'reasonable further period' for revising the policy is to be measured in months, not years.
According to a Freedom of Information request made by pressure group Liberty last year, however, in the three years since the ruling the Met confessed it had only deleted 560 persons’ images because “the current I.T. system which holds MPS custody images was not designed or built to accommodate a complex retention policy.”
In response to a Parliamentary question reported in the Birmingham Mail, Baroness Williams of Trafford reported that by 15 July this year, there were “over 19 million custody images, which may include images other than of faces, uploaded by forces onto the PND (Police National Database).”
“Of these, 16,644,143 had been enrolled in the facial image recognition gallery and are searchable using automated facial recognition software,” Williams revealed – a figure representing roughly a quarter of the UK’s entire population.
This area is expected to receive enhanced attention when the Home Office publishes its long-awaited Biometrics Review, as well as its Custody Images Review. Though both of these have been completed, the Home Office has not published them, which The Register’s sources have claimed is a result of redrafting the “rubbish” reports.
The upping in the use of AFR in London follows another story last year in which The Register revealed the Met had begun buying up to 30,000 body cameras as part of its ambition to become “the most transparent police force in the world”.
In response to a request made under the Freedom of Information Act by campaigner Pippa King, the MPS claimed that its mugshot-matching facial recognition system could not be used with live footage, saying that the facial images would have to be transferred to the police’s facial recognition system after the event.
“If a match is made by the system” being used at Notting Hill Carnival, however, the Met’s officers “will be alerted and will seek to speak with the individual to verify their identity, making an arrest if necessary.” This strongly suggests that the system actually is capable of being used with live images.
The Met has promised to supply The Register with a comment. ®