This article is more than 1 year old

UK police lack framework for adopting new tech like AI and face recognition, Lords told

Governance structure is 'a bush, not a tree' – whatever that means

UK police forces have no overarching rules for introducing controversial technologies like AI and facial recognition, the House of Lords has heard.

Baroness Shackleton of the Lords' Justice and Home Affairs Committee said the group had found 30 organisations with some role in determining how the police use new technologies, without any single body to guide and enforce the adoption of new technologies.

Under questioning from the Lords, Kit Malthouse, minister for crime and policing, said: "It is complicated at the moment albeit I think most [police] forces are quite clear about their own situation."

Malthouse admitted the governance structure for introducing new technologies in UK policing was "a bush, not a tree."

Seemingly pleased with the metaphor, he said: "Some people may say within that bush there is protection or within a tree, things become more assertive. Others may agree that the clarity of the tree is preferable."

Malthouse also pointed out there is a National Policing Digital Strategy, which claims to be a "new digital ambition for the UK police service to leverage digital technologies to build capability."

The minister said it would help police "make sure that they have the right infrastructure, the right governance to sort all this stuff out."

The government was also trying to rationalise governance of some technologies by bringing biometrics and surveillance cameras together, he said.

In terms of use of data, the Information Commissioner's Office also had powers over the policy. "The ICO grows as a body and you can see, over time, things migrating in that direction," said Malthouse.

Last year outgoing Information Commissioner Elizabeth Denham took the unusual step of warning about the future independence of the ICO in light of the government's proposals for changing data legislation.

Ultimately chief constables were responsible for use of technologies and data by their own forces.

"The chief constable has to be accountable before the law, and that normally focuses minds," Malthouse told the committee.

Bias in AI tools is already a concern to police, according to the Royal United Services Institute, a defence and security thinktank.

In a report last year, it said officers were concerned such software may "amplify" prejudices, meaning some groups could become more likely to be stopped in the street and searched.

AI has already found its way into policing in the UK. According to a blog from the Parliamentary Office of Science and Technology, Durham Constabulary's Harm Assessment Risk Tool uses machine learning to predict how likely an offender is to reoffend in the next two years. Meanwhile, police have also trialled facial-recognition technology to identify people automatically from live video footage (such as CCTV).

The Lords' committee was also concerned that police forces did not always have the capacity to evaluate new technologies.

"Some of our witnesses have worried that across that spread of forces, not all will have the capacity to assess and evaluate this new technology being sold to them by some pretty persuasive entrepreneurs in many cases," Lord Peter Ricketts said.

"I wonder what you think about the issue of some sort of central body that could undertake assessing and type marking of technologies so that individual police forces could then go ahead and procure it with more confidence."

Malthouse said that while there was some central evaluation of more mature technologies, central government needed to "be slightly careful not to stifle innovation, or indeed people willing to try things." ®

More about

TIP US OFF

Send us news


Other stories you might like