UK police lack framework for adopting new tech like AI and face recognition, Lords told

Governance structure is 'a bush, not a tree' – whatever that means


UK police forces have no overarching rules for introducing controversial technologies like AI and facial recognition, the House of Lords has heard.

Baroness Shackleton of the Lords' Justice and Home Affairs Committee said the group had found 30 organisations with some role in determining how the police use new technologies, without any single body to guide and enforce the adoption of new technologies.

Under questioning from the Lords, Kit Malthouse, minister for crime and policing, said: "It is complicated at the moment albeit I think most [police] forces are quite clear about their own situation."

Malthouse admitted the governance structure for introducing new technologies in UK policing was "a bush, not a tree."

Seemingly pleased with the metaphor, he said: "Some people may say within that bush there is protection or within a tree, things become more assertive. Others may agree that the clarity of the tree is preferable."

Malthouse also pointed out there is a National Policing Digital Strategy, which claims to be a "new digital ambition for the UK police service to leverage digital technologies to build capability."

The minister said it would help police "make sure that they have the right infrastructure, the right governance to sort all this stuff out."

The government was also trying to rationalise governance of some technologies by bringing biometrics and surveillance cameras together, he said.

In terms of use of data, the Information Commissioner's Office also had powers over the policy. "The ICO grows as a body and you can see, over time, things migrating in that direction," said Malthouse.

Last year outgoing Information Commissioner Elizabeth Denham took the unusual step of warning about the future independence of the ICO in light of the government's proposals for changing data legislation.

Ultimately chief constables were responsible for use of technologies and data by their own forces.

"The chief constable has to be accountable before the law, and that normally focuses minds," Malthouse told the committee.

Bias in AI tools is already a concern to police, according to the Royal United Services Institute, a defence and security thinktank.

In a report last year, it said officers were concerned such software may "amplify" prejudices, meaning some groups could become more likely to be stopped in the street and searched.

AI has already found its way into policing in the UK. According to a blog from the Parliamentary Office of Science and Technology, Durham Constabulary's Harm Assessment Risk Tool uses machine learning to predict how likely an offender is to reoffend in the next two years. Meanwhile, police have also trialled facial-recognition technology to identify people automatically from live video footage (such as CCTV).

The Lords' committee was also concerned that police forces did not always have the capacity to evaluate new technologies.

"Some of our witnesses have worried that across that spread of forces, not all will have the capacity to assess and evaluate this new technology being sold to them by some pretty persuasive entrepreneurs in many cases," Lord Peter Ricketts said.

"I wonder what you think about the issue of some sort of central body that could undertake assessing and type marking of technologies so that individual police forces could then go ahead and procure it with more confidence."

Malthouse said that while there was some central evaluation of more mature technologies, central government needed to "be slightly careful not to stifle innovation, or indeed people willing to try things." ®


Other stories you might like

  • SEC probes Musk for not properly disclosing Twitter stake
    Meanwhile, social network's board rejects resignation of one its directors

    America's financial watchdog is investigating whether Elon Musk adequately disclosed his purchase of Twitter shares last month, just as his bid to take over the social media company hangs in the balance. 

    A letter [PDF] from the SEC addressed to the tech billionaire said he "[did] not appear" to have filed the proper form detailing his 9.2 percent stake in Twitter "required 10 days from the date of acquisition," and asked him to provide more information. Musk's shares made him one of Twitter's largest shareholders. The letter is dated April 4, and was shared this week by the regulator.

    Musk quickly moved to try and buy the whole company outright in a deal initially worth over $44 billion. Musk sold a chunk of his shares in Tesla worth $8.4 billion and bagged another $7.14 billion from investors to help finance the $21 billion he promised to put forward for the deal. The remaining $25.5 billion bill was secured via debt financing by Morgan Stanley, Bank of America, Barclays, and others. But the takeover is not going smoothly.

    Continue reading
  • Cloud security unicorn cuts 20% of staff after raising $1.3b
    Time to play blame bingo: Markets? Profits? Too much growth? Russia? Space aliens?

    Cloud security company Lacework has laid off 20 percent of its employees, just months after two record-breaking funding rounds pushed its valuation to $8.3 billion.

    A spokesperson wouldn't confirm the total number of employees affected, though told The Register that the "widely speculated number on Twitter is a significant overestimate."

    The company, as of March, counted more than 1,000 employees, which would push the jobs lost above 200. And the widely reported number on Twitter is about 300 employees. The biz, based in Silicon Valley, was founded in 2015.

    Continue reading
  • Talos names eight deadly sins in widely used industrial software
    Entire swaths of gear relies on vulnerability-laden Open Automation Software (OAS)

    A researcher at Cisco's Talos threat intelligence team found eight vulnerabilities in the Open Automation Software (OAS) platform that, if exploited, could enable a bad actor to access a device and run code on a targeted system.

    The OAS platform is widely used by a range of industrial enterprises, essentially facilitating the transfer of data within an IT environment between hardware and software and playing a central role in organizations' industrial Internet of Things (IIoT) efforts. It touches a range of devices, including PLCs and OPCs and IoT devices, as well as custom applications and APIs, databases and edge systems.

    Companies like Volvo, General Dynamics, JBT Aerotech and wind-turbine maker AES are among the users of the OAS platform.

    Continue reading

Biting the hand that feeds IT © 1998–2022