This article is more than 1 year old

Right to contest automated AI decision under review as part of UK government data protection consultation

Right not to be subjected to solely automated decisions might not be keeping pace with 'data-driven economy' says document

The UK government has launched a consultation that suggests it could water down individuals' rights to challenge decisions made about them by artificial intelligence.

In documents released today, the Department for Digital, Culture, Media & Sport (DCMS) said that the need “to provide human review [of AI decisions] may, in future, not be practicable or proportionate.”

In the UK’s current implementation of the EU’s General Data Protection Directive (via the Data Protection Act 2018), people have a right to not be subject to a solely automated decision-making process with significant effects. However, these rights should be reviewed, according to a consultation [PDF] launched by the DCMS.

In a comprehensive review of UK data protection law and data strategy, the department said the “issues need to be viewed in the context that the use of automated decision making is likely to increase greatly in many industries in the coming years."

oliver dowden checks for ceiling caving in

Are you there, AI? It's me, Oliver

The DCMS went on: “The need to maintain a capability to provide human review may, in future, not be practicable or proportionate, and it is important to assess when this safeguard is needed and how it works in practice."

Removing the current provisions entirely are likely to look odd when both the EU and China are seeking to regulate this area, to protect consumers better

“It is critical that UK GDPR's provisions for automated decision-making and profiling work for organisations and for individuals. It is therefore important to examine whether Article 22 and its provisions are keeping pace with the likely evolution of a data-driven economy and society, and whether it provides the necessary protection,” it continued.

The consultation then points to the much-derided Taskforce on Innovation, Growth and Regulatory Reform (TIGRR) document from May this year, which recommended that Article 22 of UK Data Protection Act 2018 (which brought GDPR into UK law) should be removed.

Instead, use of solely automated AI systems should be allowed on the basis of “legitimate interests or public interests”. The government consultation went on to ask participants whether they agree.

Richard Cumbley, partner and global head of law firm Linklaters’ technology practice said: "Removing the current provisions entirely are likely to look odd when both the EU and China are seeking to regulate this area, to protect consumers better.

"Rather than removing the provisions in UK GDPR, it would however be possible to focus the area where the regulation applies to those areas of real potential harm, and in particular remove concerns over the creation of training sets, without removing the current provisions. That would avoid throwing the baby out with the bath water."

The UK law is currently in line with the EU's GDPR such that a ruling from the political and trading bloc allows data sharing between the UK and the EU post-Brexit.

Any divergence from the EU data protection principles could have have serious consequences, including the loss of data adequacy, Cumbley pointed out.

“A loss of adequacy for the UK would, in data terms, largely cut us off from the EU. This would have immediate and significant real-world effects and result in costly new compliance measures,” he said.

The consultation paper also takes a swipe at Article 5, which states among other things that data should be "collected for specified, explicit and legitimate purposes" and be "adequate, relevant and limited to what is necessary."

The UK government is proposing to implement “a more flexible and risk-based accountability framework which is based on privacy management programmes”.

“Under this framework, organisations would be required to implement a privacy management programme tailored to their processing activities and ensure data privacy management is embraced holistically rather than just as a 'box-ticking' exercise,” he said.

However, Georgina Kon, technology and media partner at law firm Linklaters, pointed out earlier this year that the EU would likely take a dim view of any tampering with Article 5, which is fundamental to its principles, and undermine that adequacy decision.

Incidentally, article 5 was also a target in the TIGRR report, which was penned by a group of pro-Brexit MPs headed up Iain Duncan Smith in June this year.

Secretary of state Oliver Dowden said: “Now that we have left the EU, we have the freedom to create a bold new data regime: one that unleashes data’s power across the economy and society for the benefit of British citizens and British businesses whilst maintaining high standards of data protection.”

The launch of the consultation coincides with plans for a new governance model for the Information Commissioners Office – the UK’s data watchdog – including an independent board and chief executive to mirror the governance structure.

“Reforms will broaden the remit of the ICO and empower the Information Commissioner to champion sectors and businesses that are using personal data in new, innovative and responsible ways to benefit people’s lives in areas such as healthcare - building on the use of data in tackling Covid-19 - and financial services,” the ICO release said.

The move follows the selection of John Edwards as the government’s preferred candidate as the new Information Commissioner. He is set to replace Elizabeth Denham.

Jim Killock, executive director of campaigners at the Open Rights Group, said: "The government's proposal would make it easier for companies to use and abuse your heath data, and easier for government to run so-called 'mutant algorithms' to decide your future.

"They are inviting thousands of health, insurance, and advertising lobbyists to rip up your rights in the name of innovation.

"And to top it all they want to stitch up the way the commissioner of the ICO is appointed in future so they can keep it controlled by Conservative-friendly industry executives," Killock said. ®

More about

TIP US OFF

Send us news


Other stories you might like