This article is more than 1 year old

Info Commish tells we shouldn't let artificial ignorance make all our decisions

Reckons GDPR will help us challenge algo-driven outcomes

Algorithms should not be solely responsible for criminal sentencing, while a change in law may be required to open up public data sets involving health information.

These were just some of the topics Information Commissioner Elizabeth Denham touched on today in a wide-ranging Parliamentary hearing about the use of AI in decision-making.

Denham told MPs on the Science and Technology Committee that companies and organisations must be able to explain how decisions are made by machines, but stopped short of saying they should be required to make that process available to the public, citing concerns over commercial confidentiality.

"When dealing with public-sector data and the use of AI systems, there may be more of an argument for publication of [that] data as long as it's not personal data," she said.

When it comes to using medical information for the purposes of research, she said that a change in law may be required to open up data sets for broad use of AI. The ICO previously said that the Royal Free NHS Foundation Trust failed to comply with the UK's Data Protection Act when it provided 1.6 million patient details to Google's DeepMind.

Asked what her biggest concerns were for the future use of AI in decision making, she said any area in which it has a significant impact on people's lives.

"AI decisions in the criminal justice system could be part of [the] decision [making process], but there needs to be human intervention around sentencing and parole."

Software has been developed to predict how likely a criminal will reoffend and is used by the courts to hand out appropriate punishments. However, it has been criticised for having a racial bias.

Denham was positive that under the forthcoming GDPR changes there will be new rules and rights for individuals to protect them from unfair algorithmic decisions.

"The most important change in data protection law is accountability – organisations have to identify risks, take steps to mitigate them and explain that to the regulator."

However, she said the ICO will need to "upskill" to ensure it is able to perform "algorithmic auditing", adding that the body was working with the Alan Turing Institute to create a framework for understanding the "explainability" of algorithms. "Opaque algorithms are not going to cut it."

On the subject of the ICO's probe into the use of algorithms to target individuals for political campaigns, Denham said it was "probably the most complicated investigation the ICO has ever taken" as it involved various companies, organisations and social media platforms.

She said the body has served an information notice on UKIP to compel it to release information relating to the EU referendum, and one other organisation whose name has not yet been published.

"From a data protection perspective we're looking at how that is happening and make it transparent to the public," she said. "We hope to reveal what is happening in terms of the use of data. It's not an investigation into fake news, bots, or financing, just the data protection issues of micro-targeting." ®

More about


Send us news

Other stories you might like