Offbeat

Legal

Brit law firm files suit against Google and Deepmind over use of hospital patients' data

Royal Free Hospital saga continues as representative action brought


A UK law firm is bringing legal action on behalf of patients it says had their confidential medical records obtained by Google and DeepMind Technologies in breach of data protection laws.

Mishcon de Reya said today it planned a representative action on behalf of Mr Andrew Prismall and the approximately 1.6 million individuals whose data was used as part of a testing programme for medical software developed by the companies.

It told The Register the claim had already been issued in the High Court.

DeepMind, acquired by Google in 2014, worked with the search software giant and Royal Free London NHS Foundation Trust under an arrangement formed in 2015.

The law firm said that the tech companies obtained approximately 1.6 million individuals' confidential medical records without their knowledge or consent.

The Register has contacted Google, DeepMind and the Royal Free Hospital for their comments.

"Given the very positive experience of the NHS that I have always had during my various treatments, I was greatly concerned to find that a tech giant had ended up with my confidential medical records," lead claimant Prismall said in a statement.

"As a patient having any sort of medical treatment, the last thing you would expect is your private medical records to be in the hands of one of the world's biggest technology companies.

"I hope that this case will help achieve a fair outcome and closure for all of the patients whose confidential records were obtained in this instance without their knowledge or consent."

The case is being led by Mishcon partner Ben Lasserson, who said: "This important claim should help to answer fundamental questions about the handling of sensitive personal data and special category data.

"It comes at a time of heightened public interest and understandable concern over who has access to people's personal data and medical records and how this access is managed."

The law firm argued that action would be an important step in seeking to address the "very real" public concerns about large-scale access to, and use of, private health data by technology companies. It also raises issues regarding the precise status and responsibility of such technology companies in the data protection context, both in this specific case, and potentially more generally.

In 2017, Google's use of medical records from the hospital's patients to test a software algorithm was deemed legally "inappropriate" by Dame Fiona Caldicott, the then National Data Guardian at the Department of Health.

In April 2016, it was revealed that the web giant had signed a deal with the Royal Free Hospital in London to build an application called Streams, which can analyse patients' details and identify those who have acute kidney damage. The app uses a fixed algorithm, developed with the help of doctors, so not technically AI.

The software – developed by DeepMind, Google's AI subsidiary – was first tested with simulated data. But it was tested again using 1.6 million sets of real NHS medical files provided by the London hospital. However, not every patient was aware that their data was being given to Google to test the Streams software. Streams had been deployed inwards, and thus now handles real people's details, but during development, it also used live medical records as well as simulated inputs.

Dame Caldicott told the hospital's medical director, Professor Stephen Powis, that he overstepped the mark, and that there was no consent given by people to have their information used in this way pre-deployment.

A subsequent Information Commissioner's Office investigation found several shortcomings in how the data was handled, including that patients were not adequately informed that their data would be used as part of the test.

In a data-sharing agreement uncovered by the New Scientist, Google and its DeepMind artificial intelligence wing were granted access to current and historic patient data at three London hospitals run by the Royal Free NHS Trust. ®

Send us news
15 Comments

Privacy Sandbox, Google's answer to third-party cookies, promised within months

Winter in July for some of those in the web ad world

Google settles location tracking lawsuit for only $39.9M

Also, more OEM Android malware, Google's bug reports (mostly) ditch CVEs, and this week's critical vulns

Google Go language goes with opt-in telemetry

Go go gadget data collection

Google wants to target you – yes, YOU – with AI-generated ads

Next step, AI-generated advertisers with AI-generated products for AI-generated people

US bill to protect reproductive health data is dead. Here's why you should care anyway

My Body, My Data sounds reasonable enough, right?

Google sued over 'interception' of abortion data on Planned Parenthood website

Plaintiff claims they didn't consent to analytics tracking

The FBI as advanced persistent threat – and what to do about it

Just because you're paranoid doesn't mean they aren't after you

Search the web at least once every two years or risk losing your Google account

Updated inactivity policy may nix accounts, and data. Even Workspaces data and your precious pics

Guess who is collecting and sharing abortion-related data?

Basically everyone at this point. But developer Easy Healthcare has promised to stop

Amazon Ring, Alexa accused of every nightmare IoT security fail you can imagine

Staff able to watch customers in the bathroom? Tick! Obviously shabby infosec? Tick! Training AI as an excuse for data retention? Tick!

90+ orgs tell Slack to stop slacking when it comes to full encryption

Protests planned for Wednesday in San Francisco and Denver

Supreme Court leaves warrantless camera surveillance an open book

Can police just set up CCTV and press record? In some places, yes