DeepMind doesn’t fully understand the complexity of the problems it is trying to address and needs to think through the broader implications of its work, a panel in the UK has said.
Google’s healthcare arm last year appointed an independent panel to assess its work, and the team of nine - including former GDS man Mike Bracken and ex-Lib Dem MP Julian Huppert - have just published their first annual report.
In it, the panel said it was concerned that DeepMind did not fully understand the “scale and complexity of the problems” it was trying to fix with its technology - currently being used to help identify patients at risk of acute kidney injury and research eye disease.
The company needs to increase its broad engagement with clinicians in different departments, academics and other healthcare bodies, the panel said.
DeepMind should also consider the broader implications of its work - both of the long-term bid to increase use of artificial intelligence in healthcare and the short-term rollout of kidney injury app Streams to other hospitals.
For instance, the panel said it was concerned some hospitals might feel Streams was “parachuted in”, or that it might be used for performance management or in litigation.
On data security, the panel found 10 low vulnerabilities - two in the Streams app and eight in its web API - and one medium vulnerability in its data centre server build.
It recommended DeepMind mitigate against these, as well as offering the safe advice to "stay abreast of the latest versions and patching".
Elsewhere, the report said that DeepMind had failed to carry out enough public engagement.
This is especially important given the perception that data processed by DeepMind might be shared with Google, the reviewers said.
In comments that allude to the Chocolate Factory’s insidious data-gathering activities, Huppert said that “‘good enough’ is not good enough for a company so closely linked to Google, a company that already reaches into every corner of our lives”.
Google should be held to higher standards
The report noted that a lot of existing hospital data handling is pretty dodgy - from the insecure storage of vast swathes of paper forms to doctors’ use of SnapChat to send scans to other clinicians.
But, Huppert said, just because that's the current state of affairs, it “doesn’t excuse” DeepMind’s failings in the area.
“It’s in DMH’s interested to live up to a high standard,” he said. “And it’s right DMH is held to higher standards, even if that means they are singled out as a lightning rod for public concerns.”
However, campaign group medConfidential co-ordinator Phil Booth said: “The idea of holding Google to a higher standard is fine, but at the moment it hasn't even complied with the law.”
Referring to the deal made between the Royal Free NHS Trust and DeepMind - in which not all the 1.6 million patients were aware their identifiable data was being used - Booth added: "Google is so big and powerful that the idea they're blundering around without fully knowing the law is terrifying.”
On Monday, the Information Commissioner’s Office ruled that the Royal Free had failed to comply with the Data Protection Act when it transferred this data to the company.
But the panel steered clear of issuing a strong opinion on this - commenting that DeepMind had “acted only as a data processor”, while the Royal Free was the data controller. (It’s perhaps worth noting that the new EU General Data Protection Regulation places direct obligations on data processors for the first time.)
The panel said that the “lack of clarity” in the initial data sharing agreement between the hospital and DeepMind was concerning, but at the same time praises the company’s correction of the agreement - which came six months after the data sharing was stopped following a public outcry.
Booth said that, although the panel “shouldn’t interpret the law - that's the ICO's job - they could have considered the Royal Free deal more than they did. Ideally, there would have been a section that looked at how this went so badly wrong”.
Meanwhile, DeepMind’s relationship with other public bodies has been brought into question this week, with a report in The Conversation claiming that the companying is eyeing up the data in the government’s flagship 100,000 Genomes Project.
According to Edward Hockings, a PhD student at the University of West Scotland, a Freedom of Information request has revealed that the body running the project, Genomics England, has met with Google.
The response said that there had been one meeting with Google in the past 12 months, which Genomics England said was to a “general introductory meeting” at which DeepMind was discussed “among other subjects”.
However, meetings between government and big businesses are fairly common and one meeting a year does not scream ‘formal partnership’ at us.
For its part, Genomics England told The Reg that there were “no active conversations ongoing with Google or Google DeepMind” - but that it was important “for the future of medicine” that it meets and collaborates with the “most innovative emerging technologies and companies”. ®