DeepMind Health told to explain business model, relationship to Google
Review panel: We don't think AI firm has a hidden agenda, but…
Alphabet-owned AI company DeepMind Health needs to clarify its relationship with Google and explain how it plans to turn a profit, the firm's independent review panel has said.
In a report (PDF) published today, the group told DeepMind Health – which has run into controversy for its work with the NHS, especially the Streams project that slurped 1.6 million patient records – to be more transparent about its business model.
The nine-strong panel, appointed by DeepMind Health in a bid to demonstrate its commitment to openness, warned that without such clarity the public would suspect a hidden agenda.
Central to the panel's second annual report on the firm are 12 principles for DeepMind Health, which are described as a set of expectations and a framework the panel will use to assess the firm's future activities.
They include calls for it to be anti-monopoly and only turn a reasonable profit – not things stablemate Google, or many tech giants, are famed for.
This is highlighted by the panel, which pointed out that, in this climate of distrust, it wouldn't take much for the firm to lose any public trust it might have.
"Given the current environment, and with no clarity about DeepMind Health's business model, people are likely to suspect that there must be an undisclosed profit motive or a hidden agenda," the report said.
Although it emphasised that it doesn't believe there is such an agenda – and that DeepMind Health hasn't made any revenues on its NHS projects yet – the panel noted that there would be "very significant concerns" if the business model involved targeted advertising or selling data.
In response, DeepMind Health said (PDF) that it was working on its longer-term business model, but that "rather than charging for the early stages of our work, our first priority has been to prove that our technologies can help improve patient care and reduce costs".
Insulate yourself from Alphabet, avoid vendor lock-in
The panel was equally concerned about DeepMind Health's relationship with Alphabet and Google, saying it needed to offer much more clarity – an issue also raised in its previous report.
"There is still a very notable gap in communication about their relationship to Google," the report said. "We noted this last year and is still as conspicuous by its absence now, as then."
Moreover, the panel questioned whether the company would be able to maintain that level of separation, and advised it to consider how to make this a more concrete long-term commitment.
"To what extent can DeepMind Health insulate itself against Alphabet instructing them in the future to do something which it has promised not to do today?" the reviewers asked.
"Or, if DeepMind Health's current management were to leave DeepMind Health, how much could a new CEO alter what has been agreed today?"
The firm needs to find a way of "entrenching its separation" from Alphabet, and that firm's AI arm, DeepMind – which does make a profit – "more robustly", the panel said.
In addition, the health business needs to find a way to be transparent about its business model and "stick to that without being overridden by Alphabet".
To what extent can DeepMind Health insulate itself against Alphabet instructing them in the future to do something which it has promised not to do today?
Similarly, the business needs to be more clear about the pledges it has already made. For instance, DeepMind Health brags that data "will never be connected to Google accounts or services" – but one of its projects uses a Google cloud service.
Again, the panel stressed that it believed the use of the service was due to high security, but said it "might lead some to think that this promise was already being broken". The company should work on its language so people have a better understanding of exactly what such statements mean.
Beyond its relationship with the search engine giant and its parent firm, the panel also asked DeepMind Health to commit to other measures that most vendors would simply laugh at.
That includes taking steps to make sure health organisations don't get locked in to its products.
"There are many examples in the IT arena where companies lock their customers into systems that are difficult to change or replace," the panel intoned sagely.
"We do not want to see DeepMind Health putting itself in a position where clients, such as hospitals, find themselves forced to stay with DeepMind Health even if it is no longer financially or clinically sensible to do so; we want DeepMind Health to compete on quality and price, not by entrenching legacy positions."
Expansion of Streams app at the Royal Free
Unlike last year's review, the latest assessment had less emphasis on the information governance and data protection issues, which were under heavy scrutiny in 2017 thanks to concerns about the legality of the Royal Free deal.
However, the panel did say that the firm needed to "continue to be very cautious" about information governance to avoid a second wave of public concern, and ensure that it makes public all its contracts with public sector partners – something that hasn't happened for all cases.
Indeed, the Royal Free has only just handed over its Privacy Impact Assessment of Streams – the app that aims to help doctors detect acute kidney injury – to journalists. Despite being drawn up last summer, the hospital declined an FoI request in November asking for its release.
The assessment is for the extension of the Royal Free NHS trust's use of Streams to Barnet Hospital.
At the moment, the app is in use at the Royal Free hospital, but a spokesperson for the trust told The Register that – in light of an audit published earlier this week – it was planning to roll it out to Barnet soon.
"Given the conclusion of the audit that we are processing information lawfully, we will be looking to do so in the near future," a spokesperson said.
The trust also confirmed that it was in the process of terminating an historic memorandum of understanding with DeepMind Health that covered a now-scrapped AI project using depersonalised data, as recommended by the audit. ®