Sharing medical records with researchers: Assumed consent works in theory – just not yet in practice

The UK shows us how not to run an opt-out approach

Register Debate Welcome to the latest Register Debate in which writers and experts go head to head on technology topics, and you – the reader – choose the winning argument. The format is simple: we propose a motion, the arguments for the motion will run this Monday and Wednesday, and the arguments against on Tuesday and Thursday.

During the week you can cast your vote on which side you support using the poll embedded below, choosing whether you're in favour or against the motion. The final score will be announced on Friday, revealing whether the for or against argument was most popular. It's up to our writers to convince you to vote for their side.

This week’s motion is: Assumed consent is the right approach for sharing healthcare patients’ data, beyond their direct care. Or to put it another way: patient records should be shared with medical researchers on an opt-out basis.

The debate around the benefits of sharing medical data for the greater good versus individual’s expectations of confidentiality and consent, has become heated to say the least over the last year and a half. But if consent is not just assumed, but informed, do we all stand to benefit? Our contributors serve up their own prescriptions, but you get to decide.

Arguing AGAINST THE MOTION today is Joe Fay, who has written about the technology business for 30 years and has edited publications in London and San Francisco.

There are undoubted theoretical benefits to the massive aggregation of health data to enable research into public health in general and new or rare conditions in particular. Collectivist countries such as Germany and Singapore are seen as having better served their citizens during the pandemic than more individualistic societies such as the US, Brazil, or the UK.

This surely strengthens the argument in favour of assumed consent, its proponents would argue. But individual choice matters, and MedConfidential’s Phil Booth eloquently demonstrated earlier this week why the use of data must be consensual, safe and transparent. And, crucially, informed. So, the NHS in England could, arguably should, demonstrate the benefits of an assumed consent approach with its General Practice Data for Planning and Research (GPDPR) programme, unkindly dubbed the “great GP data grab.”

As apparently the largest non-military organisation in the world, what the NHS does sets an example for healthcare systems around the planet. In the case of the latest NHS effort, citizens are reassured that the data will be pseudonymised, with unique codes replacing key identifying data. NHS Digital will be “able to use the software to convert the unique codes back to data that could directly identify patients in certain circumstances, and where there is a valid legal reason.”

The NHS hasn’t 'informed' us exactly how the protections around its vast data store will be an insurmountable challenge for the cyber criminal community

The problem is that many in tech land will think that, smart as the folks in NHS Digital are, there will be other less scrupulous parties out there who will be smarter, and more numerous. The NHS hasn’t “informed” us exactly how the protections around its vast data store will be an insurmountable challenge for the cyber criminal community.

Likewise, NHS Digital’s insistence that when it comes to commercial organisations, sharing would be restricted to partners with a “legal basis and legitimate need” is less than reassuring. It lists just three examples of the sort of organisations that might get access, while simply saying insurance companies won’t, leaving much open to interpretation. And the fact is that the context and interpretations change over time.

Technologies change. Policies change. And governments change too – which is why the Taliban now has its very own biometric database of people who worked with the previous US-backed government. One thing remains constant though. Companies that style themselves as digital disrupters and AI pioneers will really want to get their hands on the NHS’s treasure trove of data. Sorry, OUR data.

And on past performance, these same disruptors have an alarming tendency to operate a few steps ahead of policy makers and regulators. So, there are many reasons why the public might feel less than assured about assumed consent in general, and the GPDPR program specifically.

But the main reason citizens won’t be assured is because so many of them are largely unaware the GPDPR program even exists. By launching the programme in the midst of a pandemic, when GPs are already overstretched and their surgeries are off limits for many patients, the government has virtually guaranteed deep scepticism about the plan – and by extension, about the whole notion of assumed consent.

There may be a point at which it really does make sense for assumed – but also informed – consent to be the default model for sharing medical data. But we’re not going to reach that point until clinicians, researchers, and citizens can agree a reasonable framework for delivering such a system, and a transparent way of informing citizens. In the UK, we’ve seen no such thing. Unfortunately, polls don’t lend themselves to “not yet” answers. So, by default, the answer to this proposal is NO. ®

Cast your vote below. We'll close the poll on Thursday night and publish the final result on Friday. You can track the debate's progress here.

JavaScript Disabled

Please Enable JavaScript to use this feature.

Similar topics

Other stories you might like

  • Experts: AI should be recognized as inventors in patent law
    Plus: Police release deepfake of murdered teen in cold case, and more

    In-brief Governments around the world should pass intellectual property laws that grant rights to AI systems, two academics at the University of New South Wales in Australia argued.

    Alexandra George, and Toby Walsh, professors of law and AI, respectively, believe failing to recognize machines as inventors could have long-lasting impacts on economies and societies. 

    "If courts and governments decide that AI-made inventions cannot be patented, the implications could be huge," they wrote in a comment article published in Nature. "Funders and businesses would be less incentivized to pursue useful research using AI inventors when a return on their investment could be limited. Society could miss out on the development of worthwhile and life-saving inventions."

    Continue reading
  • Declassified and released: More secret files on US govt's emergency doomsday powers
    Nuke incoming? Quick break out the plans for rationing, censorship, property seizures, and more

    More papers describing the orders and messages the US President can issue in the event of apocalyptic crises, such as a devastating nuclear attack, have been declassified and released for all to see.

    These government files are part of a larger collection of records that discuss the nature, reach, and use of secret Presidential Emergency Action Documents: these are executive orders, announcements, and statements to Congress that are all ready to sign and send out as soon as a doomsday scenario occurs. PEADs are supposed to give America's commander-in-chief immediate extraordinary powers to overcome extraordinary events.

    PEADs have never been declassified or revealed before. They remain hush-hush, and their exact details are not publicly known.

    Continue reading
  • Stolen university credentials up for sale by Russian crooks, FBI warns
    Forget dark-web souks, thousands of these are already being traded on public bazaars

    Russian crooks are selling network credentials and virtual private network access for a "multitude" of US universities and colleges on criminal marketplaces, according to the FBI.

    According to a warning issued on Thursday, these stolen credentials sell for thousands of dollars on both dark web and public internet forums, and could lead to subsequent cyberattacks against individual employees or the schools themselves.

    "The exposure of usernames and passwords can lead to brute force credential stuffing computer network attacks, whereby attackers attempt logins across various internet sites or exploit them for subsequent cyber attacks as criminal actors take advantage of users recycling the same credentials across multiple accounts, internet sites, and services," the Feds' alert [PDF] said.

    Continue reading

Biting the hand that feeds IT © 1998–2022