NHS OSS white paper is 'disappeared'

Might have confused people into thinking we had a policy


In a weirdly Maoist piece of public self-criticism the National Health Service Information Authority has withdrawn a white paper on Open Source Software and the NHS. As the hole where it used to be now says: "The Information Authority regrets the publication of the internal discussion paper - Open Source. Any proposed actions expressed in the paper were personal views and did not represent NHS IA or government policy."

To "avoid further confusion" the document has therefore been withdrawn. This confusion, presumably, has reigned since the internal discussion document formerly known as a white paper was published in January 2002. In the intervening period the paper, by NHSIA principal consultant Colin Smith, has been widely accepted as the 'state of the art' as far as NHS OSS policy is concerned. It suggests (it remains available elsewhere, if not on the NHSIA site) that barriers to more widespread use of OSS could be overcome by promoting awareness among NHS users, by producing "a specific NHS policy on OSS, based upon the government proposals", by the provision of guidelines on licensing, and through the encouragement of the development of an OSS market in healthcare systems. As Smith noted in, er, January 2002, the UK government had recently published a draft policy on OSS and "there is an urgent need for a response by the NHSIA."

The difficulty would appear to be that there doesn't seem to have been much response beyond Smith's white paper, and as its suggestions do not appear to have been implemented, their continued presence on the NHSIA site might be thought of as being embarrassing. Or as the NHSIA has it, confusing. The NHSIA would however appear to be pulling back, rather than just ending confusion/embarrassment. As the confession says: "Adoption of Open Source across the NHS was and is a Departmental policy decision particularly where there is a potential impact on the reliability and performance of information and infrastructure systems that are critical to the delivery of health care." And this is reliability issue is elaborated further down with: "Open Source code typically comes without ownership, support or maintenance. NHS infrastructure and information systems are critical to the delivery of quality care and therefore guarantees on the reliability and future maintenance of systems are required."

Which could be read as actively discouraging 'unofficial' and 'unapproved' open source initiatives within the NHS. Note also that the NHSIA says it takes "policy direction regarding Open Source from the e envoy office, OGC and the Department of Health", and: "No policy decision has yet been made by the Department of Health or OGC to nominate a designated guardian for Open Source code across the NHS." So the NHSIA's position now is that OSS policy will be decided at higher levels, and those higher levels will decide whether or not the NHS gets a designated guardian or not.

On a related NHS matter, The Register's confidence in the Department of Health's abilities to define policies and implement systems took something of a knock last week when, answering questions relating to the NHS National Programme for Information Technology (NPfIT), Health Minister John Hutton sounded disturbingly like a man more familiar with the brochures than the actual kit. For example, in answer to a question from Keith Vaz regarding the transference of data from existing systems to the new one: "The migration of data is not a novel process and occurs at local level every time there is a refresh or upgrade of existing systems. Standard IT protocols allow for data to be recovered if problems are encountered during transfer to ensure that data is not lost, and the NPflTs contracts with its suppliers require data back-ups to be taken regularly."

So that's all right then. Should you be interested, much more John Hutton brochureware can be found in Hansard, here. ®

Related stories:

Doctors give sickly outlook for NHS IT
NHS IT costs skyrocket


Other stories you might like

  • Millions of people's info stolen from MGM Resorts dumped on Telegram for free
    Meanwhile, Twitter coughs up $150m after using account security contact details for advertising

    Miscreants have dumped on Telegram more than 142 million customer records stolen from MGM Resorts, exposing names, postal and email addresses, phone numbers, and dates of birth for any would-be identity thief.

    The vpnMentor research team stumbled upon the files, which totaled 8.7 GB of data, on the messaging platform earlier this week, and noted that they "assume at least 30 million people had some of their data leaked." MGM Resorts, a hotel and casino chain, did not respond to The Register's request for comment.

    The researchers reckon this information is linked to the theft of millions of guest records, which included the details of Twitter's Jack Dorsey and pop star Justin Bieber, from MGM Resorts in 2019 that was subsequently distributed via underground forums.

    Continue reading
  • DuckDuckGo tries to explain why its browsers won't block some Microsoft web trackers
    Meanwhile, Tails 5.0 users told to stop what they're doing over Firefox flaw

    DuckDuckGo promises privacy to users of its Android, iOS browsers, and macOS browsers – yet it allows certain data to flow from third-party websites to Microsoft-owned services.

    Security researcher Zach Edwards recently conducted an audit of DuckDuckGo's mobile browsers and found that, contrary to expectations, they do not block Meta's Workplace domain, for example, from sending information to Microsoft's Bing and LinkedIn domains.

    Specifically, DuckDuckGo's software didn't stop Microsoft's trackers on the Workplace page from blabbing information about the user to Bing and LinkedIn for tailored advertising purposes. Other trackers, such as Google's, are blocked.

    Continue reading
  • Despite 'key' partnership with AWS, Meta taps up Microsoft Azure for AI work
    Someone got Zuck'd

    Meta’s AI business unit set up shop in Microsoft Azure this week and announced a strategic partnership it says will advance PyTorch development on the public cloud.

    The deal [PDF] will see Mark Zuckerberg’s umbrella company deploy machine-learning workloads on thousands of Nvidia GPUs running in Azure. While a win for Microsoft, the partnership calls in to question just how strong Meta’s commitment to Amazon Web Services (AWS) really is.

    Back in those long-gone days of December, Meta named AWS as its “key long-term strategic cloud provider." As part of that, Meta promised that if it bought any companies that used AWS, it would continue to support their use of Amazon's cloud, rather than force them off into its own private datacenters. The pact also included a vow to expand Meta’s consumption of Amazon’s cloud-based compute, storage, database, and security services.

    Continue reading

Biting the hand that feeds IT © 1998–2022