NHS COVID-19 app update blocked by Apple, Google over location privacy fears

New version for England dead on arrival just as UK eases lockdown rules


An update for the NHS's COVID-19 test-and-trace app for England has been blocked by both Apple and Google because it added the ability for users to store and share location data.

The arm of the National Health Service that developed the app signed an agreement with both tech giants that it would not gather location data through the software in order to protect people's privacy. The latest update, however, allows folks to share a log of places where they have checked-in as a way of warning other users.

Users of the app have been able to scan a QR code when entering a public location or venue such as a shop or restaurant – many of which are required to display a sign with such a code – but the data remained on the phone.

When local authorities identified a location as being a hotspot for COVID-19, it would be added to a central database, the mobile app would check to see if user has been near locations in the database, and if so, relay this to the user. The new update, however, would allow users to upload their location data if, for example, a user tested positive and wanted to alert others of where they had been. The update was timed to coincide with a relaxation of lockdown rules in the United Kingdom.

Apple and Google have both decided that this additional data sharing violates the agreement the NHS signed with them, however, and have blocked the new code from being rolled out to handsets. People are still able to download and use the old version.

The UK government’s Department of Health has not explained how it managed to develop an entire update without realizing it would be blocked. An FAQ [PDF] written by Google and Apple last year stated clearly its COVID-19 notifications system for apps “does not share location data from the user’s device with the Public Health Authority, Apple, or Google.”

So trying to find a way around that policy decision was unlikely to be welcomed. In an update last week, the government announced the “venue history sharing” plan and described it as “privacy-protecting.” We were told:

If an app user tests positive, they will be asked to share their venue history in a privacy-protecting way via the app. This will allow venue alerts to be generated more quickly, and improve the ability to identify where outbreaks are occurring and take steps to prevent the virus spreading.

Presumably the Department of Health thought that allowing the sharing to be opt-in and seeking (or pushing) for user action to share the data rather than automating it would be a way around the restriction.

The blunder, reported by the BBC here, reminds us of the UK government’s earlier attempts to create an app that would allow data to be stored and shared by fudging a rarely used Bluetooth connectivity feature of the phones’ operating systems: an approach that ultimately failed, and the NHS was forced to use special notification APIs set by the tech giants for COVID-19 apps.

It’s tempting to view the blocking as another example of how tech companies are more powerful than governments thanks to their control of mobile operating systems on devices that billions of people carry around with them at all times. But in this case, the UK government did sign an agreement and it’s not hard to see how the update would break it.

In the meantime, Scotland, for one, has come up with an alternative solution to gathering the same sort of data: it has created an entirely separate app called Check In Scotland that does the same thing but doesn’t directly feed into its contact-tracing app called Protect Scotland. ®


Other stories you might like

  • Experts: AI should be recognized as inventors in patent law
    Plus: Police release deepfake of murdered teen in cold case, and more

    In-brief Governments around the world should pass intellectual property laws that grant rights to AI systems, two academics at the University of New South Wales in Australia argued.

    Alexandra George, and Toby Walsh, professors of law and AI, respectively, believe failing to recognize machines as inventors could have long-lasting impacts on economies and societies. 

    "If courts and governments decide that AI-made inventions cannot be patented, the implications could be huge," they wrote in a comment article published in Nature. "Funders and businesses would be less incentivized to pursue useful research using AI inventors when a return on their investment could be limited. Society could miss out on the development of worthwhile and life-saving inventions."

    Continue reading
  • Declassified and released: More secret files on US govt's emergency doomsday powers
    Nuke incoming? Quick break out the plans for rationing, censorship, property seizures, and more

    More papers describing the orders and messages the US President can issue in the event of apocalyptic crises, such as a devastating nuclear attack, have been declassified and released for all to see.

    These government files are part of a larger collection of records that discuss the nature, reach, and use of secret Presidential Emergency Action Documents: these are executive orders, announcements, and statements to Congress that are all ready to sign and send out as soon as a doomsday scenario occurs. PEADs are supposed to give America's commander-in-chief immediate extraordinary powers to overcome extraordinary events.

    PEADs have never been declassified or revealed before. They remain hush-hush, and their exact details are not publicly known.

    Continue reading
  • Stolen university credentials up for sale by Russian crooks, FBI warns
    Forget dark-web souks, thousands of these are already being traded on public bazaars

    Russian crooks are selling network credentials and virtual private network access for a "multitude" of US universities and colleges on criminal marketplaces, according to the FBI.

    According to a warning issued on Thursday, these stolen credentials sell for thousands of dollars on both dark web and public internet forums, and could lead to subsequent cyberattacks against individual employees or the schools themselves.

    "The exposure of usernames and passwords can lead to brute force credential stuffing computer network attacks, whereby attackers attempt logins across various internet sites or exploit them for subsequent cyber attacks as criminal actors take advantage of users recycling the same credentials across multiple accounts, internet sites, and services," the Feds' alert [PDF] said.

    Continue reading

Biting the hand that feeds IT © 1998–2022