ICO Deputy exposes Data Protection law wish list

Harmonisation of EU data protection law may be a pipe-dream


A question of harm?

The DIC outlined that the ICO is wanting a “harm” or “risk-based approach” (the harm approach is key to understanding the APEC Framework agreement) towards the protection of privacy. This is a seductive idea because in many instances the data controller can identify potential harm (eg, when processing personal data of a confidential nature). That is one reason why there is the promotion of Privacy Impact Assessments, designed to allow a data controller to quantify such harm, prior to any processing of personal data.

However, a word of warning: old-timers like myself are steeped in data protection history. They will remember that a "harm debate" took place in the UK some 35 years ago, well before the UK had any data protection law, and that the notion underpinning a data protection regime based on "harm" was firmly rejected by the Lindop Committee in its Report on data protection in 1978 (Command 7341, paragraphs 18.24-18.27).

Lindop concluded that there was no objective standard whereby a data controller could be able assess harm prior to the processing of personal data because there was no way an organisation could judge whether its personal data or its processing would be sensitive or non-sensitive. This was because sensitivity was a subjective assessment that could only be accurately judged by each data subject concerned; and of course, such assessments can change over time and in context.

For example, in the UK of the 1950s, most gays were fearful of others knowing of their sexuality, unlike today – but this is not the case in parts of Africa. Those who have eagerly contributed to the font of universal knowledge (eg, by YouTube or Facebook offerings about themselves) can easily regret that contribution when the context is changed to looking for employment. The sensitivity associated with a name and address of a Jewish friend changes dramatically if the book is lost and falls into the hands of the Gestapo.

In other words, an assessment made now can change in an instant (I have friends who took a wonderful holiday in Egypt only a six weeks ago) – and if that is the case, what is the value of such assessments and an approach based on harm?

That is why Lindop concluded that the only real issue was whether the data identified or related to a particular living individual and if so, then all the data protection principles should be applied. However, having established the principles did apply, Lindop concluded that the impact of the principles would be modified by a number of factors – for instance, whether there was foreseeable harm to the data subject, the sensitivity of the personal data, or whether the personal data were in the public domain.

Lindop, I believe, was in the fundamentalist camp – the Principles apply – and any pragmatism comes with enforcement and any analysis of what went wrong.

Modification to the Principles

The ICO will push data minimisation and Privacy by Design mechanisms as one of the key changes to any new law. Although this is not a new pronouncement by the ICO, I would argue that many of these requirements already form part of the current Data Protection Principles. For instance, data minimisation can be achieved by application of the Third Data Protection Principle – for example, why do you need to register your details on a website to access its free content? Isn’t that an example of excessive collection of personal data?

Of course, website owners can make such collection of personal data relevant. For instance, a data controller might want to keep records of who visits sites so that they can modify content to meet the aspirations of those who visit the site, or even deliver some marketing to those registered (heaven forbid). However such purposes (and marketing choices) should be declared to those who register via a fair processing notice.

I also think that many aspects Privacy by Design link to the Seventh and Sixth Principles (eg, obligations to have regard to the “state of the art” in relation to the security of the processing of personal data or in relation to respect the rights of data subject, so that they have choices over who can access their personal data and when. Free subject access can easily be designed into any new project that involves the processing of personal data).

The ICO would like to see collective redress available. Many of you know that the current PECR Regulations allows for aggrieved recipients of marketing messages to claim compensation for damage caused by the processing of such messages sent in breach of the Regulations. So how much is one individual damaged, for example, by a single spam message – somewhere between 0.01p and 0.1p would be a healthy overestimate? The result is that nothing happens on the PECR "compensation for damages" front; however, if there is collective damage, then the costs and risks to the spammer is much increased.

However, I should add that the Commissioner already has powers to protect the collective. For instance, a Monetary Penalty Notice could be applied to spammers using personal data (eg, an email address is personal data) where there has been blatant disregard for the email marketing rules.

With respect to notification (a hated activity), the DIC pointed out that the ICO is funded by notification fees. Reduce notification and the Government would have to pick up the tab. My solution to this is to allow the ICO to be funded by parliament; it is far too easy for an executive to strangle data protection progress by withholding state grant-in-aid to the regulator.

Finally, the Commissioner is fond of soft law – so expect more Codes of Practice in the UK.

Conclusion

What do I think? There will be little progress and the UK’s Data Protection Act will be largely unchanged in the current decade. There might be tweaks at the edges – but no fundamental change.

This story originally appeared at HAWKTALK, the blog of Amberhawk Training Ltd.

Similar topics


Other stories you might like

  • California's attempt to protect kids online could end adults' internet anonymity
    Websites may be forced to verify ages of visitors unless changes made

    California lawmakers met in Sacramento today to discuss, among other things, proposed legislation to protect children online. The bill, AB2273, known as The California Age-Appropriate Design Code Act, would require websites to verify the ages of visitors.

    Critics of the legislation contend this requirement threatens the privacy of adults and the ability to use the internet anonymously, in California and likely elsewhere, because of the role the Golden State's tech companies play on the internet.

    "First, the bill pretextually claims to protect children, but it will change the Internet for everyone," said Eric Goldman, Santa Clara University School of Law professor, in a blog post. "In order to determine who is a child, websites and apps will have to authenticate the age of ALL consumers before they can use the service. No one wants this."

    Continue reading
  • UK watchdogs ask how they can better regulate algorithms
    We have bad news: you probably can't... but good luck anyway

    UK watchdogs under the banner of the Digital Regulation Cooperation Forum (DRCF) have called for views on the benefits and risks of how sites and apps use algorithms.

    While "algorithm" can be defined as a strict set of rules to be followed by a computer in calculations, the term has become a boogeyman as lawmakers grapple with the revelation that they are involved in every digital service we use today.

    Whether that's which video to watch next on YouTube, which film you might enjoy on Netflix, who turns up in your Twitter feed, search autosuggestions, and what you might like to buy on Amazon – the algorithm governs them all and much more.

    Continue reading
  • UK criminal defense lawyer hadn't patched when ransomware hit
    Brit solicitor fined after admitting it took 5 months to install critical update

    Criminal defense law firm Tuckers Solicitors is facing a fine from the UK's data watchdog for failing to properly secure data that included information on case proceedings which was scooped up in a ransomware attack in 2020.

    The London-based business was handed a £98,000 penalty notice by the Information Commissioner's Office under Article 83 of the EU's General Data Protection Regulation 2018*.

    The breach was first noted by Tuckers on August 23 2020 when part of its IT system became unavailable. On closer inspection, resident techies found a note from the attackers confirming they had compromised part of the infrastructure. The Microsoft Exchange server was out of action and two days' worth of emails were lost, as detailed by the company blog at the time.

    Continue reading

Biting the hand that feeds IT © 1998–2022