We may be leaving the EU, but some EU law – with significant consequences for the IT community – will carry on. One such is the General Data Protection Regulation (GDPR).
We signed up to it before the referendum, agreed to persist with it after and, if businesses wish to continue to process data belonging to EU firms, as well as avoid some potentially eye-watering fines, they need to be compliant with the GDPR no later than May 2018.
And yet a March survey of IT decision makers in UK companies by information management firm Crown Records Management found an extraordinary 44 per cent believe GDPR will not apply to UK business after Brexit.
Twenty-four per cent are no longer preparing for it, while some 4 per cent have not even begun to prepare.
Rachel Aldighieri, managing director of the Direct Marketing Association, reckons only two-thirds of her particular industry believe their businesses are going to be ready for GDPR when it comes into force.
"The risk to businesses of not being compliant in time is significant. Our advice to all businesses is to prepare now, as it will take time to fully implement the systems and processes required," Aldighieri said.
Yet Aldighieri is clear: "Brexit does not change the behaviours that companies must adopt as part of GDPR."
Fixing the many flaws of the Data Protection Act
As a Data Protection veteran, I remember how large swathes of the business went into headless chicken mode in the run up to the Data Protection Act (DPA) 1998, and its predecessor, the 1984 version.
First time around, I was with a direct mailing house, which feared its lucrative business, churning out millions of mail shots to reluctant consumers, would come to a grinding halt. In 1998, as acting direct marketing manager for one of the UK's largest financial institutions, I spent many an hour talking to £500-an-hour lawyers over whether the new DPA would impact a call-centre operation handling millions of calls a year. Yet despite initial concern these fears turned out to be baseless.
In both cases, the reasons had to do with limitations within the DPA. First and foremost was an inadequate enforcement regime. An under-resourced Information Commissioner could pursue only a fraction of breaches reported and, if successful, the penalties were negligible: an average of around £500 – and then only after warnings had been given and discussion had taken place. That has been changing and, with the GDPR, the ultimate transformation of data protection enforcement from pathetic to draconian is now reality. For some offences, the maximum penalty can be €20m or 4 per cent of global turnover.
Data processors are no longer immune from prosecution: previously, the law recognised data controllers as ultimately (legally) responsible for any breaches of the law, with processors, unless they acted in criminal fashion, largely exempt. No more.
At the same time, individuals are now empowered – encouraged even – to pursue companies that process their data unlawfully.
A second major flaw in the DPA was always around consent, which appeared to be a paramount condition of any data processing. Yet over time, this eroded: "consent" is nowhere directly defined in law; while the law's second principle, which, on the surface, requires data to be processed only for specified purposes, provides a further get-out, allowing processing: "For a purpose that is in relation to the purpose(s) you have specified and could be reasonably expected by the data subject."
In other words, as practices become commonplace, an individual's right to object to them dwindles.
History therefore teaches that laws on personal data are rarely as powerful, or as defining, as the hype of significant consumer rights over how their data is used.
Is this time different? The last 18 months or so has seen the publication of an entire bookshelf of guides to what the law might mean in theory. That provided by the Information Commissioner's Office, which will, eventually, be tasked with enforcing the GDPR, is worth perusing.
What, though, are the main practical implications of the GDPR – and how prepared are businesses to deal with the operational consequences?
Let's start with customer consent to data processing. The GDPR goes some way beyond current legal requirements by abolishing "implicit" and opt-out consent.
In future, data subjects will be able to withdraw consent at any time: and it should be as easy to withdraw consent as to grant it. No more will companies be able to recruit online, then rebuff objectors, by demanding they write in via snail mail.
Consent must be freely given, which will not be the case where there is imbalance of power between data controller and subject. This could be bad news for public authorities who presently process much data on the basis that if it is done for the public good, then that is good enough.
An upgrading of biometric data to "sensitive" status means explicit consent for processing may now be required before processing a customer's image or voice data. Most significant of all, profiling of customers – the automated processing of personal data to evaluate certain personal aspects relating to a natural person – cannot take place without some human input.
Taken together, this could be a serious check on UK business, particularly financial services. Profiling is now central to much decision making: an industry has grown up around the creation of statistical models which are increasingly used to determine not just creditworthiness, but also what options an individual will be offered in marketing terms.
Moreover, the direction of travel has been to disempower customer-facing staff, while shifting volume business to call centres. All those "computer says no" phone calls sound suspiciously like automated processing for which manual intervention is specifically rejected.
As for voice recognition software, designed to spot fraud, it might be argued that it is only ever a start point and that ultimately decisions are left to call centre operators. It might equally be claimed that as mechanisms for identifying potential criminality, they are exempt from some provisions of the GDPR. Still, many customers dislike voice recognition programs, and we have yet to see which way the law will fall on that issue.
Claims that such processing is a prerequisite of service delivery will also be tested, as the GDPR is clear that processing must be part of the service: not a pre-condition of being eligible for it.
A large question mark must also now hang over the practices of one UK political party that allegedly used automated profiling as a means to target messages at the last election. Traditional politics in the UK may, in the end, be carved not by stricter spending limits, but by the GDPR simply banning these practices.
Right to be forgotten
IT departments also face new rights of deletion: data controllers must erase personal data "without undue delay" if the data is no longer needed, the subject objects, or the processing was unlawful.
The difficulty is most systems do not delete data. Following a government amnesty for gay men convicted under outdated laws, special routines had to be written to update criminal records. This included purging archived data. Yet if there is one thing that working with IT departments has taught me, it is how infernally difficult it is to achieve the latter. Not just because IT staff are disinclined ever to touch what has been archived. But because processing archives is often difficult, expensive, and disruptive.
Professor Merlin Stone, once described as godfather of modern database marketing, argued: "The GDPR is the ultimate in terms of codification of what the best marketers have believed all along but few companies have practiced – that the holding of customer data should be primarily for the benefit of the customer – in terms of what data is held, for how long, and how it is used, and once that stops being true, you had better watch out."
The real issue, which many supposedly GDPR-ready companies may not have spotted, is that data "subjects" are about to be elevated in status to significant players in the world of data: this is not just about new legal rights, but an inversion of the entire edifice of data protection.
Customers can ask questions, stop processing, demand you remove their data. If you get it wrong – if a personal data breach results in a high risk to the rights and freedoms of individuals – you are now required to inform not just the Information Commissioner, but the individual concerned.
The focus, therefore, needs to be not on internal systems, so much as the process for dealing with individuals who, if ever they cotton on to what new rights they have acquired, are likely to become that much more demanding, and empowered. And unless the focus over the next 12 months is on putting in place customer-facing systems, it is unlikely that most businesses will be ready for them. ®