Gartner's top tip to data crunchers on the eve of GDPR? Don't be creepy
Why do people forget they're a customer outside the office?
Businesses risk losing millions from the investments they made in data and analytics if they don’t respect their customers’ privacy, according to Gartner research director Bart Willemsen.
Speaking at Gartner Data and Analytics Summit in London, UK, yesterday, Willemsen said that organisations – and, crucially, those making business decisions – needed to change the way they think about data.
“I fear that when we go to work, for some reason, we leave ourselves at home... We are the customer when we’re not in the office,” Willemsen said.
He used this argument to suggest that technologists should have a better understanding of what is and isn’t appropriate use of data, and emphasised that businesses needed to be aware of what he described as “the creepy line”.
As explanation, he said a child in a clown suit at a birthday party would be adorable; a 2m-tall (6ft 7") man dressed as one in a darkened alley would be creepy.
Those that don’t toe the line could see themselves losing business, Willemsen said, pointing to a 2017 PwC survey that found 87 per cent of customers would take their business elsewhere if they didn’t trust a company to behave responsibly.
A loss of trust – or a failure to gain it in the first place – could have a “huge” impact on the business, he said.
For instance, one analysis found that businesses spent $50bn on data and analytics in 2016. If even a third of those customers chose to exercise their right to be forgotten – set out under European Union’s incoming General Data Protection Regulation, which takes effect on 25 May – it would cost firms a total of $17bn.
However, Willemsen focused more on the spirit of the GDPR than simply being a cheat sheet for firms hoping to comply to just within the letter of the law.
He emphasised the importance of data minimisation, pseudonymisation and to make sure that consent was gained in a specific, ongoing manner – rather than aiming for one initial blanket consent for all sorts of activities.
“You don’t exist with the sole purpose of processing data,” he said. “You touch personal data every day – do you know why?”
The two main questions business should ask are: if they were the customer, would this still be on the right side of the creepy line? And can they do it with less data? “If you can, why shouldn’t you?”, he asked.
At the same time, businesses should follow three ground rules: to only use directly identifiable data where appropriate and when there is a lawful basis – whether that is consent or legitimate interests; that input for analytics should be pseudonymised; and that outputs always be anonymous.
In the long term, businesses should architect aggregate data sets in a way that puts customer privacy first, Willemsen said, as well as looking at new ways of preserving privacy in analytics systems.
These might be differential privacy, trusted third parties, privacy-aware machine learning – which can be programmed not to give an output if it was on too small a sample size – or blockchain used in a way that allowed for information to be deleted if someone requested it.
Willemsen’s talk echoed comments made by distinguished analyst Ted Friedman in the keynote at the start of the day, who said that generating trust was “job number one for everybody in this room”.
However, given the subject matter, there were far fewer oblique or direct references to the Facebook-Cambridge Analytica scandal than one might have expected.
Friedman’s nod to this weekend’s elephant in the room was to note that “never has there been a moment... where forces of the outside world have been so relevant to our industry... and vice versa”. ®