This article is more than 1 year old
Snowden's NSA leaks have galvanised the storage world
Vendors raise their game after gov securo-busting revealed
Anyone following the fortunes of the world’s biggest technology companies will have noticed a trend: every one of them has gone potty for privacy.
This is not out of some sudden moral urge but because their futures depend on proving that they are good at protecting people’s personal data.
The Edward Snowden leaks, in particular the revelations about the National Security Agency (NSA) PRISM project, which saw the intelligence agency hoover up data from servers at Google, Facebook, Microsoft and many others, have brought about this predilection for privacy.
It is not just the average consumer the tech titans have to pander to either. Businesses have clear concerns about who can access their information and whether they have a Snowden-type character in their ranks.
In a recent CyberArk survey of 373 C-level and IT security executives across North America, Europe and the Asia-Pacific, 37 per cent of respondents said Snowden’s breach of NSA security had influenced their security strategy more than any other incident over the past year.
Difficult decisions are having to be made across industries. Where and how to store data tops the list of priorities. Who to trust has also become a pertinent question when it comes to access management and procurement processes. Storage and security have become sexy again.
Indeed, one of the material outcomes of Snowden’s leaks has already been realised: inspired by renewed consumer and business interest in privacy, technology is becoming more secure.
Under a cloud
But with understandable anxieties around the legal ways in which business data can be accessed by government entities, and with so many vendor promises of truly effective data privacy, it is difficult to know who to trust with storage.
A few years ago, it was taken as a given that many businesses would chuck as much information into the cloud as was feasible. Even today, the market seems set for epic growth.
The US-based Technology Business Research Public Cloud Benchmark report from July suggested the public cloud market would grow 20 per cent year-on-year to $67bn in 2014, reaching $110bn by 2018.
There is a sense that organisations have to trust the technology their vendors create, regardless of where they are based and what laws they are subject to. If intelligence agencies are determined to hack everything, how can anyone avoid their gaze?
“If what Snowden and others are saying is correct, then the NSA – and by inference pretty much anyone else – can drive a pretty large Sherman tank through any security system,” says Quocirca analyst Clive Longbottom.
“Is your carefully tended acreage of EMC, NetApp, IBM, HP, Dell, HDS storage any more secure than AWS’s, Microsoft Azure’s or John Doe Cloud Inc’s?
“You have to ask yourself two questions: first, is there any chance that we can be more secure than a company that specialises in technology and knows that information security is core to its very existence?
“And second, who would really give a damn about what we hold on our disks anyway? Sure, there may be personal, identifiable information on there but the NSA won’t really care about that, and if there are credit card details, then both we and the cloud provider would need to be demonstrably PCI DSS compliant anyway.”
Some more equal than others
Those industries that do have to care about what the NSA sees – defence contractors and pharmaceuticals, for example – are not avoiding the cloud altogether. They are simply asking more of their suppliers when it comes to security, or investing in products that guarantee extra levels of privacy.
The key question here, says Freeform Dynamics analyst Tony Lock, is can we be certain that our data is secure whether it is stored in the cloud or on storage platforms we run ourselves?
“This should lead organisations to accept that not all data has the same value and requires similar security,” he says.
“When it comes to selecting storage platforms, whether internal or cloud, it's a case of buyer beware. Make sure the systems you plan to use are suited to the demands placed upon them.
“For data this is challenging as the sensitivity of information can vary wildly over time – even within a few days, if not minutes.”
Some organisations, constricted by regulations over the geography of certain kinds of data or by their own risk strategy, will not be able to go to the cloud at all. In-house systems will certainly grant them more control over their information.
When considering government access to information, every business will be required to open up its data centre when an official arrives with a court-granted warrant, but the business is more likely to put up a fight than a provider which has to be cosy with the powers that be.
The data security could be better in-house or in the cloud depending on the respective resources and the quality of the systems dealing with information at rest or in transit. That is what makes private clouds more attractive to any business trying to find ways to handle deluges of data.
Some were experiencing year-on-year increases of up to 60 per cent in requests for data
Deciding whether an in-house or cloud infrastructure is safer is a sticky question. It might not be the right one either. A more pressing concern for many is the extra operational challenge of on-premise data storage.
In interviews with senior IT professionals in France, Germany, the Netherlands, Spain and the UK, storage and security company Iron Mountain found some were experiencing year-on-year increases of up to 60 per cent in requests for data, either from employees or external parties such as government bodies. The company said this was causing an “invisible drain” on IT.
To ease the pain, Iron Mountain recommends a tiered information storage approach that defines what is most used, most critical and most confidential, as well as what is essentially dormant, and structuring storage, access and backup accordingly. Cloud, of course, would ease the burden somewhat.
Wherever data resides, businesses want to know they can trust the algorithms determining the quality of the encryption.
Given that cracking is part of global intelligence agencies’ modus operandi, anyone worried about surveillance will demand tried and tested encryption. After the hoo-ha surrounding the Dual Elliptic Curve Deterministic Random Bit Generator, which the NSA helped design and allegedly laced with weaknesses, the more paranoid types will want to avoid any standards that the intelligence agencies played a significant part in developing.
There have been calls from anti-surveillance firms such as Silent Circle to ditch anything given the stamp of approval by the US government-funded National Institute of Standards and Technology.
Meanwhile, there is a growing interest in different forms of encryption. A Frost & Sullivan report on the public key infrastructure (PKI) industry recently found the market earned revenues of $357m in 2013 and estimated it will hit $533m by 2017.
“PKI is the best compromise for strong encryption,” says Jean-Noël Georges, author of the report and global programme director for ICT practices at Frost & Sullivan.
“Only public keys are shared with the client. Private keys are stored in a HSM [hardware security module] or something similar.”
According to Paul Simmonds, CEO of the Global Identity Foundation and board member at cloud encryption provider CipherCloud, only forms of encryption based on open standards should be trusted, and then only where the business, and no one else, holds the keys.
“It will slow the hell out of everything but your paranoia may be assuaged”
Longbottom thinks in most cases a 256bit form encryption is adequate. “Sure it may be that the NSA has the keys to the one that you choose, but if you are paranoid, wrap it in one form, say AES, and then re-wrap it in another, say 3DES,” he says.
“It will slow the hell out of everything but your paranoia may be assuaged slightly.”
Whatever happens, data needs to be protected in transit and at rest, says Brian Babineau, vice-president of product and channel marketing at Barracuda Networks.
“There are a number of tools users can employ to encrypt their data at the source with personal keys, which by default protects them from end to end. Most users don’t take that approach,” he says.
“They more typically employ user authentication to validate who is accessing the data at the source, network security like a VPN to encrypt the data in transit, and a combination of user authentication and encryption at rest at the target to ensure the data stays secure.”