A senior figure at the anti-virus giant McAfee once told this writer the security industry was a mess. There were too many vendors trying to do too many things. But what the industry mirrors is the threat landscape it is trying to calm down.
Just look at what’s happened in the past six months. Two of the most significant breaches in the history of the web have occurred, with the attacks on US retailing firm Target and auction giant eBay. There was also the small matter of the Heartbleed vulnerability in OpenSSL, one of the most high-profile web security flaws to date. From a security perspective, things aren’t letting up and there appears to be no end in sight.
One common, depressing problem that’s emerged from each of these issues is that many people just aren’t doing the old basics right. The whole security sector, from vendors to customers, needs to be sensible in its response, says Javvad Malik, analyst from 451 Research.
“The industry’s been messy for a while now and it’s important the industry responds in a pragmatic and unified manner to try and win back confidence of businesses that investing in security isn’t a completely lost cause. We’re still bad at managing the basics, patching, privilege identity management, tools that overload users with alerts within which important issues can be missed,” Malik says.
Whilst IT teams are often told they need a new approach to protecting the business, they have to get the old problems nailed first.
A new approach to malware protection
Beyond the rudimentary matters, though, coping with these manifold problems effectively will require traditional protections to change and new ones to emerge from research labs and find their way into businesses. “As the internet touches more and more areas of our lives – smart devices, currencies such as Bitcoin, cloud and virtualisation – simply reacting to threats is no longer the most effective way to protect both individuals and organisations,” says David Emm, senior security researcher at Kaspersky Lab.
That does not mean killing anti-virus, however, even if the traditional signature-based approaches have failed. Modern AV systems, the best ones, at least do some heuristic and reputation analysis, rather than just try to detect malicious software that’s already been seen in the wild.
Emm says malware detection technologies should look beyond the static and evaluate objects or applications within the context of a specific environment, questioning what it’s doing there, what it’s connecting to and what it has been designed to do compared with its expected behaviour.
“This enables security experts to identify anything that is being used for nefarious purposes before an attack has been carried out,” he adds.
“There have been those who have said that ‘AV’ protection is dead but what this really means is that we have to go beyond traditional signature-based protection and use more sophisticated technologies including heuristics, sandboxing, proactive behaviour detection, cloud-enabled threat intelligence, application control, automatic exploit prevention, secure banking and more.”
Out of the perimeter
Outside of malware problems, the rise of mobile has brought about a dismantling of the IT perimeter. That’s why the firewall has also taken a battering in recent years. And yet it still survives as a technology, whether in the traditional or “next-generation” sense. Rather than taking down the firewall, the response to the death of the perimeter should be based on a layered approach, not a rip and replace strategy, says Professor Alan Woodward, of the computing department at the University of Surrey.
“I think we can take lessons from how physical security has been mounted historically: something as simple as a castle didn’t have just one wall. There were layers of walls and eventually a redoubt within which the most precious items were kept,” Woodward says.
“Defence in depth has to be seen as the default approach. With the rise of insider threats and spear phishing attacking those with privileged access, the perimeter is becoming less of an absolute barrier to intruders but is still causing some attacks to bounce off so it would seem a little silly to simply let it crumble.”
Emm says such an approach needs to focus more on the individual, rather than attempt blanket security measures. “People do still work in the office, connecting to corporate servers so that network still needs to be protected. However, the workspace has become more diverse in that many people will work from home on a laptop or on the move with their smartphone or tablet,” Emm adds.
“This leads to a host of additional security issues - from people logging on to insecure Wi-Fi networks that could potentially be being watched by cybercriminals, to losing their device on public transport - and it is these devices that are not protected by traditional policies, firewalls, endpoint protection and mail filtering that exist in offices.
“We therefore need to look at a security solution that protects the individual, taking into account new devices and policies and procedures for untrusted environments – i.e. ‘follow-me’ security.”
This all points to a need for better intelligence systems, ones that can alert organisations to anomalies on the network caused by zero-day threats and access inconsistencies, whilst allowing for greater analysis of attacker behaviour. The most effective are likely to be based on Big Data technology, ones that can draw together different data types to determine the nature of the threat. Use of Hadoop and big data warehousing projects will likely be the domain of large enterprises. Security Information and Event Management (SIEM) technologies will likely be suitable for smaller enterprises, the most attractive being those that allow for actionable intelligence and pull in as many different sources as possible.
Proper intelligence solutions aren’t just thought to be useful in understanding the adversary, they’re also likely to save businesses money. Research from the Ponemon Institute last year, looking at 234 breached organisations, showed those who invested in security intelligence systems gained average cost savings of nearly $2m in comparison to those who didn’t.