How do you protect your online systems? Cultivate an insider threat
Challenge your people to try to break into your systems, and see how interesting life gets for your colleagues
Opinion People are the biggest problem in corporate infosec. Make them the biggest asset.
The numbers are so bad because we're doing people wrong.
Here's news you can use. If you're trying to secure a corporate network, you almost certainly can't. When you can't, that network can be cracked open and hoovered clean of good stuff in a few hours.
We know this because of course we do, but also because of a SANS Institute survey of 300 security professionals. Not the ones who try to repel attackers, but the other lot, who try to kick down the walls and get into your systems. The ethical consent-seeking pen-testing sort, of course, but attackers nonetheless.
In a failed attempt to cheer you up, the institute points out the different lengths of time each stage of a successful attack takes, to help you plan where to put your detection resources. What it doesn't point out is what's implicit in the survey's structure. Ethical online attacks are a source of information like no other. If you could have a white-hat team permanently on hand, casting their gaze over every change you make, how much better would you sleep at night?
Don't drop off too soon. The other thing the survey doesn't mention is why, after all these years and billions spent on improving data security, it's worse than ever.
The team you actually have to hand comprises, sadly, the attacker's best friends. According to Verizon's 2022 Data Breaches Investigation Report over 80 percent succeed because of bad actions by employees. There are many classes of error from email slip-ups to active fraud behind the headline figure, but it boils down to the first rule of cybersecurity: people are the problem.
The reaction of many organizations, especially larger ones, is to impose strict infosec rules with harsh punishment for transgression. Which is how we get to that 80 percent, of course. Nobody dare admit to anything, let alone those things they do to circumvent the rules to get their job done.
But you don't need a company of cowed co-workers. What you want is a company of hackers.
That may sound suicidal to those who think hacking is attacking, but less so to those who think hacking is a mindset. Hollywood and the daily media is in the first camp: online attackers are at best amoral thrill-seekers, but mostly they're quasi-criminals bent on mischief or actual crime. Reg readers will be in the second camp, and rightly.
Hacking as a mindset is characterized by themes of curiosity, tenacity, imagination and the joys of discovery and invention. It's not a skill set, although it naturally evolves one. It is playful, pattern matching, and problem solving. Teaching people to think like hackers is better than teaching them to fear the bogeyman, in many dimensions.
Take phishing. The standard corporate approach to training non-infosec people to avoid phishing is to explain the principles, show some examples, perhaps run some faux phishing campaigns to shame those who don't "get it", and move on. Another chunk of that 80 percent gets wired in.
It's much better to teach people how to write phishing emails. Lay out the principles of social engineering, and reward the best efforts. It doesn't matter what technical skills or awareness people have, at the very worst they'll learn just as much as with the traditional way. But some will start to think creatively about security.
Most people just need basic awareness, and that's good, but you can take the hacker mindset as deep as you like. Thinking about security in depth as a set of components that have to be hardened, tested, maintained, is a whole heap of toil.
Thinking about the kill chain, the pathway through a system that extracts the prize, is much more engaging. If you want to read the CEO's expense claims history, how would you go about it? Say that's not a lot more fun than asking how to protect the HR and finance systems, we double-dare you. It's an uncomfortable leap, offering vuln bounties in place of written warnings, but where would you be happier working?
There are consequences to inculcating this mindset as widely as possible. You won't be able to fire people for poking around in your infrastructure, but you will if they do it, find something, and don't tell you. You'll need different rules, the sort external pen testers work by, about what to do when vulnerabilities are discovered, and how to behave responsibly when given responsibility.
The biggest risk might seem to be success, if you do find yourself with a lot of hackers on staff who've taught themselves far too much about your systems. That's best addressed by looking at the fraud triangle: most people commit fraud if they're motivated, have the opportunity, and can rationalize what they're doing as not so wrong.
- The Reg takes the US government's insider threat training course
- Excel's comedy of errors needs a new script, not new scripting
- The crime against humanity that is the modern OS desktop, and how to kill it
- The International Space Station will deorbit in glory. How's your legacy tech doing?
Motivation doesn't change with a corporate hacker mentality. Opportunity certainly goes up with the acquisition of skills – but rationalization becomes harder. The company isn't being stupid or distrustful or uncaring, it's asking its people to come inside, have fun with serious problems, and play by the rules.
On the upside, not only do you get many more eyeballs on security – your permanent testing team – you get people with the hacker mindset of creative analysis and problem solving. That's contagious into the job that's actually printed on their business cards. And if you're in a company that has a problem with that, may we suggest you make fixing that your number one priority.
Yes, your systems are vulnerable. Yes, attackers will get through. Yes, people are the problem. They're also your greatest security asset. Just buy them some white hats. ®