Feature When something bad happens to our systems, our applications or our security, it's almost certain that our organisation is not the first it has happened to. We won't be the first in the world, or in our industry, or in our country, or probably even in our area. Why, then, does it feel like we are?
The answer is simple: the people to whom it's happened before haven't told anyone. And why? Because it's considered an unwise thing to do. Admitting your failings can dent your reputation, your share price, and your revenues.
Yet this is at odds with what we, as IT managers and cybersecurity people, tell people within our organisations. We stand in front of rooms full of people – or, more recently, sit in front of laptop cameras trying to remember what rooms full of people look like – and say: hey, if you fall for a phishing campaign, or you inadvertently delete a directory, or you lose your laptop, get in touch straight away and we'll help you get it sorted.
Tell us about your mistakes and, as long as you're not being malicious or idiotically negligent, we'll help you get them fixed
Tell us about your mistakes and, as long as you're not being malicious or idiotically negligent, we'll help you get them fixed. Yet can you imagine your company telling the world that its systems went TITSUP because the generator ran out of diesel, or because of a previously undiscovered software bug, or that your six levels of redundancy all failed and broke your country's emergency services phone lines? Even when people write to El Reg's "On Call", their names are Regomised to protect the innocent/guilty/optimistic/misguided/just plain daft from retribution, finger-pointing, abject shame, and P45s.
The thing is, though, there's more to admitting problems than shouting from the rooftops. After all, when we encourage work colleagues to tell us when they dropped a clanger, we're not asking them to grass themselves up via an all-users email or the kitchen noticeboard. Instead, they're asked to confide in a controlled group that they can (hopefully) trust with the details.
And this is the key point: if we want to share our troubles with others, in the hope that they will reciprocate and both parties will benefit as a result, whom shall we decide to trust?
Well, one option is to trust nobody and to share anonymously: there are resources that we can sign up to in order to do precisely this. In the security industry in which this correspondent plays in the daytime, for instance, we have the NCSC's CiSP (Cybersecurity Information Sharing Partnership) – a really handy example of an information-sharing web portal that has a rigorous sign-up process (not just anyone can have access), facilities for either showing or hiding your identity when you post, and a tightly defined system for categorising posts based on their sensitivity from "red" (disclosure of which is restricted very tightly) through "amber" and "green" to "white" (freely distributable with some simple caveats such as respecting copyright).
So that kind of facility is a start, and it's actually quite a useful one. But it's not terribly conversational: although you'll be able to write responses to people's posts, conversations build up over hours or days rather than minutes or seconds. And there's a balance to be drawn from controls over the audience: the more rigorous one makes it, the harder it is to get an audience of a decent size.
- You've patched that critical Sage X3 ERP security hole, yeah? Not exposing the suite to the internet, either, yeah?
- Microsoft struggles to wake from PrintNightmare: Latest print spooler patch can be bypassed, researchers say
- White hats reported key Kaseya VSA flaw months ago. Ransomware outran the patch
- Report shines light on REvil's depressingly simple tactics: Phishing, credential-stuffing RDP servers... the usual
So how do you do something conversational with a bunch of people whose opinions you respect, while retaining a reasonable chance of not blabbing your woes to the public at large? The simplest to make happen is to seek out your peers by going along to local networking events – and even if there aren't any locally run user groups in your field you can also look to professional bodies such as the BCS, (ISC)2, ISACA and the like, all of which have regional branches or chapters that can be used as a means of meeting the people you'd like to get to know and share with.
Clearly you won't just dive into an organisation's networking session and start bleating about your woes: they're merely a starting point in which you can table the idea of getting a few of you together once in a while to share experiences in a controlled, semi-formal environment, with the ground rules set out clearly and a conscious statement from all involved that they will abide by those rules.
The best-known – one might say ubiquitous – guideline is the Chatham House Rule, of course, which Chatham House themselves describe thus: "When a meeting, or part thereof, is held under the Chatham House Rule, participants are free to use the information received, but neither the identity nor the affiliation of the speaker(s), nor that of any other participant, may be revealed." The rule is perfect for what we need: we can benefit from what we say to each other and trust our colleagues to be discreet.
I used the word "trust" just then, but can we actually trust people to abide by the rules? And you know what... we all need to take a leap of faith and work on the premise that, yes, we can. In this world full of cybercriminals where we're told not to trust anyone, we have to buck that trend once in a while – specifically for people we know or people in groups whose principles are sound and whom we believe won't breach our confidences.
So what's stopping us from doing this? Company policies. The obvious ones are the confidentiality policies and associated clauses in the staff handbook and/or our contracts; as we all know, HR and compliance people love confidentiality clauses. But then people go further and start invoking stuff like the clause that you're not allowed to speak on behalf of the company unless you're authorised to do so (which the network manager and the cyber guys won't be).
Team Secrecy is not the team you want to be on
And here's where it gets messy because in most companies I've come across there have been two camps: the one that favours sharing information and the one that prefers secrecy. The big problem is that not only does Team Secrecy get a veto, but in many cases just one person gets that power, and their decision to keep saying "no" is generally not a scientific one but a personal hunch, or maybe a bad experience from past years.
What should happen, of course, is the adoption of a risk-based approach. Yes, the downside is a risk that someone will blab, but the upside – the immense value of knowing more about who's had issues, who's been hacked, what good and bad experiences people have had with mutual services providers – will generally outweigh the downside by an order of magnitude.
So, we've looked at local groups – whose other upside is, of course, that you can meet in the back room of a suitable hostelry – but how about opening it wider? Again, be bold, be trusting and have a bit of faith in people's integrity. During lockdown I've done a few group sessions run by an online security information provider, for example: the format is a Chatham House Rule closed video chat with a dozen or so people with similar interests to your own, on a particular subject, chaired/led by whoever's sponsoring the event.
Team Secrecy would be horrified at the concept, but in all cases we've shared concerns, challenges, and war stories, we've given and received opinions and advice, and we've ended each hour-long call better informed and with a bunch of ideas to take away. As I write this, I have another to go to in a few days (on the challenges of data security in the financial services industry, if you're wondering) and I'll carry on doing them. And while I, and my peers across the UK who also join in, get the value of sharing information, the nervous sceptics simply fail to do so.
Information sharing is a game of give and take, of course. But all many of us do is take – scouring the internet for tales of who got hacked, who had a major outage, which service provider is the latest to hack off its allegedly "valued" customers. And such research is of course valuable, but it misses a massive opportunity.
Carry on doing that, then, but while you do, try to step out of your comfort zone. Scour your industry and your local area for a potential audience, stand up and say: "Hey, from time to time I have stuff that goes wrong", or "I've got some interesting experiences with phishing scams", or "I'd like to sanity check my approach on something, anyone fancy a beer and a chat?"
Because the chances are that when you do, you'll get responses from a whole bunch of people who, until now, were either terrified of or prevented from (or both) indulging in information sharing and getting the massive benefits that it brings. ®