This article is more than 1 year old
There's security – then there's barbed wire-laced pains in the arse
How do you strike a balance with compliance and UX?
If IT has a reputation as the gatekeeper, the security department is the one providing the locks and barbed wire.
End users think IT security is a hassle: complex passwords, password expiry and multi-factor authentication are tolerated when they are made mandatory, but nobody is thrilled about it.
But look at it from the consumer's perspective. Even before it was reported that devs working for Cambridge Analytica had employed data harvested from more than 50 million Facebook profiles without permission, consumers had expected those holding their information would protect it as far as possible. That included access by authorised people only. Facebook and Cambridge Analytica will have sharpened those anxieties.
Meanwhile, with each successive data breach, expectations have evolved and individuals and governments increasingly expect breach notifications.
All good for the consumer – but a headache for organisations, IT departments and end users.
How, then, do we enforce security and compliance without getting in the way of the user's "experience"? Dare we even ask that question or does security and compliance rule above all – regardless of usability? Maybe that depends on who you are asking.
A secure and compliant IT environment
Once upon a time, IT security was easier. Configure your firewalls, put your internet-facing servers in a DMZ, lock down your desktops (including restricting floppy drives or USB access) and add a good antivirus software. Control remote access via a bank of dial-up modems with security tokens thrown in for good measure. Now, the internet is on everyone's phone and we have corporate data in the cloud, being accessed from the same machine a teenager downloads torrents on to.
For the IT pro, security and compliance has become "choose the right tools and configure them correctly". That sounds so simple. Let's throw in "educate your users" so they aren't entering their login credentials into dodgy phishing links because they may have been expecting a parcel delivery.
The fun starts when you try to decide what the "right" tools are and how they should be configured. Evaluating your options for threat protection, data loss protection, conditional access, cloud access security brokers or mobile device management is complicated.
The natural starting point is your current tech stack and seeing what your vendor of choice can offer. It's also natural to jump on your forum of choice and ask the tech tribe for their experiences and recommendations.
In the background is the fact you're probably constrained by budget and, as with any technology choice, your decision won't be validated until after its implementation. Good luck if it's a one-year (or more) licensing deal.
What's less obvious, though, are the security and compliance controls that already exist right under our very noses. From the very bottom of the tech stack, there exist "geek knobs" to tweak that are all gladly set to a default. BIOS and operating systems have become more secure by default (ignoring inbuilt vulnerabilities for the purpose of this paragraph). Yet there's still a skill in reviewing the defaults to see if they meet our organisational needs. Technical blogs and best-practice recommendations exist for a reason – they give us comfort that we're securing our environment in the best way possible, if such a thing exists.
And don't think just because you're using Software-as-a-Service you're safe. Let's assume you've reviewed the SaaS app first before purchasing it to check that its features meet your security and compliance requirements. You did that before spinning up your first team in Slack, right? You might find you need a premium plan or paid security add-on. And then there are settings in the application itself – who can create folders or channels, who can delete things and who information can be shared with. All need to be carefully tweaked to reflect your organisation's needs and to protect your users from themselves. Repeat after me: "The default settings aren't always right for us."
A secure and compliant corporate culture
So here's the thin security line: how far do you go with locking everything down?
On the desktop, do you prevent users from executing any installation file? In SaaS apps, do you prevent the creation of new folders or channels? Do you even disallow access from non-corporate devices entirely (through HR policy enforcement rather than technical solutions, if necessary)? The bad news is there is no standard answer. All of these decisions are a judgement call and will vary from one organisation to another. That seems kind of weird, when you think security would be a standard thing that everybody does the same way. Work for more than one company in your career (even in a non-tech role) and you'll see how wildly different those decisions can be. Isn't choice a great thing?
Maybe in an ideal world, vendors would truly make products that are secure by default and came with no geek knobs to tweak. I can only imagine the uproar. IT pros hate losing control.
If, in the real world, we tweaked all those knobs way down low and locked all the things, would we have end-user mutiny on our hands? Again, that answer varies. A bank or government defence agency would have no issues with that approach. Yet some businesses with less than 20 staff would scream if they had to ever change their PC login password or, heaven forbid, they were forced to use individual login accounts for each person (I kid you not). Unfortunately, some of those small businesses handle things like financial data.
It's a scary thing to note that some corporate cultures are more accepting of security and compliance measures than others. Usability does play a big part in this acceptance, including a shift in what the end users are used to. On your first day as bank employee, you get your own login with an expiring password and a locked-down desktop. You're subjected to loss-prevention training that talks about things like internal fraud and you're told in no uncertain terms to lock your computer when you're away from it. Small businesses miss that memo and just trust their colleagues. Try and change how people are operating and then you get resistance, especially if you are taking away some of their freedoms.
The bank employee induction is a great example of a security-conscious culture, driven by management and accepted as the norm. If we could just figure out how to replicate that across every organization, maybe our data would be a little safer.
Actions and consequences
Step too far into the security world, however, and you may be faced by a mutiny. Locking down all the things sure makes them secure, but it can be counter-productive and stifle a modern, work-from-anywhere culture.
So much so that shadow IT is a thing, where workers implement their own workaround to get things done outside of IT's acceptance. Some cloud security tools like Microsoft's Cloud App Security now exist to find an detect the use of all SaaS solutions from your network, to shine a light on what may not be authorised.
An Australian horror story exists of a guy spinning up an AWS instance from a cafe across the road from work to test out a theory for a project. Access to AWS was blocked from the corporate network and the IT approval process was too cumbersome and time consuming for a small proof of concept. He figured out an easier way to do it: using his mobile phone's hotspot from his work PC. While IT people cringe at this, the company in question is quick to point out it involved test sample data and patted the guy on the back for coming up with a successful outcome.
So, from personal Dropbox accounts to your company data in random cloud PaaS systems, these are the knee-jerk reactions to the IT security barbed wire, when we make it all too hard for the business to get their work done.
Can we build HR policies to dissuade people from doing this? Sure, but they're toothless tigers if management are going to pat people on the back instead. And let's not get started about what you do when management are the ones putting in the workarounds.
Maybe the best approach is to create that security-conscious culture. Change the employee perspective by highlighting how they would feel if another company leaked their sensitive personal or financial information? To your customers, you are that company. It also doesn't hurt to examine your controls and see how that impacts the day of an employee. A simple process change or technology tweak could remove some of the friction while still maintaining a secure and compliant environment.
Maybe we can only dream of a world where security, compliance and ease of use can co-exist. Maybe we need to suck it up and put security first, regardless. It's a battle that's raged since the dawn of the password. Decades on, even with the use of biometrics to supposedly make things that much more impregnable, we're still wondering how strike the right balance. ®