Ethics isn't a county east of London, but it's the only way to look at security

We are all human beings, we live in a community, and everything we do affects others


Column The trouble with good ideas is that, taken together, they can be very bad. It's a good idea to worry about supply chain malware injection – ask SolarWinds – and a good idea to come up with ways to stop it. It's even a good idea to look at major open-source software projects, such as the Linux kernel, with their very open supply chain, and ask – is this particularly vulnerable? After all, a poisoned Linux kernel would be bad enough to make people forget SolarWinds.

Guess what? If you have all of those good ideas and decide to test the Linux kernel supply chain by poisoning it: you have had a bad idea. A very bad idea. An idea bad enough to get your entire university banned from kernel devland, as researchers from Minnesota found out.

Two researchers having a very bad idea is hardly news. But it wasn't just them, lots of other people were involved, including the university's Institutional Review Board, which said it believed the plan did not constitute "human research" and tossed the researchers a waiver.

This was the very most bad, terrible, opposite-of-good bit of the whole saga, because what happened was at heart profoundly unethical. And this is not just because the conceit that it didn't "involve humans" was confoundingly stupid when you consider that the Linux kernel is part of the daily lives of everyone with an Android phone. And that's before you start talking about cloud services. All these people, Minnesota ethics review board, are humans.

Will the people your decision affects the most feel they've been treated fairly? To be on the safe side, ask them anyway. If the answer is no, then do something else

Let's take two more thunderingly egregious feckfests of late – GoDaddy's phishy festive present of a fake Christmas bonus to ensnare its own employees into breaking email infosec rules, and the European Super League. Both were OK'd by senior management in positions of very highly paid responsibility, and both brought shattering reputational damage down on their organisations.

In the case of the ESL's gobsmack, it even got passed by a panel of experts expressly charged with spotting public peta-facepalms. In all these cases, ethics had become at best a sterile tick-box exercise, and the safeguards merely a way to go wrong with confidence.

The ethical principle that should have applied is very simple: will the people your decision affects the most feel they've been treated fairly? To be on the safe side, ask them anyway. If the answer is no, then do something else.

That's it. That's the tweet.

The best thing about this is that it works even if you're not about to drop your corporate kecks to reveal your new giant bullseye tattoo on your backside. This week alone, I tried in vain to help a friend who had been ground down and finally locked out of their WFH company laptop by corporate IT protocol and practice.

Days of work were lost, and they – like many of their colleagues – got further entrenched in their use of illicit productivity tools just to get their job done. They aren't unaware of security – far from it, they live in fear of accidentally causing a security breach and being punished by the draconian rules in place.

Football goes into the net/ photo by shutterstock

Brit Salesforce exec Gavin Patterson becomes transfer target for controversial European Super League

READ MORE

Do they feel fairly treated? What do you think? Is that security policy working? Of course it isn't. It's unethical, in that it puts people in insidious no-win positions and then punishes them for it.

If you have worked for organisations of any size, you'll be able to point to cases large and small where you were treated unfairly by policies or situations caused by people so far above your pay scale you've never talked to them. It's even better if you're a customer of a concern which decides to drop you in it for reasons you don't understand, don't believe, or find disrespectful.

If the Minnesota researchers had asked the Linux kernel managers "is this a good idea?" they'd have learned just how bad it was without getting their entire organisation banished to the naughty step.

If GoDaddy had quietly asked some juniors the same question, they'd have got the same answer. And for the ESL to think it need not even ask its own managers, players and fans – well, that at least is proof of the basic inhumanity of venture capitalism. Not that we need any more lessons in that, thanks.

It's not as if you can't find an ethical way of testing supply chain security, or teaching people about not clicking on links, or comprehensively destroying the very heart of an industry based, in the end, on love – no, scratch that last one.

But the rest of us, the people who make the decisions that affect others and who, in our turn, are affected, must recognise that respect for humanity in IT isn't a burden, it's a way to work better. If you can't do a job without that as the first and last principle you apply, that job should not be done. ®

Similar topics

Broader topics


Other stories you might like

  • Experts: AI should be recognized as inventors in patent law
    Plus: Police release deepfake of murdered teen in cold case, and more

    In-brief Governments around the world should pass intellectual property laws that grant rights to AI systems, two academics at the University of New South Wales in Australia argued.

    Alexandra George, and Toby Walsh, professors of law and AI, respectively, believe failing to recognize machines as inventors could have long-lasting impacts on economies and societies. 

    "If courts and governments decide that AI-made inventions cannot be patented, the implications could be huge," they wrote in a comment article published in Nature. "Funders and businesses would be less incentivized to pursue useful research using AI inventors when a return on their investment could be limited. Society could miss out on the development of worthwhile and life-saving inventions."

    Continue reading
  • Declassified and released: More secret files on US govt's emergency doomsday powers
    Nuke incoming? Quick break out the plans for rationing, censorship, property seizures, and more

    More papers describing the orders and messages the US President can issue in the event of apocalyptic crises, such as a devastating nuclear attack, have been declassified and released for all to see.

    These government files are part of a larger collection of records that discuss the nature, reach, and use of secret Presidential Emergency Action Documents: these are executive orders, announcements, and statements to Congress that are all ready to sign and send out as soon as a doomsday scenario occurs. PEADs are supposed to give America's commander-in-chief immediate extraordinary powers to overcome extraordinary events.

    PEADs have never been declassified or revealed before. They remain hush-hush, and their exact details are not publicly known.

    Continue reading
  • Stolen university credentials up for sale by Russian crooks, FBI warns
    Forget dark-web souks, thousands of these are already being traded on public bazaars

    Russian crooks are selling network credentials and virtual private network access for a "multitude" of US universities and colleges on criminal marketplaces, according to the FBI.

    According to a warning issued on Thursday, these stolen credentials sell for thousands of dollars on both dark web and public internet forums, and could lead to subsequent cyberattacks against individual employees or the schools themselves.

    "The exposure of usernames and passwords can lead to brute force credential stuffing computer network attacks, whereby attackers attempt logins across various internet sites or exploit them for subsequent cyber attacks as criminal actors take advantage of users recycling the same credentials across multiple accounts, internet sites, and services," the Feds' alert [PDF] said.

    Continue reading

Biting the hand that feeds IT © 1998–2022