Who, Me? Beware the perils of being too helpful as we kick the working week off with a Who, Me? starring both DNS and the need of management to put someone's – anyone's – head on a spike.
Our tale takes us back to 2016, when "Sam" was working for an IT service provider. Sam's services had been rented out to a company that had just been gobbled up by a much larger entity. "There was a rebranding exercise under way," he told us, "and one of the changes was to switch a few thousand users to the parent company's email domain."
All straightforward enough, although the specific project Sam was working on was not directly related to email when the inevitable cock-up occurred.
Unsurprisingly, all the users at the newly acquired subsidiary lost email access. The problem was odd – it cleared up overnight, but things fell over again when the users turned up. The new parent company insisted the problem could not have been at their end and no other subsidiaries were affected so it must be Sam's site.
Pop quiz: You've got a roomful of electrical equipment. How do you put out a fire?READ MORE
It was also very much not Sam's problem.
However, as the outage ground into its third day, curiosity got the better of him and Sam did a little investigating. It transpired that all the users at the subsidiary were connecting to the parent's Exchange servers via the internet. Microsoft's venerable Forefront Threat Management Gateway (TMG) took care of firewall duties.
Sam had experience of the now-discontinued security product. It featured Flood Mitigation, and there was a very good chance that the sudden wave of traffic from the subsidiary could easily be mistaken as a denial-of-service attack.
Going beyond the call of duty, he agreed to explain his theory to the company's outsourced team. "The call," he told us, "was laboured, there had obviously been a lot of tension prior to this, but it was over in 10 minutes."
The Exchange team took Sam's suggestion about as well as a bucket of cold vomit. The setting, they insisted, was not even turned on. Although confident it was, Sam backed away from demanding proof. It wasn't his problem anyway.
And yet still the issue niggled away, doubtless magnified by the wailing of users around him.
Another idea came to Sam. He had changed an application to use the parent company's email server, but received an internal IP for it that used a recently set-up network tunnel. If he tweaked the hosts, Outlook went there for mail, perhaps it might work?
It did indeed. Outlook was roused from its sulk and performed perfectly. Internal system spoke unto internal system and all was splendid.
Sam presented the solution to the beleaguered service management team. A local DNS mod would make all the problems (at least those related to email) go away. It was, after all, how internal corporate email was usually set up anyhow.
Due process was followed, the parent company was informed, and Sam logged what he planned to do and his contact details in the requisite change management form. The change was to take place at 3pm.
Doubtless weary from batterings administered by angry users, the manager of the local team approached Sam shortly before 3 and asked him to action the change: "He said they had taken enough heat on this, there was really no need to wait until exactly 3pm.
"So I went ahead a few minutes early."
Email began to flow once more. For a few precious moments Sam basked in the glow of being the hero of the hour, day, perhaps year.
Then all hell broke loose.
First Sam got a call from the change manager at the parent company. Email had started working so the change wasn't needed.
"Er, yes, that's right," he replied, "because I fixed it, albeit a few minutes early..."
And that was when the shouting began.
"The next three days," he told us, "were the most stressful I've ever had."
First he was berated for making the change early. Then the bigwigs got involved and the aggression was ramped up a few more notches. Five years on, and "I still can't comprehend the anger," he told us. Was it mere embarrassment? A regional subsidiary taking things into their own hands where the head office had failed?
Whatever it was, the parent company wanted a head presented on a plate for the email fiasco and escalated things to Sam's employer (who were more concerned with keeping their client happy than the ins and outs of the situation).
It got to the point where accusations were flying that Sam had broken the email in the first place or that he had violated the integrity of DNS. Heck, he probably had no idea what he was even doing!
The parent company and the newly acquired subsidiary were also at loggerheads as the spat intensified. Shrieking from head office demanded that the change be reversed, while those at the other end of the chain feared what might be unleashed should email suddenly then stop.
As a mere contracted resource, Sam had a miserable time and began to dread the start of the working day. "Eventually," he told us, "I went to the guys at the service managers to tell them how much I regretted ever helping them and about the flak I was continuing to get."
Sam had looked for help when things first kicked off but had been to told ignore the fuss. The service manager clearly had not grasped how bad things were.
This time they did. It took 10 minutes for the local boss to turn up at Sam's desk and say: "You will not hear another word about this."
We assume bottoms were kicked rather than the email server reborked, but true to his word, all the nasty emails and angry meeting invites suddenly stopped.
"That was the end of it."
While the level of ineptitude on show might, admitted Sam, be hard to believe, "those who have worked in the cesspit of outsourced IT, I'm sure they'll get it!"
There is a saying: no good deed goes unpunished. It's almost as common as the one about DNS. But what would you have done in Sam's situation, or that of the team at the parent company?
Ever thrown a drowning user a life jacket, only to have it angrily thrown back? Tell your story in an email to Who, Me? ®