What’s the real point of being a dev? It's saving management from themselves

Insubordination in the face of impossible fads

Comment What’s the point of being a developer ? My experience of being one taught me that it isn’t primarily the coding – it’s actually much more important than that.

Last week at Parliament, discussion touched on the role The Register plays in the media landscape. The chairman of the House of Lords’ enquiry on Artificial Intelligence, Tim Clement Jones, very much values the voice it brings – often they are views you should hear, but don’t.

What I was saying about AI may have been quite startling to the enquiry, but they're all too frequently expressed in the Reg's comments section, by people who have to make things work.

I was struck by a parallel between today's AI mania and another, earlier hype, from when I was a coder. Last week, I was the only witness to have first-hand experience of coding in a mission-critical environment – life or death stuff. But I didn’t have a chance to share the story at the time.

You might enjoy my story, though. It goes like this.

Deep in the mists of time, some 25 to 30 years ago, there was a belief that software production would change radically. This belief, or dogma, far exceeded today’s hype today about machine learning or AI. And this dogma was so powerful, you either had to believe in it, or do a really good job pretending that you believed it.

This was O-O, or object oriented software. Software production would radically change, the experts predicted, so developers would be producing Lego-like components. What were called “monolithic” programs would become history.

Now, the data our software handled with was genuinely life or death stuff: we wrote chromatography systems whose customers included the FDA, pharmaceutical giants and big oil companies, amongst others. One particular bug, it was whispered, had caused a traffic jam of oil tankers into the Gulf. It is considered reasonable in commercial to fire staff in this environment for catastrophic errors. However, the O-O dogma had started with academics, who are rarely fired for any kind of error.

Bender the robot from Futurama

El Reg was invited to the House of Lords to burst the AI-pocalypse bubble


We object

Of course, software production didn’t change radically into the component utopia that the academics had envisaged. The “monolithic” programs like Excel, databases, CRM systems and so on, are what’s still made and sold today. A handful of software houses eked out a living selling objects or O-O libraries – Steve Jobs’ NeXT was one of them. More O-O techniques found their way in but the basic model didn’t change. The hype collapsed spectacularly.

Customers, it turned out, didn’t want a box of Lego bricks dropped on their floor, only to be told to assemble it. This was a cost the academics and theoreticians of software production hadn’t calculated. And for their part, developers didn’t particularly want to be in the brick business. They’d much rather be in the house-building business, where the margins are better. And once you’ve built a brick, what else do you do? Build a better brick?

It was the last time anyone listened to computer science academics who had been steeped in theory. After the hype collapsed, the internet came along, and the market scrambled to get on the internet.

Now here’s the thing.

We saw it coming...

Down in the trenches where I was, no programmer believed that the future would become O-O. Not one. When the glorious O-O vision was explained to us, we politely nodded and got back to work. This is because the best programmers, contrary to the popular image of troglodytes lurking in caves – spent a lot of time with the customers who used the software, as field engineers. You needed the best brains you had to get a complex Unix system up and running in an environment like an oil refinery or a doping lab back in the day. And it’s still true today: the skillset of a field engineer needs to be a problem-solving one. The programmers knew better what the customers wanted, and the customers wanted a working IT system thank you, not a bunch of Lego bricks.

Now here’s the thing. The higher up the corporate structure you went – either on the producers’ side, at the software houses, or at their customers, IT departments – the more management talked about The Next Big Thing.

Middle management would say things like, “Yes of course we’re working on our object-oriented libraries / toolkit, they’ll be here next year, for sure.”

And at the top, management felt they could talk about nothing else. They’d say thing like: “We’re at the vanguard of the industry shift to object-oriented computing.” Stuff like that. Some companies even changed their names to show they had one foot in the future!

So the role of the programmers was one of silent insubordination: the goal was to save management from themselves. And we’ve seen this replayed with a succession of technology hypes ever since. Each time the role of the Reg reader seems to be the same. Keep the PHB from getting carried away and ruining everything.

This scenario can be seen being played out in many Reg stories, but particularly so with AI and IoT stories. Coincidentally, these are two media-driven fads that were (largely) generated outside the technical community.

Journalists and business profs largely invented “automation revolution” by scaring other journalists about robots taking their jobs. The giant consultancies invented “IoT” because it was a fantastic opportunity for integration and big databases. Both ML and M2M have useful and potentially valuable competitive advantages for a company, in some situations. But like O-O was in its day, it’s buried under a mountain of unrealistic hype.

The reason the media class is obsessed with AI is because it only listens to itself: the shop floor voice isn’t heard too often. I’m glad I gave it an airing. Technology fads are now sold directly to the top management via think-tanks and TED talks, which is why companies rush out and hire graduates with ML on their CV for six-figure salaries – and then don’t know what to do with them. Soon they’ll be puzzled as to why they didn’t bring about an instant transformational leap in productivity. Bless ‘em.

So who’ll be there to save the management from themselves in the cloud and outsourcing era, when the last BOFH has gone home? Companies have already denuded themselves of a lot of shop-floor common sense. I presume soon the IT management will then be able to do whatever it wants – lining their CVs with fads. There’ll be no one to save them from themselves at all. ®

Keep Reading

Leaked benchmarks from developer kit for Apple's home-baked silicon appear to give Microsoft a run for its money

Before you get too excited 1) They're benchmarks 2) New consumer Arm-based Macs might use something else

Microsoft sides with Epic over Apple developer ban, supports motion for temporary restraining order

'Apple’s discontinuation of Epic’s ability to develop and support Unreal Engine for iOS or macOS will harm game creators and gamers,' says Microsoft

Apple funnels Worldwide Developer Conference 2020 through iOS app, website amid coronavirus lockdowns

Hey, finally an Apple event The Reg can attend, sorta, right?

Developer survey: C# losing ground to JavaScript, PHP and Java for cloud apps, still big in gaming

Plus: What puts off developers from adopting cloud? Price

Ordnance Survey recruits AR developer to build 'geolocated quests' to help get Brit couch potatoes exercising outdoors

Updated Yes, our current girth would welcome some age-appropriate Pokémon GO

Node.js community finally prodded to patch Chromium XHR bug after developer refuses to let flaw stand

If at first you don't succeed, try, try... try, try, try... try again

Android 11 Developer Preview 3 allows your mobe to become a router via USB Ethernet – if you can get a decent signal

Plus: Multitasking and other functions rejigged, but no promises they'll stick

ReactOS hits a milestone – actually hiring a full-time developer. And we've got our talons on the latest build to see what needs fixing

Open-source Windows lookalike aims to fix its 'long neglected' storage stack

Biting the hand that feeds IT © 1998–2020