The case for handcrafted software in a mass-produced world
As AI automates programming, it could be worth exploring the value of bespoke code
Part 2 A thought experiment: If the computer business responds to commoditization and globalization like other manufacturing industries do, where does that leave programmers – and users?
This piece is the second in a series titled: The future of software. You can read Part 1 over here.
The advent of mass-produced software
The software industry is in a mess of its own creation. To paraphrase Douglas Adams: Software is big. Really big. You just won't believe how vastly, hugely, mind-bogglingly big it is.
As Adams also put it: To summarize, people are a problem. There are all manner of proffered solutions around communications, leadership, development methodologies, and more, but most of the issues boil down to trying to coordinate large numbers of people and getting them to work together. As a side effect, this has also led the industry to focus on certain types of programming languages and related tools, ones that bring their own problems. But the problem is that we have billions, maybe trillions, of lines of code, and it's too big for anything but large teams of people to handle.
Increasingly, the only practical ways to deal with it involve automation. Developers have been automating compilation and linking for decades. In under 20 years, there's been a massive rise in distributed version control systems, notably Git, and even software distribution methods based around Git-like tools, such as the Red Hat-backed Flatpak.
Now multiple companies are trying to widen the extent of that automation, using LLM bots – usually marketed as "AI" of some kind – to assist programmers. The massive growth of FOSS facilitates this by creating vast amounts of source code that the bots can ingest, digest, and regurgitate on demand (yum).
For now, humans write most of it. As we said last time, though, that is set to change.
Yet today, we don't rewrite much. Major projects, such as OSes, are just too big. There's so much of it that it's no longer practical to study it, restructure it, or fundamentally change it.
Once a single kernel contains tens of millions of lines of code, mere humans can't read the whole thing and redo it. All that's realistic is to nibble around the edges, tweaking or adding new parts, occasionally removing old ones that aren't needed any more.
Meaning that now, for the most part, we just add to it. That's why it keeps growing.
What is visibly approaching, with the advent of large language models powered by the transformer algorithm, is automating that part too: Bot-powered code generation, where, given a specific task, an LLM bot can write new code.
This is where multiple corporations would like the software industry to go next. Today, helping programmers write code; next year, writing code for them; the year after that, replacing that problematic, expensive meatware.
It will be small-scale at first, just small tweaks to existing codebases, or adding small modules and functions. A human will define an API and some input and output formats, and specify some tests that it works… and LLMs will generate code to do it, and to test it, and then interpret the test results, and tweak it until it passes. (And then, probably, help the humans redefine the tests a few times, until the program does what someone somewhere originally wanted.)
As this gets tried and accepted, the next few iterations are not hard to foresee. After automatic software generation, adding minor functionality to existing codebases, come bigger-scale adjustments. Tweaking existing code to fit other existing code. Automatic code refactoring, adjusting and maintaining vast codebases to get them to work together.
As in pretty much all other manufacturing industries before, over time, the economies of replacing fallible humans with machines that do a good-enough job, supervised by rather fewer and more interchangeable humans, become irresistible for any management team.
After that? Here my speculations become a lot more blue-sky, meaning probably wrong. But here are some ideas. One thing that LLM bots are pretty good at is translating between human languages. Perhaps they can be trained to translate between machine languages too.
Interpilers have existed for decades. They take code in one high-level language and "compile" – translate – it into another. For example, here's a list of 99 FOSS tools that take code in other languages and emit C, or sometimes others.
On a larger scale, this could enable a form of automated bug identification by mass language conversion: Take code in unsafe languages, profile and test it so its behavior is quantified, then translate it into a more strict language and recompile it, as a way of identifying bugs at scale.
The painfully trendy Rust is far from the only more-type-safe language in the C family, but its semantics are quite different. There are, however, whole families of other languages that could fit, such as the "Wirthian languages" of the late, great Niklaus "Bucky" Wirth: Pascal, Modula-2, and the Oberon family. Use an LLM to translate from C to some Oberon dialect, say, just to make things more robust – and then possibly, back again for better interoperability.
Mass translation of large codebases into safer languages simply to improve reliability. Effectively impossible, or at least vastly expensive, using humans – but at least imaginable using LLMs.
The next step from that is obvious: Automatically fixing them.
If something doesn't translate because, say, it uses things like pointer arithmetic that doesn't translate to languages outside the C family, then it's possible to rewrite that section of the code to achieve that task another way. Given a corpus of source code with enough such examples, one can imagine an LLM bot that could identify whole categories of memory safety errors in code – and replace the offending parts.
Large-scale, mass-production of code may never happen, but certainly many companies are trying to bring it about. If it happens, and it might, about the only thing it's possible to confidently predict is that the results won't be very good.
But that is normal for almost any other kind of mass-produced products. Compared to hand-made versions, they are often poor, but predictable: Good enough, and so cheap that repairs cease to be worth it. You know what you're going to get, and you know it won't last as long as an expensive hand-made version, so you will need to replace it sooner… but it's cheap enough that that's acceptable.
There are wilder possibilities too. Programming languages evolved for computers by computers and no longer human-readable, like Google Translate's internal intermediary language. Entire custom languages and tools, produced by machines for a single production run, then discarded. Dynamic communications protocols, defined on the fly during use. Stochastic APIs, and code evolving to fit around existing units that for legal or other reasons can't be changed.
What becomes of the old hand-made stuff?
I am not really here to discuss the issues of machine-generated, mass-produced software, though.
The more interesting questions are: Is this the only way forward? What about the trade of hand-made software?
There are people who enjoy programming. Not as many as actually do it, but there are dedicated craftsmen, not just in FOSS but in many other areas, from gaming, to embedded systems and firmware, to real-time systems.
What if most programmers get automated out of a job? Some people who like programming might also enjoy supervising and guiding bots who do it for them, but I suspect many won't.
A possible alternative to mass-produced software is that it will go the same way as clothing, food and drink, furniture, and many other manufactured goods.
Hipster software
Perhaps we could see an era of bespoke, artisanal, hand-crafted, small-batch software. Yes, it's expensive, but just feel the quality.
Even today, $5 of hardware can usefully run a multi-gigabyte free OS with another few gigs of software running on top, making disposable Unix computers not only feasible, but affordable to use just once.
In an era of plentiful, cheap, machine-produced software, a market could evolve for smaller, simpler software, designed like in the old days by one skilled artisan, or a tiny team, like they did in the old days of the 20th century. Tools intentionally kept so small and simple that each important element can fit completely into one person's head.
There are various potential reasons. Aesthetic ones, because it's more interesting or more pleasant to use. Recreational ones: It's more fun to make, and people find niches where they can make a living doing it, without ever going mainstream. Commercial reasons: You can afford it, and doing it this way makes it easier to maintain, faster, and it can be kept more private and more secure.
In other industries where alternative, hand-made products have successfully made resurgences, from beer to boots, there are some common elements.
Doing things "retro style" is one. It's near-universal, not just from nostalgia, but because small-scale production frequently uses old-fashioned methods, simply because these tools and methods were perfected over long periods when there was no alternative. If there's no other way to do something than by hand, you work out how to get really good at doing it.
- The future of software? Imagine a bot, stamping on a human face – forever
- What is this computing industry anyway? The dawning era of 32-bit micros
- Sweet 16 and making mistakes: More of the computing industry's biggest fails
- Where the computer industry went wrong – the early hits
Talking to some younger people, there is a perception that computers from a few decades ago were toys. They were so tiny and simple and limited that, by modern standards, they look ridiculous now. Operating systems that occupied just a few kilobytes must be useless.
But this isn't the case. As an example, Niklaus Wirth's Oberon OS is a complete bare-metal multitasking OS and development environment, but it's around one ten-thousandth the size of the Linux kernel alone. Fifty years ago, tens of thousands of businesses around the world ran on CP/M, with a resident size of a few K. Small can be mighty.
Compared to anything stamped out by machines, hand-built stuff will cost more – if not in money, in human effort. But, on the other hand, it's often higher-quality: Cheaper to deploy, able to run in much less space, and needs far less frequent maintenance.
All items coming off a single production line are aimed to be identical and interchangeable and interoperable. That's less necessary with small-batch stuff. We expect modern OSes to run tens of thousands of different applications, as mass-made clothing fits millions of bodies. Bespoke stuff doesn't have to: It just has to fit one body, really well.
In terms of computer compatibility, as long as something can work with the prevailing standards – media and file formats, network protocols, or whatever – that may be enough. Most of those were defined in the pre-mass-production era, anyway. As long as a device can do the job it was built for, it doesn't need to be compatible.
Backward compatibility is a trap that the industry has walled itself into. This vulture has been working in the industry for around 36 years now, and has worked through half a dozen complete industry transitions: Eight-bit to 16-bit, MS-DOS to Windows, 16-bit to 32-bit Windows, Win9x to NT, and then from Windows to mostly Linux.
It all runs on later generations of the same hardware. Compatibility is a red herring: The whole industry has abandoned entire previous generations, and repeatedly at that. So long as the old stuff can still run in a VM, or in an emulator, it doesn't need to do everything its predecessors did. Keep it in a box for old times' sake, occasionally bring the box out to get one old job done, and otherwise, ignore it.
Another potential driver could be legislation. Governments have already tried to impose legal standards of accountability on software, such as the EU Cyber Resilience Act, which caused alarm in FOSS circles. For instance, the Debian project was very concerned. The GPL itself is so concerned that it says, in block capitals:
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW.
That's not so bad; compare with a Microsoft EULA, such as the Security Essentials one:
DISCLAIMER OF WARRANTY. The software is licensed "as-is." You bear the risk of using it. Microsoft gives no express warranties, guarantees or conditions.
Perhaps it's time for at least some parts of the computer industry to grow up a little, and get a little more like the vehicle industry: Mandated by law to comply with safety standards, and coming with a guarantee.
This will be very hard to accomplish, not merely for existing code, but existing languages. But maybe artisanal code could do it. If it could, then it would win acceptance from institutional customers, and that might force the mass-production industry to catch up.
Retro-style tech doesn't just mean old-fashioned. Before the computer business turned into a mass-market industry, there were more exotic and experimental tools that for various reasons didn't scale up.
An example is Donald Knuth's famous note [PDF]:
Beware of bugs in the above code; I have only proved it correct, not tried it.
Formally verified software is possible, but it's not easy. It only works on small, simple, isolated systems – but then that's exactly the sort I am proposing.
It could favor a revival in more advanced tools and methods from before the current era of consolidated, commercialized, commoditized software.
Every mainstream operating system is built using imperative languages. Imperative languages include almost every programming language you've probably ever heard of, from Algol to Zig. Their statements are commands, which change the program's state. But there are other ways to program software. For instance, there are also relatively exotic languages that may not scale well to large teams, but with which small teams – and individuals – can work near-miracles, such as declarative and logic languages. Just among purely functional languages, Wikipedia lists some 20, and many more that aren't purely functional, including many forms of Lisp. All have long histories, lots of work in them, and yet remain seriously obscure.
If the AI pushers' dreams do come true, and bring a revolution to the software industry, that could foster and fertilize an alternative market in hand-crafted code. And that could have a disproportionate impact. ®