This article is more than 1 year old

New software sells new hardware – but a threat to that symbiosis is coming

Complex software packages need ever gruntier specs... and Koomey’s Law awaits

Comment A few months back, I wrote that buying software is a big lie. All lies have consequences, of course. The worst kind of consequences are the ones you didn't see coming. So let's look at some of those, and some other lies they conceal.

As we said last time, you can't really buy software. Commercial software is mostly – but as some readers pointed out, not always – proprietary. Proprietary software has both pros and cons, but so does FOSS. It's not always all about money. Last time, we argued that convenience – minimizing expenditure of work, time, and effort – can be even more important than simple financial cost.

The differences between commercial software and FOSS also have lots of important consequences when it comes to hardware, too, though. With commercial software, the relationships are generally easy to follow. With FOSS, it's more complicated, with contradictory and conflicting effects. That's what I want to look at this time.

lie

You thought you bought software – all you bought was a lie

READ MORE

Software sells hardware, so use hardware to sell software

Software vendors, whether of FOSS or proprietary software, don't work in isolation. Anyone selling operating systems, especially, must work closely with hardware vendors.

In some important ways, the goals of these two camps are aligned. Both need you to pay for their products, but more importantly, they need to find ways to make you keep on paying. For hardware vendors, it's relatively easy: just keep making the product faster and more capacious. Ideally, newer versions should use less power and be more manageable.

For software vendors, it's trickier. As a general rule, newer versions of established software products tend to be bigger, which makes them slower and more demanding of hardware resources. That means users will want, or better still need, newer, better-specified hardware to run the software they favor. That's good for the hardware vendors. However, it's not just one way: the newer software has additional features, to tempt users to upgrade, and if their new computer runs it faster, that's good for the software vendors.

As we pointed out last time, this pushes software companies toward tactics such as licensing, keys, product activation, support contracts and more. File-format lock-in is by no means limited to proprietary vendors: you could also describe .DEB format versus .RPM format, or Ansible versus Salt, in much the same terms. You pick one, and after you invest in it, changing to another means costs: in effort, in monetary value, in downtime, and so on.

When you link the need to sell newer software to the need to sell newer computers, you get things like OEM software bundling, and product-compatibility labelling. Software and hardware vendors give each other incentives to promote and support each others' products, and both gain.

The result is a feedback loop. As former Microsoft CTO Nathan Myhrvold described it:

Software is a gas.

Whatever container you put it in, it expands to fill it. This can seem as inevitable as physics, and maybe that's why people often describe it in terms of "laws". One example is "Andy and Bill's Law":

What Andy giveth, Bill taketh away.

Harsher still is Wirth's Law:

Software is getting slower more rapidly than hardware becomes faster.

There are laws about that, too. Computers are no longer getting so much faster every year or two. Moore's Law has been losing impetus for well over a decade, supplanted by Koomey's Law. Computers are getting smaller, cheaper, cooler-running, and last longer on battery power. That's good. It's led to the rise in inexpensive phones and tablets, which is helping more people get online. That's also good.

Secondly, when large complex software packages get mature, vendors start to run out of new features to drive big-bang releases. One alternative is sofware subscriptions. Pay a fee, and continually get a trickle of new features, plus certification that you're up to date and so a little more secure. Can't find enough bullet points to justify a new $250 boxed version? That's fine: just ask for $25 each year, and get just a few new features, but peace of mind.

That makes it increasingly tempting to just keep using your old software on your old computer for as long as possible. You've paid for both, they're yours, so if a new one won't be massively faster, just stay with what you've got.

This has led to some remarkable ramifications. Adobe recently came up with a magnificent new wrinkle on this: a subscription scheme it calls Pantone Connect. In 2021, it removed the Pantone™ color libraries from its Creative Cloud products. Since August, if you don't have the $15-per-month subscription, all the Pantone colors in your old files just… turn black.

Such subscription-based schemes can provide strong motivation to climb over the hump of relearning and switch to FOSS. Then, you get off the upgrade ladder, and keep using that old-but-still-good-enough PC until it fails, and then replace it with something similarly low-end and cheap.

Free software and cheap hardware: What's not to like?

The continuing development of both hardware and software imposes several pressures, though. Software isn't an inert gas: as Wirth noted, it keeps growing in complexity, and for-profit or not, companies and non-commercial organizations offering FOSS need to keep operating somehow.

If you're not selling anything, supporting older hardware keeps your users happy. That's good. It's a motivation to support older, lower-end kit, one that's less true of proprietary OSes and expensive apps.

But you still need to keep those FOSS OSes and apps updated: fix bugs, maintain defences against new threats and new exploits. And at the same time, hardware doesn't stay still. Computers are machines, ones that are still evolving. All machines eventually wear out, and replacement computers will be different: newer, more capable. That means newer components and newer drivers, even whole new types of subsystem.

Sure, some freeloaders just use this stuff and don't contribute anything back. Many might not even realize that they're doing it. Apple's macOS is built on a basis of open source. Android is too, and ChromeOS. There's nothing at all wrong with that.

But the people that are funding the development are paying for RHEL or SLE. Obviously, they don't run their businesses on old, low-end kit. They're using high-end servers in datacenters. Some of the companies running those datacenters also sponsor the development.

Another major faction of entities funding the development of FOSS OSes are the hardware vendors themselves.

The interests of these companies lie in selling new hardware, and selling support for OSes to run on them.

If it is not already obvious, these are somewhat conflicting goals. The people paying for the bulk of the work are not the most numerous consumers of it, and the desires of the billions of end-users are to a degree opposed to those of the people paying developers to work on it.

Once you grasp this, some industry trends make more sense. For instance, the move to 64-bit processors and processing.

After all, 32-bit processors worked pretty well. The 80386DX appeared in 1985, the Pentium D in 2005, and the Core Duo in 2006. That two-decade span took us from Quarterdeck DESQview and OS/2 2.0, enabling power users to multitask multiple DOS apps side-by-side, to Windows XP and the first Intel Macs.

The simple version of the story is that once PCs got more than three-and-a-bit gig of RAM, you needed a 64-bit OS to access it. But that isn't actually true. A 32-bit OS is perfectly able to access more than 4GB of memory, using CPU features called PAE and AWE. It's just that it was turned off in 32-bit desktop Windows: it was only enabled in 32-bit server versions. (As ever, there was a workaround, but rare is the corporate that would support that.)

Today, it's moot. The PC began to move to 64-bit way back in 2000. Windows Server dropped support for x86-32 hardware when Windows Server 2008 R2 came out in 2009, and when Windows 11 emerged in late 2021, there was no 32-bit edition.

The real point is that big enterprise servers needed lots of memory, and there came a point when supporting 32-bit kit no longer helped adoption or sales. Linux vendors discussed dropping 32-bit versions back in 2016, and it started to happen a few years later. Ubuntu dropped 32-bit support in 2019, although it kept support for 32-bit apps. Even ChromeOS Flex requires an x86-64 processor. Today, only non-commercial distros such as Debian still support it. Even openSUSE Tumbleweed is dropping its x86-32 version.

Microsoft makes money from selling operating systems and upgrades, so it has long worked very hard to keep newer OS releases compatible with old applications. Apple, conversely, have been giving OS updates away for free since MacOS 10.9, so it readily drops older hardware. Versions since macOS 10.15 no longer support 32-bit apps, and macOS 13 will not support pre-2017 Macs. Since version 13 isn't out yet, it's too soon to speculate about macOS 14, but at some point Apple will surely stop selling x86 machines and supporting them in its OS.

All the same, if customers are happy with their old machine and their old OS, companies which rely on revenues from new software sales need to find ways to persuade them to upgrade. Microsoft has had "designed for Windows $VERSION" schemes for decades, long before "Windows XP-ready". The PC keyboard didn't even have a Windows key until the campaign for Windows 95 made them a selling point.

So, today in proprietary-OS world, Windows 11 needs a version 2.0 TPM chip. If Agent P's proposals are adopted, so will many Linux distros. SUSE's next-gen enterprise distro, known as ALP, needs x86-64 version 2 although openSUSE Tumblweed won't – for now.

At some point after the baseline inevitably moves up for paid, enterprise distros, the non-commercial distros – Ubuntu, Fedora, openSUSE and the 270+ others will follow.

As the Raspberry Pi Desktop and Alpine Linux both show, a Core 2 Duo PC is still a quite capable computer today. A high-end one with a dedicated GPU can even run a 3D-composited desktop such as Cinnamon, thanks to LMDE.

But soon, they won't. Sadly, inevitably, this sort of support and technology is going to disappear from even free mainstream Linux distros, just as it has disappeared from commercial OSes and apps. The tooling and the subcomponents come from upstream sources who make their income from supporting newer kit. Eventually, the gap gets too wide, and the same kernels and the same compilers can't support both 32-bit kit from the 1990s and current-generation stuff. For a while, things like NetBSD will come to the rescue, but it can't do so forever.

Commercial software sells new hardware, and new hardware sells new software, in a relatively simple relationship – which has propelled some very profitable businesses. Free software, on the other hand, tends to get pulled along by new hardware developments. In turn, newer generations of free software pull non-commercial OSs and apps along with them. The behavior of such a system is complicated at best, which is why elastic tow-ropes carry big warnings.

Youtube Video

Bootnote

Wirth's Law springs from a 1995 paper in the IEEE's journal Computer, titled A Plea for Lean Software. You can read this online in PDF. Your reporter has also shared a text version.

Wirth himself credits the observation to the late Martin Reiser in his book The Oberon System: User Guide and Programmer's Manual (1991, PDF):

a critical observer may observe that software manages to outgrow hardware in size and sluggishness.

®

More about

More about

More about

TIP US OFF

Send us news


Other stories you might like