This article is more than 1 year old

The International Space Station will deorbit in glory. How's your legacy tech doing?

Your past projects may be a pain, but can they rain fiery death from above?

Opinion The International Space Station is showing its age. It's older than a third of the population, over two and a half billion people who have never known a time without humans in orbit.

Bits and pieces of it keep going wrong, most recently the EVA spacesuits; Russia may or may not be about to bail; and it's more Red Dwarf than the Enterprise when it comes to space germs.

You thought it was difficult getting a cleaner to come to your apartment in the city? From one point of view, it's worn out, super-expensive to run, and is not contributing much to space exploration any more, and is soaking up far too much of space engineers' brain time.

Does that apply to any legacy technology you're familiar with? IT legacies don't make cool videos of the Earth from space or astronauts tumbling in microgravity, so they lose out heavily to the ISS on the public relations front. They won't be consuming 15 percent of a $22 billion budget [PDF] either.

The most important difference between the ISS and your line-of-business app still working gamely away in a virtualized Windows XP (which still has over 0.3 percent market share for heaven's sake) is that the ISS is a project designed to die. NASA is planning its demise in the next five to eight years [PDF].

Along with the headline stuff like scientific research and technology testing – there are many thousands of results that are ours to enjoy – the world also honors the work of the ISS because it benefits from the experience of long-term crewed missions. On the roadmap of space, the ISS bridges the misguided missile of the Shuttle and the return to the Moon and beyond.

When we next leave orbit outwards instead of inwards (the ISS will come crashing down into the Pacific Ocean, apparently), it'll be because of the legacy of the ISS.

Legacy IT could play a similarly honorable role in organizational long-term planning. It doesn't. Nobody thinks in those terms. If you're very, very lucky, the solitary nod to posterity may be some documentation that's been kept a bit up-to-date (you won't be very, very lucky).

Project lifecycles become more myth than management after the work's been done. If anyone asks at the start of a project: "What do we expect to learn by building and running this, and how do we migrate that knowledge onwards?" it's not part of tradition or general practice.

But the fact that such ideas seem more alien than 'Oumuamua is in part due to the chronic amnesia that afflicts corporates so desperate to reinvent that they forget biological evolution is nothing but refactored legacy.

It's also your fault, IT practitioners, and your determination to keep the science in computer science science-fiction. Take the agile methodology of software development.

The idea first took rigorous shape with rules and reasoning in 2000, the same year that the first crew boarded the ISS.

It recognized the problems of the strictly sequential project plan, it had good results that made agile a jargon term outside software, and it fed into DevOps and the cloud. You can easily find many discussions of how well it did these things, whether its time has passed, and what strengths and weaknesses have been exposed over two decades. What you cannot find is an attempt to systematically analyze what agile has taught us about software engineering and project management, in its own terms or in the context of the total history of software.

Our skills in science and technology don't progress linearly, with each step an incremental good. Many fashionable ideas look good but are rarely mentioned in polite society after they fail to match the hype. But do we learn from these failures through a shared narrative?

Software, too, has its fashion failures: Java everywhere, anyone? And the unfashionable can also rise (JavaScript, yo). But what does this mean for the future?

It's not that there's any lack of discussion about every aspect of the fabric of our digital world; it's that there's no sense of coherent intellectual analysis. It's like Anglophone politics, where any tradition of intellectual analysis has been abandoned in favor of the hot take.

Even something as fascinating and profound as the co-dependent rise of open source and the internet has received less academic attention than the prehistory of ant parasitology. Yet there's no part of commerce or culture untouched by the former.

That the ISS could be built and run with a probable life of 30 years is because the science, technology, and engineering live up to their identity as disciplines. That's how a legacy of enrichment and progress happens.

That the term legacy in IT is a mark of shame and technical debt is because we've shirked the brainwork of making it a proper discipline. As we move deeper into the 21st century with all the problems that bad digital can bring, it's our responsibility to make it one of the great human endeavors. Serious intent is no crime. Fiery death from above is not an option. ®

More about

TIP US OFF

Send us news


Other stories you might like