This article is more than 1 year old
After 40 years in tech, I see every innovation contains its dark opposite
From modems to the Metaverse, Mark Pesce has seen it all
Column Next month will mark forty years since I showed up for the first day of my first professional job. I knew BASIC – I'd even learned how to type RPG-II onto a deck of punched cards – but in reality, I knew nothing.
In 1982 the field was not yet very professional. Most people working as software engineers were, like myself, university dropouts.
I was very lucky to be mentored by two incredibly bright and (fortunately) very patient individuals. John taught me how to make the most of the tight microcode of the Intel 8085A, while Ethan – who had come from a background in minicomputers – showed me how to work with systems bigger than a single CPU and EEPROM.
I learned everything they could teach me. Enough that when, a year later, my boss explained his Big New Idea, I was able to prototype it (both hardware and software) from his explanation. That turned out to be the very first version of RSA's SecurID – progenitor of many of the 2FA systems in use today. (Full disclosure: I am not very good at cryptography and my first implementation was laughably easy to crack. But it proved the concept.)
Humanity transitioned from a deficit of information to an exhausting oversupply
I spent the first decade of my career writing firmware for a range of communications devices: X.25 packet assembler/disassemblers, modems, CSU/DSUs, and, finally a range of dial-up networking equipment for a startup called Shiva Corporation that allowed users to dial into an office network from anywhere to access files and printers.
Shiva followed with an IP gateway – at the time, only a few major universities and corporations had access to the internet – and I learned how to code for TCP/IP. That gave me my own Big New Idea – inspired by William Gibson's Neuromancer – for a virtual reality interface to the internet.
The web was done right the first time. An ancient 3D banana shows Microsoft does a lot right, tooREAD MORE
As I worked on that, on the other side of the world, Tim Berners-Lee developed a protocol to connect all of the world's computers into a single, hyperlinked resource. The web changed everything (and continues to), providing the a foundation upon which I could build a 3D interface to the internet: that interface was VRML.
The five years at the end of the 1990s felt like the tech equivalent of the Cambrian Explosion. The earliest crude websites quickly gave way to increasingly elegant user interfaces, data navigation, media explorers, e-commerce, and much more besides. As more and more information found its way into cyberspace, information became more accessible and sharable. In an instant, humanity transitioned from a deficit of information to an unending, exhausting oversupply.
We haven't yet learned how to hold smartphones at a safe distance
The Web 1.0 bubble burst in early 2000 – a reset wiping the board of almost every idea that couldn't be immediately monetized. The 3D web went to the Island of Lost Toys, never to be seen again – or so I thought.
Then Friendster made the web fun again; the infinite depths of information became a space of human connection, as we found our friends, families, colleagues and neighbors, and used those bonds to share and learn from one another. Social media felt like another revolution – nothing would ever be the same between us.
It turned out that was just the overture.
From the day Steve Jobs walked onstage in January 2007 with the first iPhone, only twelve years passed before half of the adults on Earth owned a smartphone. The ubiquitous devices bring all of our information and all of our connections to the palm of our hands. We no longer desire to look away from these screens – their flashing lights and continuous stream of notifications promise so much, while delivering a stream of FOMO, disappointment, and negativity.
We haven't yet learned how to hold these devices at a safe distance – one that allows us to hold onto ourselves. To do that, we need time and space to think and feel. Technology helps us fill our time with such efficiency we rarely realize we need to breathe – just breathe – in order to grow.
- If you didn't store valuable data, ransomware would become impotent
- LIDAR in iPhones is not about better photos – it's about the future of low-cost augmented reality
- Zuck didn't invent the metaverse, but he's started a fight to control it
The 3D Web has returned – rechristened as 'the Metaverse'. I doubt whether any of us are prepared for that moment when we don augmented reality 'spectacles' soon to come from Apple and Meta and Microsoft, and the screen becomes the whole of the world – when everything we see gets mediated by the massive computing, analytics and recommendation infrastructure we've developed over the last forty years.
If we've learned anything over the last four decades, it's that every innovation, however wonderful, contains its own opposite. The splendor of "knowledge at your fingertips" laid the foundations for a planetary-scale "ignorance amplifier". Massive human hyperconnectivity across social networks reawakened and accelerated our tendency to form tribes. Our drawing together begins to look more like our coming apart. Advances in artificial intelligence mean that the surveillance state can scale without human staffing – or human oversight.
We never seem to grasp that our strengths inevitably transform into weaknesses. A touch of humility could go far to help us avoid tragedy as we navigate the next forty years. Where we can admit that we do not know, there we can find some space to think, to feel, and to breathe. ®