This article is more than 1 year old
Engineers on the brink of extinction threaten entire tech ecosystems
Resting on its laurels is costing the industry its hardies
Opinion Intel has produced some unbelievable graphs in its time: projected Itanium market share, next node power consumption, multicore performance boosts.
The graph the company showed at the latest VLSI Symposium, however, was a real shocker.
While computer science course take-up had gone up by over 90 percent in the past 50 years, electrical engineering (EE) had declined by the same amount. The electronics graduate has become rarer than an Intel-based smartphone.
That part of the technology industry which makes actual things has always been divided between hardies and softies, soldering iron versus compiler, oscilloscope versus debugger. But the balance is lost. Something is very wrong at the heart of our technology creation supply chain. Where have all the hardies gone?
Engineering degree courses are a lot of work across a lot of disciplines, with electronic engineering being particularly diverse. The theoretical side covers signal, information, semiconductor devices, optical and electromagnetic theory, so your math better be good. There's any amount of building-block knowledge needed, analogue and digital, across the spectrum from millimetric RF to high-energy power engineering. And then you have to know how to apply it all to real-world problems.
This isn't the sort of course you opt to do because you can't think of anything better. You have to want to do it, you have to think you can do it, and do it well enough to make it your career. For that, you need prior exposure. You need to have caught the taste. And to make it your life, there has to be a lot of high-status, high-wage, high-interest jobs to do at the end.
For most of the history of electronics, there was a clear on-ramp for this, and an industry that didn't need to sell itself because it was inherently cool for geeks. Look at the biographies of the great names in electronics, such as Intel co-founder Robert Noyce or the father of the information age Claude Shannon, and you find them as teenage geeks pulling apart, then rebuilding, then designing radios and guitar amplifiers. The post-war generation tore down military surplus gear to teach themselves how it worked and mine components to build their own inventions.
This was practical magic, and you could start your apprenticeship by taking the back off a broken wireless. If you had the urge, it was easy to ignite the fascination. Then came the pull of working on the front line of the Cold War, the space age, the era of technological innovation. The industry had its supply of fresh creativity guaranteed.
This remained broadly true until the turn of the 21st century. A reasonably bright kid would realize that the family CRT television was in fact a particle accelerator with its own multi-kilovolt high-voltage generator, plus any amount of repurposable bits and pieces. You can have a lot of fun with that. There were old analog gadgets all over the place. You could peer inside Granny's radio and follow the signal path, component by component. That's all gone now.
- Even robots have the right to learn from open source
- We need a Library of Congress – but for the digital world
- Cloudflare's outage was human error. There's a way to make tech divinely forgive
- AI's most convincing conversations are not what they seem
By one measure we're surrounded by more electronics in our homes than entire nations had years back. Your granny's radio had maybe 10 transistors; a smart speaker, billions. But it's a computer, like your flat-screen television is a computer, like your phone and your audio system and even your light bulbs are computers. The electronics have sunk out of sight, beneath thick alluvial layers of software, and it will do nothing without that software. Any budding geek will expend their youthful vigor on that software first, because it's where the animating genius of technology now resides. We have literally cut ourselves off from a primary wellspring of fascination.
It's not all bad news. Maker culture is alive and well and access to knowledge has never been easier. You don't have to go to a library to get out books on electronic theory or find a fascinating gadget to eviscerate. It's all on YouTube. Want to take apart a laser guidance system for an RAF Tornado's bombs? Mike's Electric Stuff has you covered. But the maker culture revolves around embedded processors and high-level concepts: you can build radios at home now that cost a few pounds and outperform the state-of-the-art of a few years back, but they're software defined.
If electronics are invisible at the start of a young engineer's life, they're invisible in the careers they may contemplate. In the 20th century, not only were consumer electronics full of differentiated analog desirables, aerospace, the military, and industry were too. Now everything is a screen with a UI. You still need a lot of specialized hardware, but it's vanished deep into the background. No wonder everyone who once had the itch to solder now gets ensnared by software.
Is it possible for electronics to regain its status as a primary inspiration for young technical minds? Not without a lot of work from the industry that needs those minds. The pipeline it once took as the natural order has broken. To reach new talent, the magic must be re-exposed. What goes on in chip fabs, design bureaus, and product R&D is just as important – and as magical – as ever.
Selling that message in a world designed by geeks to distract geeks is going to be hard. But we have hero brands, and hero space missions, and temples where we conjure machines, atom by atom. If the industry can't look at all the incredible things it does and find a way to capture imaginations, it deserves every last heartbreaking graph of doom. ®