Will LLMs take your job? Only if you let them

If you wear black T-shirts, are you even a white-collar worker?

Opinion Let's face it. You're a smart cookie. You work in tech, you create or manage or help others in some of the most complex machinery in human history. So is it true that LLMs are gunning for your job, as they are for those of others working in similarly arcane areas of knowledge work, like the law, finance, medicine and the creative sector? Will you be at home watching daytime TV while the director of marketing asks Alexa to "create an app that makes people give us their money," while the CEO asks Bing to "come up with ideas better than the director of marketing"?

It's no good looking elsewhere for help. In the UK, the Minister for Education seems to be the spokesperson for official AI appreciation. She says that LLMs can ease the burden of teachers by doing drudge tasks like marking, but if you think that sounds monumentally stupid she confirms it by admitting that she's got no real idea what's going on. A new technology is threatening to restructure employment and the government is barely bothering to even busk it.

We've been here before, of course. In the mid to late 1970s, the microprocessor stopped being a quirky outlier of a computing industry that was largely isolated from everyday experiences. Instead, people realized that microprocessors were going to infiltrate everywhere and change lives across the board – especially jobs.

There wasn't much sign of this realization in the UK governments of the time. They thought in silos: technology was about technology companies, and Britain's tech industry was a mess. Consumer electronics had been ceded to imports, the defense sector sucked up Cold War finance and made little of interest for anyone who didn't have to buy it. The same was true of wannabe-IBM ICL. Managed decline and the exit of the state from industrial influence was what counted as a tech policy. That a revolution was coming that would take tech into everyone's lives, potentially automating away millions of jobs, was no more a state problem than writing limericks in Sanskrit.

Then something remarkable happened. The BBC made a documentary, 1978's Now The Chips Are Down, as part of its well-established Horizon series. It covered what microprocessors were doing and could do across society and industry, and unusually for Horizon – and the BBC – said the UK was adrift, there was no political strategy, and there had to be one. The ending was unusually forceful: "What is shocking is the government has been totally unaware… the silence is terrifying. It's time to talk about the future."

This hit home. Three years later, the UK government kicked of the Microelectronics Education Programme (MEP) and backed the BBC's Computer Literacy Project – which saw the adoption of the BBC Micro as practically the standard school computer. None of this cost very much, the MEP was funded to roughly £35 million in today's money, but it helped fuel a national obsession with learning about, playing with, and programming computers.

Lots of expected things didn't happen, such as a viable educational software market, but things like Arm and the UK video games industry made up for that. How many people were put out of work by automation is impossible to say, compared to those put out of work by mismanagement and industrial decline. The service sector grew at a rate unimaginable without the use of IT, and economists will grow fat for years to come arguing exactly what happened. That the country was primed for action, though, is indisputable.

What would the equivalent for LLM knowledge engineering look like? This time, the jobs at threat are thought to be lawyers, finance advisers, coders and the like, especially those lower down the food chain. As with processors, the conviction that change is coming is as strong as the admission that nobody knows what it will look like.

A crucial difference is that what processors do well and do badly is clear. LLMs far less so. They deal in knowledge but have no idea about truth. They mix facts and hallucinations. They amplify bias. They are great at talking to people, but have no idea what effect they will have. In short, the technology is far less mature than its breakneck adoption would imply, and how things will evolve over the next five years is unknowable. This, frankly, is our salvation.

The one aspect of LLMs that can't be denied is that they are far too flaky without a human in the loop. A whole new field has appeared overnight, prompt engineering, based on the massive correlation between the usefulness of the output and the cleverness of the request. People who are good at their jobs are those who know the right questions to ask and how to interpret the answers, and LLMs can't do either. They can only be a tool of greater or lesser help to the human.

So here's the plan to make ready for the "AI Revolution." Expose students to the technology, not to replace teachers or encourage dependency, but to be skilled in critical analysis of what it does and how it does it. Encourage those in work that may be affected to find what each new iteration is good at and bad at for their particular specialties. This has three facets: yes, LLMs may be good for you and you should know how. No, LLMs can't work without expert human interaction, and for your job you are the best expert human. Finally, you need to know how to tell your boss that they're full of it if they try to boot you out in favor of ChatGPT.

It's unlikely that the governments and public service broadcasters of 2023 will find common cause this time to create even the skeleton of an education and awareness campaign. That's OK. This time round, the microprocessor has given us access to all the things we need to do it for ourselves. The scale of the state's involvement at the time was modest; it doesn't take much to have an outsized effect.

If you're smart enough to do your job better with LLMs or smart enough to know why that's not happening, you're smart enough to keep it. Don't wait to be told. Not that you would – you're a smart cookie, after all. ®

More about

TIP US OFF

Send us news


Other stories you might like