Hypernormalisation: Adam Curtis on chatbots, AI and Colonel Gaddafi

It's a techno-utopia on the BBC


Interview Hypernormalisation, the new film by English documentary-maker Adam Curtis, dives deeper into technology than any of his previous films for the BBC. It goes up on the Beeb's iPlayer on Sunday (at 9pm in the UK) and “it’s a bit of a monster”, he admits.

The film is pushing three hours long. I had to watch it over two nights – a one-movie binge. But Curtis doesn’t mind. He describes it as a book with chapters, and says you can watch it however you want.

In the movie, Curtis puts the spotlight on characters such as cyber activist John Perry Barlow; AI pioneer Joseph Weizenbaum, father of the chatbot Eliza; and Judea Pearl, the father of the murdered hostage Daniel Pearl. These are woven around in a long studies of Libyan dictator Colonel Gadaffi and a modern history of Syria (“Isn’t it astonishing no one’s done a proper basic history of Syria on television?” wonders Curtis) and the apparent master media manipulator of Vladimir Putin’s Russia, Vladislav Surkov.

The themes here are familiar ones for Curtis fans: how people retreat into simplified views of the world that are fantasies, and how we let politicians do this for us, too. It touches on many of his earlier series - hippies turning into self-absorbed baby boomers from The Century of the Self. There are no marmosets, but there is an agonisingly theatrical Remain campaigner, weeping for a lost European Utopia.

Ever generous, Curtis is happy to throw open the floor to Reg readers once you’ve had a chance to see it next week; he answered some of our general questions here.

First off, Gadaffi dominates the centre of the film - with weird and creepily compelling footage of the colonel before or after interviews. And what about Lester Coleman’s Trail of the Octopus: From Beirut to Lockerbie …. Why focus so much on the Gadaffi?

“I really was astonished when I went back to the 1980s how much of what they claimed was Gadaffi was actually Assad using all these strange terrorists in Damascus.”

But, he says, rather than deal with something complicated, Western powers found that Gadaffi who had until then been isolated and ignored by the Arab world, fitted the bill of a cartoon villain.

“Gadaffi illustrates, like a flash of lightning on a dark night, just how corrupt, how hypocritical, and how empty of values our middles class elites have become,” Curtis told us. “It was quite shocking to me. It’s just rubbish that he had WMDs. There were no biological weapons and he’d got a a centrifuge but none of his people knew how to put it together - it was in boxes.”

But after being demonised in the 1980s and 1990s, Gadaffi finds himself rehabilitated.

“After the Gulf War all these people go out and make him into a modern thinker: David Frost, Anthony Giddens, even Lionel Ritchie went out there, and said Gadaffi was good. Then, after the Arabs Spring he was a villain again, so they just dropped him.”

“I was thinking of making a sort of comedy. At one point when Gadaffi is really out of favour and the only people who will talk to him are the British National Front. The Leader of the National Front donates a copy of Gadaffi’s Green Book to the public library in Hounslow.”

“I tried to explain how we don’t see a lot of the real reality like Syria. Syria is incomprehensible to us. Books written after 2001 don’t mention Syria. The elder Assad fights this great war in [Homs] where he gasses all these people in Hama. Their Muslim Brotherhood people played a big role in Afghanistan and Iraq. No one mentions that. They went after Gadaffi instead.”

Social networks and internet utopianism loom large in the film. Barlow’s mad, adolescent Declaration of Independence in Cyberspace (written at the elites’ annual retreat at Davos in 1995) finds echoes in Occupy Wall Street years later. And an early parody of AI becomes a forerunner of how people talk about themselves endlessly on social networks.

Therapy with ELIZA and the internet's hall of mirrors

Every developer over a certain age has enjoyed using the chatbot Eliza (my first encounter was an Eliza built into EMACS).

“I met Joseph Wiezenbaum a long time ago, and always wanted to go back to him. He thought AI was just ridiculous and he invented Eliza as a joke - a parody. He used the psychologist Carl Rogers as a model. He just repeated everything back to you - and everyone loved it. What makes you secure is having yourself reflected back to you, which I think sums up our time. You feel secure. If they’re not like you, you feel threatened. Eliza is a very good early sharp image of what was coming.”

These days people request "trigger" warnings if anyone is going to annoy them with an opinion they themselves don’t hold.

“Mr Zuckerberg can now work out how to give you people like you. So you get weaker and more fragile and when you come out of your bubble, you don’t know what to do, do you?”

Images from Adam Curtis' Hypernormalisation

“But you have to be ready to challenge and persuade people who are not like you. That’s what politics is all about. Back in the 1950 white upper middle-class children found common cause with black activists and founded the civil rights movement.

“The Left wing who fetishise the internet mistake process for content. But the internet is fundamentally an engineering system based on feedback and what engineers want is to keep things stable. It’s brilliant for organising things but when it gets you to Tahir Square, it doesn’t tell you what to do. Tech utopianism doesn’t have a picture of the future: it’s about managing now.”

He isn’t impressed with a lot of technology fads, whether it’s “disruptive innovation”, or the techno-panic about AI.

“A lot of AI is an imaginative projection onto machines. We fetishise machines at the moment.” The media are whipping themselves into a froth about machines stealing their jobs - but something I talked about last year is how the professions have already "mechanised" themselves, hollowing themselves out.

Curtis reveals that Microsoft’s sweary chatbot Tay almost made it into the film.

The AI teenager was unleashed on the world back in March, and was soon boasting about smoking weed in front of the police and spraying around racist insults. Tay didn’t know it was being trolled.

“And immediately it was being trolled. It was terribly funny,” says Curtis.

The magic words tech and disruption don’t carry much weight with the man either.

We have a digression into liability theory - I mention last week’s arrests of Backpage, which was given blanket immunity by internet laws written in the 1990s. Big Tech and “digital rights” groups had backed Backpage strongly.

“It’s very interesting, and related to this, is how much of the tech world can go and set up corporations, which if [they] didn’t have ‘tech’ in the description could be much more legally liable.”

He mentions Uber and AirBnB.

“They say: ‘We’re disruptive. It’s not our fault!’ That’s like running a hotel and saying the manager shot the guest but it's not our fault, we just built the hotel,” he says.

We’ll be following up, so fire away when you’ve seen the film. ®

Similar topics

Broader topics


Other stories you might like

  • Deepfake attacks can easily trick live facial recognition systems online
    Plus: Next PyTorch release will support Apple GPUs so devs can train neural networks on their own laptops

    In brief Miscreants can easily steal someone else's identity by tricking live facial recognition software using deepfakes, according to a new report.

    Sensity AI, a startup focused on tackling identity fraud, carried out a series of pretend attacks. Engineers scanned the image of someone from an ID card, and mapped their likeness onto another person's face. Sensity then tested whether they could breach live facial recognition systems by tricking them into believing the pretend attacker is a real user.

    So-called "liveness tests" try to authenticate identities in real-time, relying on images or video streams from cameras like face recognition used to unlock mobile phones, for example. Nine out of ten vendors failed Sensity's live deepfake attacks.

    Continue reading
  • Lonestar plans to put datacenters in the Moon's lava tubes
    How? Founder tells The Register 'Robots… lots of robots'

    Imagine a future where racks of computer servers hum quietly in darkness below the surface of the Moon.

    Here is where some of the most important data is stored, to be left untouched for as long as can be. The idea sounds like something from science-fiction, but one startup that recently emerged from stealth is trying to turn it into a reality. Lonestar Data Holdings has a unique mission unlike any other cloud provider: to build datacenters on the Moon backing up the world's data.

    "It's inconceivable to me that we are keeping our most precious assets, our knowledge and our data, on Earth, where we're setting off bombs and burning things," Christopher Stott, founder and CEO of Lonestar, told The Register. "We need to put our assets in place off our planet, where we can keep it safe."

    Continue reading
  • Conti: Russian-backed rulers of Costa Rican hacktocracy?
    Also, Chinese IT admin jailed for deleting database, and the NSA promises no more backdoors

    In brief The notorious Russian-aligned Conti ransomware gang has upped the ante in its attack against Costa Rica, threatening to overthrow the government if it doesn't pay a $20 million ransom. 

    Costa Rican president Rodrigo Chaves said that the country is effectively at war with the gang, who in April infiltrated the government's computer systems, gaining a foothold in 27 agencies at various government levels. The US State Department has offered a $15 million reward leading to the capture of Conti's leaders, who it said have made more than $150 million from 1,000+ victims.

    Conti claimed this week that it has insiders in the Costa Rican government, the AP reported, warning that "We are determined to overthrow the government by means of a cyber attack, we have already shown you all the strength and power, you have introduced an emergency." 

    Continue reading

Biting the hand that feeds IT © 1998–2022