Hillary Clinton: 2024 will be 'ground zero' for AI election manipulation

2016 meddling was 'primitive' compared to what's ahead

When it comes to AI possibly influencing elections, 2024 will be "ground zero," according to Hillary Clinton. 

This will be a huge election year, with more than four billion people on this planet eligible to vote in one poll or another. The output of generative AI in all this politics, at least, is expected to be unavoidable in 2024; deepfake images, falsified audio, and such software-imagined stuff are likely to be used in attempts to sway or put off voters, undermine people's confidence in election processes, and sow division.

That's not to say nothing should be trusted, or that elections will be thrown. Instead, everyone should be mindful of artificial intelligence, what it can do, and how it can be misused.

"This is the year of the biggest elections around the world since the rise of AI technologies like ChatGPT," the former US Secretary of State, senator, and First Lady said at an Aspen Institute and Columbia University joint event on Thursday covering machine learning's impact on the 2024 global elections.

Clinton, who lost to Donald Trump in the 2016 White House race, has personal experience with election disinformation attempts and how technology can be potentially used for nefarious purposes.

As fellow panelist Maria Ressa, Nobel Peace Prize-winning journalist and co-founder of Filipino news site Rappler, said: "Hillary was probably ground zero for all of the experimentation."

Still, the fake news stories and doctored images pushed on Facebook and other social media platforms ahead of the 2016 election were "primitive" compared to "the leap in technology" brought about by generative AI, Clinton said.

"Defamatory videos about you is no fun — I can tell you that," she added. "But having them in a way that … you have no idea whether it's true or not. That is of a totally different level of threat."

Former Secretary of Homeland Security Michael Chertoff, who was also a panelist at the Aspen-Columbia gathering, said the internet should be considered a "domain of conflict."

In a world in which we can't trust anything, and we can't believe in truth, we can't have democracy

"What artificial intelligence allows an information warrior to do is to have very targeted misinformation, and at the same time to do that at scale, meaning you do it to hundreds of thousands, maybe even millions of people," Chertoff explained.

In previous election cycles, even those that occurred just a decade ago, if a political party or a public figure electronically sent an "incendiary" message about a candidate or elected official, this message might have appealed to some voters — but it would also likely backfire and repel many others, he opined. 

Today, however, the message "can be tailored to each individual viewer or listener that appeals only to them and nobody else is going to see it," Chertoff said. "Moreover, you may send it under the identity of someone who is known and trusted by the recipient, even though that is also false. So you have the ability to really send a curated message that will not influence others in a negative way."

Plus, while election interference in previous democratic elections around the globe have involved efforts to undermine confidence or swing votes toward or away from a particular candidate — like Russia's hit-and-miss meddling in 2016 and its Macron hack-and-leak a year later in France — the election threats this year are "even more dangerous," Chertoff said. 

By that he means some kind of AI super-charged version of the Big Lie Donald Trump concocted and pushed after he lost the 2020 presidential election to Joe Biden, in which the loser wrongly claimed he was unfairly robbed of victory, leading to the January 6 storming of Congress by MAGA loyalists.

What if fake images or videos enter the collective consciousness — spread and amplified via social media and video apps — that promote that kind of false narrative, causing large numbers of people to fall for it?

"Imagine if people start to see videos or audios that look like persuasive examples of rigged elections? It's like pouring gasoline on fire," Chertoff said. "We could have another January 6."

This, he added, plays into Russia, China, and other nations' goals to undermine democracy and sow societal chaos. "In a world in which we can't trust anything, and we can't believe in truth, we can't have democracy."

Instead of worrying about people being tricked by deepfakes, Chertoff said he fears the opposite: That people won't believe real images or audio are legitimate, because they prefer alternative realities. 

"In a world in which people have been told about deepfakes, do they say everything's a deepfake? Therefore, even real evidence of bad behavior has to be dismissed," he said. "And then that really gives a license to autocrats and corrupt government leaders to do whatever they want." ®

More about


Send us news

Other stories you might like