This article is more than 1 year old

Man sues OpenAI claiming ChatGPT 'hallucination' said he embezzled money

Probably the first defamation suit involving an AI, but will it stick?

ChatGPT maker OpenAI is facing a defamation suit from a man seeking damages over statements it delivered to a journalist. The suit says the AI platform falsely claimed he'd been accused of embezzling money from a gun rights group.

Georgia resident Mark Walters filed the claim in Gwinnett County Court earlier this week. You can read a copy of Walters' complaint  here [PDF]. He alleges the chatbot is guilty of libel "per se," meaning a statement it made is likely to damage Walters' reputation.

"While research and development of AI is worthwhile, it is irresponsible to unleash a system on the public that is known to make up 'facts' about people," his attorney John Monroe told The Register.

According to the complaint, a journalist named Fred Riehl, while he was reporting on a court case, asked ChatGPT for a summary of accusations in a complaint, and provided ChatGPT with the URL of the real complaint for reference. (Here's the actual case [PDF] the reporter was trying to save time on reading for those curious.)

What makes the situation even odder is that the case Riehl was reporting on was actually filed by a group of several gun rights groups against Washington's Attorney General's office (accusing officials of "unconstitutional retaliation", among other things, while investigating the groups and their members) and had nothing at all to do with financial accounting claims.

When Riehl asked for a summary, instead of returning accurate information, or so the case alleges, ChatGPT "hallucinated" that Mark Walters' name was attached to a criminal complaint – and moreover, that it falsely accused him of embezzling money from The Second Amendment Foundation, one of the organizations suing the Washington Attorney General in the real complaint.

ChatGPT is known to "occasionally generate incorrect information" – also known as hallucinations, as The Register has extensively reported. The AI platform has already been accused of writing obituaries for folks who are still alive, and in May this year, of making up fake legal citations pointing to non-existent prior cases. In the latter situation, a Texas judge said his court would strike any filing from an attorney who failed to certify either that they didn't use AI to prepare their legal docs, or that they had, but a human had checked them.

It was all pretty bad news for junior lawyers who were hoping OpenAI's platform might do them a solid and claw a few minutes back from hours spent miserably sipping on cold coffee as they page through case law in a darkened office.

ChatGPT's maker, Microsoft-backed OpenAI, does warn on its landing page that the platform is a "free research preview and not intended to give out advice." We have asked OpenAI for comment.

According to the complaint, Riehl contacted Alan Gottlieb, one of the plaintiffs in the actual Washington lawsuit, about ChatGPT's allegations concerning Walters, and Gottlieb confirmed that they were false. None of ChatGPT's statements concerning Walters are in the actual complaint.

The false answer ChatGPT gave Riehl alleged that Walters was treasurer and Chief Financial Officer of SAF and claimed he had "embezzled and misappropriated SAF's funds and assets." When Riehl asked ChatGPT to provide "the entire text of the complaint," it returned an entirely fabricated complaint, which bore "no resemblance to the actual complaint, including an erroneous case number."

Walters is looking for damages and lawyers' fees. We have asked his attorney for comment. As for the amount of damages, the complaint says these will be determined at trial, if the case actually gets there.

We have asked experts in defamation for comment. According to the Berkman Center for Internet and Society, in Georgia, a private figure plaintiff bringing a defamation lawsuit must "prove that the defendant was at least negligent with respect to the truth or falsity of the allegedly defamatory statements."

The complaint claimed "ChatGPT's allegations concerning Walters were false and malicious, expressed in print, writing, pictures, or signs, tending to injure Walters' reputation and exposing him to public hatred, contempt, or ridicule."

Internet defamation law firm RM Warner Law opines that "it is the responsibility of slander and libel plaintiffs to prove that the statements under review are about them."

This part, in this Reg scribe's humble opinion, might be a problem. We have asked two Georgia-based Mark Walters for comment. ®

Bootnote

Walters' case, ironically, is listed in the Gwinnett Courts Portal as Walters VS OpenAL LLC (23-A-04860-2). We can only assume the error was introduced by one of our fellow fleshbags.

More about

TIP US OFF

Send us news


Other stories you might like