Just say no to NO FAKES Act, EFF argues
The problem is focusing on property rights rather than privacy
The Electronic Frontier Foundation (EFF) says the revised version of the NO FAKES Act, reintroduced in April, would be a disaster for free speech and innovation if signed into law.
The bill [PDF] aims to address valid concerns about AI-generated audio and visual replicas of people in the wrong way, the EFF argues, by establishing a new intellectual property right over one's image, rather than a privacy right that would deter unauthorized AI replicas, among other purported flaws.
The new right would incentivize a market for monetizing the simulation of dead celebrities and would lead to more litigation, the EFF argues.
And the associated takedown requirement, the rights group claims, would put the distribution of AI software tools at risk, would encourage overly broad content removal, would make unmasking online content creators easier, and would cement the power of incumbents with compliance costs that startups couldn't easily afford.
We simply must stop approaching everything as if property rights are the only way to protect people, when privacy does a better job of it
"We simply must stop approaching everything as if property rights are the only way to protect people, when privacy does a better job of it," Katharine Trendacosta, the EFF's director of policy and advocacy, told The Register in a phone interview.
"Property rights inherently create an unequal playing ground where some people's rights are worth more than others, and it's worth more time for people who can license out their image to go after people, to preserve their ability to make money or to make money off of their dead relatives."
In contrast, Trendacosta said privacy rights are designed to protect you from defamation, misinformation, and misattribution.
"Creating a property right in this case is just an idea designed to create a whole new market for certain people to be able to make money and not to protect everyone from the really understandable fears people have about digital replicas," she said.
Property vs privacy
The NO FAKES (Nurture Originals, Foster Art and Keep Entertainment Safe) Act was initially introduced in July 2024 as a way to prevent the unauthorized production, hosting, or sharing of digital replicas of people.
The bipartisan bill aspired to give people a property right over computer-generated likenesses of themselves and to establish a notice-and-takedown mechanism to remove unauthorized digital replicas.
The proposed legislation never made it out of committee. That may be due at least in part to opposition from a handful of civil society groups like the EFF as well as the Center for Democracy & Technology, the Computer & Communications Industry Association, the American Library Association, and others.
In a public letter [PDF], various advocacy and trade groups argued at the time that the bill would overlap with existing laws, would invite abusive takedown demands, and would increase litigation, among other problems.
So two months ago, US Senators Marsha Blackburn (R-TN), Chris Coons (D-DE), Thom Tillis (R-NC), and Amy Klobuchar (D-MN), in conjunction with US Representatives Maria Salazar (R-FL) and Madeleine Dean (D-PA), reintroduced the bill, promising it would protect the voices and likenesses of people from unauthorized digital recreation. The new version covers not just digital replicas but tools used to make them, expands the takedown requirements to cover more providers, and requires not only takedown but prevention of reuploading.
"While AI presents extraordinary opportunities for technological advancement, it also poses some new problems, including the unauthorized replication of the voice and visual likeness of individuals, such as artists,” said Senator Tillis in a statement.
The bill's sponsors claim NO FAKES will:
- Hold individuals or companies liable if they distribute an unauthorized digital replica of an individual’s voice or visual likeness;
- Hold platforms liable for hosting an unauthorized digital replica if the platform has knowledge of the fact that the replica was not authorized by the individual depicted;
- Exclude certain digital replicas from coverage based on recognized First Amendment protections; and
- Preempt future state laws regulating digital replicas.
But Trendacosta argues that the bill's exceptions for material protected as free speech under the First Amendment won't matter in practice.
"The bill itself has exceptions for what they believe to be First Amendment protected speech," she said. "That's not what's going to happen. Because when you force companies to take down things, they don't really look at context to figure out whether or not it is protected speech. They just see that it contains a version of the thing they're supposed to take down."
- Mozilla rolls out Firefox 140 with ESR status and fresh features
- 'Psylo' browser tries to obscure digital fingerprints by giving every tab its own IP address
- Australia finds age detection tech has many flaws but will work
- Trump signs TAKE IT DOWN law meant to stop revenge porn
The bill has attracted an endorsement from YouTube and the Recording Industry Association of America (RIAA), among others.
Trendacosta said that YouTube's support reflects the fact that it already has its own content takedown mechanism.
"YouTube has Content ID, which is not required by law, but they've built it and it's a nightmare," she explained, pointing to a white paper she wrote about the problems with YouTube's system.
"And all they will have to do is crank that up to 11 to comply with this law, so it is easier for them than it is for anyone else," she said. "In fact, one of the problems we have with the bill is the requirements are so intense and onerous that no one could ever enter this market who isn't YouTube because of the startup costs."
As for the RIAA, Trendacosta said, "They've always sort of wanted this because they want to create a licensing system that they can skim money off of."
The EFF contends the broadly written legislation just creates a censorship regime covering not just digital replicas but products and services that produce them. The revised bill, the organization says, targets "tools that can be used to produce images that aren’t authorized by the individual, anyone who owns the rights in that individual’s image, or the law."
There's every reason to investigate the collateral damage this bill would have if we've already got Take IT Down to address the kinds of harm that everyone really is concerned about
Trendacosta described the provision as "incredibly bizarre."
"It basically means that if you're a service that allows people to download tools, you have to take down tools and similar tools if someone tells you they exist," she explained. "But at the same time, who can send those takedowns is incredibly vague. It's a very weird restriction on a tool because, again, there are constitutional uses that are perfectly acceptable of AI tools. And so, instead of attacking use, they're attacking the tool."
In light of the passage of the TAKE IT DOWN Act, which "prohibits the nonconsensual online publication of intimate visual depictions of individuals," Trendacosta argues legislators should slow down and consider the implications of NO FAKES.
"There's every reason to investigate the collateral damage this bill would have if we've already got Take IT Down to address the kinds of harm that everyone really is concerned about," she said.
There's no indication when NO FAKES, currently referred to the Senate Judiciary Committee, will receive further consideration from lawmakers. ®