Y Combinator, startups funnily enough aren't fans of draft California AI safety law

'We believe a more balanced approach is necessary'

Venture capitalist Y Combinator and more than 140 machine-learning startups have signed an open letter in opposition to a proposed hot-button AI safety law making its way through the California legislature.

In the letter, first reported by Politico, the signatories make the same arguments for a lack of AI regulation that have been made countless times in recent months: AI is the next industrial revolution, and if we regulate it now it's only going to stifle the innovations we have yet to innovate.

"We believe a more balanced approach is necessary," the letter argues. "One that protects society from potential harm while fostering an environment conducive to technological advancement that is not more burdensome than other technologies have previously enjoyed."

"This bill, as it stands, could gravely harm California's ability to retain its AI talent and remain the location of choice for AI companies," the signatories threaten.

Are these startups even covered by the bill?

It's not immediately clear if any of the AI startups that signed the YC letter would even be covered by the bill's provisions.

California Senate Bill 1047, introduced way back in February by state Senator Scott Wiener, would impose guardrails and transparency requirements on large AI models. The bill defines these as those requiring computing power in excess of 1026 FLOPs or costing more than $100 million to implement "using the average market prices of cloud compute."

Firms in control of an AI covered by SB 1047 would be required "to determine whether it can make a positive safety determination … before initiating training of that covered model," would establish a new board responsible for re-certifying covered AI models annually, and require the government to create a public cloud cluster dubbed "CalCompute" for researching AI safety.

According to YC, that 1026 threshold is entirely arbitrary and problematic because "the technology is still evolving, and such specific metrics may not adequately capture the capabilities or risks associated with future models."

YC said that the language in the bill is extremely vague and could be applied too broadly, claims that it's unusual in its burdens by placing responsibility for AI misuse on developers instead of users, and opposes the "AI kill switch" included in the bill, which it said "could function as a de facto ban on open source AI development." 

It's not clear if any of the signatories to the letter are operating covered AI models or if they simply hope that someday they too may be influential enough to warrant government action. We've asked, but haven't heard back from YC.

SB 1047 passed in the California Senate last month on a vote of 32-1, with seven abstentions, but has been heavily marked up since landing in the CA Assembly. Last week it passed its initial Assembly committee, with eight votes for and three abstaining. Wiener said last week that he welcomed revisions to the bill that allow state agencies to change the compute threshold for coverage and gives AI firms flexibility on how the meet their obligations.

Wiener represents portions of San Francisco that include YC's offices.

"By taking a light touch approach to regulation that focuses exclusively on the largest companies building the most capable models, SB 1047 puts sensible guardrails in place against risk while leaving startups free to innovate without any new burdens," Wiener said.

The YC-led letter, on the other hand, doesn't believe that the amendments go far enough.

"Our core concerns about the bill's potential impact on innovation and California's tech economy remain," the signatories said, warning SB 1047 "could inadvertently threaten the vibrancy of California's technology economy and undermine competition."

That said, it's entirely unclear what may happen to SB 1047 in the coming months – it could very well pass the Assembly, but California Governor Gavin Newsom's signature is not a sure thing.

Signature not guaranteed

Newsom has expressed some of the same concerns included in the YC letter, namely that over-regulation of AI could lead to startups departing California for locales less likely to scrutinize them.

"We dominate in [the tech] space. I want to continue to dominate in this space," Newsom said at the GenAI Summit in San Francisco in late May. "I don't want to cede this space to other states or other countries."

Newsom has already vetoed AI regulation, and SB 1047 could very well be next in line to be tossed in the kindling pile. However, Newsom also said the AI industry needs to be regulated in some way – YC and those startups might not want it, but the big guys do.

"When you have the inventors of this technology – the godmothers and fathers – saying 'help, you need to regulate us,' that's a very different environment," Newsom said. "They're basically say[ing] 'we don't know really what we've done but you got to do something about it.'"

Newsom's office didn't respond to questions.

Wiener, for his part, claims he already has those godfathers, like Geoffrey Hinton and Yoshua Bengio, on his side. 

"Forty years ago when I was training the first version of the AI algorithms behind tools like ChatGPT, no one – including myself – would have predicted how far AI would progress," Hinton said in a statement released by Wiener's office. "Powerful AI systems bring incredible promise, but the risks are also very real and should be taken extremely seriously." ®

More about

TIP US OFF

Send us news


Other stories you might like