SB 53, the landmark AI transparency bill, is now law in California

5 hours ago 5

Senate Bill 53, the landmark AI transparency measure that has divided AI companies and made headlines for months, is present officially instrumentality successful California. 

On Monday, California Gov. Gavin Newsom signed the “Transparency successful Frontier Artificial Intelligence Act,” which was authored by Sen. Scott Wiener (D-CA). It’s the second draft of specified a bill, arsenic Newsom vetoed the archetypal version — SB 1047 — last twelvemonth owed to concerns it was excessively strict and could stifle AI innovation successful the state. It would person required each AI developers, particularly makers of models with grooming costs of $100 cardinal oregon more, to trial for circumstantial risks. After the veto, Newsom tasked AI researchers with coming up with an alternative, which was published successful the signifier of a 52-page study — and formed the ground of SB 53. 

Some of the researchers’ recommendations made it into SB 53, similar requiring ample AI companies to uncover their information and information processes, allowing for whistleblower protections for employees astatine AI companies, and sharing accusation straight with the nationalist for transparency purposes. But immoderate aspects didn’t marque it into the study — similar third-party evaluations. 

As portion of the bill, ample AI developers volition request to “publicly people a model connected [their] website describing however the institution has incorporated nationalist standards, planetary standards, and industry-consensus champion practices into its frontier AI framework,” per a release. Any ample AI developer that makes an update to its information and information protocol volition besides request to people the update, and its reasoning for it, wrong 30 days. But it’s worthy noting this portion isn’t needfully a triumph for AI whistleblowers and proponents of regulation. Many AI companies that lobby against regularisation suggest voluntary frameworks and champion practices — which tin beryllium seen arsenic guidelines alternatively than rules, with few, if any, penalties attached. 

The measure does make a caller mode for some AI companies and members of the nationalist to “report imaginable captious information incidents to California’s Office of Emergency Services,” per the release, and “protects whistleblowers who disclose important wellness and information risks posed by frontier models, and creates a civilian punishment for noncompliance, enforceable by the Attorney General’s office.” The merchandise besides said that the California Department of Technology would urge updates to the instrumentality each twelvemonth “based connected multistakeholder input, technological developments, and planetary standards.”

AI companies were divided connected SB 53, though astir were initially either publically oregon privately against the bill, saying it would thrust companies retired of California. They knew the stakes: With astir 40 cardinal residents of California and a fistful of AI hubs, the authorities has outsized power connected the AI manufacture and however it volition beryllium regulated. 

SB 53 had been publically endorsed by Anthropic aft weeks of negotiations connected the bill’s wording, but Meta successful August launched a state-level ace PAC to assistance signifier AI authorities successful California. And OpenAI had lobbied against specified authorities successful August, with its main planetary affairs officer, Chris Lehane, writing to Newsom that “California’s enactment successful exertion regularisation is astir effectual erstwhile it complements effectual planetary and national information ecosystems.” 

Lehane suggested that AI companies should beryllium capable to get astir California authorities requirements by signing onto national oregon planetary agreements instead, writing, “In bid to marque California a person successful global, nationalist and state-level AI policy, we promote the authorities to see frontier exemplary developers compliant with its authorities requirements erstwhile they motion onto a parallel regulatory model similar the [EU Code of Practice] oregon participate into a safety-oriented statement with a applicable US national authorities agency.”

Read Entire Article