SB 1047, a bill currently being debated in California, has sparked fierce debate in the tech community, particularly among AI companies based in the state. The bill, introduced by Senator Scott Wiener, would impose strict safety requirements on companies developing AI models on a large scale to prevent potential disasters such as the creation of bioweapons or financial crises.
Key data in California
The legislation, which the state Senate already passed in May 2024, is now in the final review phase in the state Assembly. The vote in the Assembly is scheduled for the end of August 2024. If passed, Governor Gavin Newsom could sign the bill into law and come into effect in early 2025. AI companies would likely have until mid-2025 to comply with the new safety requirements.
The bill would require AI companies to implement mechanisms to shut down systems in emergencies and take preventative measures against catastrophic risks. Companies would also be required to file a statement of compliance with the California Attorney General and could be subject to civil penalties for non-compliance.
However, the bill has been strongly opposed in Silicon Valley, a center of technological innovation. Leading companies such as OpenAI have expressed concerns that state regulation could stifle innovation and prevent investment in California. In a letter to Senator Wiener's office, Jason Kwon, OpenAI's Director of Strategy, emphasized, "The AI revolution has just begun, and California's unique status as a world leader in AI is driving the state's economic momentum. SB 1047 would threaten that growth, slow the pace of innovation, and cause world-class engineers and entrepreneurs to leave California in search of greater opportunities elsewhere."
Nancy Pelosi, former Speaker of the House of Representatives and influential figure in US politics, has also spoken out against the bill. Pelosi described the bill as "misinformed" and expressed concern that it could negatively impact California's leading role in the technology sector. She warned that the law could impose excessive restrictions that could slow down the progress of AI in the state.
In response to the criticism, Senator Wiener defended the need for the bill, explaining that SB 1047 aims to establish "reasonable" safety standards already accepted by leading AI labs. He pointed out that the law would apply to any company operating in California, regardless of where it is headquartered, countering concerns about a potential talent drain. Wiener commented, "In short, SB 1047 is a very sensible law requiring the big AI labs to do what they've already committed to — test their big models for catastrophic safety risks. SB 1047 is well aligned with what we know about foreseeable AI risks and deserves to be enacted."
To address some of the concerns raised, Wiener has made changes to the bill that remove criminal liability for non-compliant companies and protect smaller open-source developers. However, these changes have not been enough to overcome the resistance.
As the House vote approaches, the future of SB 1047 remains uncertain. If passed, the bill could set a precedent for AI regulation in the United States and mark a turning point in the relationship between technological innovation and government regulation. However, it could also trigger a series of legal and economic challenges that could affect California's position as a global technology leader.