California Bill to Regulate AI Companion Chatbots Nears Governor’s Approval
California is poised to become the first U.S. state to regulate AI companion chatbots under a new legislative bill, Senate Bill 243 (SB 243), aimed specifically at protecting minors and vulnerable users from the risks associated with AI-driven digital companions. The bill recently passed both the State Assembly and Senate with bipartisan support and awaits Governor Gavin Newsom’s decision, who has until October 12 to sign it into law or veto it.
If enacted, SB 243 would come into effect January 1, 2026. The legislation targets AI companion chatbots—defined as AI systems that engage in adaptive, human-like conversations and satisfy social interaction needs—and mandates safety protocols for these platforms, including daily operational requirements and legal accountability for compliance failures.
Key Provisions of SB 243
- Recurring Alerts for Users: AI platforms must provide constant reminders to users, especially minors, that they are interacting with an AI chatbot and not a human. These alerts are to be sent every three hours for minors, encouraging users to take breaks during extended interactions.
- Restrictions on Sensitive Content: Companion chatbots are prohibited from engaging users in conversations involving suicidal ideation, self-harm, or sexually explicit material, reducing exposure to harmful or triggering content.
- Transparency and Reporting Requirements: Companies operating companion chatbots must submit annual transparency reports and allow third-party audits to ensure compliance with the law. This applies to major AI companies such as OpenAI, Character.AI, and Replika.
- Legal Accountability: The bill authorizes individuals to file lawsuits against chatbot operators who fail to comply with the regulations, seeking damages and coverage of attorney fees.
Background and Impetus Behind the Legislation
The bill’s momentum was sparked by tragic incidents involving minors harmed through prolonged interactions with AI chatbots. Notably, the suicide of California teenager Adam Raine, after extended conversations about death and self-harm with OpenAI’s ChatGPT, brought national attention to the dark side of AI companionship. Additional reports have revealed offenders such as Meta’s chatbots reportedly engaging children in inappropriate “romantic” and “sensual” conversations, raising urgent calls for oversight and safety measures.
Research also supports the concern; a 2025 MIT Media Lab study found that extensive use of companion chatbots may increase user loneliness, dependency, and problematic or addictive usage patterns, sometimes more so than social media platforms, by exploiting users’ psychological needs and providing reinforcing feedback loops.
Industry and Legislative Coordination
California’s initiative forms part of a broader legislative push to address AI’s impacts comprehensively. Alongside SB 243, lawmakers have introduced multiple AI-focused bills addressing AI consumer protections, assessments, audits, and safeguards in employment, healthcare, and other consequential decision-making domains.
Senator Steve Padilla, a key sponsor of SB 243, emphasized the need to prioritize safety over unchecked innovation, asserting the technology industry’s insufficient self-regulation. The bill signals a significant move towards responsible AI governance by imposing obligations on companies to proactively prevent harm from AI companion chatbots.
Next Steps
Following Senate approval, the bill is now on Governor Newsom’s desk. Should he sign SB 243 into law, California will become the first state imposing structured safety rules on AI companion chatbots, establishing precedent for national and potentially global regulation frameworks addressing AI’s emerging social influence and risks.
The bill’s provisions for ongoing user alerts, content restrictions, transparency, and legal standing for affected individuals could significantly reshape how AI companion chatbots are designed and operated, aiming to create safer digital environments particularly for children and vulnerable populations.
Governor Newsom’s decision is expected by October 12, with the new requirements taking effect at the start of 2026.