The company, known for its vast array of AI characters, will remove the ability for users under 18 years old to engage in “open-ended” conversations with AI by Nov. 25.
It plans to begin ramping down access in the coming weeks, initially restricting kids to two hours of chat time per day.
Character.AI noted that it plans to develop an “under-18 experience” in which teens can create videos, stories and streams with its AI characters.
“We’re making these changes to our under-18 platform in light of the evolving landscape around AI and teens,” the company said in a blog post, underscoring recent news reports and questions from regulators.
The company and other chatbot developers have recently come under scrutiny following several teen suicides linked to the technology.
The mother of 14-year-old Sewell Setzer III sued Character.AI last November, accusing the chatbot of driving her son to suicide.
OpenAI is also facing a lawsuit from the parents of 16-year-old Adam Raine, who took his own life after engaging with ChatGPT.
Both families testified before a Senate panel last month and urged lawmakers to place guardrails on chatbots.
The Federal Trade Commission also launched an inquiry into AI chatbots in September, requesting information from Character.AI, OpenAI and several other leading tech firms.
“After evaluating these reports and feedback from regulators, safety experts, and parents, we’ve decided to make this change to create a new experience for our under-18 community,” Character.AI said Wednesday.
“These are extraordinary steps for our company, and ones that, in many respects, are more conservative than our peers,” it added. “But we believe they are the right thing to do.”
Check out the full report at TheHill.com.