Character.AI Rolls Out New Parental Insights Feature Amid Safety Concerns

Chatbot platform Character.AI has introduced a new Parental Insights feature aimed at giving parents a window into their children's activity on the platform. The feature allows users under 18 to share a weekly report of their chatbot interactions directly with a parent's e-mail address.

The move comes as the company, which has faced criticism and multiple lawsuits over its handling of minors' safety, seeks to bolster its parental oversight tools and ensure its platform is used more responsibly.

Parental Insights was designed to provide parents with an overview of their child's activity on Character.AI without sharing specific chat logs or conversations. According to the company, the weekly report includes key details such as the average daily time a child spends on both the web and mobile platforms, the characters they interact with most frequently, and how much time they spend chatting with each of those characters.

"We are a small team here at Character.AI, but many of us are parents who know firsthand the challenge of navigating new technologies while raising teenagers," the company said in a blog post. "Over the past year, we have rolled out a suite of new safety features across our platform, designed specifically with teens in mind. These features include a separate model for our teen users, improvements to our detection and intervention systems for human behavior and model responses, and more."

The feature is optional, and teens can activate or deactivate it via their account settings. Once set up, parents can receive the reports automatically without needing to create an account on the platform themselves. If a teen wishes to revoke parental access to this data at any point, they can do so, but the request will require confirmation from the parent.

The platform, which allows users to create and interact with customized AI chatbots, has been widely popular among teenagers, but its content moderation policies have been called into question after reports of bots offering content that could be potentially dangerous.

In response to these concerns, Character.AI has implemented several safety features over the past year. These include a new model tailored to users under 18 that is trained to avoid sensitive or inappropriate output, as well as clear notifications that remind users their interactions are with AI, not real people. The platform has also introduced time-spent alerts and restrictions on sensitive content, aiming to foster a safer environment for younger users.

About the Author

John K. Waters is the editor in chief of a number of Converge360.com sites, with a focus on high-end development, AI and future tech. He's been writing about cutting-edge technologies and culture of Silicon Valley for more than two decades, and he's written more than a dozen books. He also co-scripted the documentary film Silicon Valley: A 100 Year Renaissance, which aired on PBS.  He can be reached at [email protected].

Featured

  • Abstract AI circuit board pattern

    Nonprofit LawZero to Work Toward Safer, Truthful AI

    Turing Award-winning AI researcher Yoshua Bengio has launched LawZero, a nonprofit aimed at developing AI systems that prioritize safety and truthfulness over autonomy.

  • students using digital devices, surrounded by abstract AI motifs and soft geometric design

    Ed Tech Startup Kira Launches AI-Native Learning Platform

    A new K-12 learning platform aims to bring personalized education to every student. Kira, one of the latest ed tech ventures from Andrew Ng, former director of Stanford's AI Lab and co-founder of Coursera and DeepLearning.AI, "integrates artificial intelligence directly into every educational workflow — from lesson planning and instruction to grading, intervention, and reporting," according to a news announcement.

  • human profile with a circuit-board brain next to an open book

    Pilot Program Fosters AI Literacy in Underserved Youth

    A pilot co-led by Operation HOPE and Georgia State University is working to build technical, entrepreneurial, and financial-literacy skills in Atlanta-area youth to help them thrive in the AI-powered workforce.

  • student holding a smartphone with thumbs-up and thumbs-down icons, surrounded by abstract digital media symbols and interface elements

    Teaching Media Literacy? Start by Teaching Decision-Making

    Decision-making is a skill that must be developed — not assumed. Students need opportunities to learn the tools and practices of effective decision-making so they can apply what they know in meaningful, real-world contexts.