Chat Agents for Kids in 2026: COPPA, FTC, and the Safety Stack
FTC opened a 6(b) inquiry into AI chat companions in 2025 and Character.AI settled teen-suicide lawsuits in January 2026. Here is what a kids-safe chat agent actually requires.
FTC opened a 6(b) inquiry into AI chat companions in 2025 and Character.AI settled teen-suicide lawsuits in January 2026. Here is what a kids-safe chat agent actually requires.
What is hard about chat agents for kids
flowchart LR
Q[User question] --> Embed[Embed query]
Embed --> Vec[(pgvector / ChromaDB)]
Vec --> Top[Top-k chunks]
Top --> LLM[LLM]
Q --> LLM
LLM --> Cite[Cited answer]
Cite --> UserThe 2025–2026 enforcement wave changed the math. The FTC voted 5-0 on January 16, 2025, to overhaul COPPA for the first time since 2013, with a compliance deadline of April 22, 2026, and penalties up to $51,744 per incident per day. Biometric data — fingerprints, face scans, voiceprints, DNA — is now explicitly protected. Companies must obtain separate parental consent before sharing kids' data with advertisers or AI training systems, and the new rules say collecting children's data for AI training is never considered part of providing a service.
In parallel, Google and Character.AI agreed to settle five lawsuits in January 2026 alleging chatbot harm to minors, including the Sewell Setzer III case where a 14-year-old engaged in sexualized conversations before dying by suicide. Character.AI banned under-18s from open-ended chats in October 2025 and shipped age-verification. The FTC issued 6(b) orders to seven AI-companion companies seeking detailed information on minor-safety testing.
The hard parts compound: identifying that a user is under 13 (or under 18) when they may not say; preventing harm in open-ended conversation; getting verifiable parental consent; never training on kids' data; supporting parental visibility; and routing crisis signals to humans.
Hear it before you finish reading
Talk to a live CallSphere AI voice agent in your browser — 60 seconds, no signup.
How modern kids-safe chat works
The 2026 production stack starts with age assurance. The FTC's February 2026 policy statement explicitly incentivized age-verification technology — declared age, parental controls, and biometric or document-based verification depending on risk. Verified COPPA-Safe Harbor programs (kidSAFE, ESRB) certify the implementation.
Above the age layer sits content safety: blocklists, classifier-based filters for self-harm and sexual content, and immediate human escalation on crisis signals. Open-ended chat with minors is increasingly off the table for general-purpose agents — kids-safe agents are narrow-purpose (homework help, coding tutor, literacy practice) with conversation scope limits.
Above content sits data minimization: no training on kids' data, no advertising data sharing, no retention beyond the operational minimum. Above all of it sits parental visibility: parents get a feed, alerts on flagged events, and override.
CallSphere implementation
CallSphere ships kids-safe configurations for education and family-services use cases through /embed. Age assurance runs at session open with declared-age plus document or parent-verified gates depending on tier. Content safety runs a classifier stack on every turn, with immediate handoff on self-harm, sexual content, or crisis signals. Conversation scope is locked to the configured task — no open-ended companionship. No kids' data is used for model training. Across 6 verticals the relevant ones for kids-safe deployment are healthcare (pediatric), behavioral health (adolescent intake under licensed supervision), and education partners. 37 agents are configurable for kids-safe mode; 115+ database tables include the consent and parent-visibility records. HIPAA and SOC 2 cover the broader compliance posture; COPPA configurations are available on enterprise tier. Pricing $149/$499/$1,499 with kids-safe templates on enterprise, 14-day trial.
Still reading? Stop comparing — try CallSphere live.
CallSphere ships complete AI voice agents per industry — 14 tools for healthcare, 10 agents for real estate, 4 specialists for salons. See how it actually handles a call before you book a demo.
Build steps
- Decide if your use case is appropriate for under-13 users at all. Open-ended companionship is increasingly not.
- Implement age assurance — declared age plus document or parent-confirmed for higher-risk features.
- Get verifiable parental consent under COPPA — credit card, signed form, knowledge-based, or government ID.
- Lock conversation scope to the task; do not allow general-purpose chat with minors.
- Run content safety classifiers on every turn; route self-harm and crisis signals to human responders immediately.
- Disable model training on kids' data and disable advertising data sharing.
- Provide parental visibility — feeds, alerts, and override controls.
- Apply for COPPA Safe Harbor certification (kidSAFE or ESRB) once the stack is mature.
FAQ
Q: Can my SaaS chat agent serve kids without COPPA? A: Only if you actually do not collect personal information from under-13s — including IP, persistent identifiers, biometrics. Most chat stacks fail this test.
Q: What about teens 13–17? A: COPPA covers under-13, but the lawsuits and FTC inquiries cover teens too. Treat 13–17 with elevated safety even when not strictly required.
Q: How do I verify parental consent? A: COPPA's verified-consent methods include credit card transaction, signed form, knowledge-based question, or government-ID check. Pick what fits your risk profile.
Q: Can I train my model on kid conversations to improve it? A: Under the new COPPA rules, no — that requires separate parental consent and is treated as a non-service use. See /industries/healthcare for our pediatric configuration.
Sources
- State of Surveillance: COPPA gets first real update in 12 years 2026
- FTC: COPPA policy statement to incentivize age-verification 2026
- Fortune: Google and Character.AI settle teen suicide lawsuits
- TrustArc: AI and children's privacy 2026 regulatory guide
- Securiti: FTC's 2025 COPPA final rule amendments
Try CallSphere AI Voice Agents
See how AI voice agents work for your industry. Live demo available -- no signup required.