Reporting and Moderation
How families can flag unsafe content, what moderators look at, and what actions may happen after a report.
What families should know
When to report something
Families should report behavior that feels unsafe, inappropriate, misleading, or too personal. Reports are not just for major issues; they are also useful when something feels off and needs a closer look.
What moderators review
Moderators look at the reported item, nearby context, and any safety signals connected to it. Review is meant to support safe decisions, not punish children for honest mistakes.
- Message content and repeated behavior patterns.
- Listing honesty and policy violations.
- Signs that someone is trying to move a child off-platform.
Possible outcomes
Depending on what moderators find, content might stay up, be hidden, be sent back for parent review, or lead to restrictions on an account.
What helps a report move faster
Use this as a quick conversation guide with your child or another caregiver.
- Choose the closest reason for the report.
- Add a short note about what felt unsafe.
- Tell your child not to keep replying while the issue is being reviewed.
Related guides
More pages in help & oversight if your family wants to keep going.