← Back to zer000.com
Child Safety Standards for tism
Effective Date: March 19, 2026
Last Updated: March 19, 2026
Our Commitment
tism is a community app for parents and caregivers of autistic family members, developed by 0.0.0 LLC, also known as ZeroZeroZero LLC. The safety of children is our highest priority. We maintain zero tolerance for child sexual abuse material (CSAM), child sexual abuse and exploitation (CSAE), or any content that exploits, endangers, or sexualizes minors.
Who Uses tism
tism is designed exclusively for adults. All account holders must be 18 years of age or older. Children do not create accounts, do not interact with the app directly, and cannot communicate with other users. Child profiles are created and managed entirely by a parent or legal guardian.
Safety Measures
Age Verification
- All users must confirm they are 18 or older during onboarding
- The app's target audience is adults only (18+)
- No features are designed for or directed at children
Content Moderation
- Every piece of user-generated content (posts, comments, messages, events) has a report function
- Every user profile has block and report options
- Reported content is reviewed promptly by our moderation team
- Content that violates child safety policies is removed immediately
Account Enforcement
- Accounts that post, share, or solicit CSAM or any exploitative content involving minors are permanently banned
- Accounts engaging in grooming behavior or attempting to contact minors are permanently banned
- We cooperate fully with law enforcement investigations
Child Data Protection
- Child profiles contain only information voluntarily provided by the parent (first name or nickname, birth year, interests, strengths)
- No photos of children are collected or stored
- A built-in blur tool is available when posting images, allowing users to obscure faces or sensitive areas before sharing
- Child data is only visible to members of your groups
- Parents have full control to edit or delete their children's information at any time
- Children's data is never shared with third parties, advertisers, or used to train AI models
Community Trust
- Access to tism requires an invite code from an existing member or registration in an open area
- Community members can verify each other through a peer verification system
- Private messaging requires mutual opt-in (both users must accept a connection)
- Connection requests require context ("How do you know this family?")
Reporting
Users can report child safety concerns in multiple ways:
- In-App: Tap the report button available on all content and user profiles. Select the appropriate category including options for child safety concerns.
- Contact Form: Submit a report at tism.us/contact (select "Child safety concern")
All child safety reports are treated as highest priority and reviewed immediately.
Reporting to Authorities
tism complies with all applicable child safety laws. When we become aware of CSAM or child exploitation:
- We report to the National Center for Missing & Exploited Children (NCMEC) via the CyberTipline as required by U.S. federal law
- We preserve relevant evidence as required by law
- We cooperate with law enforcement agencies and other relevant authorities
- We permanently ban the offending account
Crisis Resources
tism provides access to crisis resources within the app:
- 988 Suicide & Crisis Lifeline: Call or text 988
- Crisis Text Line: Text HOME to 741741
- Childhelp National Child Abuse Hotline: 1-800-422-4453
- Autism Society of America Helpline: 1-800-328-8476
- NCMEC CyberTipline: report.cybertip.org
Contact
For questions about our child safety standards or to report a concern:
0.0.0 LLC
Contact: tism.us/contact (select "Child safety concern")
Address: 1309 Coffeen Avenue STE 1200, Sheridan, WY 82801