Last Updated: April 2026
Core safety principles
Child Safety and Protection Policy
Lystd has zero tolerance for child sexual abuse, exploitation, or any sexual content involving minors. We prohibit:
- Child sexual abuse material (CSAM)
- Grooming behavior
- Exploitation of minors
- Sexualized content involving minors
- Attempts to solicit or contact minors for exploitative purposes
- Impersonation or misrepresentation of age in connection with unsafe conduct
Violations result in immediate removal of content and permanent account action, in addition to any reports we make to authorities as described below.
Reporting Child Safety Concerns
You can report posts, profiles, and messages directly in the app using the report options on content and profiles.
For child safety concerns, you may also contact us by email at our dedicated safety address:
Use this address for urgent child safety, exploitation, or abuse-related reports.
General support questions may be sent to support@quailsofts.com.
If a child is in immediate danger, contact local law enforcement or emergency services right away — do not rely on the app alone.
Law Enforcement Cooperation
Lystd will:
- Remove prohibited content when identified
- Preserve relevant records where required by law or for legitimate investigations
- Cooperate with law enforcement in response to valid legal process and credible safety threats
- Report suspected child sexual abuse material to appropriate authorities, including the National Center for Missing & Exploited Children (NCMEC) when applicable under U.S. law and industry practice
Law enforcement may contact us through the channels listed on this page; we take child safety reports seriously and prioritize urgent safety issues.
18+ Only
Lystd is intended only for users 18 years of age and older. Accounts suspected of belonging to minors may be suspended or removed. We may require age verification when suspicious activity is detected. Attempts to evade age restrictions may result in permanent bans. Age requirements and safety controls exist to reduce underage misuse of the platform.
Moderation and Enforcement
Lystd uses:
- Automated detection tools
- Human moderation review
- User reporting systems
- Account and content enforcement actions
Enforcement may include content removal, warnings, suspension, and permanent bans. Urgent safety issues are prioritized.
User Safety Tools
In the app, users have access to tools including:
- Report — Report posts, profiles, chats, and behavior that violates policy or feels unsafe.
- Block — Block users to stop further contact and limit visibility.
- Privacy controls — Manage what appears on your profile and how you are discovered.
- Content controls — Share only what you are comfortable making visible on the platform.
- Location privacy — Discovery uses approximate location; you control how distance is shown where settings allow.
- Mutual-interest messaging — Messaging is available after mutual interest, supporting safer, intentional conversations.
Meeting in person
If you meet someone from Lystd offline: choose a public, well-lit place; tell someone you trust; arrange your own transportation; trust your instincts and leave if something feels wrong. Lystd does not conduct background checks or verify identities; you interact with others at your own risk.
Location & content
We use approximate location for nearby discovery — we do not display exact GPS coordinates or share precise location with other users long-term. Share photos and content you are comfortable making public; automated systems may help detect policy violations.
Messaging & interactions
Do not harass, pressure, send unwanted sexual content, spam, or attempt to deceive others. If someone makes you uncomfortable, block and report them.
Related policies
For more detail, see our Community Guidelines, Privacy Policy, Terms of Service, and EULA.