Moderation
← ChismisHow we keep Chismis safe and constructive.
Chismis is built around open conversation and local community discussion. Moderation exists to protect users from harm while allowing people to speak honestly about issues that matter to them.
Our moderation system combines automated systems, community feedback, and human review to keep conversations safe and constructive.
1. Automated safety systems
Chismis uses automated tools to detect content that may violate our Community Standards before or after it is posted.
These systems help identify:
- Personal information about private individuals
- Harassment or abusive language
- Threats or dangerous content
- Spam or manipulation attempts
- Illegal or harmful content
In some cases, users may receive a warning before posting if a message appears to violate the Community Standards.
2. Community moderation
Chismis relies on the community to help maintain healthy discussions. Users can interact with content through voting and reporting.
The community moderation tools include:
- Upvotes and downvotes — influence how content appears in feeds.
- Content reports — flag posts for review.
- Visibility adjustments — heavily downvoted posts may appear less prominently.
Community feedback helps surface valuable discussions while limiting harmful content.
3. Content review and enforcement
When content is reported or flagged by automated systems, it may be reviewed by moderators or internal moderation tools. Moderation decisions are based on our Community Standards.
Possible actions include:
- Removing posts or comments
- Limiting visibility of content
- Restricting posting ability
- Temporarily suspending accounts
- Permanently banning accounts for severe violations
4. Account penalties
Repeated violations of the Community Standards may lead to account penalties, including reduced visibility of posts, temporary posting restrictions, or account suspension.
Severe violations — including threats, harassment, or attempts to expose private individuals — may result in immediate account bans.
5. Appeals and review
Moderation decisions are made using a combination of automated systems and human judgment.
If you believe a moderation action was taken in error, you may contact support to request a review. Please include the relevant post information and your account email when submitting an appeal.
6. Transparency and fairness
Our goal is not to restrict opinions or silence debate. Chismis exists to support open conversations about local issues, community experiences, and public topics.
Moderation focuses on preventing harm, protecting user safety, and maintaining respectful discussions.
Reporting violations
If you encounter content that violates the Community Standards, please report it using the report feature in the app. Reports help moderators identify and address harmful behavior quickly.