Guidelines
Community Guidelines and Moderation Rules
SynSecurity uses these rules to keep reports, review, reputation, and enforcement consistent while remaining aware of UK law and platform requirements.
1. Purpose and scope
These guidelines explain how SynSecurity handles reports, staff review, sanctions, appeals, and record keeping. The goal is to provide a clear and predictable process so community members understand what happens when a report is submitted and why a decision is made.
They apply to user reports, attached evidence, moderation notes, reputation decisions, and any enforcement action taken through SynSecurity tools or staff workflows. If content may be illegal, involve immediate harm, or require specialist handling, the matter may be escalated to Discord Trust & Safety, the relevant platform owner, or the appropriate authority.
2. What users must not do
Members should not use SynSecurity or any connected community space to do any of the following:
- Harass, threaten, stalk, blackmail, or intimidate other users.
- Post hate speech, discriminatory abuse, or content that targets protected characteristics.
- Share sexual material involving minors, exploitative material, or content that encourages abuse.
- Impersonate staff, commit fraud, run scams, phish for credentials, or attempt account takeover.
- Release personal data without permission, including doxxing, private images, or confidential information.
- Spread malware, malicious links, exploit instructions, or content intended to compromise systems.
- Evade moderation, raid servers, brigades, or automate abuse through throwaway accounts or bots.
- Reuse copyrighted material without permission where it infringes the rights of the creator or owner.
3. How reports should be made
Reports should be accurate, relevant, and limited to the information needed to understand the issue. Users should include the message, channel, user name or ID where possible, and a short description of why the material matters. Screenshots or logs should be unaltered whenever possible.
False, fabricated, or malicious reports may be treated as abuse of the system. Reports should not be used as a tool for retaliation, harassment, or score settling. Staff may disregard duplicate reports or requests that repeat the same issue without new evidence.
4. Moderation process
- The report is logged and assigned a reference so staff can track the case consistently.
- A moderator reviews the evidence, checks context, and decides whether further information is needed.
- The matter is assessed against the published rules, the community standard, and any relevant UK law.
- If action is required, staff may warn, restrict, suspend, remove content, or apply another proportionate sanction.
- The outcome is recorded so the decision can be reviewed later if there is an appeal or a follow-up issue.
Where a report raises a serious safety concern, staff may preserve the evidence and seek urgent external assistance rather than handling it only through routine moderation tools.
5. Appeals, evidence, and retention
Users may request a review of a decision if they believe an error was made or important context was missed. Appeals should be respectful, specific, and supported by the information needed to reassess the case. A good appeal explains what happened, why the original outcome may be wrong, and what additional evidence should be considered.
SynSecurity may retain reports, decision logs, and evidence for as long as needed to maintain community safety, respond to appeals, defend against abuse, and meet legal obligations. If a record is no longer needed, it should be deleted or anonymised where the system design allows it.
6. UK legal context
The following laws and legal principles may be relevant when enforcing these rules. They are included as a practical reference and are not a substitute for legal advice or a full review of the facts of a case.
Online Safety Act 2023
Helps frame duties around harmful and illegal content, platform safety, and escalation.
Equality Act 2010
Supports non-discriminatory moderation and response to harassment linked to protected characteristics.
Protection from Harassment Act 1997
Relevant where conduct becomes repeated, targeted, or distressing harassment or stalking.
Malicious Communications Act 1988 and Communications Act 2003
Can apply to threatening, grossly offensive, or persistently abusive messages sent online.
Computer Misuse Act 1990 and Fraud Act 2006
Cover account abuse, unauthorised access, phishing, scams, and other forms of digital wrongdoing.
Copyright, Designs and Patents Act 1988
Matters where stolen media, leaked content, or repeated infringement is reported.
7. Updates and interpretation
SynSecurity may update these rules to reflect changes in the law, platform policy, or community practice. The latest version should always be published with a clear date so members know which rules are current.
In the event of uncertainty, staff should interpret the guidelines conservatively, focus on safety and fairness, and document the reasoning behind material decisions. When a case sits at the boundary between community policy and law, the safer course is to preserve evidence and escalate appropriately.