1. We promise fairness
We will treat moderation decisions as a serious responsibility. Reports should be reviewed by a human whenever a meaningful enforcement outcome is being considered, and staff should not rely on guesswork or personal bias when evidence is available.
Where a case involves context, dispute, or multiple sides, we will try to consider the whole picture rather than making a snap decision from a single message. We will also keep the door open for appeals where that is safe and appropriate.
2. We promise transparency
We will explain how reports are handled, what evidence is used, and why a decision was made whenever the situation allows it. Our aim is to make the workflow understandable instead of hidden behind vague moderation language.
SynSecurity will keep a record of important outcomes so communities can review what happened later, audit the system, and understand patterns over time. If a policy changes, we will publish the update with a clear date and avoid silently changing the rules that people depend on.
3. We promise privacy by design
We will collect only the information needed to run the service, review reports, and protect communities. We will not use personal data for unrelated advertising, and we will not sell it to third parties.
Minimisation
We avoid collecting extra data when a smaller record is enough to resolve the case.
Retention control
We keep records only for as long as they are needed for safety, audit, or legal reasons.
Security
Access to moderation records should be restricted to people who actually need it.
UK GDPR mindset
We aim to follow the principles of lawfulness, fairness, transparency, accuracy, and accountability.
4. We promise accessibility and inclusion
We will try to keep our language plain, our pages readable, and our moderation process accessible to the widest possible audience. If someone needs a different format or extra support to understand a decision, we should try to help where we reasonably can.
We will not knowingly ignore harassment, discrimination, or abuse aimed at protected characteristics. Our approach is intended to fit alongside the Equality Act 2010 and the general expectation that communities should not be policed in a way that is unfair or discriminatory.
5. We promise responsible automation
Automation can help with consistency, but it should not replace judgment when the outcome is serious. We will use automation to support moderation, not to hide it. Where a decision has a significant effect, there should be an understandable path for human review.
We will not use automation to make communities feel like they are being processed by a black box. If a threshold, reputation score, or queue rule is in use, it should be explainable and documented so staff can justify the result if challenged later.
6. We promise secure handling
We will use reasonable technical and organisational measures to protect reports, logs, and account information from unauthorised access, disclosure, alteration, or loss. If a security problem is discovered, we will investigate it promptly and limit the damage as far as we can.
We will also keep an eye on the minimum standards expected under UK data protection law. That means thinking about who can see records, how long they stay available, and whether the service is collecting more than it should.
7. We promise to act lawfully
If content appears to involve illegal activity, serious harassment, exploitation, fraud, unauthorized access, or other harmful conduct, we will treat it seriously and escalate where appropriate. In the UK context, that may mean considering laws such as the Online Safety Act 2023, the Protection from Harassment Act 1997, the Malicious Communications Act 1988, the Communications Act 2003, the Computer Misuse Act 1990, the Fraud Act 2006, and the Copyright, Designs and Patents Act 1988.
This page does not list every possible law, and it does not replace legal advice. It is a statement that SynSecurity will not knowingly operate a moderation system that ignores relevant UK legal duties or encourages harmful conduct.
8. What we will not do
- We will not sell personal data or use it for unrelated advertising.
- We will not hide the fact that records are being kept when records are required for safety or audit.
- We will not let staff decisions rely on a single rushed message where more context is available.
- We will not knowingly tolerate harassment, discrimination, scams, or account abuse.
- We will not refuse a reasonable appeal process simply because a decision has already been made.
9. How we keep this promise current
We will review these commitments when the service changes, when the law changes, or when community feedback shows that a promise needs to be sharpened. If we cannot keep a promise in practice, we should change the page or change the service rather than leaving the wording misleading.
SynSecurity exists to make community moderation more understandable and more trustworthy. That only works if the process remains open, proportionate, and honest about how decisions are made.