Content moderation process
content-moderation-processDomain: content-moderationType: mixedDescription
Content moderation is now a regulated function rather than a voluntary product choice: the DSA, the UK Online Safety Act, Singapore's Online Criminal Harms Act, India's IT Rules 2021, and Australia's Online Safety Act 2021 have each turned what platforms used to do as a matter of trust-and-safety practice into a documented obligation with audit hooks. A working moderation program has five pieces: a published content policy that defines prohibited content with enough specificity that a user can predict whether their post will be actioned, an intake path for user reports (the complaint-handling system above), a graduated enforcement ladder that scales response to severity (warning, content removal, account restriction, account termination), a decision log that retains both the action and the reasoning for the regulatory record, and an internal complaint-review route for users who want to contest a decision. Moderation accuracy is judged on two axes simultaneously: false negatives (illegal content that stayed up) draw enforcement under DSA Article 16 and OSA duties of care, and false positives (lawful content wrongly removed) draw enforcement under DSA Article 17 and the OSA's freedom-of-expression duties. Optimizing for one without measuring the other is the recurring failure mode.
Required by (8 regulations)
- Marco Civil
Lei nº 12.965, de 23 de abril de 2014 (Marco Civil da Internet), regulated by Decreto nº 8.771, de 11 de maio de 2016
- CSL
Cybersecurity Law of the People's Republic of China (adopted November 7, 2016, effective June 1, 2017)
- DSA
Articles 14-23 — content moderation, reporting, transparency.
Regulation (EU) 2022/2065 of the European Parliament and of the Council (Digital Services Act)
- IT Rules 2021
Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, issued under the Information Technology Act, 2000 (Act No. 21 of 2000), as amended by the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2023
- NetzDG
NetzDG German notice-and-action timing.
Gesetz zur Verbesserung der Rechtsdurchsetzung in sozialen Netzwerken (Netzwerkdurchsetzungsgesetz. NetzDG), BGBl. I 2017, S. 3352, as amended
- Anti-Cyber Crime Law
Royal Decree M/17, Anti-Cyber Crime Law, issued 8/3/1428 AH (March 26, 2007)
- Section 230
US safe-harbor preconditions.
47 U.S.C. § 230
- UK OSA
Online Safety Act duties of care.
Online Safety Act 2023 (c.50)
Fulfilled by (5)
- tremau · partial · medium effort · $$
- hive · partial · low effort · $$
- In-house build · high effort
- webpurify · partial · low effort · $$Image / text / video moderation as managed service.
- crisp · partial · low effort · $$AI-driven content moderation with human escalation.
ClearLaunch does not accept payment from vendors. Methodology.
Evidence formats
- community guidelines
- moderation playbook
- transparency report
- appeals log