ClearLaunch
Feature CheckerRegulations & PoliciesEnforcementRadarVendorsChangelogGuides
FAQ
← All Controls

Non-consensual intimate imagery (NCII) takedown procedure

ncii-takedown-procedureDomain: content-moderationType: process

Description

The federal TAKE IT DOWN Act (2024) is the first US statute to impose a fixed takedown deadline for non-consensual intimate imagery (NCII), including AI-generated synthetic imagery, on platforms that host user-generated content. The substantive rule: an accessible reporting mechanism that any user can invoke without an account, validation of the report, and removal of the content within 48 hours of a valid request. The statutory definition of NCII covers photographic and AI-synthesized imagery alike (the deepfake provision is the part of the statute that closed the prior gap most prominent state laws had left open), and the reporting surface has to be reachable from the platform's main interface rather than buried in a help center. Adjacent obligations attach when the imagery involves a minor: 18 USC Section 2258A requires independent reporting to the National Center for Missing and Exploited Children's CyberTipline, on a faster clock, with chain-of-custody preservation of the reported content for law enforcement. The operational system has to handle both pathways without conflating them, because the minor pathway carries criminal-evidence preservation duties that the adult pathway does not. Recordkeeping of the report, the validation steps, the takedown action, and the timeline is what the statute uses to evaluate compliance after the fact, and is what defends against the private right of action the statute creates for affected individuals.

Applicability

Applies when: business model role is intermediary or mixed AND features include ugc, social-features, ephemeral-content, synthetic-media, live-streaming, or file-sharing.

How predicates are evaluated

Required by (3 regulations)

  • Marco Civil

    Lei nº 12.965, de 23 de abril de 2014 (Marco Civil da Internet), regulated by Decreto nº 8.771, de 11 de maio de 2016

  • IT Rules 2021

    Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, issued under the Information Technology Act, 2000 (Act No. 21 of 2000), as amended by the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2023

  • TAKE IT DOWN

    TAKE IT DOWN Act §3-§4: 48-hour takedown of NCII (including AI-generated synthetic intimate imagery) on receipt of a valid request, accessible reporting mechanism, enhanced minor-involving protections, recordkeeping.

    Pub. L. No. 119-xxx (TAKE IT DOWN Act of 2025)

Fulfilled by (6)

  • stop-ncii · partial · low effort · $
    StopNCII.org (operated by SWGfL): adult-NCII hash-matching service that platforms can integrate to detect known intimate imagery at upload time.
  • take-it-down-ncmec · partial · low effort · $
    Take It Down (NCMEC): minors-NCII hash service for content depicting persons under 18 at the time of imagery.
  • photodna · partial · low effort · $
    Microsoft PhotoDNA hashes match known CSAM + can extend to NCII corpora; widely deployed across UGC platforms.
  • hive · partial · medium effort · $$
    Hive AI moderation classifiers cover NCII / CSAM / synthetic-imagery detection at scale.
  • thorn-safer · partial · medium effort · $$
    Thorn's Safer platform: CSAM detection + content moderation tooling that supports NCII workflows.
  • In-house build · high effort
    In-house implementation needs reporting form + ticketing + 48-hour SLA tracker + minor-account escalation + recordkeeping; most operators integrate at least one hashing service.

ClearLaunch does not accept payment from vendors. Methodology.

Evidence formats

  • user-facing NCII reporting mechanism (accessible from content surfaces + help center)
  • takedown request log with intake → action timestamps
  • 48-hour SLA compliance audit
  • minor-involved content escalation log + CyberTipline report receipts
  • synthetic-imagery (deepfake) detection + handling SOP

ClearLaunch provides legal information based on publicly available regulatory sources. It does not constitute legal advice and does not create an attorney-client relationship. Consult a licensed attorney in your jurisdiction before making compliance decisions.

ClearLaunch

Regulatory intelligence for people who ship products.

Tools
Feature CheckerRegulations & PoliciesVendorsGuidesFor LegalFor EngineeringFor ExecutivesFor Investors
About
AboutMethodologyChangelogFAQRegulatory UpdatesClearLaunch on LinkedIn
Legal
Terms of ServicePrivacy PolicyHow we handle your dataCoverage scope & limitations

Built by Neel Patel, in-house game counsel. Games touch more compliance domains at once than anything else in tech. That's what ClearLaunch was designed around.

ClearLaunch provides legal information based on publicly available regulatory sources. It does not constitute legal advice and does not create an attorney-client relationship. Consult a licensed attorney in your jurisdiction before making compliance decisions. Operated by a Washington-licensed attorney. Not licensed in California or other US states. ClearLaunch provides legal information; consult a licensed attorney in your jurisdiction. Data reviewed through March 2026. Methodology

© 2026 ClearLaunch · Terms · Privacy