14 min read

The TAKE IT DOWN Act: What Every Creator Needs to Know Before May 2026

Complete breakdown of the TAKE IT DOWN Act for content creators -- how it differs from DMCA, the 48-hour takedown rule, AI deepfake coverage, and what to do right now.

On May 19, 2025, President Trump signed the TAKE IT DOWN Act into federal law. It passed the Senate by unanimous consent and the House 409-2. For once, Washington actually agreed on something.

And for content creators, it could change how you fight leaked content -- in a big way.

Here is what the law actually does, how it is different from DMCA, what the May 2026 deadline means, and what you should be doing about it right now.

What the TAKE IT DOWN Act Actually Is

The name is an acronym that stands for "Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks." Yeah, they worked hard on that one.

In plain English: it is now a federal crime to share someone's intimate images or videos without their consent. That includes real content and AI-generated deepfakes. And it requires online platforms to take that content down within 48 hours of being notified.

Before this law, there was no federal statute specifically criminalizing non-consensual intimate imagery. All 50 states and DC now have their own revenge porn laws (South Carolina was the last to pass one, in May 2025), but they vary wildly -- different definitions, different penalties, inconsistent enforcement. If your content leaked across state lines or onto platforms based in different states, you were navigating a patchwork of rules that did not always help.

Now there is one federal standard that covers the entire country.

How It Is Different from DMCA

If you have dealt with content leaks before, you probably know about DMCA takedowns. They have been the go-to tool for creators. But the TAKE IT DOWN Act works differently in some important ways.

DMCA protects copyrighted works. The TAKE IT DOWN Act protects people.

With DMCA, you are filing because someone stole your copyrighted content. You need to prove you own it. You need to follow specific legal formatting. And critically -- you have to put your real name, address, and phone number on every takedown notice. That information gets sent to the person hosting your stolen content.

The TAKE IT DOWN Act does not require that. It is not about copyright ownership. It is about consent. If intimate images of you are published without your permission, you can request removal regardless of who technically took the photo or recorded the video. That distinction matters -- especially for creators whose content was filmed by someone else, or whose likeness was used to create AI deepfakes.

Here is a side-by-side comparison:

DMCA TAKE IT DOWN Act
What it protects Copyright (your intellectual property) Privacy/consent (your intimate images)
Who can file Copyright holder only The depicted person (or their representative)
Legal basis You own the copyright to the content The content depicts you and was shared without consent
Removal timeline "Expeditious" (no fixed deadline) 48 hours (strict deadline)
Covers AI deepfakes Not well -- deepfakes may not be copyrightable Yes -- explicitly covers AI-generated imagery
Counter-notice Yes -- the accused can dispute and restore content None -- no mechanism to contest removal
False claim penalties Yes -- perjury charges None -- no penalties for false claims
Criminal penalties No (civil law) Yes -- up to 2-3 years imprisonment
Can you sue platforms Yes (you can take them to court) No (must go through FTC enforcement)
Covers non-intimate content Yes -- all copyrighted works No -- intimate imagery only

That 48-hour timeline alone is a major shift. Under DMCA, "expeditiously" is open to interpretation and platforms routinely take weeks. Under this law, 48 hours means 48 hours.

What Gets Covered

The law covers two categories:

1. Real Intimate Images Shared Without Consent

If someone shares your intimate photos or videos without your permission -- whether it is a subscriber who leaked your OnlyFans content, an ex who posted revenge content, or a hacker who distributed stolen files -- this law makes it a federal crime. The person who shared it faces up to 2 years in prison (3 years if the victim is a minor).

The law specifically states that consenting to the creation of intimate content does not equal consenting to its publication. This is directly relevant for creators: the fact that you created content for a subscription platform does not mean anyone has the right to share it elsewhere. Your paid content behind a paywall is still protected.

2. AI-Generated Deepfakes

This is where the law breaks genuinely new ground. If someone creates a realistic AI-generated intimate image or video using your likeness -- without your consent -- that is now a federal crime too.

Before this law, deepfake victims had very limited legal recourse. Only about 30 states had laws specifically addressing sexual deepfakes, and enforcement was inconsistent. Now it is federal.

For creators, this is increasingly relevant. AI tools are getting better and cheaper. Deepfake intimate content is a growing problem -- and until now, there was not a clear legal tool to fight it at the federal level.

3. Threats to Publish

This one gets overlooked but it matters: even threatening to share someone's intimate content is now a federal crime when done for the purpose of intimidation, coercion, extortion, or to cause mental distress. If someone is holding your content hostage -- "pay me or I'll leak this" or "I'll post your photos if you don't do X" -- that threat alone carries up to 18 months in prison (30 months if the victim is a minor).

Before this law, you had to wait until someone actually published your content before you had a clear federal path to act. Now the threat itself is a criminal offense. If you are being blackmailed or extorted with your content, document the threats and contact law enforcement. You have federal backing now.

The 48-Hour Takedown Requirement

This is the part that will affect your day-to-day experience the most.

Starting May 19, 2026, every "covered platform" -- meaning any website or app that hosts user-generated content -- must have a system in place to:

  1. Accept removal requests from people (or their representatives) who appear in non-consensual intimate content
  2. Remove the content within 48 hours of receiving a valid request
  3. Make reasonable efforts to find and remove identical copies of the same content on their platform
  4. Display a clear, conspicuous explanation of how to submit a removal request

A valid request is simpler than a DMCA notice. You need:

  • Your signature (physical or electronic)
  • Enough information to help the platform find the content
  • A good-faith statement that the content was shared without your consent
  • Your contact information

That is it. No perjury statement. No proof of copyright ownership. No legal name and home address exposed to the infringer.

Which Platforms Are Covered

The law defines "covered platforms" broadly. Any public website, online service, or mobile app that primarily provides a forum for user-generated content falls under this law. That includes:

  • Social media platforms (Twitter/X, Instagram, TikTok, Reddit)
  • Adult content sites (tube sites, forums, aggregators)
  • Messaging platforms with public channels (Telegram, Discord)
  • File-sharing services
  • Any site that hosts or curates non-consensual intimate content

Exceptions: email providers, internet service providers, and platforms that only publish their own editorially curated content (not user-generated).

For creators, this is a big deal. Sites that currently ignore your DMCA notices may be forced to comply with takedown requests under this law. The enforcement mechanism is the FTC -- the Federal Trade Commission -- which can treat non-compliance as an unfair or deceptive trade practice. That carries real regulatory teeth.

One legal detail worth knowing: the law is structured so that CDA Section 230 does not block enforcement. Section 230 is the law that usually shields platforms from liability for what their users post. The TAKE IT DOWN Act has a specific carve-out -- platforms cannot hide behind Section 230 to avoid complying with the 48-hour takedown requirement. That is a meaningful legal shift.

Also worth noting: platforms get safe harbor protection for good-faith removals. If a platform removes content based on a valid request and it later turns out the request was wrong, the platform is protected from being sued by the person whose content was taken down. This is actually good for creators -- it means platforms have less reason to hesitate when they receive your removal request.

Will every offshore piracy site suddenly start complying? Probably not. But this gives platforms less room to hide behind vague "we'll get to it" responses, and it gives enforcement agencies a clearer mandate to act.

What This Means for Creators Right Now

The Good

You have a powerful new legal tool. For content that falls under "non-consensual intimate imagery" -- which includes most leaked OnlyFans/Fansly content -- you now have a faster, simpler removal path than DMCA alone.

The 48-hour deadline is real. Once platforms implement their takedown systems (by May 2026), you should see much faster response times on removal requests.

Deepfakes are covered. If someone uses your face to create AI-generated intimate content, you have a federal law backing your removal request.

Your privacy is better protected. The notice requirements are less invasive than DMCA. You do not need to expose your real address to file.

Platforms must actively look for copies. It is not enough to remove the one link you reported. They need to make reasonable efforts to find and remove identical copies.

The Honest Caveats

The law is not a magic fix for everything. A few things to keep in mind:

It only applies to intimate imagery. If someone leaks your non-intimate content (SFW photos, written posts, audio files), this law does not cover it. You still need DMCA for those.

Enforcement on offshore sites will be limited. Sites operating outside U.S. jurisdiction will be harder to reach, just like with DMCA. The FTC can act against U.S.-based platforms, but piracy forums hosted overseas are a different story.

The May 2026 deadline has not hit yet. Platforms have until May 19, 2026 to set up their systems. Until then, the criminal provisions are active (the person who shared your content can be prosecuted), but the 48-hour takedown requirement is not yet enforceable.

No penalty for false claims (yet). Unlike DMCA, the law does not include consequences for bad-faith takedown requests. Some legal experts and advocacy groups have raised concerns about potential abuse -- someone filing false claims to remove legitimate content. This may be addressed in future amendments.

You cannot sue platforms directly under this law. Under DMCA, you can take a platform to court yourself. The TAKE IT DOWN Act does not give individuals that option -- enforcement goes through the FTC. That means if a platform ignores your request, you report it to the FTC rather than filing a lawsuit. (You can still sue under DMCA for copyright, and many state laws offer civil remedies. But this specific law relies on government enforcement.)

When to Use DMCA vs. the TAKE IT DOWN Act

This is the practical part. Most OnlyFans and Fansly creators can actually use both -- and that is the strongest position to be in. Here is how to decide.

Use DMCA when:

  • You created the content yourself (you are the photographer or videographer)
  • The leaked content is not intimate -- someone reposted your Instagram photos, your SFW content, or written posts
  • You want the ability to sue in court for damages
  • The platform responds well to DMCA but has not set up TAKE IT DOWN Act systems yet

Use the TAKE IT DOWN Act when:

  • Someone else created the content (a subscriber screenshotted your live stream, a fan recorded a video call)
  • The content is an AI deepfake of you -- you cannot claim copyright over something someone else's AI generated
  • You want faster removal (48-hour hard deadline vs. vague "expeditious")
  • The platform has been ignoring your DMCA requests
  • You want criminal prosecution of the person who shared your content
  • You do not want your real name and address on the takedown notice

Use both when:

  • You created intimate content that was shared without consent (covers most OnlyFans/Fansly leaks)
  • You want maximum legal pressure on the platform
  • One approach is not getting a response, so you try the other

The key insight: if you are a creator who made intimate content that was leaked, you are both the copyright holder (DMCA applies) and the depicted individual who did not consent to publication (TAKE IT DOWN Act applies). Two legal weapons are better than one.

And when you add state-level revenge porn and NCII laws on top of that, you actually have three layers of legal protection: federal copyright law (DMCA), federal consent law (TAKE IT DOWN Act), and state criminal and civil laws. Creators have never had this much legal backing before.

What You Should Do Right Now

You do not need to wait until May 2026 to benefit from this law. Here is what to do now:

1. Know Your Rights Under Both Laws

You now have two tools: DMCA (for copyright) and the TAKE IT DOWN Act (for non-consensual intimate images). Many leaked content situations qualify under both. Use whichever gives you the better path -- or use both.

For OnlyFans/Fansly content that is leaked without consent, the TAKE IT DOWN Act may be your stronger option once platforms have their systems in place, because it does not require exposing your personal information and comes with a hard 48-hour deadline.

2. Document Everything

Same advice as always, but even more important now. Screenshot the leaked content with URLs visible. Save dates and times. Keep records. The TAKE IT DOWN Act requires you to provide "information sufficient to enable the covered platform to locate" the content. Good documentation makes your requests airtight.

3. Start Using Both Removal Paths

When you file removal requests, reference the TAKE IT DOWN Act alongside DMCA. Even before the May 2026 compliance deadline, platforms are aware the law exists and many are beginning to set up their systems early. Mentioning the law by name in your requests signals that you know your rights.

4. Watch for Platform Updates

Between now and May 2026, platforms will be rolling out their NCII takedown systems. Keep an eye on the platforms where leaked content is most common -- Telegram, Reddit, Discord, tube sites -- for new reporting tools and removal request forms.

5. Understand What Qualifies

The law covers "intimate visual depictions" -- which includes nudity and sexual content. If your leaked content is intimate in nature (as most leaked OnlyFans/Fansly content is), you are covered. Make sure your removal requests clearly state that the content is non-consensual intimate imagery, not just copyrighted material.

The Bigger Picture

The TAKE IT DOWN Act is not perfect. Legal experts have raised legitimate concerns about the lack of safeguards against false claims and the potential for abuse. Those are real issues that will likely need to be addressed.

But for creators who have been fighting content theft with DMCA as their only federal tool, this is a meaningful step forward. It acknowledges something that should have been obvious a long time ago: sharing someone's intimate content without their consent is not just a copyright issue. It is a violation of their dignity and autonomy. And it should be treated as one.

The 48-hour takedown requirement, the inclusion of AI deepfakes, the simpler notice process, the better privacy protections -- these are practical improvements that will make a real difference in how creators protect their work.

Get Ahead of It

The May 19, 2026 compliance deadline is coming. Between now and then, the best thing you can do is know where your content stands.

Run a free scan at removeonlyleaks.com/freescan -- no credit card, no commitment. See where your content appears across 75M+ sites including Telegram, tube sites, forums, scraper sites, and search engines. Whether you plan to use DMCA, the TAKE IT DOWN Act, or both -- knowing the scope of the problem is always step one.

RemoveOnlyLeaks already files takedowns using every available legal tool, and we will incorporate the TAKE IT DOWN Act's notice-and-removal framework as platforms implement their systems. Verified proof of every removal. Flat pricing. Your identity stays private.

The law is on your side. Make sure you use it.

Your content. Your rights. Now backed by federal law.


RemoveOnlyLeaks is an AI-powered DMCA takedown and content protection service for digital creators. We monitor 75M+ sites 24/7 and provide verified proof of every removal. Learn more at removeonlyleaks.com.

Find out where your content appears

Our free scan checks 75M+ sites -- including Telegram, scraper sites, forums, and search engines. No credit card required.

Run a Free Scan