The House has overwhelmingly passed the Take It Down Act, a bill aimed at combating deepfake porn—nonconsensual AI-generated sexual content—with a vote of 409–2.
The law makes producing or sharing explicit deepfake photos or videos without the subject’s consent a federal crime. It also requires online platforms to remove flagged content within 72 hours of notification.
Victims will now have the legal right to sue creators, distributors, or platforms that ignore takedown requests. Lawmakers say this legislation is overdue given the rapid advances in AI-generated imagery.
The bill, supported by President Trump and a rare bipartisan coalition, is being hailed as a historic move to protect human dignity and digital privacy.
Proponents emphasize that deepfake porn disproportionately harms women, public figures, and children, often causing severe social and psychological damage.
Rep. Sheila Jackson Lee (D-TX), one of the bill’s sponsors, said, “This is about drawing a line. No one should wake up to discover their face on a fake pornographic video going viral online without their permission.”
Only two representatives opposed the bill, voicing concerns about potential government overreach and free speech implications. However, supporters argue the bill carefully balances platform responsibility with privacy rights.
The Senate is expected to consider the Take It Down Act within weeks. With strong bipartisan momentum and executive backing, it is likely to pass.
If enacted, this law would mark a major shift in how the U.S. addresses digital exploitation and the misuse of AI technology.