With a vote of 409–2, the House passed the Take It Down Act, a bill designed to address the growing threat of deepfake porn, or nonconsensual AI-generated sexual content.
According to the law, producing or disseminating explicit deepfake photos or videos without the subject’s permission is a federal offense. Additionally, it mandates that within 72 hours of notification, online platforms remove content that has been flagged.
The legal ability to sue content creators, distributors, or platforms that disregard takedown requests will now be available to victims of such content. Given the speed at which AI image and video generation is developing, lawmakers say the law is long overdue.
The Take It Down Act, which has the backing of President Trump and an uncommon bipartisan coalition, is being praised as a historic step in protecting human dignity and digital privacy.
Proponents contend that deepfake porn has grown to be a more harmful tool, disproportionately affecting women, public figures, and children, frequently with disastrous social and psychological repercussions.
One of the bill’s sponsors, Rep. Sheila Jackson Lee (D-TX), stated, “This is about drawing a line.” “No one should wake up to discover their face on a phony, pornographic video that is going viral online without their permission.”
Two representatives were among the opponents who expressed worries about overreach and possible implications for free speech. However, proponents contend that the bill carefully strikes a balance between platform responsibility and privacy rights.
Within a few weeks, the Senate is anticipated to consider the bill. It is likely to pass with executive support and strong bipartisan momentum, signaling a dramatic change in how the United States responds to digital exploitation and misuse of AI.