Texas teen joins Melania Trump in push for anti-revenge porn bill

A North Texas teenager was in Washington D.C. on Monday to promote a new bill which aims to fight fake pornographic images made by artificial intelligence.

Elliston Berry of Aledo was 14 years old when a classmate took an image off social media and used a computer program to create fake naked photos of her.

Berry joined a roundtable discussion in DC with First Lady Melania Trump to share her story and support for the Take It Down Act filed by Sen. Ted Cruz (R-Texas).

North Texas teen on AI deepfake pornography

Elliston Berry

Berry was one of two teenagers that shared stories of their experiences with deepfake pornography.

What they're saying:

The teen says she and her mother tried to get the image of her removed for months, but 

"We need to hold Big Tech accountable to take action. I came here today to not only promote this bill, but to fight for the freedom of so many survivors millions of people, male, female, teenage children, kids all are affected by the rise of this image-based sexual abuse. This is unacceptable," said Berry. 

Despite the situation, Berry says she is glad to be able to give victims a voice.

"It is so inspiring to know that my voice is being heard giving hope to not only me, to all the many people that have been affected by this. It is truly so amazing how this awful situation has turned into good,"she said.

Take It Down Act

What we know:

The Take it Down Act is legislation introduced by Sen. Ted Cruz of Texas that makes it unlawful to knowingly publish "non-consensual intimate imagery (NCII), including "digital forgeries" created with AI software (or deepfake pornography), and require social media and similar websites to have in place procedures to remove such content upon notification from a victim," according to a 2024 release from Cruz’s office.

FILE-A person uses a laptop at a desk. (Photo by Matthew Horwood/Getty Images)

RELATED: FBI warn against AI-generated deepfake content created for sextortion schemes

Other legislators that support the bill include Democratic Sen. Amy Klobuchar of Minnesota and Rep. Madeleine Dean of Pennsylvania also supports the bill.

What we don't know:

It’s unknown when the legislation will be passed in the House, but it passed the Senate with bipartisan support during the last session of Congress and again in February.

AI-generated explicit images deepfakes on the rise

Big picture view:

Most states have laws protecting individuals from revenge porn, but only 20 states have explicit laws covering deepfake non-consensual intimate imagery (NCII), per a release from Congresswoman Maria Elvira Salazar of Florida.

Researchers tell the Associated Press that the number of explicit deepfakes have increased in the last several years, as the technology used to create these explicit images is more accessible and easier to use. 

Dig deeper:

Over the past few years, several celebrities were reportedly victims of deepfake pornographic images that circulated online and social media. 

Last year, fake sexually explicit and abusive fraudulent images of singer Taylor Swift were disseminated on the social media platform X, with some shared on Meta-owned Facebook and other social media platforms, according to the Associated Press. 

RELATED: Taylor Swift AI-generated explicit photos spark outrage

Meanwhile, more fake and sexualized images of actors Miranda Cosgrove, Jennette McCurdy, Ariana Grande, Scarlett Johansson, and former tennis star Maria Sharapova were shared on multiple Facebook accounts, which amassed tons of likes and shares on the platform, CBS reports.

The Source: Information for this story comes from a news conference in support of the Take It Down Act held in Washington, D.C. on March 3, 2025, a news release from Sen. Ted Cruz's office with information on the Take It Down Act, previous LiveNOW from FOX and FOX 4 reporting, the Associated Press, CBS News and CNN.

Artificial IntelligenceTechnologyCrime and Public SafetyTexasAledo