WASHINGTON — On Capitol Hill, a North Texas mother and her 15-year-old daughter lobbied for legislation protecting against deepfake revenge porn, which is sexually explicit images created by artificial intelligence to harm or embarrass someone. Elliston Berry traveled to the nation’s capital from Aledo, Texas, to share her story of being victimized by such manipulated images.


What You Need To Know

  • Fifteen-year-old Elliston Berry traveled to the nation’s capital from Aledo, Texas, to share her story as a victim of deepfake revenge porn, which is sexually explicit images created by artificial intelligence to harm or embarrass someone

  • Elliston and her mother, Anna McAdams, joined Sen. Ted Cruz, R-Texas, and other sponsors for the introduction of the Take it Down Act, a bipartisan bill that would criminalize the publication of non-consensual intimate imagery 

  • The bill would also require social media platforms to develop procedures to remove the images within 48 hours of receiving a valid request from a victim and to take reasonable steps to remove any copies

  • While there is broad bipartisan support in Congress to address the issue, there is disagreement about how much responsibility to give tech companies to address the problem

“I just really wanted to show and share my story, that way anybody that goes through this is able to know that they're not alone, and that justice will be served and that they will, like, make it through,” Elliston told Spectrum News.

Last fall, as a high school freshman, Elliston woke up one morning to a barrage of texts. 

Someone used a program to alter innocent photos of her and her friends to be sexually explicit and then shared the images on social media. 

“It was really fearful, and I was really scared to even go to school or just step out of my boundaries, step out of my circle, and now that we are fighting to get these bills in place. It also creates just so much more relief as other people that are going through this are also able to get justice,” Elliston said. 

Her mother, Anna McAdams, said part of the frustration was how long it took to get the images taken down.

“We always want to protect our child no matter what, and we'll go to any extent to do that, and I think this AI offense showed me that there was really nothing we could do,” McAdams said. “I couldn't protect her in that moment, and it's out there, and could be out there for the rest of her life, but, you know, we just want to make the best of this and make sure that this doesn't happen to anybody else, or if it does, there's consequences to make somebody stop in their track.”

The student responsible for the images of Elliston was suspended and decided to leave the school. Charges against him will be dropped when he turns 18.

On Tuesday, Elliston and her mom joined Sen. Ted Cruz, R-Texas, and other sponsors for the introduction of the Take it Down Act. The bipartisan bill would criminalize the publication of non-consensual intimate imagery. It would also require social media platforms to develop procedures to remove the images within 48 hours of receiving a valid request from a victim. The companies would also have to take reasonable steps to remove any copies. 

“Big tech companies know how to comply with these requests, because under copyright law, if you put up a scene from 'The Lion King' on social media, you'll see it taken down very, very quickly, because that content is copyrighted, they know how to pull it down,” Cruz told Spectrum News. “It's just, sadly, big tech has demonstrated such arrogance to the victims of these kinds of violations that they just ignore their requests, and so this law will change that.”  

There is broad bipartisan support in Congress to address the issue, although there is disagreement about how much responsibility to give tech companies to address the problem. Other senators have introduced legislation that would empower victims to bring lawsuits against those who manipulate and share images, but Cruz believes more needs to be done. 

“It used to be a few years back, if you tried to Photoshop someone's head on another body, it looked pretty obviously fake, and with AI now you can do it in a way that it's impossible to tell,” Cruz said. “This bill directly addresses deep fake technology used for this, and then secondly, the mandate that big tech take the images down. None of the other bills put that obligation.” 

Elliston and her mother said they are hopeful federal legislation will pass.

“It just felt hopeless, but as we have come so far in getting these laws in place and seeking justice, it shows that you just have to start somewhere, and as long as you have a voice and just try to reach out, it can turn into anything,” Elliston said. 

Under the bill, social media companies’ failure to comply would result in enforcement action by the Federal Trade Commission.