New York lawmakers are urging Gov. Kathy Hochul to approve a proposed ban on the distribution of pornographic media created using artificial intelligence that includes a person's image without their consent.
The measure is meant to address AI-generated "deepfakes" that can contain fictional and sexually explicit imagery or video content of a person's likeness.
Lawmakers want up to a year in jail and a $1,000 fine for people convicted under the pending measure. They also want to give victims the right to purse legal action against the perpetrators.
“In a rapidly advancing digital world, anyone can be the target of deepfake abuse, and our state laws need to keep up with these advancements to protect people against this violation and send a clear message that the disturbing trend of non-consensual fake intimate images online will not be tolerated here in New York,” said state Sen. Michelle Hinchey, who sponsored the bill with Assemblywoman Amy Paulin. “Every person should be able to maintain control of their likeness, and I’m proud to sponsor legislation that would ban the non-consensual dissemination of deepfakes and crack down harder on this form of abuse by expanding the ban from intimate partners to anyone who creates and shares explicit images without a person’s consent. I urge the Governor to sign my bill into law immediately to protect people from this exploitation and entitle victims to seek justice.”
The measure is meant to address an emerging issue of increasingly sophisticated artificial intelligence, which can largely target women and children. Lawmakers in 2019 had previously approved laws meant to tackle so-called "revenge porn" in New York. But the spread of "deepfake" pornography creates a new set of challenges, lawmakers said.
Lawmakers pointed to a 2021 report that found more than 85,000 deepfake videos deemed to be harmful and the number has been doubling every six months.