RALEIGH, N.C. — North Carolina State University is investigating deepfakes targeting dozens of students.
A search warrant shows 28 sorority members at N.C. State reported their photos, names and likeness were used without consent last fall.
According to the warrant, many of the images showed their faces artificially imposed on nude women performing sexual acts. The images were uploaded on a porn site.
A detective was able to get the email address associated with the account that uploaded the pictures.
An AI expert, Dev Nag, the CEO of QueryPal, explained how this could happen and what can be done to protect yourself from deep fakes.
“Even a single image is enough to train the machine to create long videos that look very realistic to most people," said Nag, who is CEO of QueryPal. "I’d say the first thing for anyone involved in social media or especially college kids is to educate yourself on what’s possible and where’s the technology today, not what it was like last year, and also to educate your friends and family."
He offers four tips, which he sums up as EVPR:
- Educate yourself
- Verify the images and videos you see online
- Protect your content by making your accounts private
- Report inappropriate AI-generated images to the social media sites you use
“We’ve gotten so used to having our images and our videos out there on social media whether it’s Facebook or Twitter or Instagram or TikTok," Nag said. "It only takes a single image to give someone the fuel to make a deep fake of you. If you can, lock down your social media to just your friends and your family."
"You don’t have to share it across the world,” he said.
Nag says technology moves so quickly that AI-generated images look more and more real.
“We had a huge progression over the last five years in the quality of images and videos that AI’s able to create,” he said. “Any kind of tell or signal or giveaway that you think is going to tell you what AI is today will be wrong tomorrow.”
Spectrum News 1 reached out to the website where these sexually explicit images landed to try to learn more about the account that posted them and has not received a response.