ROCHESTER, N.Y. — There is growing concern by some artificial intelligence experts that AI-generated deepfakes could impact the 2024 presidential election campaign. It’s a complicated issue, but there are ways to detect whether what you’re seeing is fake.
It doesn’t take long for Kelly Wu to find something that’s just not right.
“It is very scary,” said Wu, a doctoral student at Rochester Institute of Technology's Global Cybersecurity Institute as she searched on her computer. “Those are the generated images, which are totally fake.”
They’re called deepfakes. Some are relatively harmless, but when it comes to politics, it’s concerning to RIT researchers.
“Recently it is becoming even more,” said Wu. “Especially during election time.”
Fake images have received a lot of attention recently. There were deepfake images showing a mugshot of former President Donald Trump. Made-up videos showed Florida Gov. Ron DeSantis dropping out of the presidential race — well before he actually did drop out. The DeSantis campaign released deepfakes showing Trump kissing Dr. Anthony Fauci, the former director of the National Institute of Allergy and Infectious Diseases.
“The goal is indeed to rile people up,” said RIT research scientist Christopher Schwartz.
Schwartz expects deepfakes to be a major issue in a presidential election year. What makes a deepfake distinct from other forms of media content is the role of AI in producing it.
“It may or may not involve real people and places and events,” he said. “It could be totally manufactured.”
Schwartz uses a photo of Trump surrounded by African American supporters — a fake photo of an event that never happened — as an example of how deepfakes can be used to show a candidate in a more favorable light. Another fake video making the rounds showed President Joe Biden announcing the singing of the National Anthem, before breaking into a computer-generated version of the song Baby Shark.
Other deepfakes contain outright disinformation, like a robocall that went to some voters leading up to the New Hampshire primary. A computer-generated voice impersonating President Biden urged Democrats to avoid voting in the primary, claiming, “Voting Tuesday only enables Republicans in their quest to elect Donald Trump again.”
“Audio deepfakes will have a certain level of credibility, if you will, that visual deepfakes won't necessarily have,” said Schwartz. “So, audio poses a very distinct, real threat.”
RIT researchers are developing a tool called DeFake, intended to help journalists determine what’s real and what’s not.
“This feeling of, how could someone believe that? It’s ironically a feeling that’s being preyed on and manipulated,” he said.
Schwartz is a former journalist and disinformation observer who says video deepfakes are dangerous on a different level, likening them to “a full-on assault to our perception.”
“What concerns me the most is not actually deepfakes,” he said. “But the environment in which they're being introduced.”
An environment, according to Schultz, in which many people react immediately, believing what they want to believe — true or not.
“I think it's that people, rightly or wrongly, feel marginalized and powerless,” he said. “And at some level, they want it all to burn down. And that’s what’s driving this.”
Schwartz suggests people take a critical look before falling for disinformation. He said deepfakes can target people on any political ideology, be it left, right or centrist. The best advice is to proceed with caution.
“It’s just the reality we're living in,” he said. “Sooner or later, humanity will eventually find a solution to this. We're not at the point yet.”