Artificial intelligence continues to become more prevalent in the everyday lives of New Yorkers, but that doesn’t mean there aren’t risks.

One woman experienced one of the negative ways AI can be used when she received a scam call. The scammer used AI voice cloning to make it seem like her daughter was calling her during an emergency.

“I just couldn’t understand how it was her voice,” said Amy Conley, who received the unexpected and fake call the day before her daughter's wedding.

“When I answered the phone, I heard sirens and then my daughter, Mary Kate, the bride, crying, saying, 'mom I got in a car accident,'” Conley said.

The scammer was trying to steal money from Conley by pretending her daughter caused a car accident and was going to jail. But the voice on the other side sounded exactly like Conley’s daughter.

“I feel like I’m pretty level-headed, but honestly, when you hear the voice of the person in your family and it truly is their voice, everything else kinda goes out the window and you just believe that this is that person," Conley said.

Experts say AI voice cloning allows anyone to do this. The tool is prominent in the film and music industry.

“First and foremost, you can license actors' voices from either the actors themselves or the estates of actors who are deceased. We can, for all intents and purposes, bring them back to life,” said Shelly Palmer, advanced media professor at Syracuse University.

You don’t need a lot of audio. A scammer can call and ask a couple questions while recording your answers. With just a few sentences, they can take what you said and turn it into anything they want you to say.

“You take a piece of audio, you analyze the audio, you basically put together words," Palmer said. "Part of these systems include text-to-speech algorithms so that the actual text can be translated into a language that the voice synthesizer understands.”

The way to combat scam calling using voice cloning is simple.

“If you’re worried about someone faking their voice or cloning the voice of a loved one and trying to trick you, what you do is you create a family code word," Palmer said.

“Had we had ours, I could have said when I was talking to Mary Kate, 'what is our family code word?' And if she couldn’t have said it right away, I would’ve known that wasn’t her,” Conley said.

After almost falling for the scam, Conley hopes more people know this can happen.

“I think this is only the beginning with AI and voice cloning, and the more people know the better,” she said.