Podcast
Root Causes 314: AI-based Deepfakes in Real Crimes


Hosted by
Tim Callan
Chief Compliance Officer
Jason Soroko
Fellow
Original broadcast date
July 5, 2023
We have spoken in previous episodes about the potential for deepfakes in real-world crimes. In this episode we discuss a variety of real-world attacks in which deepfakes have played a role. These include fake kidnapping, "sextortion," and a range of spear phishing attacks and social media scams.
Podcast Transcript
Lightly edited for flow and brevity.
So okay. So the first one, you found this article, Jay. This is the Evening Standard, June 15, 2023. Author is Andrew Williams and the headline reads - AI Clones Child's Voice in Kidnapping Scam. And the gist of this is this woman testified in front of Congress recently, that in January of this year, she received a phone call from what proposed to be kidnappers of her teenager and essentially they demanded $50,000 to return what she thought was her kidnapped child. And here's a quote. This is at one point, she heard her daughter, and her daughter says, “Mom, these bad men have me. Help me. Help me.” And the wrinkle in all of this is that child was not harmed in any way the child was off skiing, was happy, thought nothing was wrong. This entire thing was a scam based on a deep fake of the girl's voice.
So number two. This is a June 5, 2023, public service announcement from the FBI and I'll give you the headline. And again, you can search for it yourself, but the headline reads, Malicious Actors Manipulating Photos and Videos to Create Explicit Content and Sextortion Schemes. And the gist of this here is a warning from the FBI, that people are using available public content, which is all very G-rated, to then manipulate it into deep fakes that are not and are using those deep fakes to do things like extort money from somebody who doesn't want this displayed or to harass people, or to do various other things that are to the detriment of the person who gets deep faked.
And so let's just run through what are the kinds of scams we're talking about? We're talking about the business email compromise. It would have been in the past, I get an email from what appears to be my CFO’s personal email account saying, we need you to wire this amount of money to this bank account by Monday morning or the lights are gonna go out. And the helpful mid-level finance employee does it to be helpful and it turns out that those $5,000 weren't really going to the landlord. They were going to a criminal. Well, we all get trained not to fall for that but now all of a sudden, if I get a voicemail in my mailbox that sounds like my CFO’s voice that says the exact same thing, maybe I'll fall for that. So it's taking these things into the new context.
Or another example would be what they used to call the Western Union attack way back in the day, which was, I get an email from somebody, from my friend, or seems to be from my friend saying, hey, I'm traveling abroad. I just got robbed. I don't have any money or anything. I don't know how to get home. If you can wire $500 to this Western Union office, I'll pick it up and I'll pay you back when I get home. And you want to help your friend out and you go, oh you bet. You know, he's good for it. She’s good for it. They're not going to steal $500 from me. Well, they're not the ones who are stealing $500. It's someone else. That one you can say, okay, we all learned not to fall for that. We don't fall for the email. We don't fall for the text. But what if it seems to be a voice message from what sounds like my friend.
Or another example of a very similar thing would be just the good old-fashioned, the gift card scam. Where again, something that seems to come from my boss or someone high in my company says, look, we've got some really disgruntled customers, I want to make right by them. Can you just do me a favor, buy a $300 gift card on name your favorite online shopping store and send it to this account and put it through expenses. And we know, we've all been educated, trained, don't fall for that if you get that text. Don't fall for that if you get that email. But once again, if I get that what appears to be that voicemail, do I fall for it? And this is now happening. This is something that you're seeing where these attacks, these venerable attacks, well understood attacks that have been used by criminals, in some cases for decades, are moving into the new medium because of the new technology and because it's breaking the context, they are getting new victims.
And then the other example I heard of was, apparently, somebody was doing a movie about the life of Anthony Bourdain and they wanted him narrating a bunch of his own writings, because he had a lot of writings. He wrote a lot of books and there are a lot of things that - - they are Anthony Bourdain’s words, but they weren't captured in his voice and so and I believe that you and I talked about this in the past, Jay, that someone used AI to allow Anthony Bourdain to read his own words in his own movie about himself, even though he wasn't alive to do it.
Now, those are good examples. James Earl Jones being able to use his voice even after he's not able to use it. Everybody being able to enjoy a character from a Star Wars movie that they all remember fondly. Everybody being able to hear Anthony Bourdain saying his own words in his own voice. Those are positive examples, and it goes back to the other theme that you and I touch a lot, which is when we use our advances in computers to make things better, which is why we do them, they do make things better, and a lot of things get better. And in the process of those things getting better, vulnerabilities and exploits open up and it winds up being both and it's always both.

