Redirecting you to
Podcast Feb 06, 2024

Root Causes 360: Joe Biden Deepfake Plays in New Hampshire Primary

A deepfake of Joe Biden's voice made an appearance in robocalls leading up to the New Hampshire primary. We discuss this latest development and its implications.

  • Original Broadcast Date: February 6, 2024

Episode Transcript

Lightly edited for flow and brevity.

  • Tim Callan

    We want to talk about a news item, and we've been talking a lot about deep fakes, and deep fakes destroying the basic reliability of what we think of as a recording of real life. So that can be a photograph, a video or an audio recording and that you can get things that seem to be these things for real and they're not. And one of the things I think that you and I have been predicting, Jason, is that as these things become really available to a broad variety of users or actors, if you want, that we will see them showing up in every imaginable exploit you can think of. And we've talked in the past about them being part of the new spearfishing structure and cons that are going on around deep fakes. But they're also making their way into the world of politics in a very real way and in particular, the one I wanted to call our attention to was leading up to the New Hampshire primary. The January 23 New Hampshire primary, there was a robocall that was going out to I guess a lot of people with a deep fake of Joe Biden's voice.

  • Jason Soroko

    Yes. Which, Tim, I bet you you and I could create a deep fake of Joe Biden's voice in about, what, it might be the simplest, one of the easiest voices to deep fake. Would you agree?

  • Tim Callan

    Absolutely. So there are easy, inexpensive commercial tools you can get. You can create a deep fake with one minute of somebody's voice and if you want to create an excellent, deep fake, it's, you know, it's easy to collect it. If you can collect an hour of somebody's voice, you can make something that's amazing. Exquisitely good. And of course, Joe Biden, I mean, you got all the voice you could ever want, right? There's so much of his voice out there, just recorded for easy consumption. And so you can put that together, you can make a script of what you want to say, and it just says it for you. And it's uncanny.

  • Jason Soroko

    We know this because we've done it with our own voices.

  • Tim Callan

    Right.

  • Jason Soroko

    And, you know, the technology pipeline to allow this to happen is now in the hands of absolutely everybody. And that's just a fact. So no surprise that somebody would do this. But I thought, Tim, this example was pretty clever.

  • Tim Callan

    Yeah. I mean so, I'll just quote what the - - so here. I've got a transcript of what this robocall would say. It would say, this is quote, “Republicans have been trying to push non-partisan and Democratic voters to participate in their primary. What a bunch of malarkey. We know the value of voting democratic when our votes count. It's important that you save your vote for the November election. We'll need your help in electing Democrats up and down the ticket. Voting this Tuesday only enables the Republicans in their quest to elect Donald Trump again.” So what they were trying to do here was they were trying to eliminate, suppress, Republican turnout, in the belief that suppressing Republican turnout would be disadvantageous to Donald Trump and help a different candidate win.

    And so, you know, of course, Biden didn't record this. And this didn't - - I mean, that primary wasn't close, right? So this robocall didn't affect the outcome of that primary in any meaningful way. But lots of races are close. And we all know this, right? There have been lots of races that have been extremely close, especially in the last 30 years. And so it's easy to imagine how tools - I don’t know if we want to use the word tools - techniques like this could actually swing a race.

  • Jason Soroko

    I could see it happening. I could see it potentially happening, especially when things are extremely close, which is quite often the case nowadays, right?

  • Tim Callan

    Yeah. And so you know, I think this goes back to kind of what we've been trying to emphasize and in the past, Jay. And the reason I like the idea of talking about this today is in the past we've sort of talked about saying, look out, this is the new normal. This is the new normal. This is a proof point of that, right?

  • Jason Soroko

    Yeah.

  • Tim Callan

    This is an example just like we talked a lot about that episode, the mother who testified in front of Congress because she got a fake ransom call for a child and she heard her child's voice, but the child wasn't really ransomed. It was a fake voice. And this is another example of that, where we see these things moving into this realm. And I think we're all kind of comfortable with the idea of photoshopped pictures.

    So if you see a picture of a celebrity doing something bad, maybe you don't really entirely trust it. But most people haven't made the intellectual leap to apply that same idea to audio or video, but it can happen there, too. And again, this is a good example of that, where, you know, if people pick up the phone, and they get a robocall, and it sounds like a recorded message from Joe Biden, they don't think Joe Biden is calling them they know it's a recording, but they probably are prepared to make the intellectual leap to believe that this is a sanctioned message that Joe Biden is really behind. Right?

    Because they'll say, oh, he recorded it, he must believe it. And, you know, this is the new normal. Like all of the world and all of the public, and all of our mechanisms and our laws, and our elections are going to have to reconsider and re-reckon with how we can, you know, think about these digital files, that again, we refer to as photographs or videos or audio recordings, which very well may not be those things.

  • Jason Soroko

    Tim, I'd like to coin a term here now.

  • Tim Callan

    Please.

  • Jason Soroko

    Everybody stand back. Watch this. You know, that term deep fake? It makes it imply that there's something tricky about it. I want to coin the term easy fake. How about that?

  • Tim Callan

    Yeah. They are easy fakes. I think that's a really good point, Jay, which is the deep fakes are not hard. And you're right. The words we choose, sort of put color on it and deep fake makes it sound very, very advanced. Right? This is the reason we recorded an episode a long time ago where I talked about how I hate the use of spoof because spoof sort of underplays the maliciousness of what's being done there. And I think you're right. Deep fake, there's an implication kind of built into that word, that this is super advanced, super sophisticated. This is what the real rocket scientists do. No! Anybody with the web and a credit card can make deep fakes. It's the easiest thing in the world and as it becomes that easy, part of what we're going to see is we're going to see these things showing up in all aspects of our lives. And some of it will be good, right, but a lot of it won't be.

  • Jason Soroko

    So journalists, keep in mind, I'm talking to journalists now who cover this stuff. I've seen so much coverage from consortia of camera makers and others who are saying, hey, we're just going to sign original content from cameras and who knows, maybe it'll be microphones next, and etc. I got news for you. There's nothing that signed original content would do that would stop the report that Tim just told you about. Nothing.

  • Tim Callan

    Right. Yes. Exactly. And we talked about this kind of recently, which is, I do think there is a role for signed content. But it's for kind of the deep bunkers, right? Like in this case, if someone could come back and start with an original signed file that they could show originated from, you know, a device that was registered to Joe Biden, and then the robocall is the same as that, then we'd be able to call back and say, looks like this was a real thing that was said. Right? And, and maybe that example, wouldn't be that important. But there'd be other examples where being able to get back to original source content would have its value. For sure. In terms of sort of getting to the truth of it. But the point that you recently made, I thought very aptly, in an episode we did about signed photographs, is that most of the people consuming this content are not going to be in a situation where that kind of confirmation is available to them. And therefore, signed content very much has a role for things like professional journalists or people who want to go debunk something, or researchers or somebody who has the expertise and the resourcing and the time and the motivation to really prove if something was created by a computer or captured real events but most of us in our lives are not going to have all of those things. So we're not going to have the time and the motivation and the resources and the equipment to do so and the opportunity to do so and instead, we're just going to have to deal with this new world where that content you see might not be genuine.

  • Jason Soroko

    Exactly. If you are Getty Images, and you have a bunch of signed images that you can prove in court that that's yours and therefore you can collect the money off that image, hallelujah for Getty Images.

  • Tim Callan

    Right.

  • Jason Soroko

    For the rest of us - guess what people? Tim said it best in a few podcasts ago, we now have to doubt everything or at least question anything we consume, in terms of audio, video, images, etc. I don't think it's about living in a world of fear, but it definitely is a world of verification of some kind. And before you make a hard decision about something, such as voting or geez, you got a phone call from your daughter that says she needs to have a million dollars wired to her immediately, that might not be your daughter. I'm sorry, but that's just the truth now.

  • Tim Callan

    Yeah. So anyway, I think you and I talked about this in our forecast episode, which wasn't very long ago. It was the end of December, that we were predicting that this was going to be the year where we just saw deep fakes making their way into every flavor of fraudulent or deceptive activity you can think of. And here's another example of that. And I think that it’s proving true. And I continue to predict that we're going to watch that happen over the course of this year and that by the end of the year, we're gonna look back and we're gonna say, you know, if you couldn’t use a deep fake somebody did use a deep fake.

  • Jason Soroko

    Yeah. It's gonna be a juicy year, Tim.

  • Tim Callan

    It is. All right. Well, thank you, Jay.

  • Jason Soroko

    Thank you.

  • Tim Callan

    This has been Root Causes.