Redirecting you to
Podcast Jan 26, 2024

Root Causes 357: Signed Digital Photographs

Three major camera manufacturers have joined to create a standard for signed digital images from their cameras.

  • Original Broadcast Date: January 26, 2024

Episode Transcript

Lightly edited for flow and brevity.

  • Tim Callan

    All right. So, our episode 336, we discussed how Nikon and I think it was Getty Images, if I’m right. Reuters. Maybe it was Reuters. I should probably look that up . Got together Nikon and one of the new services, it might have been Reuters, had worked together to basically build a system of establishing the provenance of a digital image. And this is basically to get around the deep fake images problem. And so the basic idea is, you have your lensless SLR, your digital SLR, and when it captures an image, it digitally signs it with something that's tantamount to a code signing signature. And then because it's digitally signed, what are the benefits we get of digital signatures. We know that it is untampered with, we know that it is genuinely coming from a certain source, it could be time stamped, could be perhaps geographically stamped if that's something you have and want to enable. All of that stuff could be put in there. And this can perhaps in the future, create a process whereby we can look at an image and establish definitively that it is a real capture of a real event.

    So there's a new development recently looked at, where basically Nikon, Sony and Canon have all come together, and are attempting to put together a standard that will do exactly this so that manufacturers of equipment of photographic or video equipment, and the systems that consume and present these equipment, could all in theory get around the same standard and allow us to have, you know, digitally signed, known to be actually captured images.

  • Jason Soroko

    Yeah. I got a few things to say about this, Tim.

  • Tim Callan

    Ok.

  • Jason Soroko

    Yeah. Look, you know, digital signatures, they're great. They're great. The big obvious elephant in the room is it might not apply here, but I'm gonna say it anyway, which is if you're presented with an image, and we're not talking about voice or other things here, this is just an image. I'm going to propose to you, Tim, that 99.9999% of the images you ever see, are not the ones that come off the camera.

  • Tim Callan

    Right because they're photoshopped or something. They’re cropped.

  • Jason Soroko

    Yep.

  • Tim Callan

    Yeah. Yep.

  • Jason Soroko

    There's probably not a single unedited image that you're ever going to see. Either in print or on the Internet or anywhere and as soon as you make a change, well, your digital signature doesn't apply to the image anymore.

  • Tim Callan

    Yep. Yep. That’s right.

  • Jason Soroko

    So you have to ask yourself the question, so what is this for? And if the answer is, well, it's to prove that you know that a base image came off of a camera, well, there's another problem with that, which is I am going to - - I'm going to just suppose that it's not going to be all that difficult for a bad guy to make a software implementation of, you know, they'll go get their own Sony, Nikon camera, and basically reverse engineer this so that they can take any image they want, and sign it as if it came off a camera.

  • Tim Callan

    Yeah. So you know, I think there's a lot to say about this, right? I can imagine this kind of thing having utility. And it's basically what you said - imagine there's some scenario where I want to know that this is really the right thing, right? The police go to the crime scene, and they want to take pictures and these pictures are going to be shown in court and they want to be able to prove that these were the real pictures that were really captured, right? So you could imagine there are scenarios where this kind of thing definitely has value. You could imagine where things like timestamping and geolocation stamping have value. This really happened on the fifth of December, and I was really at this address, right? You can imagine problems with that. Like if it's simply a matter of changing the clock on my camera before I take the picture, then I didn't really accomplish anything. Right.? But you know, theoretically people could sit and try to work on these things and work on these problems.

    I agree that you and I probably don't see images where these signatures would be valid because just as a reminder to all the listeners if you change a single bit, the signature stops working. So all I have to do is crop it. Or all I have to do is edit it in any way. I take out the red eye, and bang. Right? It can no longer be a signed image. And that's a valid point, Jay.

    The other thing though I'll offer is that watermarking is a complete dismal, abject failure. Watermarking these things is never going to work. So the idea was that you're going to make the AI incorporate something steganographically into the image such that software that knows what it's looking for can look and understand that this was made by let's say, mid journey, right?

    The trouble with that is that it's so easy to get around. All I have to do is display the image on my screen, do a print screen, take that image and move forward and now the steganography is destroyed. And so it's trivial to get around. A school child could get around it. And there isn't really a technical solution for that problem. So even if this digital signature thing is far from a ubiquitous solution that allows me to always know that every image was really captured from a real life thing or wasn't, that's not the same as it being completely valueless. If you see what I'm saying?

  • Jason Soroko

    Correct. It's not valueless, the value may be very limited and as the impacts come against this, it may be rendered down to what even is this for, might be the question.

  • Tim Callan

    Right. And it might be about real specific use cases. Like I can imagine this legal documentation of something I'm gonna use to prosecute someone in court being a use case that holds up and works. Right? And a lot of other things wouldn't. I think it also gets into another interesting question, which of course, is when we talk about deep fakes now, we sort of talk about this idea of a whole image that's generated from scratch. But real line base starting photographs and videos and audio have been edited by people for years and years and sometimes that editing can change how you would understand or interpret what's being caught. Now we have tools where we have AI augmentation of that editing. So, I can see a video of live action people moving around in the real world, and some of those people are real, and some are not. These are things that we can do. So, is that a real image? No. Is that a deep fake? No. It's an AI edited, AI augmented version of a real base image.

    So what level of editing is fine? Like we could argue I can take out the red eye and that's not really changing the image in a meaningful way. But you know, what if somebody gives me a tummy tuck and slims out my features? What if they change my eye color? What if they change my hair color? Where do we get to the point where suddenly it's not a picture of me anymore?

  • Jason Soroko

    You know, Tim, all of these things that are down to subjectiveness, I think while you were talking, I just came up with the use case for this. And you came up with the one of the court case, evidence in a court. I think that the true killer use case for this and this explains why certain kinds of players are in this consortia and it makes sense that it would be people like Getty Images, and perhaps Reuters because the question of who owns the base image is very important to them.

  • Tim Callan

    Sure. Yeah. Rights and royalties?

  • Jason Soroko

    So therefore, who are you actually protecting with this? In other words, the term deep fake, a protection from deep fakes is being used by the journalist here. Well, who is being protected? Well, it's Reuters. It’s the people who own the intellectual property of the image. They went off and paid a journalist with a camera to go off and do the work to capture the moment and they own the moment. Right? I think about images that from, you know, the red carpet of, you know, the Academy Awards coming up. There's going to be images that are bought up by Getty Images, and perhaps even Reuters, and the importance of if some newspaper prints that even an edited version of that, Reuters will be able to go into court and say, that was my photographer that took that photo.

  • Tim Callan

    This is my original. Here’s my photograph and here it’s signed.

  • Jason Soroko

    I think that's the killer use case for this. So be careful with the protecting from deep fakes because it's not protecting you, unless you're Reuters and you’re Getty Images.

  • Tim Callan

    Yeah. I think that's a good and valid point, Jay. And you can imagine this being set up in a way, just like with a TLS certificate or a code signing certificate where you know who that certificate was assigned to, and it's really that person, and it's reliable. Now, I don't think this has gotten far enough along that we have answers to questions like, ok, are these public certs? Are these being issued by a CA? Is Nikon gonna be its own CA? Do I trust Nikon if it's its own CA? All that stuff's gotta get worked out. But presumably that will. And yeah, I think you're right. I think now that you say it, digital rights is the real motivator here.

  • Jason Soroko

    It is. So journalists who are covering this topic, be careful with your wording. I know you're going for the sensationalist stuff, but you got to curb the wording because protection from deep fakes is only for a particular party here. And hey, you know, that's not a bad thing. It's a good thing. And PKI to the rescue once again, Tim.

  • Tim Callan

    Yeah. PKI to the rescue once again.

    So anyway, this is developing. You know, I think it's interesting because it's how deep fakes and digital signatures all kind of come together. And we should keep tracking it as it develops. If this has legs and if a standard comes out and if things like that happen, let's return to it and keep the listeners just informed on what's going on.

  • Jason Soroko

    Yeah. We're gonna keep a close eye on this and even deep fake mitigations, Tim, that might help you rather than just Getty Images.

  • Tim Callan

    All right. Thanks a lot, Jay.

  • Jason Soroko

    Thank you.

  • Tim Callan

    This has been Root Causes.