The White House wants to 'cryptographically verify' videos of Joe Biden so viewers don't mistake them for AI deepfakes - eviltoast

The White House wants to ‘cryptographically verify’ videos of Joe Biden so viewers don’t mistake them for AI deepfakes::Biden’s AI advisor Ben Buchanan said a method of clearly verifying White House releases is “in the works.”

  • dejected_warp_core@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    9 months ago

    TL;DR: one day the user will see an overlay or notification that shows an image/movie is verified as from a known source. No extra software required.

    Honestly, I can see this working great in future web browsers. Much like the padlock in the URL bar, we could see something on images that are verified. The image could display a padlock in the lower-left corner or something, along with the name of the source, demonstrating that it’s a securely verified asset. “Normal” images would be unaffected. The big problem is how to put something on the page that cannot be faked by other means.

    It’s a little more complicated for software like phone apps for X or Facebook, but doable. The problem is that those products must choose to add this feature. Hopefully, losing reputation to being swamped with unverifiable media will be motivation enough to do so.

    The underlying verification process is complex, but should be similar to existing technology (e.g. GPG). The key is that images and movies typically contain a “scratch pad” area in the file for miscellaneous stuff (metadata). This is where the image’s author can add a cryptographic signature for the file itself. The user would never even know it’s there.