Categories
AI Politics and civics Tech & society

This deep fake video is terrifyingly real: that’s its intent

A Kari Lake deep fake is designed as an education tool and wake-up call. How will voters know what’s real in seven months?

It’s a little more than seven months until Election Day. If you thought 2016 and 2020 felt like an increasingly unprecedented onslaught of propaganda and lies, hold on to your hat. This is the year of the hyper-realistic deepfake: images, sound, video.

You may have seen these headlines:

And this is not Kari Lake. 

But it sure as hell looks and sounds like her, especially on the small screen that is a phone. (On a big screen, you may start to see the tells.)

Oh, the budget? Zero plus a bit of donated time by a software engineer. It’s a project of Arizona Agenda.

Arizona is a key swing state, “epicenter of American politics.”` That’s why journalist Hank Stephenson launched Arizona Agenda, “an insider’s guide to Arizona politics for the political outsider.” That’s also those of us who don’t live there.

Legal deception? You betcha. This is ‘murica.

The European Union (EU) has passed an “AI Act” that requires transparency (in other words, you gotta say it’s not real) rather banning fakes entirely [1].

On the other hand, we are relying on a not-close-to-robust patchwork from the states to constrain automated algorithms. Why? First, it’s a stated goal of the U.S. Supreme Court. Second, that’s what happens when one of the two parties in Congress has a motto of we-refuse-to-govern.

As of 07 February 2024, state representatives had introduced 407 AI-related bills with about half of them addressing deepfakes according to Axios. That’s a 700% increase from February 2023.

See current and enacted legislation by state.

Here are two current examples:

  • New Mexico. NM HB 182, as passed and published on 05 March, criminalizes using synthetic media if the the intent is to alter voting within 90 days of an election. (Oh, that we could stop all ads within even 30 days.)
    SECTION 1. Section 1-19-26 NMSA 1978 
    (being Laws 1979, Chapter 360, Section 2, as amended) 
    is amended to read:
    
    "1-19-26.  DEFINITIONS.--As used in the Campaign
    Reporting Act:
    
    C. "artificial intelligence" means a machine-based 
    or computer-based system that through hardware or 
    software uses input data to emulate the structure 
    and characteristics of input data in order to 
    generate synthetic content, including images, 
    video or audio;
    

    The legislation contains a robust definition of “artificial intelligence” but misses the very real problem of prior repetition.
    Falsehoodl quote

  • Wisconsin. AB 665, as passed and published on 21 March, requires that any campaign audio or video content containing “synthetic media” (defined as “generative artificial intelligence”) must include the phrase “Contains content generated by AI” at the beginning and the end of the content.

 

Types of synthetic media

The industry calls these fakes “synthentic media.” The term normalizes disinformation, as though it were as benign as deciding which a shirt to buy: silk (“real”) v rayon (“synthetic”).

First, it’s relatively easy to put one person’s face on another person’s body in a still photo. See the kidnapped Joe Biden tailgate wrapper that Donald Trump shared on Good Friday.

In case you think that the Biden kidnap meme was “all in fun,” consider the public outrage in 2013 to a similar truck tailgate wrapper of a young kidnapped woman. Hint: the Texas company pulled the product.

Second, text-to-voice synthesizers require more computing power and a modium of technical expertise.

Have a listen to this (fake) confession from Donald Trump (text-to-voice).

 

Finally, there’s fake video. Here’s another faux Kari Lake.

 

See current and enacted legislation by state.

[1] There is no generally accepted definition of “artificial intelligence;” it is a marketing term from the 1950s. Because we’ve punted this disinformation problem to the states, the U.S. won’t have “one” definition this year. Defining the tools and putting guardrails around its use is at a minimum an interstate commerce issue. Those are managed at the federal level. But truly it’s a global problem.

“Not the odds, but the stakes.” That’s how Jay Rosen, NYU journalism professor and media critic, thinks news organizations should be covering the 2024 presidential election.

The stakes, of course, mean the stakes for American democracy,” Rosen told Oliver Darcy, CNN, last year. “The stakes are what might happen as a result of the election.” Rosen continued: “The horse race [odds] should not be the model… It should not be the organizing principle of your campaign coverage.”

In that spirit, I am writing periodic reports focused on the stakes facing voters in this presidential election.

Talk to me: BlueSky | Facebook | Mastodon | Twitter

 

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.