From Star Trek to the Nancy Pelosi video, video fakery has been with us for a long time, but the stakes are getting higher. Chief Strategy Officer Brad Berens explains in a new column.

_______________________________________________________________________________________

By Brad Berens

The original Star Trek show from the 1960s has proved prescient again and again. Starfleet’s pocket communicators and slightly larger tricorders anticipated smartphones. Hospital beds today with their sensors and screens look a lot like the diagnostic beds in the Enterprise’s sickbay. We talk with Siri, Alexa, and other digital assistants the way the Enterprise crew talked with the ship’s computer.

Star Trek also foreshadowed the deepfake. Deepfakes are convincing, technology-enabled forgeries where it looks like people are saying things that they’ve never said or stood next to people they’ve never met.

But let’s get back to Star Trek. In a 1967 episode, “Court Martial,” a resentful crew member tried to frame Captain Kirk for murder by altering video of a moment when Kirk had to jettison part of a laboratory during an ion storm in space. The altered video showed Kirk pressing the button before the crew member, Ben Finney, had left the lab. During Kirk’s trial, the crew and Kirk’s attorney figured out that Finney was still alive, and the action ended with a fistfight between Kirk and Finney.

What was once the stuff of science fiction, a story set in space in the 23rd century, is now happening here on Earth in 2019. Even on the Enterprise, Finney’s video fakery took skills that not many crewmembers possessed. Today, deepfakes are moving towards the grasp of ordinary people.

A pair of recent articles from The Verge are just two examples of this rapidly growing phenomenon: the first shows how an Artificial Intelligence (AI) convincingly cloned the voice of Microsoft founder Bill Gates. The second describes new software that lets users change what people in a video are saying just by typing new text.

The most famous recent deepfake was a video of Speaker of the House Nancy Pelosi slurring her words as if drunk, which the Daily Beast exposed as a fake and revealed the identity of the creator. Simply search the word “deepfakes” to see numberless articles on the topic, although you should have a strong stomach as some of the uses of deepfakes are horrifying.

Why deepfakes are a big problem

Photo fakes are nothing new. Dino Brugioni’s helpful book Photo Fakery: the History and Techniques of Photographic Deception and Manipulation, shows that altered photos go back to the 1850s and the Civil War. Nowadays, with programs like Adobe Photoshop and even some of the software built into smartphones, it has become easy for ordinary folks to edit and alter their photos. Until recently, though, video fakes were harder. It was easy to misdirect through clever editing and splicing, but outright forgery was more challenging.

_________________________________________________________________________________________________

Ironically, just when face-to-face, shared experiences will be necessary for people to agree on things and record those agreements, we’ve seen a two-decade slump in face-to-face experiences with family and friends.

_________________________________________________________________________________________________

Now, as the technology underpinning deepfaked videos is following Moore’s Law and getting easier to use and cheaper to buy, more and more people are able to create or alter more and more video. The result is that we’re increasingly unable to believe the evidence of our eyes, at least when it comes to things that we did not directly experience ourselves.

Deepfakes, in other words, are one weapon in a growing assault on the very notion of reality.

Since the creation of writing, we’ve always known about a gap between reality and accounts of reality. Deepfakes and other forgeries, though, are different because they appear to be transcriptions of reality. It’s hard not to believe in them.

In Duck Soup, Groucho Marx famously quipped, “who ya gonna believe, me or your own eyes?” When Groucho said it, it was clearly ridiculous. Today, though, if we can’t believe our own eyes then we may only be able to believe other people. Trust will become more important than ever.

As deepfakes proliferate, we will return to a culture where witnessing, agreeing, and testifying are important. One person saying, “Yes, I was there, and I saw it happen” will become necessary, because we will not be able to trust signatures or even videos of people signing documents. Notaries all over the U.S. should celebrate as their businesses will boom.

Ironically, just when face-to-face, shared experiences will be necessary for people to agree on things and record those agreements, we’ve seen a two-decade slump in face-to-face experiences with family and friends.

In the Center’s recent 2018 “Surveying the Digital Future” report, for example, we see that face-to-face family time declined from 29 hours per week in 2000 to just shy of 18 hours per week in 2018. (2007 seems to have been an inflection point in this trend, which is perhaps-not-coincidentally the year that the first iPhone hit the market.)

The other irony is that as consensus becomes critical to determining what is real and what isn’t, it means we’ll have to pay attention to what’s happening around us. That means that we’ll have to look up from our smartphones, and who wants to do that?

__________

 

Brad Berens is the Center’s Chief Strategy Officer. He is also principal at Big Digital Idea Consulting.

 

 

See all columns from the Center.

June 11, 2019