Retro futures: War Games

Can a 1983 movie thriller about computers and the military tell us anything about drone warfare today?

By Brad Berens

Image created using DALL-E.*

In 1984, my lifelong friend Juliet and I were watching a then-recent movie, War Games, at my parents’ house. This was in the early years of home video. The first Blockbuster store had yet to open, and Tim Berners-Lee was five years away from inventing the World Wide Web. Starring a very young-looking Matthew Broderick and Ally Sheedy (although both were in their early 20s during filming), the movie was about David, a high school videogame nerd, and his friend, Jennifer, who hacked into the North American Aerospace Defense Command (NORAD) by accident—they were trying to hack into a videogame company to poke around some upcoming releases.

At one point, Juliet observed of Ally Sheedy, “she has pretty hair.” This mystified me since it had never occurred to me that “pretty” was an adjective that one could use in reference to hair. Years later, when War Games came up, I mentioned Juliet’s evaluation to La Profesora, who at once sagely agreed that, yes, indeed, Ally Sheedy had pretty hair. I still don’t get it, but such is the feminine mystery.

I rewatched War Games on MAX last night. It’s still a lot of fun, and it’s a fascinating example of a retro future: a look forward from way back that tells us a lot about the daylight between where people thought we were going and where we landed. You can get the flavor of War Games—including its molasses-slow pace and difficulties dramatizing a bunch of people sitting around looking at screens (this was also the problem with Star Trek: The Motion Picture a couple years earlier) from a trailer that seems endless even though it’s only two-and-a-half minutes long:

 

War Games is a movie about machine learning and artificial intelligences making decisions without a human “in the loop”—a phrase we’re hearing a lot these days and that is also key to the story. In a harrowing opening scene, two soldiers get the order from the White House to launch nukes at Russia. One of them (played by a very young John Spencer of West Wing fame) can’t do it. It turns out to have been a test that the soldiers failed, as 20% of all soldiers did in such tests.

Dr. McKittrick (Dabney Coleman) convinces NORDAD brass to take humans out of the loop and let his War Operation Plan Response (the WOPR ) advanced computer make the decision to launch the nukes when the order comes from DC. (I’ve often wondered if “the WOPR” was a subtle paid product placement from Burger King, but I suspect I’ll never know.)

We’re dealing with dozens or hundreds or thousands of artificial intelligences at wildly varying levels of capability. Persuasion as an option for resolving algorithmic misunderstandings at scale—i.e., when there are a whole lot more AIs to persuade—seems unlikely.

Everything goes awry when David and Jennifer use a back door to access the WOPR. They think they are playing videogames. The WOPR also thinks it is playing games, but it doesn’t realize that it connects to the actual nukes. When David worriedly types, “is this a game or is it real?” into his terminal, the WOPR replies, “what’s the difference?”

I don’t think I’ll ever have the opportunity, but I’d love to watch my kids’ faces as they watched War Games to witness their puzzled expressions when seeing the gigantic amounts of hardware in David’s high school room, listening to the angry squawks of his dial-up modem, and realizing how hard it was for him to connect his computer to another. They might clutch their iPhones lovingly to their chests at such horror.

War Games is prescient in its consideration of where human accountability ends and algorithmic accountability begins. Today in the Middle East, Israel, Hamas, and Hezbollah are all using drones in their conflicts. Likewise, Ukraine and Russia are both using drones in theirs.

There’s no such thing as a fully autonomous military drone that can make decisions in all conditions without humans in the loop… yet. But even limited autonomy changes the shape of war. Here’s a description of how military drones are developing from Markets and Markets, a research firm and consultancy:

Artificial intelligence has revolutionized target recognition and tracking capabilities in military drones, significantly enhancing their effectiveness in complex operational environments. These advanced systems leverage cutting-edge technologies to process vast amounts of data in real-time, enabling autonomous drones to identify, classify, and track objects of interest with unprecedented accuracy and speed.

“Objects of interest,” here, is an obvious euphemism and typical evasion for “people the drones will try to kill.” Having a human in the loop is moving from a technological necessity (because the drones aren’t good enough to make their own decisions) to an ethical quandary. In January of 2023, the Department of Defense released a directive about Autonomy in Weapons Systems insisting that humans stay in the loop when it comes to lethal force. But how tight is that loop? Also, whether other countries and organizations will follow such ethical guidelines is dubious.

The key difference between War Games and today’s reality is also worth exploring. That difference is distribution.

War Games is a tidy story where one big computer featuring a single artificial intelligence is the problem. All the humans tear their hair and scramble to convince the WOPR not to blow up the world.

In today’s far messier reality, we’re dealing with dozens or hundreds or thousands of artificial intelligences at wildly varying levels of capability. Persuasion as an option for resolving algorithmic misunderstandings at scale—i.e., when there are a whole lot more AIs to persuade—seems unlikely.

To this day, in my experience most science fiction does not deal with the proliferation of artificial intelligences that we’re seeing in our everyday lives. I might ask SIRI one type of question (because it lives on my wrist with the Apple Watch), Google another type of question, and ChatGPT yet another. But most SF doesn’t reflect this complexity. The 2013 Spike Jones movie Her was an exception, as are Martha Wells’ Murderbot Diaries novels and Catherine Asaro’s Major Bhaajan novels.

What will a retro future about human/AI interaction look like in another 40 years?
__________

 

Brad Berens is the Center’s strategic advisor and a senior research fellow. He is principal at Big Digital Idea Consulting. You can learn more about Brad at www.bradberens.com, follow him on Blue Sky and/or LinkedIn, and subscribe to his weekly newsletter (only some of his columns are syndicated here).

 

 

* About the image: I plugged this 38 word prompt into ChatGPT’s latest Strawberry model: “Create an image with a 1980s style large military computer on the left side of the image. On the right side of the image please have dozens of different sized and shaped military drones from the 2020s. Photorealistic.” Strawberry then transformed the prompt into a remarkably expanded 261 word prompt with intricate detail, but it could not itself create the image. I then shifted to ChatGPT 4o, which created a first image. That image didn’t have enough drone variety, which I requested, resulting in the image above.

 

See all columns from the Center.

October 23, 2024