The Importance of ‘Team Human’ When Mixing AI and Nuclear Weapons
Last week I had the pleasure of moderating a panel on the burgeoning role of artificial intelligence in nuclear weapons.
They let me give some opening remarks at the top to set the tone and I wanted to share a transcript of my speech with you fine readers. In the speech, I begged the audience, which included sitting American politicians and nuclear policy experts, to say yes to Team Human in the face of pressure to adopt AI technologies into nuclear command, control, and communications systems.
I borrowed this phrase from writer Douglas Rushkoff. It’s the title of one of his books, the name of a podcast, and an all around philosophy about how to live in the digital age. In short: go with Team Human. It’s an adage I’ve tried to keep in mind as I continue to report on the intersection of nukes and AI.
My transcript is blow:
Let me first thank the Center for Arms Control and Non-Proliferation for putting together this event.
Second, let me thank their wonderful tech people for getting us all online and looking our best on short notice.
Ain’t the future grand. Here we all are on a conference call, able to keep working on the problem of nuclear weapons despite losing our venue. A government shutdown cannot hold us back and we can thank technology for that.
But I think we all know that a Zoom call is not quite as good as an in-person meeting, and a lot is lost when a signal changes from analog to digital. There’s something to be said for trying to keep things as human as possible.
That’s a theme of the panel today. Saying yes to Team Human in the face of overwhelming pressure, salesmanship, and not a little fear. We live in an age of miracles and wonder and horror. Artificial Intelligence, in all its varied forms, is here.
I’m going to guess that everyone in this room has either played around with it or knows someone who has. And I am not just talking about LLMs, those systems like ChatGPT and Claude that will have a conversation with you and act as a frontend to much of the web. No, AI is doing a lot more. If you’ve had a document glanced over by spellcheck in the past few years, you have used an AI system. I’m pretty skeptical of the tech, but I do use an AI transcription service. It turned an eight hour job into an eight minute one. I can’t argue with those results.
But, of course, AI is not just helping us with office work. The IDF is using it to pick targets in Gaza. Palantir has an AI system it says will collate battlefield data and help commanders make fast decisions in an active war zone. SOCOM just bought its first round of AI-controlled turrets.
And so, inevitably. It comes up. This idea that is almost as old as nuclear weapons themselves.
What happens when we mix AI and nukes?
There is, of course, that nightmare scenario that has played out in countless works of fiction. A thinking machine weighs the outcomes and, having done the math, decides to push the nuclear button and end all human life on Earth.
It’s a ludicrous proposition, automating the apocalypse. But it’s not as if no one is talking about it. If you go looking, you will find serious policy proposals from serious think tanks and pundits who say it’s a good idea. Perimeter exists. Dead Hand and fail-deadly systems are attractive to some.
But I ask – as we talk – that you also think smaller. There are a thousand little ways that AI will be integrated into the world’s nuclear weapons. It may be as simple as a machine learning system scanning repair reports in a Missouri missile silo to automatically order parts or as grand as trading out weary human eyes on early warning systems for tireless, algorithm-connected sensors.
And I want us to remember our history. We have avoided nuclear war because we’ve been lucky. But, also, because we’ve been human.
When a leader orders a nuclear strike, there are dozens of individual humans that sit between that decision and the ultimate launch. A radar tech sees something odd. Commanders relay orders. Missileers turn keys. The responsibility is spread.
Everyone here knows all the stories; the multiple times we’ve come to the brink of nuclear war and stepped back because a human being decided not to pull the trigger.
And this is the real danger of nuclear weapons at the dawn of AI. We shouldn’t worry that AI will make a decision to kill us all. We should worry that, justified by efficiency, in the name of modernization, and in the pursuit of lucrative defense contracts … AI will replace the next Stanislav Petrov.
And so we are here today to grapple with the specifics of this problem. We are here today to talk about what it means to keep a human in the loop. But I also want us to remember why we should keep a human in the loop.
The stakes are high. And humans, for all our myriad faults, have time and time again walked to the very brink and balked. Machines do not balk.
Let me introduce you to the people who are going to help us grapple with this.
Jesse Kirkpatrick is a professor at George Mason University. He is here to walk us through the technical aspects of AI integration with nuclear command, control, and communication.
Then we’ll go to Jacquelyn Schnieder. She is the director of the Hoover Wargaming and Crisis Simulation Initiative at Stanford University. She’s going to talk to us about risks associated with incorporating AI and NC3.
And we’ll end by trying to answer the question: What do we do about all this? It’s a big question, but I know that Cindy Vestergaard will help us navigate it. Vestergaard is the director of Converging Technologies and Global Security at the Stimson Center.
And with that, let’s kick to Jesse.
The entire conference is worth watching. The panel on AI is good, but the second topic on “Using Diplomacy to Avoid Nuclear War” had me on the edge of my seat. Lacie Heeley, the founder of Inkstick Media moderated it and put questions before Joe Cirincione, Mallory Stewart, and Jon Wolfsthal.
It’s a bad time for people who work on arms control, report on nuclear issues, and work to live in a world free of nuclear weapons. The trend has been negative for a long time, but my sense from policy people and experts has long been that they have an earnest belief that things will normalize if we can survive the present moment.
That’s been going on for a decade. Relief washed through my body as I listened to Cirincione — a former guest of the show and a nuclear policy expert — admit some hard truths out loud. Relief may be an odd emotion to feel when hearing something so dire, but his candor was refreshing.
“We are losing. We haven’t just lost, we are losing the structures that we built up — conservatives and liberals, Democrats and Republicans — over all these years,” he said of the state of nuclear arms control “Those have crumbled. Those are being actively dismantled as we speak. Our goal has to be to minimize those losses, preserve what we can preserve, and prevent the very worst from happening.”
And he made it clear that it happened, in part, because he and other people like him failed. “We have to analyze our own failures,” Cirincione said. “Arms control is not dead, but it is mostly dead. Could these agreements have been stronger? Was our strategy wrong? Why didn’t the Biden administration make any changes at all to our nuclear policies, our nuclear posture, our nuclear budgets. What went wrong there? We don’t have an understanding of that yet. We have to first look at our failures.”
Then he pointed out that nuclear weapons had fallen into the hands of authoritarian leaders. “What we’re doing isn’t working … things are different now … the vast majority of nuclear weapons are controlled by authoritarian leaders,” Cirincione said. “In my view, it is likely that Donald Trump is gonna consolidate this authoritarian regime. What does that mean for nuclear policy? Nuclear doctrine? Most of the people writing about the new nuclear age haven’t even touched this issue, but we have got to consider it.”
Wolfsthal of the Federation of American Scientists was just as candid “The MAGA ambition to remake America in a very different image is ascendent, so it’s going to be very hard to imagine other forces agreeing to work with them on something lasting,” he said. “I think the time has come for the United States and the Democratic Party to think about shadow government. A shadow secretary of defense, a shadow secretary of state. A shadow national security advisor. Drawn from the Congress, it will be a focal point to talk about these issues.”
As we move through a new nuclear age and a new nuclear arms race, the people who work against Armageddon must change and reckon with the failures of the past. It did my heart good to hear Cirincione and Wolfsthal say as much.