When J. Robert Oppenheimer learned that Hiroshima had been attacked (like the rest of the world), he began to regret his role in building the atomic bomb. On one occasion, during a meeting with President Harry S. Truman, Oppenheimer wept and expressed this regret. Truman called him a crybaby and said he never wanted to see him again. Christopher Nolan hopes that when Silicon Valley audiences see his account of all these events in his film “ Oppenheimer” (out July 21), they’ll see something of themselves there, too.
After a screening of Oppenheimer at the Whitby Hotel yesterday, Nolan was joined by a panel of scientists, as well as Kai Bird, co-author of the bookOppenheimerAmerican PrometheusAmerican Prometheus. According to, let’s talk about movies. The audience was mostly scientists, laughing at the film’s jokes about physicists’ ego, but there were also a few journalists, myself included.
We listened to all the too-brief debates about the success or failure of nuclear deterrence, and current Los Alamos director Thom Mason talks about how many current lab employees make cameos in the movie, since most of it was shot nearby. But toward the end of the conversation, host Meet the Press Chuck Todd asked Nolan what he hoped Silicon Valley would learn from the film. “I think what I want them to take away is the concept of responsibility,” he told Todd.
“Applied to artificial intelligence? It’s a scary possibility. Terrible.”
Then he clarifies, “When you innovate through technology, you have to make sure there is accountability.” “The rise of companies over the past 15 years has been using words like ‘algorithm’ constantly without knowing what they mean in any meaningful mathematical sense. They just don’t want to be held accountable for what that algorithm does.” It’s about accountability. We have to hold people accountable for what they do with the tools they have.”
While Nolan didn’t mention any specific companies, it’s not hard to see what he’s talking about. Companies like Google, Meta, and even Netflix rely heavily on algorithms to acquire and maintain audiences, and that reliance often has unforeseen and often heinous consequences — perhaps most notably and truly horrific is Meta’s contribution to the Burmese genocide.
“At least a cautionary tale.”
Now, while an apology tour is all but guaranteed days after the company’s algorithm did something horrible, the algorithm is still there. The thread has even just started using a proprietary algorithm feed. Sometimes, a company might give you a tool to turn it off, like Facebook did, but these black box algorithms still exist, rarely talking about all the potentially bad outcomes, and talking a lot about the good ones.
“When I talk to leading researchers in artificial intelligence, they’re literally calling it their Oppenheimer moment now,” Nolan said. “They wanted his story to illustrate what is the responsibility of scientists to develop new technologies that can have unintended consequences.”
“Do you think Silicon Valley is thinking that now?” Todd asked him.
“They say so,” Nolan replied. “Here it is,” he laughs, “it’s helpful. At least in conversation. I hope this thought process continues. I’m not saying that Oppenheimer’s story offers any easy answers to these questions. But at least it’s a cautionary tale.” . We regret this error.