The comparison seems to be everywhere these days. “It’s like nuclear weapons,” a pioneering researcher in artificial intelligence has said. Top AI executives have compared their product to nuclear power. And a group of industry leaders warned last week that AI technology could pose an existential threat to humanity, on par with nuclear war.

People have been comparing AI advances to the splitting of the atom. for years. But the comparison has become starker amid the launch of AI chatbots and calls by AI creators for national and international regulation, just as scientists called for guardrails governing nuclear weapons in the 1950s. Some experts worry that AI will kill jobs or spread misinformation in the short term; Others fear that hyper-intelligent systems could eventually learn to write their own computer code, break the bonds of human control, and perhaps decide to wipe us out. “The creators of this technology tell us they are concerned,” said Rachel Bronson, president of the Bulletin of the Atomic Scientists, which tracks man-made threats to civilization. “The creators of this technology are calling for governance and regulation. The creators of this technology tell us to pay attention.”

Not all experts think the comparison fits. Some point out that the destructiveness of atomic energy is kinetic and proven, while the danger of AI to humanity remains highly speculative. Others argue that almost all technologies, including AI and nuclear power, have advantages and risks. “Tell me a technology that can’t be used for bad, and I’ll tell you a completely useless technology that can’t be used for anything,” said Julian Togelius, a New York University computer scientist who works on AI.

But the comparisons have become so rapid and frequent that it can be hard to tell whether doomsayers and proponents are talking about artificial intelligence or nuclear technology. Take the quiz below to see if you can tell the difference.

The quotes above are only a part of the responses and the debate on AI and nuclear technology. They capture parallels, but also some notable differences: fears of imminent, fiery destruction from atomic weapons; or how advances in AI right now are mostly the work of private companies rather than governments.

But in both cases, some of the same people who brought the technology into the world are sounding the alarm louder. “This is about managing the risks of advancing science,” Bronson, president of the Bulletin of Atomic Scientists, said of AI. “This is a major scientific breakthrough that requires attention, and there are so many lessons to be learned from nuclear space in that. And you don’t have to equate them to learn from them.”

Leave a Reply

Your email address will not be published. Required fields are marked *