Giving an AI control of nuclear weapons: What could possibly go wrong?

Posted: 3rd February 2022


By Zachary Kallenborn | February 1, 2022

nuclear football

The “nuclear football” follows the president on trips. It allows the president to authorize a nuclear launch. 

If artificial intelligences controlled nuclear weapons, all of us could be dead.

That is no exaggeration. In 1983, Soviet Air Defense Forces Lieutenant Colonel Stanislav Petrov was monitoring nuclear early warning systems, when the computer concluded with the highest confidence that the United States had launched a nuclear war. But Petrov was doubtful: The computer estimated only a handful of nuclear weapons were incoming, when such a surprise attack would more plausibly entail an overwhelming first strike. He also didn’t trust the new launch detection system, and the radar system didn’t have corroborative evidence. Petrov decided the message was a false positive and did nothing. The computer was wrong; Petrov was right. The false signalscame from the early warning system mistaking the sun’s reflection off the clouds for missiles. But if Petrov had been a machine, programmed to respond automatically when confidence was sufficiently high, that error would have started a nuclear war.


https://thebulletin.org/2022/02/giving-an-ai-control-of-nuclear-weapons-what-could-possibly-go-wrong…
Find out more – call Caroline on 01722 321865 or email us.