Connect with us

Hi, what are you looking for?

Smart Bombs: Military, Defense and National Security

The Block Nuclear Launch by Autonomous AI Act is Bad Legislation

Ohio-class submarine. Image Credit: Creative Commons.
Ohio-class submarine. Image Credit: Creative Commons.

Congress is again on the cusp of impairing the United States’ drive for nuclear deterrence. The Block Nuclear Launch by Autonomous AI Act is a bad idea. The act would codify the Department of Defense’s existing policy “by ensuring that no federal funds can be used for any nuclear weapon launch by an automated system without meaningful human control.” 

This act does not enhance deterrence by ensuring more efficient, effective, and survivable nuclear command, control, and communications (NC3). It seeks to frustrate efforts across the nuclear enterprise to speed up the processing of data because of a fear that machine learning algorithms will become Skynet and wipe out humanity. 

AI Can Enhance Nuclear Stability

According to the 2022 Nuclear Posture Review, current policy is to “maintain a human ‘in the loop’ for all actions critical to informing and executing decisions by the president to initiate and terminate nuclear weapon employment in all cases.” The goal with this Act is to ensure the next president cannot change this policy. 

According to the National Security Commission on Artificial Intelligence, artificial intelligence must be leveraged to “change the way we defend America and deter adversaries,” including its integration into the broader NC3 system. Moreover, “to compete, deter, and, if necessary, fight and win in future conflicts requires wholesale adjustments to operational concepts and technologies.” 

The proposed Act is bad legislation for three main reasons.

First, America must stop telling Russia and China what Washington will not do. Acts such as this undermine credible deterrence, strategic ambiguity, and stability. Credibility is the quality of being believed, and it relies on effectively communicating America’s capability to retaliate and its commitment to doing so. This very point is why the Russians developed their automated Perimeter system in the early 1980s. When activated, this system automatically launches Russian intercontinental ballistic missiles if communications are lost between missile forces and leadership. Perimeter’s deployment changed American targeting strategy, shifting our focus away from leadership targets.  

The Nuclear Posture Review states the “fundamental role of nuclear weapons is to deter a nuclear attack.” The nation averts a nuclear strike by convincing nuclear-armed adversaries not to take an action. Deterrence stability depends on the “threat that leaves something to chance,” because it places that risk on the adversary. Russia, China, or North Korea must risk a devastating nuclear response if they decide to strike first. 

Machine-learning algorithms that improve the speed and accuracy of the NC3 system can increase the credibility of America’s deterrent by reducing the adversary’s chance of success. This enhances stability.

Advanced Nuclear Directives

Second, the Act would constrain the automation of the NC3 system even where such automation improves the speed and accuracy of presidential decision-making. Machine-learning tools are only as good as their designers, which is true of everything in the nuclear force. NC3 is a system of systems. Indeed, it links more than 100 systems. Any machine-learning algorithms embedded in the system will have specific functions, not god-like control.  

Today, advances in technology are compressing the time available to detect an attack, decide on a response, and direct forces. American planners can no longer assume a 30-minute window for Russian ICBMs, or 15 minutes for submarine-launched ballistic missiles to arrive. The advent of hypersonic glide vehicles decreases detection time significantly

Even if Russia, for example, believes America is likely to retaliate with a nuclear second strike, the adversary must believe the order to do so will get through. According to former Secretary of Defense James Schlesinger, “To deter you must have a threat that you are prepared to implement, and that your opponent must perceive that you are prepared to implement that threat.” 

Attack-time compression slashes the president’s decision and response time. Given the widespread vulnerability of both the bomber force and the submarine fleet, the role of machine-learning algorithms in making and transmitting rapid decisions is critical. 

Because of our growing nuclear inferiority in regards to Russia — and soon, China — the United States may need to develop pre-planned nuclear options that reorder the decision-making process from detect-decide-direct to decide-detect-direct, where the president, in a time before a crisis, decides on a nuclear response for given situations. 

The use of machine-learning algorithms in this scenario is no different from an advanced medical directive used to convey a person’s desires to physicians if an individual is incapacitated. Much like Russia’s automated Perimeter system, an American NC3 system embedded with machine-learning algorithms informed by pre-planned presidential decisions can deter Russian, Chinese, or North Korean aggression by its very existence.

Outpacing the U.S. on AI

Third, if one goal of this legislation is to seek similar commitments from Russia and China, America has already failed. The Russians and the Chinese alike are rapidly integrating AI into their command-and-control systems. Their progress depends on the speed of technological advance.  

Eliminating even the possibility of an American AI-like system removes any incentive to bring the Russians and Chinese to the negotiating table. Idealist notions of leading by example are perceived as weakness in strength-based cultures. China and North Korea are growing their nuclear arsenals at alarming rates. Russia, China, and North Korea are all looking for ways to use nuclear blackmail against the United States. 

Banning the use of any artificial intelligence in the NC3 decision system does not solve any real problem. It creates a bogeyman that does not exist. 

Expecting Russia and China to follow our lead is naive at best and dangerous at worst. Chinese President Xi Jinping and Russian President Vladimir Putin will undoubtedly do what is best for their command of nuclear weapons, and we should expect no less. Passing the Block Nuclear Launch by Autonomous AI Act only allows Russia and China to further outpace the United States.   

Col. (Ret) Curtis McGiffin is the Vice President for Education at the National Institute for Deterrence Studies and spent almost three decades as a nuclear operator, including time flying NAOC. Adam Lowther, PhD is Vice President for Research at the National Institute for Deterrence Studies. He served in the United States Navy and spent much of his career working nuclear issues as an Air Force civil servant.  

Dr. Adam Lowther is Vice President for Research at the National Institute for Deterrence Studies and the host of ANWA DC’s Nuclecast. Col. Curtis McGiffin (U.S. Air Force, Ret.) is Vice President for Education at the National Institute for Deterrence Studies and visiting professor at Missouri State University’s Department of Defense and Strategic Studies. Together, they have more than five decades of experience in uniform and DoD civil service. 

Advertisement