• D.C.
  • BXL
  • Lagos
  • Dubai
  • Beijing
  • SG
rotating globe
  • D.C.
  • BXL
  • Lagos
Semafor Logo
  • Dubai
  • Beijing
  • SG


Apr 26, 2023, 2:38pm EDT
politics

Congress wants to stop robots from detonating nuclear bombs

A decommissioned Titan missile in Arizona.
Flickr/Stephen Cobb
PostEmailWhatsapp
Title icon

The News

Photo: Flickr/Stephen Cobb

Policymakers working on regulating artificial intelligence say that there are far more pressing issues for lawmakers to look at than just the fictional doomsday from the Terminator movies, where a rogue weapons system declares nuclear war on humanity.

But also: They should probably nip the Terminator scenario in the bud while they’re at it.

AD

A bipartisan group of members plan to introduce the Block Nuclear Launch by Autonomous Artificial Intelligence Act, which would reinforce existing Department of Defense policy by prohibiting any federal funds to be used to change policy around human involvement in the launch of nuclear weapons.

Reps. Ted Lieu, D-Calif., Don Beyer, D-Va. and Ken Buck, R-Colo. are leading the effort in the House. Sen. Ed Markey, D-Mass. will introduce a companion bill in the Senate.

“We know that the Department of Defense does have a classification of autonomous weapons that can launch automatically and we want to put into law that no matter how advanced AI gets, we will never let AI launch a nuclear weapon by itself automatically,” Lieu told Semafor.

AD

They’re not the first to raise the issue. The National Security Commission of Artificial Intelligence, which was authorized by Congress to advise the government on related issues, recommended in their final report that the U.S. “clearly and publicly” affirm only humans can authorize the employment of nuclear weapons.

Title icon

Kadia’s view

There are broader plans to regulate AI, and while members haven’t decided what that looks like just yet, there seems to be an appetite to prioritize national security.


The Department of Defense already has a policy that addresses human interaction in making decisions around the launch of nuclear weapons, but the desire to codify it signals that Congress is taking the technology more seriously as ChatGPT makes daily headlines..

AD

Members are introducing the bill the same day that Speaker Kevin McCarthy and Minority Leader Hakeem Jeffries are convening a bipartisan briefing by MIT professors on AI.

“This bill is a part of my effort to be as forward looking as possible when it comes to AI and potentially catastrophic applications like nuclear weapons,” Lieu said. “We are still looking towards how best to regulate the industry in general but when it comes to nuclear war, we cannot wait. Humans always need to be a central part of the nuclear launch process.”

Title icon

Room for Disagreement

Some industry experts worry that obsessing over Terminator-like scenarios is overtaking common sense when it comes to assessing AI risk. “The sky is not falling, and Skynet is not on the horizon,” Daniel Castro, director of the Center for Data Innovation at the industry-funded Information Technology and Innovation Foundation, said in a statement last month criticizing calls for a six-month pause on AI development.

Title icon

Notable

  • Rep. Ro Khanna, D-Calif. talked to Semafor’s Reed Albergotti this week about his work on AI issues in Congress, where he’s tried to encourage the industry’s growth while looking out for potential excesses to correct. “There has to be a balance,” Khanna said. “There was a reflexive celebration of tech without seeing its blind spots. Then the pendulum swung and there was no recognition of all the good that tech does.”
Semafor Logo
AD