The head of a U.S. software company said Ukraine’s use of its artificial intelligence technology has forced countries around the world to view the advancements — and their impact on the battlefield — with more urgency.
Speaking at the first international summit on the use of AI by the military, Palantir Technologies co-founder and CEO Alex Karp said the discussion around AI in warfare is no longer just a scholarly ethical debate.
“This has now shifted to: your ability to identify the right technology and implement it will determine what happens on the battlefield,” he said at the REAIM summit in The Hague, organized by the Dutch government and and co-hosted by South Korea.
“One of the major things we need to do in the West is realize this lesson is completely understood by China and Russia,” Karp said.
Karp said earlier this month that Palantir, a data analytics company, is “responsible for most of the targeting in Ukraine.” Its services can analyze satellite and social media feeds and help visualize an enemy’s position, allowing Ukraine to target tanks and artillery.
“We are at the very, very early days of AI,” he said Wednesday, warning that governments need to look at what portion of their defense budgets are going to AI advancements, especially as the conversation around AI has shifted in the past six months.
Fifty countries are represented at the two-day conference, where a host of experts, government officials, and military leaders discussed the need for ethics and responsibility as the use of AI becomes more commonplace in armed conflict.
“AI will inevitably change how militaries operate,” U.S. Under Secretary for Arms Control and International Security Bonnie Jenkins said.
Many have called for an international treaty on the use of AI in the military.
“To prevent abuses, we need to establish international guidelines,” Dutch Foreign Affairs Minister Wopke Hoekstra said. “It is crucial that we take action now.”
Room for Disagreement
During a panel discussion with the Dutch chief of defense and a Lockheed Martin executive, the secretary general of Amnesty International pushed back on the idea that it’s possible to remove bias from AI when used in a military context. And asserted that it can lead to certain groups being targeted more than others.
“Where have you been for the past 70 years? Wars are dirty. Biases are part of warfare,” Agnès Callamard of Amnesty said. “There is not one clean war. Wars are full of biases. You are creating a technology that can encode biases.”
“We cannot just think in this room that there are only good people that are going to use this intelligence for defense.”