![]() |
![]() |
![]() |
![]() |
![]() |
Ethical questions abound as wartime AI ramps up Paris, April 10 (AFP) Apr 10, 2024 Artificial intelligence's move into modern warfare is raising concerns about the risks of escalation and the role of humans in decision making. AI has shown itself to be faster but not necessarily safer or more ethical. UN Secretary General Antonio Guterres said Friday that he was "profoundly disturbed" by Israeli media reports that Israel has used AI to identify targets in Gaza, causing many civilian casualties. Beyond the "Lavender" software in question and Israeli denials, here is a tour of the technological developments that are changing the face of war.
But the results can only produce probabilities, with experts warning that mistakes are inevitable. AI can also operate in tactics. For example, swarms of drones -- a tactic China seems to be rapidly developing -- will eventually be able to communicate with each other and interact according to previously assigned objectives. At a strategic level, AI will produce models of battlefields and propose how to respond to attacks, maybe even including the use of nuclear weapons.
"The reaction time is significantly reduced. What a human can do in one hour, they can do it in a few seconds," he said. Iron Dome, the Israeli anti-air defence system, can detect the arrival of a projectile, determine what it is, its destination and the potential damage. "The operator has a minute to decide whether to destroy the rocket or not," said Laure de Roucy-Rochegonde from the French Institute of International Relations. "Quite often it's a young recruit, who is twenty years old and not very up-to-speed about the laws of war. One can question how significant his control is," she said.
Humans "take a decision which is a recommendation made by the machine, but without knowing the facts the machine used", de Roucy-Rochegonde said. "Even if it is indeed a human who hits the button, this lack of knowledge, as well as the speed, means that his control over the decision is quite tenuous." AI "is a black hole. We don't necessarily understand what it knows or thinks, or how it arrives at these results", said Ulrike Franke from the European Council on Foreign relations. "Why does AI suggest this or that target? Why does it give me this intelligence or that one? If we allow it to control a weapon, it's a real ethical question," she said.
But "the real game changer is now -- Ukraine has become a laboratory for the military use of AI", Accorsi said. Since Russia invaded Ukraine in 2022 the protagonists have begun "developing and fielding AI solutions for tasks like geospatial intelligence, operations with unmanned systems, military training and cyberwarfare", said Vitaliy Goncharuk of the Defense AI Observatory (DAIO) at Hamburg's Helmut Schmidt University. "Consequently the war in Ukraine has become the first conflict where both parties compete in and with AI, which has become a critical component of success," Goncharuk said.
Researchers from four American institutes and universities published in January a study of five large language models (a system similar to the ChatGPT generative software) in conflict situations. The study suggested a tendency "to develop an arms race dynamic, leading to larger conflicts and, in rare cases, to the deployment of nuclear weapons". But major global powers want to make sure they win the military AI race, complicating efforts to regulate the field. US President Joe Biden and China's President Xi Jinping agreed in November to put their experts to work on the subject. Discussions also began 10 years ago at the United Nations, but without concrete results. "There are debates about what needs to be done in the civil AI industry," Accorsi said. "But very little when it comes to the defence industry."
|
|
All rights reserved. Copyright Agence France-Presse. Sections of the information displayed on this page (dispatches, photographs, logos) are protected by intellectual property rights owned by Agence France-Presse. As a consequence, you may not copy, reproduce, modify, transmit, publish, display or in any way commercially exploit any of the content of this section without the prior written consent of Agence France-Presse.
|