“Killer robots”: will they be banned? | World | Breaking news and perspectives from around the world | DW

62560321 401

These are not the drones that deliver your online order. Loaded with cameras, sensors and explosives, their mission is to drive themselves to a target with an algorithm in the driver’s seat. They destroy themselves along with the target, leaving only a pile of electronic debris behind.

Increasingly, these types of weapons are a manufacturer’s promotional material rather than science fiction movies. Starting today, a United Nations conference of 80 countries is meeting in Geneva to debate whether to ban them or at least regulate them more strictly.

Machines that kill humans

Autonomous weapons are, as the name suggests, capable of selecting and attacking targets by themselves. This is unlike piloted drones and other weapons, which are directed by a human operator from afar. Weapon manufacturers are taking advantage of the latest advances in artificial intelligence and machine learning to develop them.

The UN conference calls them “lethal autonomous weapon systems”. Critics call them killer robots. They can take the form of drones, land vehicles or submarines.

Some countries want to ban autonomous weapons, arguing that an algorithm should never decide about life or death. Other countries want autonomous weapons to be regulated, with more or less binding rules of engagement that include some role for human decision-making.

Great powers on the way

The UN has met twice a year since 2014 to discuss the issue. The United States, Russia and China are the strongest opponents of an outright ban on autonomous weapons systems or binding rules to govern their use.

Russia blocked the latest meeting, which was scheduled for March, by refusing to accept the agenda. At the time, the Russian invasion of Ukraine had been a few weeks old.

Autonomous weapons will change the future of warfare

Autonomous war crimes

“If an autonomous weapon makes a mistake and possibly commits a war crime, who is responsible?” asks Vanessa Vohs, who researches autonomous weapons at the University of the German Armed Forces in Munich.

For Vohs, responsibility is one of several open questions.

The Geneva meetings do not seem close to answering many of them, and Russia’s war in Ukraine has added to the uncertainty. For some, it is more evidence to ban autonomous weapons. Others see war as another sign that doing so is hopeless.

“There is evidence that Russia is using autonomous weapons in this conflict,” said Ousman Noor, who works for the Campaign to Stop Killer Robots. The NGO wants to see these weapons banned. “This could lead to recognition of the urgent need to regulate these weapons before they are sold around the world.”

The US has reportedly sent the Ukrainian military several tactical “kamikaze” drones that can be remotely piloted or find their own target.

Artificial intelligence experts have long warned of the ease of producing small, armed drones in large numbers, which any IT student could program.

“Without the need for a person to service these weapons, you can send tens of thousands, if not millions, of them,” Stuart Russell, an AI researcher, told DW. “We are creating weapons with the potential to be more lethal than an atomic bomb.”

military robot

The Ordnance Disposal Search deploys a Harris T7 multi-mission robotic system

More money for German armed drones

The war in Ukraine has prompted countries to spend more on their militaries, including investing in the latest weaponry. The extra 100 billion euros ($102 billion) that Germany is borrowing to supplement its defense budget may go in part to buy fleets of armed drones or other advanced weapons systems that use AI.

Observers of the Geneva talks have said Germany’s representatives there have so far been reluctant to take a clear position. Few believe the multilateral discussion will result in a ban or any binding rules.

With reports of autonomous weapons already being deployed to combatants, there is a growing sense of urgency to find a solution.

“That’s why we need new rules,” Vohs said. “Before we find ourselves in an apocalyptic scenario when something really goes wrong.”

This article was originally written in German.

While you’re here: Every Tuesday, DW editors round up what’s happening in German politics and society. You can sign up for the weekly Berlin Briefing newsletter here.



Source link

You May Also Like