A United Nations meeting this week in Geneva is gathering experts on "killer robots" to wrestle with the legal, policy and ethical issues involved in lethal autonomous weapons
— and what role humans will play on the high-tech battlefield of the future.
The second Expert Meeting on Lethal Autonomous Weapons Systems will also consider whether so-called killer robots ought to be permitted in some capacity or banned altogether,
CNN reports.
The talks are being held by the United Nations Convention on Certain Conventional Weapons, which has protocols covering non-detectable fragments, mines and booby traps, incendiary weapons, blinding lasers and the explosive remnants of war, CNN notes.
The potential for autonomous systems is huge,
Defense One reports.
"Humans are far from perfect, and autonomous systems can help increase effectiveness and accuracy, mitigate against accidents, and even prevent some deliberate war crimes," Defense One writes.
"Given current technological developments, a blended approach that uses autonomy for some tasks and keeps humans 'in the loop' for others is likely to be the best approach on the battlefield."
Though no government has publicly declared it's building autonomous weapons, "there are several reasons why they might start," Defense One reports, including the already widespread use of defensive systems that protect ships, bases, and civilians from aircraft and missiles.
Professor Ron Arkin of Georgia Tech in Atlanta argues killer robots should be regarded more as the next generation of "smart" bombs, CNN reports.
But robots can, and have failed "quite badly," Defense One notes.
"[A]utonomous weapons could perform perfectly 99.99 percent of the time but, in the few instances where they did fail," they did so "quite badly," Defense One notes, including in 2003, when the U.S. Patriot air defense system shot down to friendly aircraft, killing the pilots.
"Without a human 'in the loop,' an autonomous system that was malfunctioning
— or hacked by an enemy
— could keep engaging targets until it ran out of ammunition or was destroyed. … In such a scenario, a system failure
— caused either by a malfunction, unanticipated interaction with the environment, or a cyber attack
— raises the frightening prospect of mass fratricide on the battlefield."
Determining who's responsible for such frightening foul-ups presents even greater challenges.
"In an autonomous weapon, the decision about which specific targets to engage is no longer made by the military professional, but by the system itself … " Defense One writes. "[T]he warfighter is responsible for placing the autonomous weapon into operation, but it is the engineers and programmers who designed the system who are responsible for target selection."
© 2025 Newsmax. All rights reserved.