" />

The ongoing campaign against lethal autonomous weapons systems in warfare

The ongoing campaign against lethal autonomous weapons systems in warfare

Military applications of robotics and AI are on the verge of deployment on the battlefield. Dubbed lethal Autonomous Weapons Systems (AWS), these could be the future of warfare, and experts continue to warn against its potential for destruction and disruption on a global scale.

AWS can include anything from unmanned missile systems and drones to tomorrow’s intelligent robot warriors: machine learning is a versatile technology that has spurred the imagination of defense strategists and arms manufacturers.

An independent database lists 284 autonomous weapons systems already operational in many countries across the globe. Notably, most are developed by the top five arms exporters: Russia, China, the United States, France and Germany. Many of these are capable of targeting, moving and identifying targets with little to no human input.

In the future, the defense industry hopes to use AI to confer more complex abilities such as planning, communicating and even establishing mission objectives. This goal is particularly troubling: AI is unlike human minds and has different priorities due to the way it is programmed and the sources it learns from. This could potentially lead to disproportionate attacks, logical loops and unprovoked military actions, even if the absence of a technical failure.

The discussion on the ethics of AWS is getting louder as this reality sets in. While most countries agree that AWS require some degree of “meaningful human control” or similar safeguard, there is yet no single universal definition of AWS. Furthermore, AWS can come in many degrees of autonomy, the permissibility of each also a major topic of the ongoing discussion.

Aware that the implications of AWS are not only technological, but also political and ethical, many notable voices are increasingly asking for international cooperation aimed to implement autonomy limits or an outright ban on AWS before deployment in real conflicts begins.

In an open letter to the UN released on August 21 2017, Elon Musk and 115 more signatories expressed their concerns about the future of mankind, demanding stringent regulation on AWS. The release of the letter was prompted by the cancellation of a long-awaited first meeting of a UN-appointed Group of Governmental Experts on AWS.

Lethal autonomous weapons threaten to become the third revolution in warfare. Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways. We do not have long to act. Once this Pandora’s box is opened, it will be hard to close,” warns the open letter.

Read More: Microsoft’s Acquisition of AI Startup Maluuba is like Hera’s Gift to Pandora

This is the latest of repeated calls for preventive measures against a military arms race of unmanned weaponry.

However much we prefer envisioning a zero-bloodshed future with robots destroying each other during military conflicts, a more likely scenario is that of swarms of lethal machines at the disposal of individual or political interests. At no human cost for the offensive party and no restriction for mass production beyond monetary cost, their potential for disrupting the post-WWII defense-oriented international stability is unprecedented.

Read More: ‘AI will represent a paradigm shift in warfare’: WEF predicts an Ender’s Game-like future

The famous Santayana quote, “Those who cannot remember the past are condemned to repeat it,” underlies the entire discussion on AWS regulation. The world is reflecting on the future of AI-controlled weapons under the shadow of past catastrophic outcomes of events such as Hiroshima and Nagasaki, the pinnacles of the last great arms race before the Cold War.

A nuclear strike is no longer seen as an easy one-hit win; since 1945 atomic bombs are mere deterrents for conflict because large-scale nuclear retaliation could obliterate life on Earth. For this reason, keeping humans in the loop at every step during wartime is not only sensible but necessary.

After all, an AI programmed to win would have no qualms about pressing that big red button.

View Comments (2)


  1. Pingback: How to Deal With Corporate Bullshit - The Sociable

  2. lovely

    August 29, 2017 at 10:56 AM

    Woѡ, this piece of writing is pleasant, my sister is ɑnalyzing sᥙch things,
    therefore I am going to convey her.

Leave a Reply

Your email address will not be published. Required fields are marked *


Hailing from the Caribbean coast of Colombia, Daniel is a writer and freelance translator with a background in biology. When not word-smithing, you will probably find him chasing frogs somewhere around the tropical belt.

More in Technology

hearts and arrows

Hearts and Arrows Diamonds – What Makes Them So Popular?

Kavinesh ArumugamNovember 16, 2017
cancer ai

First coding and now cancer, AI is transforming the healthcare industry

Nishtha SinghNovember 16, 2017
logistics amazon

3 Logistics Startups Helping Businesses Stay Competitive in the Amazon Era

Zac LavalNovember 14, 2017
zuckerberg president

Zuckerberg is running for president, even if he never runs: KU interview analysis

Ben AllenNovember 13, 2017
bitcoin blockchain

Trust no one: the story of Blockchain and Bitcoin

Omar ElorfalyNovember 10, 2017
entrepreneur, restaurant, lyft, startup

The entrepreneur who became a Lyft driver to conduct customer research: interview

Ben AllenNovember 9, 2017
startups hangovers

Startups and technology to alleviate hangovers from hell

Sam Brake GuiaNovember 8, 2017
ar new york fashion week

Fresh off landmark court battle, Candy Lab teams up with LDJ Futures to unveil new AR experience at New York Fashion Week ‘18

Peter AndringaNovember 8, 2017
digital transformation help

All Hands on Deck: When to Bring On Extra Help For Your Digital Transformation

Alejandro VasquezNovember 7, 2017