With the ability to hover over a target at high altitudes while awaiting the opportune moment to strike, a “kamikaze” drone acts as a faceless assassin that can be deployed anywhere in the world.
UVision Air Ltd, an Israeli company specializing in lethal loitering systems, aka “kamikaze” or “suicide” drones, has now entered the US market.
The company’s semi-autonomous Hero line of loitering munitions systems can loiter above a target and strike precisely when the opportunity arises, even if the target appears for only an instant.
Kamikaze drones have been “evolving” in Israel since the mid-1970s, according to Aviation Week, and they are used by the military, police and paramilitary units all over the world for missions of reconnaissance or combat, according to Newsweek.
Today, users and producers of suicide drones in the US include companies such as Raytheon and AeroVironment, and UVision is already working with the former.
“UVision has partnered with a few US Companies and developed a very successful cooperation with Raytheon, for the adaptation of our advanced lethal loitering systems to the specific demands of the US Army,” said UVision CEO Noam Levitt, in a statement on February 7, 2019.
“The new company [UVision-USA] will enable us to further improve our high level, rapid response to our US customers, and we hope in the future also to broaden our production operations in the US, which will provide both local jobs and transfer of technology (TOT),” he added.
With the entrance into the US market, UVision-USA will act as an independent US Company and will assist in improving accessibility and availability for the US customers, according to a UVision press release.
The Defense Advanced Research Projects Agency (DARPA) is envisioning a future of warfare where soldiers engaged in urban combat will interact with upwards of 250 autonomous robots through a military tactic known as swarming.
Read More: DARPA envisions soldiers swarming with 250 robots simultaneously in urban combat
Swarming is where a military unit engages an adversary from all directions simultaneously, either with fire or in force, according to a report by the RAND National Defense Research Institute.
Swarm tactics depend on a devolution of power to small units and a capacity to interconnect those units that has only recently become feasible, due to the information revolution.
In October of 2018, Raytheon — the American compay working with Israel’s UVision — delivered the first Space Enabled Effects for Military Engagements, or SeeMe, satellite to DARPA, which was assembled on the company’s advanced missile production lines, will provide greater situational awareness to soldiers on the ground.
“Ground troops can’t always get immediate access to the larger, military and commercial satellites,” said Dr. Thomas Bussing, Raytheon Advanced Missile Systems vice president, in statement at the time.
“These smaller, SeeMe satellites will be dedicated to soldiers, providing them with real-time images from space when they’re needed most,” he added.
The connections between DARPA, Raytheon, and UVision point to a future of warfare where soldiers and autonomous drones swarm together — forming a trustworthy partnership on the battlefield.
But that’s not even the freaky part. As weird as it sounds, DARPA has also demonstrated a confirmed use of humans telepathically operating drones through a brain-computer interface (BCI).
Read More: Brain-computer interface allows for telepathic piloting of drones
“As of today, signals from the brain can be used to command and control … not just one aircraft but three simultaneous types of aircraft,” said Justin Sanchez, Director of the Biological Technologies Office at DARPA, at the Agency’s 60th-anniversary event in Maryland on September 6, 2018.
“The signals from those aircraft can be delivered directly back to the brain so that the brain of that user [or pilot] can also perceive the environment. It’s taken a number of years to try and figure this out,” he added.
Reading into the chain of events, it appears that DARPA wants soldiers communicating with autonomous drones in real-time on the battlefield. The soldiers wouldn’t necessarily have to manually operate the drones, but rather the communication would be for coordination and intelligence purposes.
Read More: A program to keep Prometheus out of machine learning systems
Add autonomous kamikaze drones into the mix and you get a cocktail of deep ethical concerns that may not go down smoothly with human rights advocates.
Anything with the power to kill will obviously pose ethical and human rights concerns. In some form or another robotic features in weapons systems have been used in military operations going back to at least the first World War.
According to the Council on Foreign Relations (CFR), former US President Barack Obama conducted over 540 drone strikes for “counterterrorism” purposes, but the report does not mention kamikaze drones specifically.
Hugh Gusterson, Professor of Anthropology and International Affairs at George Washington University’s Anthropology Department, wrote in his study on “Drone Warfare:”
“Drone strikes in Yemen, Somalia and Pakistan take the form of pure drone warfare. Here drones operate outside generally recognized warzones and hunt their prey alone, or in conjunction with small networks of spies or special forces on the ground who help develop targets. Although the leaders of Yemen and Pakistan gave quiet permission for many of these strikes, they denied that they had done so and even, on occasion, publicly condemned them.”
“In 2016, the Obama Administration finally released an official estimate that drone strikes in Somalia, Yemen, Pakistan, and Libya had killed between 64 and 116 civilians, though this claim was met with widespread skepticism by journalists who had been covering the drone campaign.”
Needless to say, drones are a coveted arsenal among militaries of the world, but perhaps the gravest concern among human rights advocates is the idea of drones being used as Autonomous Weapons Systems (AWS) — that is that they become fully automatic in a “Fire and Forget” scenario.
Right now, UVision’s Hero line of kamikaze drones are “semi-autonomous.” A human operator can make the ultimate call of whether to kill or not. A fully autonomous kamikaze drone would be so complex that a human operator would not be able to understand what the drone is calculating.
Read More: ‘We paid little attention to vulnerabilities in machine learning platforms’: DARPA
In her Master thesis, “Autonomous weapon systems that decide whom to kill,” Erika Steinholt Mortensen of the Arctic University of Norway gives a scenario of a potential AWS drone strike going wrong:
The AWS arrives at the destination, and starts to seek after the target. As the AWS approaches it, you realize that the interface is too comprehensive for you to fully understand what is going on in its software. Consequently, the operation may fail because of one of the following scenarios:
Read More: ‘AI will represent paradigm shift in warfare’: WEF predicts an Ender’s Game-like future
According to the World Economic Forum (WEF) Global Risk Report 2017, we will see a shift towards AWS and that their attacks “will be based on swarming, in which an adversary’s defense system is overwhelmed with a concentrated barrage of coordinated simultaneous attacks.”
RAND warns that “swarming may also be welcomed by many actors around the world as a way to reshape global competition and assemble social forces to overturn the existing order of world power led by the United States.”
Throughout history, swarm tactics have been used by everyone from Genghis Khan and Alexander the Great and in major military operations up until the present day. The difference between the past and present is the rapid advancement in technology.
“For American political and military leaders, understanding the rise of swarming should lead to reappraisals of both our mass-oriented, industrial-age way of war, and of the statist focus of our diplomacy. In the future, we shall have to learn to fight nimbly against an array of armed adversaries who will likely do all they can to avoid facing us head-on in battle,” according to RAND.
Every now and then, I stumble upon posts such as these here and there: And,…
Winter(Physics) is Coming It now looks like Large Language Models running on the GPT technology…
Latin America’s tech industry is booming, with innovative new startups popping up across the region.…
The Global Initiative for Information Integrity on Climate Change claims to 'safeguard those reporting on…
In the late 19th Century, physicians began inserting hollow tubes equipped with small lights into…
This year wasn’t exactly what the video gaming industry expected — it declined by 7%…
View Comments