Twitter
Advertisement

From Predators to Replicators: Evolution of drones and ethical side of autonomous warfare

Test flights for this programme are scheduled for later this year.

Latest News
From Predators to Replicators: Evolution of drones and ethical side of autonomous warfare
Photo: Reuters
FacebookTwitterWhatsappLinkedin

TRENDING NOW

Concerned about a possible conflict with China, the Pentagon is looking to use a new type of weapon to counter the larger People’s Liberation Army: a large number of drones.

In August 2023, the US Defense Department introduced the Replicator, a programme aimed at deploying thousands of affordable, potentially AI-driven systems. These include self-piloting ships, large robotic aircraft and swarms of small kamikaze drones. The goal is to use these low-cost machines in large numbers to overpower the Chinese forces.

Earlier this month, two key Pentagon offices revealed that they had selected four innovative weapons manufacturers for a new drones programme. The new robot planes will represent a change from the Defense Department’s old drones, which the Defense Innovation Unit (DIU) describes as “too complex” and “difficult to build”. The programme selected four contractors out of more than 100 applicants: Anduril Industries, Integrated Solutions for Systems, Leidos Dynetics and Zone 5 Technologies.

Test flights for this programme are scheduled for later this year.

The companies working on the ‘Enterprise Test Vehicle’ (ETV) must show that their drone can fly over 500 miles (about 805 kilometres) and carry a ‘kinetic payload’. The emphasis is on creating weapons that are affordable, quick to produce and modular. This was highlighted in a 2023 request for proposals (RFPs) and a recent announcement from the Air Force Armament Directorate and the DIU, the Pentagon’s branch for speeding up new technology adoption. Many experts believe the ETV initiative is connected to the Replicator programme.

Selecting ‘Loitering Munitions’

The aim, according to information gathered from The Intercept, is to select one or more types of drones, often called kamikaze drones, or ‘loitering munitions’, that can be mass-produced quickly when needed. It is not clear if all prototypes will be kamikaze drones, which are designed to fly into targets and self-destruct, after delivering a payload. Unlike traditional, reusable drones, these are single-use and intended for specific missions where precision and direct impact are critical. They are often used to target high-value assets and can be launched from various platforms, including ground and air.

The latest drones will, probably, be smaller than the MQ-1 Predator and MQ-9 Reaper drones that were widely used as missile-firing weapons during the early war on terror. For the past 25 years, un-crewed Predators and Reapers, controlled by military personnel from the ground, have been responsible for civilian deaths in such countries as Afghanistan, Libya, Syria and Yemen.

The new drones will be more versatile and include a version that can be dropped or launched from cargo planes. But the obvious risk is that these drones could be used more widely, increasing concerns about potential harm to civilians. Experts are concerned that mass-producing new, low-cost, deadly drones will result in more civilian deaths.

Priyanka Motaparthy, director of the Project on Counterterrorism, Armed Conflict and Human Rights at Columbia Law School’s Human Rights Institute, told The Intercept that the main danger is whether these drones could be used on a larger scale, which raises concerns about harm to civilians. It needs to be determined if these drones could endanger civilians and understand how the risks will be evaluated.

Crewed Drones: Tragic Mistakes

US drones have, traditionally, needed human operators to carry out deadly strikes, and this has often led to tragic mistakes. However, progress in artificial intelligence is making it more likely that unmanned planes in various countries’ militaries will be able to choose their own targets. Due to Russia’s electronic jamming in the Ukraine war, there has been a move towards autonomous drones. These drones can stay on target and complete their mission even if they lose contact with their human operators.

Last year, the Ukrainian drone company, Saker, announced that its fully autonomous Saker Scout drone was using AI to identify and attack 64 different types of Russian military targets. Ukraine has been using up to 10,000 inexpensive drones each month to counter the Russian military’s larger forces. Pentagon officials view Ukraine’s drone strategy as an example for countering China’s larger military. Deputy Secretary of Defense Kathleen Hicks, who is involved in the programme, stated that the Replicator was built to help them counter China’s biggest advantage, which was their huge numbers.

Last month, the Pentagon announced plans to speed up deploying its Switchblade-600, an AeroVironment-built kamikaze anti-armour drone. This drone, which has been widely used in Ukraine, flies overhead until it locates a target. Adm. Samuel Paparo, commander of the Indo-Pacific Command (INDOPACOM), stated that this was an essential move to provide the necessary capabilities at the required scale and speed.

The Switchblade-600 is a versatile, loitering munition that can fly as far as 40 kilometres for more than 40 minutes. Weighing approximately 54 pounds, it is equipped with a high-explosive warhead and advanced optics. This portable system offers precision strikes with minimal collateral damage, enhancing battlefield effectiveness. The ‘600’ in Switchblade-600 refers to its capability to carry a 600-series warhead, which signifies a larger and more powerful payload compared to its predecessor, the Switchblade-300. This makes it suitable for targeting and destroying more substantial and fortified threats on the battlefield.

At a recent NATO conference, Alex Bornyakov, Ukraine’s Deputy Minister for Digital Transformation, spoke of using AI and a network of sound sensors to target a Russian ‘war criminal’ with an autonomous drone. According to him, computer vision is effective and had already been demonstrated.

Debate over Autonomous Weapons

The use of autonomous weapons has been debated for more than ten years. Since 2013, the ‘Stop Killer Robots’ campaign, which now includes over 250 organizations, such as Amnesty International and Human Rights Watch, has been pushing for a legally binding treaty to ban autonomous weapons. The ‘Stop Killer Robots’ campaign aims to prevent the development and use of fully autonomous weapons that can make decisions to kill without human intervention and frame international laws to ban the use of such technology.

The Pentagon’s rules from last year say that fully and semi-autonomous weapons must follow the laws of war and the Department of Defense’s AI Ethical Principles. The guidelines released in 2020 only say that people should use “good judgement and care” when creating and using AI. However, “care” has never been America’s strong point. For the past 100 years, the US military has carried out airstrikes that often harm civilians. They have mistakenly targeted ordinary people, ignored reports of civilian casualties, dismissed these deaths as “unfortunate, but inevitable”, and have not taken steps to prevent similar incidents or hold its own soldiers responsible.

In the first 20 years of the war on terror, the US carried out over 91,000 airstrikes in seven major conflict zones: Afghanistan, Iraq, Libya, Pakistan, Somalia, Syria and Yemen. According to a 2021 report by Airwars, a UK-based group that tracks airstrikes, these attacks killed up to 48,308 civilians.

Many Missed Deadlines for Reports

The Defense Department often fails to meet its deadline for reporting the number of civilians killed in US operations each year, which is the most basic level of accountability. Its report for 2022 was released this April, a year late.

The Pentagon defaulted on its May 1, 2023, deadline for submitting that year’s report to Congress. Last month, The Intercept asked Pentagon spokeswoman dealing with civilian harm issues Lisa Lawrence about the delay in the 2023 report and the expected date of its release. According to The Intercept, a return receipt shows that she read the e-mail but did not respond.

At least one of the new drone models will be mass-produced for the military, depending upon the evaluations by Special Operations Command, INDOPACOM, and other groups. According to the DIU, the selected winner/s of the competition will be tasked with advancing the development of a production model that can be manufactured quickly and at scale.

Motaparthy is concerned about increasing drone usage without proper accountability. She points out that the Pentagon still does not have a reliable system for tracking civilian harm caused by past US military actions. She is worried that, with the potential increase in the use of drones, safety measures may be overlooked.

(The author of this article is a Defence, Aerospace & Political Analyst based in Bengaluru. He is also Director of ADD Engineering Components, India, Pvt. Ltd, a subsidiary of ADD Engineering GmbH, Germany. You can reach him at: girishlinganna@gmail.com)

Disclaimer: The views expressed in this article are of the author only.

 

Find your daily dose of news & explainers in your WhatsApp. Stay updated, Stay informed-  Follow DNA on WhatsApp.
Advertisement

Live tv

Advertisement
Advertisement