Thursday, November 21, 2024

How drone systems are changing warfare

By Nana Brink

At first glance, the YouTube presentation by DARPA looks a lot like an amateur video. Young men – some in camouflage – holding tablets are flying drones. About the size of a man’s palm, they whirr around like a swarm of birds – changing direction on a dime, suddenly dispersing, then reconvening. Cut. The clip then shows the target area, a square. Cut. Hundreds of black dots move onto the square. Cut. The young men look deadly serious. The dots then dissipate. The men smile.

DARPA stands for Defense Advanced Research Projects Agency, and the project the young men in the Pentagon research department are showcasing is the “next generation in autonomous warfare,” according to the usually well-informed journal Jane’s International Defence Review in its title article from November 2019. The project – Offensive Swarm-Enabled Tactics (OFFSET) – is reportedly one of more than a hundred programs in the civil and military sectors of the US that are working under great time pressure on the development and refinement of drone-swarm technology. DARPA alone has allocated around $2 billion “to develop the next wave of artificial intelligence (AI) technologies.”

This next wave is subject to the highest level of secrecy. The competition over autonomous weapons systems has been underway for quite some time. Drone experts such as Paul Scharre estimate that it will only be a few years before autonomous weapons – also called LAWS, or Lethal Autonomous Weapons Systems – become reality. Critics often use the term “killer robots” to refer to drones as well as unmanned submarines and aircraft.

The advent of AI in weapons technology about a decade ago has completely altered the future of warfare. The cutting edge of this technology has long surpassed self-propelled robots and flying objects piloted remotely. The vanguard now focuses on systems that act fully independently. LAWS constitute the third generation in warfare technology: After the invention of gun powder and the atomic bomb comes the ability for humans to place the decision of who should live and who should die into the hands of autonomously operated machines.

The question is: Are LAWS already available for deployment? Or is the DARPA video still a bit of science fiction? The answer is both. Or, in the words of Jane’s author Andrew White: “The ability to conduct unmanned swarming operations from the air, land and sea continues to gather pace as armed forces seek advanced autonomous technologies to overcome adversaries.”

The development of drones has recently made one fact abundantly clear: The deployment of drones is no longer the sole domain of the great powers. Non-state actors such as terrorist groups and militias are now making use of these “Kalashnikovs of the air,” which are increasingly unleashed in swarms of “killer robots.”

What transpired on Jan. 6, 2018, at the Russian-operated Khmeimim Air Base in Syria was not all that different from the content of the DARPA video. In the early morning hours, a swarm of drones suddenly appeared on Russian Air Force radars. Two days later the Russian defense minister announced that seven drones had been shot down and the remaining six brought under control.

“Islamic extremists” are believed to have planned the attack on the air base in the west of the country. Yet they must have had help. Russian President Vladimir Putin has steered suspicions toward the US, as the technology “could only come from a country that commands a high degree of technological prowess.”

The drones did not reach their target, but Russia’s defense strategy overshot its goal as well. Just a few days after the attack, resourceful reporters from The Daily Beast published an article speculating on the origin of the drones. Examples of projectiles almost identical in construction – they were roughly two meters wide, are controlled via GPS and can be loaded with explosives – have surfaced on the social media platform known as Telegram.

This messaging app, also capable of encrypted correspondence, is popular among IS supporters as well as IS sympathizers and terrorist groups. The drones, which can also be seen in photographs held by the Russian defense ministry, appear to be rather simply constructed. The explosives were fixed to the body of the drones using adhesive tape. These “killer bees” seem to have been cobbled together in a garage from an off-the-shelf drone kit for a few thousand dollars.

But it was not the technology that alarmed military experts; it was rather the swarming. Never before had so many drones been deployed in concert. And the approach used in the attack on Khmeimim Air Base appears to have caught on; Sept. 14, 2019, saw a similar attack on two Saudi Arabian oil processing facilities in Abqaiq and Khurais. Houthi rebels from Yemen claimed responsibility for that strike; their weapons of choice were dozens of “Kamikaze drones” that honed in on their targets and deliberately crashed there with precision.

A UN report by the Panel of Experts on Yemen in January 2019 unambiguously showed how drones are manufactured in Iran using Chinese and German motors. These particular drones – referred to as UAV-X by UN weapons experts – can fly 155 mph for over 900 miles. Depending on wind conditions and general capacity, they can carry up to 40 pounds of explosives. And they can also fly en masse.

The swarm itself is the actual weapon. But what makes it so effective that great powers with massive defense industries, such as the US, China, Russia, the UK, Israel and South Korea, are sinking hundreds of millions of dollars into the development of their LAWS?

A glance at the sky can help explain. A swarm of birds seems to rely on instinct to coordinate its movements – a self-organizing system indeed. And one with several advantages: All members of the swarm – right up to the lead bird – appear equal. They fly without impeding any fellow flyers. Even if parts of the swarm fall away, the remaining mass carries on its trajectory. Put in military terms: While a single aircraft can be shot out of the sky, eliminating an entire swarm of aircraft is far more difficult.

But how far are manufacturers away from completing the development of autonomous drone swarms? Despite the effectiveness of large drones such as the American MQ-9 Reaper, which can fly 300 mph and in January 2020 was equipped with Hellfire missiles for the assassination of Iranian General Qassim Soleimani, the trend is toward mini-drones such as the Perdix (which is also the name of a genus of partridges).

Developed by the Pentagon’s Strategic Capabilities Office (SCO), the merely 300-gram 3D-printed Perdix drones proved in 2016 that flying in swarms is possible. According to their chief of development, William Roper: “Due to the complex nature of combat, Perdix are not pre-programmed synchronized individuals, they are a collective organism, sharing one distributed brain for decision-making and adapting to each other like swarms in nature.” The SCO claims the small drones are not “Kamikaze robots,” but serve rather as instuments of  reconnaissance. Like their namesakes in nature, the Perdix drones fly under the radar.

However, it does not require too much creativity to imagine how the allegedly harmless “surveillance partridges” could mutate into “killer bees” with explosives under their wings. The Future of Life Institute activist Stuart Russell’s viral 2017 video Slaughterbots shows what this could look like. The 8-minute drama depicts swarms of mini-drones that, by way of AI, become killer machines that use pre-programmed information to recognize and exterminate political opponents and student protesters.

Future of Life, the self-proclaimed independent research institute in Boston – which has boasted Stephen Hawking and Tesla founder Elon Musk as members – is among the most prominent opponents of the further development of LAWS. It is demanding not only a ban on autonomous weaponry, but indeed the end of collaboration between the military and the private sector.

In a 2018 protest note, Google employees came out against this cooperation: “We believe that Google should not be in the business of war.” And in July 2018, the Future of Life Institute issued a public appeal, signed by leading scientists and businesses in the field of AI, with an unambiguous message: “We call upon governments and government leaders to create a future with strong international norms, regulations and laws against lethal autonomous weapons.”

While UN Secretary-General António Guterres has deemed autonomous weapons “politically unacceptable and morally repugnant” and calls for their ban under international law, the probability of a treaty in the near future is dismally low. The issue of arms control in our current political climate is a nonstarter. Since 2014, the 125 signatory states of the UN Convention on Certain Conventional Weapons (CCW) have been debating drone technologies in Geneva. But in August of last year, government experts were only able to agree to extend the ongoing talks for another two years. The main point of dispute is the regulation, championed by experts on international law, that a human must always have ultimate control of the operation of a weapons system. This is known as the principle of “meaningful human control.”

Although several non-aligned states are pleading for a ban, large countries like Russia and China block any and all attempts at a ban in order to continue the ongoing development of their LAWS. The US is also generally against such a UN provision and makes no secret of its intent to continue using AI in weapons technology without any restrictions. This stance was justified in its Summary of the 2018 Department of Defense Artificial Intelligence Strategy, noting that America’s strategic competitive advantage was at stake: “Other nations, particularly China and Russia, are making significant investments in AI for military purposes. The United States must adopt AI to maintain its strategic position.”

Although Germany, which along with France advocates a “conciliatory solution,” is targeting a LAWS prohibition as part of a coalition agreement, as John Reyels of the foreign ministry’s arms control division stressed at a conference hosted by the Green-Party-affiliated Heinrich Böll Foundation, “the optimal case would be a ban, but this is not attainable.” However, neither Germany nor France is currently urging a ban to achieve what is called a “minimal consensus.” What form this will ultimately take remains to be seen.

All signs point to an intensified arms race, with no end in sight. Or, as Stephen Hawking wrote in his posthumously published book Brief Answers to the Big Questions: “Whereas the short-term impact of AI depends on who controls it, the long-term impact depends on whether it can be controlled at all.”

 

A THE HISTORY OF DRONE OPERATIONS

 

  • At the beginning of the 1990s, the US deploys the first UAVs (unmanned aerial vehicle) for surveillance during Operation Desert Storm and the Yugoslav Wars.
  • After the attacks of Sept. 11, 2001, the armed drone Predator is deployed in Afghanistan to pursue Taliban leaders.
  • Under US President Barack Obama, targeted assassinations using drones becomes US policy. The CIA carries out drone strikes in Pakistan, Somalia and Yemen using a “kill list.” The Bureau of Investigative Journalism (TBIJ) documents 563 drone attacks between 2008 and 2016 in the three countries, none of which was or is at war with the US. Civilian casualties from the strikes are estimated to be between 384 and 807. According to US government data, between 2,400 and 2,600 militants were killed in these attacks. More precise totals are difficult to establish.
  • Since 2014, restrictions on LAWS have been under negotiation within the framework of the Convention on Certain Conventional Weapons (CCW). Talks have thus far been unsuccessful.
  • Since 2015, terrorist groups have been deploying drones (in Syria, Lebanon and Yemen).
  • In 2019, drones strike Saudi Arabian oil fields. Houthi rebels claim responsibility for the attack.
  • On Jan. 3, 2020, the US uses a drone to kill Iranian General Qassim Soleimani in Iraq. For the first time, a military leader is killed on foreign soil in country that is not his own.

NANA BRINK
is a Berlin-based freelance journalist for various newspapers and a radio reporter and moderator (Deutschlandfunk). She focuses on global politics and security policy. She is a member of WIIS.de (Women in International Security Deutschland).

Security Briefs