In July 2019, unidentified drones swarmed the US Navy destroyers, triggering an alert. In May of 2021, Israel allowed the use of drone swarms to locate, identify, and attack Hamas militants, in what is likely the first-ever use of drone swarms in combat.
Last month, we had reported that Israel deployed a semi-autonomous robot during the recent Gaza conflict. Carrying a machine gun, this robot named Jaguar, was capable of driving to a designated location, returning fire, and even self-destructing when compromised. However, the robot needed a human operator to initiate the firing from the machine gun.
A fully autonomous drone swarm is a different level of technology altogether. It is a networked entity that is not controlled by human operators at all. Operated by artificial intelligence (AI), it can continue its mission, even if loses some drones during its mission. The machine learning system is fed with data sourced from satellites, other reconnaissance drones, and aerial vehicles, as well as intel collected by ground units.
Over the last couple of years, the Israeli Defense Forces (IDF) have been using AI and supercomputers to identify locations of Hamas activity and plan strikes to remove any strategic advantage. According to the IDF, this has allowed them to comb through the gathered data much faster and reduced the time needed for their missions. Unit 8200 is Israel’s Intelligence Corps that develops algorithms using geographical, signal, and human intelligence data to identify these strategic strike points. IDF has not released the specifics of the autonomous swarm attack.
Israel’s technology superiority has definitely helped them in the recent conflict that reportedly saw over 4400 rockets fired into Israel. The Iron Dome thwarted over 90 percent of them mid-air.
In March 2020, Libya used an autonomous unmanned vehicle to carry out attacks on militant Haftar Affiliated Forces. But not everybody is happy about technology-led warfare.
Human Rights Watch runs a campaign against fully autonomous weapons, which it calls “killer robots.” According to its website, “There are serious doubts that fully autonomous weapons would be capable of meeting international humanitarian law standards, including the rules of distinction, proportionality, and military necessity, while they would threaten the fundamental right to life and principle of human dignity.“
Its campaign to Stop Killer Robots is aimed at countries like China, Israel, South Korea, Russia, the United Kingdom, and the United States who are engaged in the development, production, and use of fully autonomous weapons.