Servet Günerigök
Bayraktar TB2 is ‘utilitarian and reliable—qualities reminiscent’ of AK-47 rifle that changed warfare in 20th century: WSJ
Armed low-cost drones made by Turkey are reshaping battlefields and geopolitics, The Wall Street Journal reported Thursday.
The report said smaller militaries around the world are deploying inexpensive missile-equipped drones against armored enemies, calling it a “successful” battlefield tactic.
The Turkish drones are built with affordable digital technology, which the report said wrecked tanks and other armored vehicles as well as air-defense systems in wars in Syria, Libya and Azerbaijan.
“These drones point to future warfare being shaped as much by cheap but effective fighting vehicles as expensive ones with the most advanced technology,” it said.
Last July during a virtual gathering of the Air and Space Power Conference, Britain’s Defense Secretary Ben Wallace stressed the “game-changing” role of Turkish drones in modern warfare in the Middle East and North Africa.
“We need to look at the lessons of others. Look how Turkey has been operating in Libya, where it has used Bayraktar TB2 UAVs since mid-2019,” said Wallace at the time.
The Bayraktar TB2 is a Tactical Armed/UAV System developed and manufactured by drone producer Baykar.
According to the company, currently, 160 Bayraktar platforms are at the service of Turkey, Qatar, Ukraine and Azerbaijan. Poland said last month it would buy 24 TB2 drones.
In the report, the Bayraktar TB2 is compared with the American MQ-9. The TB2 is lightly armed with four laser-guided missiles and its radio-controlled apparatus limits its basic range to around 200 miles, roughly a fifth of the ground the MQ-9 can cover, it said.
“Yet it is utilitarian and reliable—qualities reminiscent of the Soviet Kalashnikov AK-47 rifle that changed warfare in the 20th century.
“A set of six Bayraktar TB2 drones, ground units, and other essential operations equipment costs tens of millions of dollars, rather than hundreds of millions for the MQ-9,” said the report.
The Bayraktar TB2 became known internationally first in the Syrian war early last year after the Turkish military launched Operation Spring Shield in northern Syria, backed by electronic warfare systems, ground troops, artillery and warplanes.
The report also highlighted the role of the drones in the Libyan civil war, which it said “helped turn the tide” in the war last spring. Turkey backed the Tripoli-based government against by renegade commander Khalifa Haftar and his forces.
“Improved drone tactics honed in Syria provided the upper hand against Russian-made surface-to-air missile systems known as Pantsir, handing the Tripoli government aerial supremacy. By June, Mr. Haftar’s forces retreated from Tripoli,” said the report.
*********
AI drone may have ‘hunted down’ and killed soldiers in Libya with no human input
Charles Q. Choi
A UN report suggests that AI drones attacked human targets without any humans consulted prior to the strike.
At least one autonomous drone operated by artificial intelligence (AI) may have killed people for the first time last year in Libya, without any humans consulted prior to the attack, according to a U.N. report.
According to a March report from the U.N. Panel of Experts on Libya, lethal autonomous aircraft may have “hunted down and remotely engaged” soldiers and convoys fighting for Libyan general Khalifa Haftar. It’s not clear who exactly deployed these killer robots, though remnants of one such machine found in Libya came from the Kargu-2 drone, which is made by Turkish military contractor STM.
“Autonomous weapons as a concept are not all that new. Landmines are essentially simple autonomous weapons — you step on them and they blow up,” Zachary Kallenborn, a research affiliate with the National Consortium for the Study of Terrorism and Responses to Terrorism at the University of Maryland, College Park, told Live Science. “What’s potentially new here are autonomous weapons incorporating artificial intelligence,” added Kallenborn, who is with the consortium’s unconventional weapons and technology division.
These attacks may have taken place in March 2020, during a time when the U.N.-recognized Government of National Accord drove Haftar’s forces from Libya’s capital, Tripoli.
“The lethal autonomous weapons systems were programmed to attack targets without requiring data connectivity between the operator and the munition: in effect, a true ‘fire, forget and find’ capability,” the report noted.
The Kargu-2 is a four-rotor drone that STM describes as a “loitering munition system.” Once its AI software has identified targets, it can autonomously fly at them at a maximum speed of about 45 mph (72 km/h) and explode with either an armor-piercing warhead or one meant to kill non-armor-wearing personnel. Though the drones were programmed to attack if they lost connection to a human operator, the report doesn’t explicitly say that this happened.
It’s also not clear whether Turkey directly operated the drone or just sold it to the Government of National Accord, but either way, it defies a U.N. arms embargo, which prevents all member states, such as Turkey, and their citizens from supplying weapons to Libya, the report added. The weapons ban was imposed after Libya’s violent crackdown on protesters in 2011, which sparked a civil war and the country’s ongoing crisis.
Haftar’s forces “were neither trained nor motivated to defend against the effective use of this new technology and usually retreated in disarray,” the report noted. “Once in retreat, they were subject to continual harassment from the unmanned combat aerial vehicles and lethal autonomous weapons systems.”
Though the report does not unequivocally state that these autonomous drones killed anyone in Libya, it does strongly imply it, Kallenborn wrote in a report in the Bulletin of the Atomic Scientists. For example, the U.N. noted that lethal autonomous weapons systems contributed to “significant casualties” among the crews of Haftar’s forces’ surface-to-air missile systems, he wrote.
Although many, including Stephen Hawking and Elon Musk, have called for bans on autonomous weapons, “such campaigns have typically assumed these weapons are still in the future,” Kallenborn said. “If they’re on the battlefield now, that means discussions about bans and ethical concerns need to focus on the present.”
“I’m not surprised this has happened now at all,” Kallenborn added. “The reality is that creating autonomous weapons nowadays is not all that complicated.”
As dangerous as these weapons are, “they are not like the movie ‘Terminator,'” Kallenborn said. “They have nowhere near that level of sophistication, which might be decades away.”
Still, the fears over autonomous weapons are part of larger concerns that scientists and others have raised over the field of AI.
“Current AIs are typically heavily dependent on what data they are trained on,” Kallenborn said. “A machine usually doesn’t know what a cat or dog is unless it’s fed images of cats and dogs and you tell it which ones are cats and dogs. So there’s a significant risk of error in those situations if that training data is incomplete, or things are not as simple as they seem. A soldier might wear camo, or a farmer a rake, but a farmer might wear camo too, and a soldier might use a rake to knock over a gun turret.”
AI software also often lacks what humans would think of as common sense. For instance, computer scientists have found that changing a single pixel on an image can lead an AI program to conclude it was a completely different image, Kallenborn said.
“If it’s that easy to mess these systems up, what happens on a battlefield when people are moving around in a complex environment?” he said.
Kallenborn noted that there are at least nine key questions when it comes to analyzing the risks autonomous weapons might pose.
- How does an autonomous weapon decide who to kill? The decision-making processes of AI programs are often a mystery, Kallenborn said.
- What role do humans have? In situations where people monitor what decisions a drone makes, they can make corrections before potentially lethal mistakes happen. However, human operators may ultimately trust these machines to the point of catastrophe, as several accidents with autonomous cars have demonstrated, Kallenborn said.
- What payload does an autonomous weapon have? The risks these weapons pose escalate with the number of people they can kill.
- What is the weapon targeting? AI can err when it comes to recognizing potential targets.
- How many autonomous weapons are being used? More autonomous weapons means more opportunities for failure, and militaries are increasingly exploring the possibility of deploying swarms of drones on the battlefield. “The Indian army has announced it is developing a 1,000-drone swarm, working completely autonomously,” Kallenborn said.
- Where are autonomous weapons being used? The risk that drones pose rises with the population of the area in which they are deployed and the confusing clutter in which they travel. Weather can make a difference, too — one study found that an AI system used to detect obstacles on roads was 92% accurate in clear weather but 58% accurate in foggy weather, Kallenborn said.
- How well-tested is the weapon? An autonomous weapon tested in a rainy climate such as Seattle might fare differently in the heat of Saudi Arabia, Kallenborn noted.
- How have adversaries adapted? For example, AI company OpenAI developed a system that could classify an apple as a Granny Smith with 85.6% confidence, but if someone taped a piece of paper that said “iPod” on the fruit, it concluded with 99.7% confidence that the apple was an iPod, Kallenborn said. Adversaries may find similar ways to fool autonomous weapons.
- How widely available are autonomous weapons? If widely available, they may be deployed where they should not be — as the U.N. report noted, Turkey should not have brought the Kargu-2 drone into Libya.
“What I find most significant about the future of autonomous weapons are the risks that come with swarms. In my view, autonomous drone swarms that can kill people are potentially weapons of mass destruction,” Kallenborn said.
All in all, “the reality is, what happened in Libya is just the start,” Kallenborn said. “The potential for proliferation of these weapons is quite significant.”
_________________