The technological war has already reached us

A report published by the United Nations Security Council suggests that for the first time a military drone could have autonomously attacked humans


Although the use of autonomous drones seems like something out of an action movie, the reality is that they could be used as a lethal weapon in wars and the cost of using them could be too great for society, since, despite having high technology, they are prone to irreparable mistakes.


Despite international efforts to ban autonomous drones, a report published by the United Nations Security Council, which covers from October 2019 to January 2021, assures that these were already used during the Libyan Civil War. According to the document, the drone used was the Kargu-2 produced by Defense Technologies Engineering and Trade Inc.


The first recorded case of this attack took place in March 2020, where Haftar Affiliated Forces describe how they were attacked with drones by the Government of National Accord to gain an advantage on the battlefield. In the more than 500-page document, it can be read: “The retreating logistical convoys and Haftar Affiliated Forces were subsequently pursued and attacked from a distance by unmanned combat aerial vehicles or autonomous weapons system lethal such as the STM Kargu-2 and other loitering munitions. The lethal autonomous weapon systems were programmed to strike targets without requiring data connectivity between the operator and the ammunition: in effect, a true 'shoot, forget and find' capability. "


It is later noted that, due to the use of these drones as weapons, the Haftar Affiliated Forces suffered significant casualties as "they did not have any real protection against remote air attacks." This is the first time that the use of autonomous drones has been detected, which, it is said, were supplied by Turkey to the Libyan forces, which implies a violation of the UN arms embargo on Libya, where it is stated that no state member can provide arms to this country.


Kargu-2: Why are they a threat?


The Kargu-2 from the Turkish arms company Defense Technologies Engineering and Trade Inc., or better known as STM, first introduced in 2017 a rotary-wing attack drone ammunition system, which was designed, according to the company, to be used in asymmetric warfare or counterterrorism operations.


It can attack static or moving targets through its processing capabilities, as well as its machine learning algorithms. Its weight is approximately 7 kg and it can stay in the air for 30 minutes, as well as fly at a speed of 145 km / h although its new generation of drones have been improved so they are more efficient.


These drones can be operated manually or automatically through a coordinate system, where the shooter chooses the place where the drone will travel, and once there he will execute the actions for which it was sent. In the military, this process is known as "shoot and forget," which means that once the shooter loads the drone, the drone will do all the work while the shooters can relocate or perform other activities.


The concern of the experts is due to the fact that the Artificial Intelligence with which it operates could fail and cause irreparable havoc since these weapons are the ones that decide who lives or dies, faced with this, there is the question of how drones could distinguish between combatants and civil, which has involved a series of complex ethical conversations.


For this reason, the way in which these autonomous lethal weapons can be prohibited has been sought, however, world powers such as Russia, the United States, South Korea, Israel, and Australia have blocked negotiations with the UN to prohibit them.


What do the experts say?


Famous people have taken a position on the use of autonomous weapons. During a Daily Telegraph interview, Brad Smith, president of Microsoft, stated that killer robots are "unstoppable" and it takes many risks to use them like nuclear weapons, so he considers that a new Geneva Convention adapted to the technological world is necessary to demand norms in the use of these weapons that protect civilians and soldiers.


For his part, Elon Musk, CEO of Tesla believes that artificial intelligence can be revealed against us, which is why he has spoken out against the use of autonomous weapons. He along with 116 experts signed a letter in 2017 asking the UN to ban these weapons and the use of autonomous killer robots.


Lethal autonomous weapons threaten to become the third revolution in the war. Once developed, they will allow armed conflicts to be fought on a larger scale than ever before, and on timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways. We don't have much time to act. Once this Pandora's box is opened, it will be difficult to close it ”, the letter concludes.


Also, organizations have been created that are campaigning against these weapons, such as the Campaign to Stop Killer Robots, which is a coalition of non-governmental organizations (NGOs), which has worked since 2012 to completely ban autonomous weapons.

0 views0 comments

Recent Posts

See All