Autonomous Weapon Systems: The Urgent Need for Regulation
This opinion piece is part of the ‘The Global Governance Futures 2025’ programme which brings together young professionals to look ahead and recommend ways to address global challenges.
There are many in the world who do not support the development and use of self-determining weapons endowed with artificial intelligence. The arguments made by these critics are based on ethical, legal, moral and policy concerns. I cannot claim to be able to make a case for or against all their contentions, but as a lawyer, I am prepared to say that it is possible to have autonomous weapon systems (AWS) comply with international law, particularly the law of armed conflict. And if these technologies are indeed the future of our conflict zones, we need to start regulating their development, production and use now.
AWS, or killer robots as they are popularly known, are understood to be weapon systems capable of selecting targets and using force without any human intervention beyond the programming phase. They are different from automated or semi-autonomous weapon systems which execute pre-programmed actions in relatively controlled environments.
As we investigate autonomous weaponized technologies for the far-reaching repercussions they are expected to have on the well-being of populations and the world order, we realize that most governments, especially the ones that do not (yet) possess the technical expertise, do not have the advantage of experience or knowledge to advocate for responsible laws and policies for regulation of use and possession. Yet nearly all are engaging vigorously in discussions on AWS. As recently as November 2014, nations party to the United Nations Convention on Certain Conventional Weapons discussed the need for deliberations on lethal autonomous weapon systems that raise critical issues of legal and ethical consideration. This is because, regardless of their prevailing standards of associated expertise, countries realize that without regulating the production and use of such technologies, they are a time bomb waiting to explode and that will change the character of warfare for good. As such, demands to ban the development, production and use of AWS are also becoming more resolute.
Although as of now, we have no evidence that there are any fully AWS in use or even in existence, with rapid technological advances in the field and active concentration of interest and funds in the area, the possibility of their coming into being is very real. Given that simple forms of AWS, including active protection systems for ground vehicles, are already in existence and that major powers are actively funding the development of systems such as the United States’ X-47B unmanned combat air system, China’s Lijian (Sharp Sword) and Russia’s Mikoyan-Skat, all of which possess varying degrees of autonomy, it is plausible to imagine a future where AWS are commonly employed in warfare.
And why not? Despite an enduring body of law that defines the law of armed conflict, there are frequent violations of international humanitarian law committed by human beings. Advocates of AWS propose that if robots could be programmed to respect international humanitarian law, humanity would probably have an improved chance to prevail in war zones where survival instincts and urges to kill and rape often get the better of soldiers. Moreover, it is very unlikely that states will consider banning AWS in combat at a time when the potential of such weaponry has yet to be fully comprehended. Critics, on the other hand, argue that by advocating for AWS, we are exposing ourselves to the risks of proliferation, of such weapons falling into the wrong hands, and of other threats that we are not even able to envisage at the moment. And that international humanitarian law in its current form, in all likelihood, is not sufficient to tide us over increasingly complex, futuristic situations.
As machine learning improves, I can imagine an incremental application of autonomous systems in different contexts, including as tools of conducting and controlling wars. Further, as we have entered an age where the state’s needs for protecting its territory and nationals are increasing manifold, I suspect that the use of autonomous systems in zones of conflict is very unlikely to be banned.
The time, then, is ripe to propose a new legal framework that regulates not just the use, but also the production, accumulation and distribution of AWS. In the current environment of uncertainty and imbalance of interests and knowledge, there lies a tremendous opportunity for the law to either level the playing field or perpetuate asymmetry. As long as we keep in sight the likelihood of shortcomings that may creep in, particularly due to the dynamic and unanticipated nature of technology, while abiding the obligation to reinvention and the commitment to the gradual evolution of codes of conduct (in relation to AWS) based on the cardinal principles of the law of armed conflict, the development, production and use of AWS can be regulated significantly.
So how should we go about engineering such a legal framework? Decisions on the nature of laws, the specificity of their provisions and the frequency of revisions must first be preceded by constructive dialogue at the international as well as domestic levels. It is anticipated that one of the first steps to meaningful legislation would be the establishment of shared definitions of categories or particular thresholds of autonomy (by formalizing definitions of terms such as fully autonomous, semi-autonomous, etc.) and types of technology, which in itself will prove to be a contentious and protracted process. Further, a joint understanding of the principle of ‘meaningful human control’ vis-à-vis AWS is required to facilitate the development and enforceability of new international law. Second, in determining the legality of weapon systems, weapons law and targeting law will need examination. The extant body of international humanitarian law— particularly Additional Protocol I to the 1949 Geneva Conventions as well as customary international law—has rules against the use of undiscriminating weapon systems and those that have the potential to cause superfluous injury. Similarly, the four principles of the law of armed conflict—namely distinction, proportionality, military necessity and limitation—are sufficient to guide the application of AWS. Third, to provide for questions of accountability, such a framework will have to include specific standards of liability for the producers of AWS as well as the commanders that authorize their use. In addition, specifications on well-defined and mutually agreed upon standards of care to be employed by all parties involved in the manufacture, use and transfer of such technology would need enunciation along with details of the liability actions that could be brought about in case of related omissions.
There is merit in expediting the process of establishing the legal framework by enacting new laws that address areas of concern apropos AWS and by declaring the relevance and applicability of appropriate existing laws now when AWS are still in the early stage of evolution. In doing so, we can ensure that the designers and users of military robots develop and utilize their technology in ways that conform to legal standards. This could also prevent a stage where lobbies with vested interests campaign for laws that legalize or validate extant weapon systems. Such an international legal framework, when combined with domestic state norms and best practices, can go a long way in ensuring compliance.
Visible competition in the development of sophisticated weaponry is better than the clandestine AWS arms race that is taking place today. The more we try to deny the presence and development of such systems and attempt to ban them, the more we risk behaving like the birds that put their heads in sand when they sense danger around. The need of the hour is a regulatory framework with a spotlight on addressing the very key concerns based on which the development, production and use of AWS are objectionable to many crucial voices.
We have the legal principles in hand; it is now time to facilitate the formation of further laws to apply exclusively to AWS so that the harm that they have the potential of causing can be limited and their advantages reaped.
Swati Malik serves as Legal Officer at the United Nations Mission in the Republic of South Sudan. She is a fellow of the Global Governance Futures (GGF 2025) program. The views expressed herein are those of the author and do not necessarily reflect the views of the United Nations.