Sponsored by China Society for Human Rights Studies
Home>Journal

The Challenges Posed by Autonomous Weapon Systems to Human Rights and Humanitarian Concerns and Relevant Legal Responses

2023-10-29 00:00:00Source: CSHRS
The Challenges Posed by Autonomous Weapon Systems to Human Rights and Humanitarian Concerns and Relevant Legal Responses
 
ZHANG Yunhan*
 
Abstract: By 2050, autonomous weapon systems may potentially replace humans as the main force on the battlefield, as per predictions.The development of autonomous weapon systems poses risks to human rights and humanitarian concerns and raises questions about how international law should regulate new technologies. From the perspectives of international human rights law and international humanitarian law, autonomous weapon systems present serious challenges in terms of invasiveness, indiscriminate killing, cruelty, and loss of control, which impact human rights and humanitarian principles. Against the backdrop of increased attention to the protection of human rights in China, it is necessary to clarify the existing regulatory framework and fundamental stance regarding autonomous weapon systems and proactively consider and propose countermeasures to address the risks associated with such systems. This will help prevent human rights and humanitarian violations and advance the timely resolution of this issue, which affects the future and destiny of humanity, ultimately achieving the noble goal of universal enjoyment of human rights.
 
Keywords: autonomous weapon systems · international humanitarian law · international human rights law · humanitarian
 
Ⅰ. Introduction
 
The autonomous weapon system refers to any weapon system with autonomous decision-making capabilities,1 which includes a range of auxiliary elements in addition to autonomous weapons, such as sensors, decision-making units, and ammunition. By definition, it is a weapon system that can search, engage, and ultimately strike without human intervention.2 Some human rights advocates have questioned the legality of indiscriminate use of smart weapons, as AI technology continues to be applied to the military and, in combination with military weapons, poses serious challenges to existing principles and provisions of human rights and humanitarian law.3 Although the issue of legality has been a major concern for nearly 10 years by Convention on Certain Conventional Weapons (CCW),4 the current issue of perfecting International Humanitarian Law has been under the spotlight in the international community5. While, autonomous weapons systems in lack of human judgment and review also pose a threat of human rights violations.
 
To date, the laws of war have come down to two categories: people and weapons. However, the autonomous weapon system fundamentally differs from people and weapons in the traditional sense. It is neither a person in the real sense nor a mindless weapon. It highly integrates the ability of human thinking and judgment with the lethality of weapons and even can independently decide whether, when, and how to kill. If human rights norms and humanitarian protection rules are not set for future autonomous weapons systems, people will face huge issues related to human rights and humanitarian concerns.6 For the purpose of protecting the most fundamental demands, i.e., the right to life and other basic human rights, guaranteeing the lawful rights and interests of belligerents, neutral parties, and civilians in a state of war, and preventing them from undermining the system of international humanitarian law, human rights, and humanitarian protection rules limiting autonomous weapon systems should be formulated as soon as possible. Meanwhile, we should win over the broad recognition of the international community, strengthen exchanges and cooperation, and jointly promote the proper settlement of this issue concerning the future of mankind in advance. The current discussion of autonomous weapon systems in legal circles mainly focuses on international humanitarian law, but similar conflicts over principles are also implied in international human rights law, which always guarantees certain human rights for all mankind, regardless of nationality or local law. Although the research of autonomous weapons in the field of human rights is still in its infancy, a few foreign scholars have already thought about it. Peter Asaro argues that the human rights to life and due process, and the restrictions that derogate from them, imply a set of specific obligations related to automation and autonomous technologies.7 In considering the possible use of force through autonomous weapons in the context of human rights, Christof Heynsbelieves that it should not be limited to lethal force, but should consider all forms of force, including lethal force and less lethal force.8 James Dawes uses the term   extrapolated human rights   from a human rights perspective, based on concerns about autonomous weapon systems, expanding to predict the concerns that general artificial intelligence will trigger.9 In contrast, few domestic scholars have considered the challenges and legal responses to autonomous weapon systems from a human rights perspective. Xu Shuang has examined the impact and violation of human rights from the challenges brought by the use of drones as weapons.10 As per the definition of autonomous weapon systems, remotely controlled drones are not included in autonomous weapon systems. However, Xu Shuang tries to probe into the negative impact of emerging technologies on protection of human rights. Based on research results, this paper aims to investigate the challenges posed by autonomous weapon systems to human rights and humanitarian law and explore how to deal with the impact of emerging technologies from the perspectives of international human rights law and international humanitarian law.
 
Ⅱ. The Relationship Between Human Rights and Humanitarian in Autonomous Weapons Systems
 
International humanitarian law and international human rights law are two different disciplines, but their relationship remains a legal concern, especially due to their key impact on military operations.11 Despite their different perspectives, both are committed to protecting individuals   life, health, and dignity. During armed conflict, international humanitarian law protects people of certain classes from deliberate attacks by belligerents. However, international human rights law has a standard for using deadly force stricter than the law of war. The warring parties must guarantee against arbitraryd eprivation of life under conditions of armed conflict, and armed forces and organized armed groups must consider whether their use of autonomous weapons systems complies with international human rights law.
 
A. The relationship between international human rights law and international humanitarian law
 
Generally, international human rights law refers to the principles, rules, regulations, and systems of international law that promote and guarantee the universal respect and realization of the fundamental rights and freedoms of people, while international humanitarian law mainly includes the principles, rules, regulations, and systems of international law that protect the wounded, the sick, civilians and other war victims during war or armed conflict.12 There are two mainstream views on their relationship.One holds that international humanitarian law broadly includes human rights law, and human rights law only represents a higher development stage of general humanitarian law. The other is that humanitarian law is derived from the law of war, and human rights law, as an essential part of the law of peace, takes precedence over international humanitarian law. However, judging from their development process, international humanitarian law and international human rights law complement each other, and such connections and roles are still developing.13
 
According to the International Court of Justice, international humanitarian law is a law designed to regulate hostilities, while international human rights law functions as general law. The Advisory Opinion on the Legality of the Threat or Use of Nuclear Weapons, ICJ Reports, 1996,14 proposed the application of human rights law in situations of armed conflict and clarified the protection provided by the International Covenant on Civil and Political Rights does not cease in time of war, and the principle that the right to life may not be arbitrarily deprived also applies to hostilities. Therefore, although international humanitarian law and international human rights law belong to different legal systems, they protect similar principles and interests. In practice, however, while human rights law applies both in peace and in armed conflict, the means to protect human rights through national or international human rights institutions during armed conflict are very limited. The parties implement international human rights law under the supervision of international human rights treaty bodies and the United Nations. The decisions of these bodies are authoritative but lack legally binding force, and the implementation of international humanitarian law, in the case of serious violations of the laws of war, can be punished by the International Criminal Court for violating humanitarian law and committing war crimes.15 To sum up, the application of international human rights law in armed conflicts and its interrelationship with international humanitarian law provide a preliminary basis for developing and using autonomous weapon systems to shape this relationship.
 
B. The relationship of international human rights law, international humanitarian law, and autonomous weapon systems
 
The development of autonomous weapon systems has raised questions about how international law should regulate this new technology. The current military robot technology and military standards are based on the premise of  human intervention,   i.e., humans are still the final military decision maker. But today, major militaries around the world are investing heavily in the research, development, and deployment of autonomous weapons systems. Traditional military powers have raced to develop and apply autonomous weapon systems in an arms race.16 At present, Israel's closedloop border defense system is monitored by artificial intelligence, and the system can manipulate the target and time of attack without human control. Besides, due to the lack of clear international rules and effective supervision of autonomous weapon systems, they only stay in the discussion of CCW and gradually enter the vision of non-international actors, which is bound to increase the potential threat to international security. Taking into account the application of international human rights law and international humanitarian law on this basis, it should be clear that the use of autonomous weapons systems in armed conflict is subject to both, and the former is superior to the latter in the case of direct conflict between the two legal systems. On the one hand, some situations in an armed conflict may relate only to matters of international humanitarian law, some may relate only to international human rights law, and some may be involved in the two intersecting areas of international law. On the other hand, in armed conflict, obligations under international human rights law and international humanitarian law can coexist because the advisory opinion of the International Court of Justice supports this view. In the case of armed conflict, international humanitarian law can coexist. It is also because international human rights law and international humanitarian law share a common purpose: To maximize individuals   protection.
 
Specifically, there are three situations in which international human rights law may apply to autonomous weapons systems: First, in armed conflict, human rights law complements international humanitarian law in the use of force in armed conflict, and its focus is always to enable both parties to maintain human rights such as the right to life and the right to dignity in armed conflict, even though the rights to liberty and security of person, freedom from inhumane treatment, and access to a fair trial may also be involved. Second, counter-terrorism and other actions that do not constitute situations of armed conflict. In situations where the benchmark for the use of autonomous weapons systems is not met, the use of autonomous weapons systems should be regulated only by human rights law, not international humanitarian law. Third, in domestic law enforcement, law enforcement officials may also decide at some point to use autonomous weapon systems with lethal or less lethal weapons if the technology matures and is widespread. In such cases, the use of force should be governed by human rights law.
 
In terms of applying international humanitarian law to autonomous weapon systems, as a key guarantee for maintaining international peace and respecting humanity, the international community has recognized the principles of distinction and proportionality established by the law. Therefore, on the basis of International Humanitarian Law, autonomous weapons can only be allowed to be applied if they meet its main principles. At present, the development and use of autonomous weapons are very likely to conflict with the current international humanitarian law, and the international discussion on this issue mainly focuses on three aspects17: First, whether it suits the principle of distinction, that is, the distinction between civilian and military objectives; Second, whether it suits the principle of proportionality, that is, to limit the collateral casualties caused to civilians as much as possible compared with its military interests; The third is whether the   Martens Clause   is observed, that is, the observance of   humanitarian principles   and   public conscience.  
 
Ⅲ. The Challenge of Autonomous Weapons Systems to Human Rights and Humanitarian Concerns
 
The rapid development of autonomous weapons and artificial intelligence technology has brought great challenges to the peace and steady development of the international community.18 Although autonomous weapon systems have technical advantages and countries can take the initiative on the battlefield by improving the accuracy and effectiveness of autonomous attack targets, for fully autonomous weapon systems known as killer robots, it is difficult to guarantee that they will not violate human rights such as the right to life and the right to dignity or even violate international human rights law. As for the challenge to humanity, it is very likely to conflict with the existing international humanitarian law because only the main principles of international humanitarian law can be allowed to be applied.19 Human warfare is expensive and bloody, and replacing a large number of ground combat vehicles and other weapons with robots seems a way to save money and prevent casualties from influencing political decisions. But when autonomous weapon systems can decide on their own whether to attack people, the way of fighting will no longer be the same as traditional forms of warfare, and it is entirely possible to treat human survival with the same indifference as people treat machines. It is likely to have several characteristics, such as invasiveness, indiscriminate killing, cruelty, and loss of control. The specific analysis of these features on human rights and humanitarian challenges further confirms that autonomous weapon systems have obvious anti-human characteristics and huge humanitarian risks. If terrorist organizations control autonomous weapon systems, the safety of innocent people will suffer greater threats.
 
A. Challenge of invasiveness
 
The challenge of the invasiveness of autonomous weapon systems is primarily embodied in international human rights law, as the decision to use force against a person is a procedural rather than an act made by a person, inherently violating the right to dignity. In the relationship between humans and machines, all machines are made by humans and serve their survival and development. Even if the machine has developed to a very advanced stage of autonomous intelligence, capable of cognition, thinking, self-generating instructions, and even self-repair and self-reproduction, its relationship with humans is ultimately dependent. If, according to the algorithms in autonomous weapon systems, humans are placed in a position determined by inhuman machines, it will mean that death is simply seen as a goal, and humans will no longer have full and independent status. In other words, in the use of autonomous weapon systems,   human   is in fact reduced to numbers 0 and 1, instead of being human beings with inherent dignity.
 
As stipulated in the Universal Declaration of Human Rights,   All human beings are born free and equal in dignity and rights.  20 The International Covenant on Civil and Political Rights argues:   In accordance with the principles proclaimed in the Charter of the United Nations, recognition of the inherent dignity and of the equal and inalienable rights of all members of the human family is the foundation of freedom, justice, and peace in the world.  21 Although the right to dignity is not listed as a substantive right in the Convention, it is interwoven with other rights, and human dignity is an essential goal of international human rights law. The basis of the concept of dignity is to emphasize that the value of each person is infinite. Kant's concept of dignity gives each person an independent, unique, and irreplaceable value, and the behavior of insulting people is to reduce this value unjustifiably. Everyone has an inner core that cannot be violated, even if that violation is beneficial to the public good because it means they are used as tools.22 But a dehumanized machine cannot understand the meaning of the use of force against a human being, nor can it do justice to the gravity of the decision.23 Computer code has not yet accurately described the complexity and richness of human life and the day-to-day decisions about it, so robots cannot be pre-programmed,24 to respond in an appropriate way to the endless scenarios offered by real life and real people. Another problem with procedural presupposition is that algorithms determining when an autonomous weapon system is allowed to unleash force necessarily lead to assumptions. In other words, rule-makers and programmers made fatal decisions in advance based only on hypothetical, theoretical possibilities, rather than on real and urgent situations. But in the abstract, it is difficult to anticipate every far-reaching and accurate decision in advance. So fully autonomous weapon systems outside human control violate the right to dignity of those whose force is being used. If the operation of the computer is at a low level that humans still control, the machine is still a tool in the hands of humans, and people use it to enhance their autonomy, the right to dignity may not be threatened. Because the decision to use force is not made by a fully autonomous weapon system, but by a human being, the person being attacked is treated in a manner that takes into account human rights. However, the decision to use force is taken over by autonomous weapons, and they are no longer tools in the hands of humans, posing a serious challenge to international Human Rights Law.
 
B. Challenge of indiscriminate killing
 
The challenge of indiscriminate killing of autonomous weapon systems is mainly reflected in international humanitarian law. Although the act of killing results in the loss of life of the victim, the weapon itself is not prohibited by international law because of its lethality or non-lethality.25 Indiscriminate killing focuses on the automatic execution of killing missions regardless of conditions, occasions, and targets, violating the distinction principle in international Humanitarian Law. In its Advisory Opinion on the Legality of the Threat or Use of Nuclear Weapons (1996), the International Court of Justice regarded the principle of distinction as a primary principle of international humanitarian law.26 As a new class of weapons with autonomous attack capabilities that could cause indiscriminate proliferation similar to biological viruses, autonomous weapon systems should be considered weapons that meet the principle of distinction.
 
Articles 48, 51, and 52 of Additional Protocol I to the Geneva Conventions stipulate the principle of distinction, making it clear that a distinction should be made between civilians and combatants, so causing harm to civilians, whether intentionally or by mistake, is a violation of international humanitarian law. Scholars who believe in the analysis and identification ability of autonomous weapons say they can better distinguish between civilians and combatants and reduce harm to civilians. Other scholars believe that the battlefield and people are more complex than the program written, and autonomous weapons cannot deal with such complex situations. Additionally, they also face a series of ethical and moral problems. For people, the main adversary of people is people, so the distinction required by law is also relatively   simple,   that is, to distinguish between enemy military targets and civilian residents and civilian objects. However, in the case of autonomous weapon systems, in addition to the distinction between military and civilian targets, which is consistent with human soldiers, a more critical distinction should be added; that is, a distinction should be made between the attributes and intentions of the other side, such as whether it is a civilian engaged in hostilities or a civilian engaged in armed self-defense and whether it is a combatant with the intention of surrender. Some scholars are skeptical about whether a deeper distinction can be correctly drawn between autonomous weapon systems. For example, when autonomous weapon systems face enemy personnel who have surrendered, due to its instructions, it is difficult for robot soldiers to achieve the human rights protection rules and procedures of   keeping people under the knife,   it is difficult to immediately stop the attack, not to kill those who give up resistance, and it is difficult to respect and protect the wounded, the sick, medical and rescue workers and other groups that deserve special protection. The battlefield environment is far more complex and changeable than the preset procedures of autonomous weapons, and the most typical conflict scenarios in today's armed conflicts are  asymmetric warfare and urban warfare environments.27 Therefore, artificial intelligence may not be qualified in this respect by using the algorithm to accurately distinguish between military and civilian and at the same time without civilian collateral casualties.
 
C. Challenge of cruelty
 
The challenge of the cruelty of autonomous weapon systems is reflected in both international human rights law and international humanitarian law because its use of force against people cannot be interfered with by humans. Autonomous weapon systems that avoid human physical and emotional weaknesses take on a ruthless character. The challenges to international human rights law posed by the cruelty of autonomous weapon systems include but are not limited to the right to life and the right to dignity. Whether humans or machines control force, there will be cases where force is used against the wrong people or excessively, but when autonomous weapon systems go wrong, the harm to human rights is irreversible.
 
The challenge of cruelty can be articulated more clearly through the Martens Clause in international humanitarian law. As one of the important principles of international humanitarian law, Martens Clause was put forward by Russian international jurist Maltens at the First Hague Peace Conference in 1899, and has been confirmed in the preamble of the Convention on the Laws and Customs of War on Land.28 The article states that weapons systems shall conform to international custom, the principles of humanity, and the dictates of public conscience. Specifically, the Martens Clause states:  In the case of matters not provided for in this article, there are principles of international law which, from the customs of civilized peoples, from the principles of humanity, and from the dictates of conscience, shall protect and govern between the two belligerents and their peoples.   It fills the gaps that are not provided for by international law, making the application of international humanitarian law not limited by its express convention.29 With the continuous improvement of the autonomy of autonomous weapon systems and the continuous development of artificial intelligence technology, it is very likely that the decision to hit the target will be handed over to the autonomous weapon system in the future, that is, the machine will control the power of life and death of humans. At present, international humanitarian law does not prohibit autonomous weapon systems, and they can only be punished after the system commits war crimes or crimes against humanity. According to the psychoanalysis of international humanitarian law, autonomous weapon systems lack the empathy and moral character required of qualified combatants and may present unknown dangers and threats to humanity.
 
D. Challenge of loss of control
 
The challenge of the loss of control of autonomous weapon systems is reflected in both international human rights law and international humanitarian law. The loss of control mainly stems from the   autonomy   of weapon systems. A fully autonomous weapon system differs from a semi-autonomous weapon system, a supervised autonomous weapon system, which places humans outside its OODA cycle.30 It can execute higher-level instructions and tasks and independently choose targets and decide whether to strike, independent of human operations.31 This will lead to a violation of the principle of proportionality in international humanitarian law and constitute a violation of the right to life.
 
On the one hand, the laws of war are designed to prevent excessive civilian casualties and property damage caused by war, the use of means and methods of warfare should be proportional to the expected, concrete, and direct military interests, and excessive attacks and unnecessary injuries are prohibited.32 Article 51.5.2 and Article 57.2.1 of Additional Protocol I to the Geneva Conventions also provide proportionality requirements in international humanitarian law. The current technology for autonomous weapons is far below that of military personnel trained for long periods of time, and is not yet capable of achieving uniquely human judgment.33 Therefore, whether it conforms to the principle of proportionality is still controversial. Under the principle of proportionality, autonomous weapon systems need to evaluate and quantify various elements and make choices compared with expected military benefits, so compared with the principle of differentiation, its technical requirements are higher, and how to measure the value of life will be one of its technical difficulties. Besides, due to the huge differences in individual capabilities between robots and humans, there should be some inequality in the confrontation between robot soldiers and humans; that is, in combat methods and means, efforts should be made to avoid human casualties. Suppose robot soldiers can disarm enemy personnel with Taser guns, anesthetic bullets, or fiber nets. In that case, it is illegal to physically destroy humans by firing self-homing smart bombs, laser beams, or electromagnetic shells. Suppose the program calls for the total elimination of a particular person. In that case, the use of torture to death by robot soldiers who can shoot the enemy is not legally acceptable. At the heart of the principle of proportionality is the assessment of military necessity to ensure that the military benefits expected to be realized should not significantly outweigh the collateral damage caused.34 Therefore, the measurement and trade-off between humanitarian protection and military necessity is a value judgment, not a decision that can be made by analysis alone, which is precisely the task that autonomous weapons are difficult to accomplish. It may also need to minimize civilian collateral damage in the selection of multiple military targets, and this technology requires autonomous weapons to have moral value judgment.
 
On the other hand, the challenge of autonomous weapon systems to the right to life can be considered in terms of accountability, and of course, the same applies to related rights.35 As stipulated in the Universal Declaration of Human Rights,   All human beings are born free and equal in dignity and rights.  36 Although the Universal Declaration of Human Rights is not binding under international law, it is a programmatic document of global human rights norms. The International Covenant on Civil and Political Rights provides that:   Every human being has the inherent right to life. This right shall be protected by law. No one shall be arbitrarily deprived of his life.  37 This is also the obligation that the state must undertake to respect and guarantee the right to life. The right to life has two components: prevention of arbitrary harm to life and accountability when that happens.38 It constitutes a violation of the right to life if the aggressor causes loss of life, even if this includes failure to use due diligence to prevent it. This implies a lack of accountability for unlawful deaths, which also constitutes a violation of the right to life. If there is insufficient human control over the release of force, the use of autonomous weapon systems may create an accountability vacuum because accountability presupposes control. As a result, autonomous weapon systems present serious challenges in the context of human rights law, because without meaningful human control, robots may not have the technical capability to use force or protect life and body safety as required by human rights law. Even though impersonal institutions such as states or corporations may be held accountable and have to pay compensation or change their practices, humans play a secondary role in decision-making due to the increased autonomy of machines. Because of their lack of control, it will be difficult to hold them personally accountable. First, good algorithm engineers also have trouble predicting their complex decisions, and without transparency in targeting decisions, determining where and how error attacks occur and how to prevent similar errors from happening again is an extremely difficult task. Second, an autonomous weapon system, as an artificial intelligence system, cannot fully grasp the real meaning of the designer  s instructions or lacks an algorithm description of some meaning.39 Third, when machines evolve, that is, when they interact with the environment, through autonomous learning, to achieve functional expansion and capability evolution beyond human prediction, it is difficult to determine whether the responsibility belongs to the commander, the technician, the producer, or the machine itself. Last, the International Covenant on Civil and Political Rights states: Everyone has the right to liberty and security of person.  40 This means that, like the right to life, rights such as liberty and security of a person can be violated by a failure to establish accountability.
 
Ⅳ. China’s Perspective on Addressing the Challenge of Autonomous Weapon Systems
 
The ICRC and CCW Parties have held a series of meetings on issues related to autonomous weapon systems and reached some basic consensus in the hope that the degree of autonomy of autonomous weapon systems can be regulated. With the continuous improvement of China’s international status and comprehensive national strength, China’s existing institutional norms and basic positions have been gradually established. On this basis, China should actively think about countermeasures, promote international legislative research and improve legal review, and reduce the threat of such weapons to international peace and civilian safety. Moreover, as China’s voice in the international community gradually increases, China’s normative attempts are more likely to attract the attention of the international community. It will have a positive impact on the stable development of autonomous weapon systems in accordance with international human rights law and international humanitarian law.
 
A. China’s existing institutional norms and basic positions
 
In July 2017, The State Council publicly issued the New Generation of Artificial Intelligence Development Plan, which clarified that the uncertainty of artificial intelligence development had brought new challenges, which may impact laws and social ethics and challenge the norms of international relations, and proposed for the first time that “initially establish artificial intelligence laws and regulations, ethical norms and policy systems and setting the strategic goal of developing AI security assessment and control capabilities.41” In June 2019 and September 2021, the Ministry of Science and Technology successively issued documents such as New Generation of Artificial Intelligence Governance Principles — Developing Responsible Artificial Intelligence and New Generation of Artificial Intelligence Ethics Code.42 The former emphasizes eight governance principles, including harmony and friendship, fairness and justice, inclusiveness and sharing, respect for privacy, security and controllability, shared responsibility, open collaboration, and agile governance, while the latter put forward six basic ethical requirements, including improving human well-being, promoting fairness and justice, protecting privacy and security, ensuring controllability and trustworthiness, strengthening responsibility, and improving ethical literacy. It has clarified 18 specific ethical requirements for specific activities such as artificial intelligence management, research and development, supply, and use, and is more clear, specific, and standardized in the system. In December 2021, the Ministry of Foreign Affairs announced China’s first initiative on regulating military applications of artificial intelligence, namely, China’s Position Paper on Regulating Military Applications of Artificial Intelligence submitted to the Sixth Review Conference of the United Nations CCW.43 It maintains that all countries should ensure that new weapons and their means of combat comply with international humanitarian law and other applicable international laws, should not use the advantages of artificial intelligence technology to endanger other countries’ sovereignty and territorial security, and should establish an international mechanism with universal participation. Promote the formation of widely agreed AI governance frameworks and standards. In September 2022, the Foreign Ministry issued the Position Paper on China’s Attendance at the 77th Session of the United Nations General Assembly,44 reemphasizing that the norms on AI security governance, especially the military application of AI, should adhere to the principles of multilateralism, openness, and inclusiveness, and promote the formation of a broad consensus framework and standards for AI governance through dialogue and cooperation.
 
In addition to publicly published policies and positions at home, China also participated in the 2014 annual meeting of CCW Parties to discuss the legal, moral, technical, and military applications of lethal autonomous weapon systems, as well as the 2017 upgrade from an informal “expert meeting” to a formal Group of Governmental Experts (GGE) meeting,45 actively expressing China’s position abroad. Taking the most recent intergovernmental expert Group meeting as an example, in July 2022, China submitted the China Working Paper on Lethal Autonomous Weapon Systems, reaffirming the December 2021 position paper, discussing the definition and scope of lethal autonomous weapons, and the distinction between “acceptable” and “unacceptable” autonomous weapon systems. From the human rights and humanitarian perspective of safeguarding the common security and dignity of mankind and effectively controlling the security, legal, ethical, humanitarian, and other risks caused by artificial intelligence, it is proposed to regulate the military application of artificial intelligence.46 Taking the most recent meeting of arties to CCW as an example, in November 2022, China submitted China’s Position Paper on Strengthening the Ethical Governance of Artificial Intelligence, reaffirming its December 2021 position paper. And from the four aspects of AI governance should adhere to ethics first, strengthen self-restraint, promote responsible use, and encourage international cooperation, actively advocate the principle of “people-oriented” and “smart for the good,” advocate enhancing mutual understanding and trust among countries, ensure the development of artificial intelligence technology is safe, reliable, and controllable, and promote the construction of a community of human destiny in the field of artificial intelligence.47
 
B. China’s response to the challenge of autonomous weapon systems
 
Governments have reached some basic consensus under CCW. According to the reports of the Group of Governmental Experts meetings in recent years, a number of guiding principles have been gradually established, including: (1) To ensure the development, deployment and use of any new weapon system within the framework of the CCW, it must be determined whether the use of the weapon is prohibited under international law in each case; (2) Lethal autonomous weapons systems should seek to strike a balance between military necessity and humanitarian concern within the object and purpose of the CCW; (3) international humanitarian law and its principles are applicable to the development and use of lethal autonomous weapon systems, including the principle of distinction and proportionality in attack. (4) Any policy measures discussed and adopted in the context of the CCW should not personify emerging technologies in the area of autonomous weapon systems or impede progress in the peaceful uses of intelligent autonomous technologies; (5) The obligation of international humanitarian law belongs to states, parties to armed conflict and individuals, not machines, to retain human responsibility for decisions on the use of weapons systems throughout their life cycle; (6) Human-computer interaction should be implemented at all stages of the weapon life cycle to ensure the application of international law in the area of lethal autonomous weapon systems, where human judgment is essential; (7) In cases involving emerging technologies in the area of autonomous weapons systems not covered by the CCW and other international treaties, civilians and combatants should always be protected by the principles of customary international law, proceeding from the dictates of humanity and public conscience; (8) Legal review at the national level on the development, acquisition and use of new weapons, new means and methods of warfare is an effective method, and countries are free to decide the form of legal review.
 
However, even if there is a certain consensus, there is still a gap in the field of autonomous weapon systems involving specific international law issues, such as the legal status of autonomous weapon systems, the regulation of autonomous weapons, etc., the parties to the contracting parties are still in dispute, especially the formulation of relevant conventions and legal rules is slow. Analysis of the existing system rules in our country, also because of the rapid development of technology and the lag of law, the lack of written law and legal practice in this field. The consensus also indicates that in accordance with parties’ legal obligations to Additional Protocol I to the Geneva Conventions, governments must pay close attention to the specificities of emerging technologies in the field of autonomous weapon systems when conducting legal reviews of weapons. To sum up, in order to cope with the challenges in this field, we should participate in the construction of legal rules for autonomous weapon systems. Furthermore, China can consider the future development of autonomous weapon systems from the aspects of legislative research, review, and accountability.
 
First, actively participate in international legislation and promote multilateral cooperation. In the field of autonomous weapon systems, China should carry out exchanges and dialogues with relevant international organizations and countries and actively promote the construction of international legislation under a relatively peaceful social state, such as the conclusion of a protocol on autonomous weapon systems under the mechanism of the United Nations CCW.48 When autonomous weapon systems are used in warfare, they need to be governed by law, like any other area of activity, because of the importance of the rules-based international order. Besides, in the context of the contemporary international rule of law, responding to the challenges posed by autonomous weapon systems to international human rights law and international humanitarian law, we should uphold the guiding principles of the UN Charter and respect international sovereignty. On the basis of respecting the rights of all countries to participate in the governance of autonomous weapon systems on an equal basis, we will coordinate all countries to reach a consensus on ideas, systems coordination, and common actions to address the challenges posed by autonomous weapon systems. We should be good at using “non-force” rather than “force,” and adopt multilateral and multi-party participation to establish and improve the global autonomous weapon system governance system.
 
Second, we will improve the legal review and accountability mechanism. Much of the concern about autonomous weapons systems is based on the issue of accountability for crimes. From the perspective of international criminal trial practice, the defendant’s subjective criminal intent is an essential component. Some scholars believe that in international law, the international accountability of war criminals is a complicated issue. In the case of autonomous weapon systems, in addition to continuing to involve the responsibility of combatants and their superior commanders, as in traditional armed conflict, the responsibility of designers, manufacturers, and sellers of autonomous weapons will also be involved. Of course, countries that acquire, install, and deploy autonomous weapon systems should also bear corresponding legal responsibilities.
 
At the domestic level, in accordance with Article 36 of Additional Protocol I to the Geneva Conventions, a state has an autonomous obligation to review new weapons, means, or methods of warfare to determine their compliance with international regulations. On this basis, China can build a weapons review model and accountability mechanism with Chinese characteristics, such as the State Council and the Central Military Commission jointly formulated the Regulations of the People’s Republic of China on the Review of Weapons Laws. With the development of autonomous weapon system technology, the damage caused by it will be more serious, and it will be more necessary to form legal constraints. According to our regulatory experience in the practice process, we continue to standardize the development and use of autonomous weapon systems in China, reflecting our respect for the law and highlighting our great power demeanor in the international community.
 
At the international level, the application of international human rights law and international humanitarian law can be understood as adopting a specific strategic framework to govern emerging military technologies in accordance with the principle of neutral technology. It can be regulated directly by determining whether the specific implementation or use of the technology would conflict with fundamental principles of international human rights law and international humanitarian law. The development of autonomous weapon systems helps to improve a country’s political and economic status, check competitors, and form national defense advantages, but countries are in a secret state for the research and development of this technology, and it is difficult to form an independent review mechanism, so independent third parties are required to carry out regulatory control. Countries are required to take the initiative to report regularly to the agency on the development process and major breakthroughs in the core procedures of their autonomous weapon systems, and the agency can also conduct temporary spot checks on them.
 
C. Reflections on legal responses to the challenge of autonomous weapon systems
 
In the realm of autonomous weapon systems, China should take seriously and make reasonable application of international human rights law and international humanitarian law, which serves as a low-cost and wise choice and helps the state to build and strengthen its lawfulness, minimize resistance in the development process and gain more recognition and support in the international community.49 Autonomous weapon systems available now are equipped with high-speed sensing systems that can respond to signals from multiple inputs simultaneously,50 and make comprehensive real-time analysis and monitoring of the battlefield to eliminate the “fog of war51.” Its autonomy is different from automaticity. The latter can only perform repetitive operations and low-level tasks according to fixed procedures,52 while the former can perform higher-level instructions and tasks,53 making it independent of human operation and capable of choosing targets and deciding whether to strike. An autonomous weapon system, as the product of the era of artificial intelligence, is required to ensure that China’s rapid development of the military industry is in line with the value of “caring for people’s lives, values, and dignity and realize human rights for everyone which is the common pursuit of human society.”54 Facing such realistic challenges, China should take international human rights law and international humanitarian law as the starting point for thinking, to further consider the countermeasures in multiple dimensions to realize the “preventive” protection of human rights and elimination of humanitarian disasters.
 
First goes to meaningful human control from the perspective of International Humanitarian Law. While there are arguments that encouraging technological innovation in autonomous weapon systems can significantly reduce collateral damage to civilians, and even objections to legislation banning autonomous weapon systems,55 most countries and scholars have reached a consensus on “meaningful human control,” a new norm for autonomous weapon systems formed at the UN CCW held in Geneva in May 2014, which can be broken down into two elements: “human control” and “meaningful.”56 First, “human control” clarifies that the operator of an autonomous weapon system controls targets and makes strike decisions, which is consistent with basic humanitarian thinking. Second, “meaningful” refers to the final decision made by human beings after weighing various factors, which can comply with relevant regulations of international humanitarian law. Dr. William Boothby, a British scholar, affirmed the importance of preventive measures and suggested that elements of military action could be limited in order to control targets and collateral damage.57 Human interfaces with autonomous weapon systems can also provide guidance for achieving “meaningful human control,” such as directly generating meaningful actions that can be taken or real-time success rates for missions. Therefore, the specific means of “meaningful human control” should be clarified, and incorporating “meaningful human control” into relevant laws should be considered a legal norm that must be observed in developing and using autonomous weapon systems. As stipulated in common Article 1 of the Geneva Conventions,58 states should respect international humanitarian law and ensure that other parties respect it. The responsibility of the person who decides to use an autonomous weapon system must be retained, as responsibility cannot be transferred to the machine and should be considered throughout the weapon system’s life cycle. In fact, banning weapons that are not under human control will not be counterproductive to technological development. On the contrary, there is a need to take full advantage of technological advances to advance humanitarian protection and international law to maintain international peace and security.
 
Second, we consider creating a dependent human rights bottom line based on international human rights law. In 2012, Human Rights Watch issued a report saying that states need to take precautions and resist developing and using fully autonomous weapons systems in armed conflict.59 Therefore, in order to eliminate all possibilities of humans being eventually destroyed by machines, and to guarantee basic human rights such as the right to life and the right to dignity, robots must not be given the right to kill autonomously. Just as the laws of war apply to human beings, we should make “human supremacy” a core element of the legislative spirit and operational rules applicable to the laws of robot soldiers, which is also the first principle of protection of human rights in using autonomous weapon systems. For autonomous weapon systems, the “attached to people” program source code that never disappears and never deletes can be considered as the human rights bottom line created by humans, and any robot that does not have an attached program will lose its legal basis. Moreover, from the perspective of weapons review, the international community prohibits the use of indiscriminate weapons, as well as those that cause excessive harm or unnecessary suffering. Therefore, China should show absolute human control over the programming of autonomous weapon systems, create a human rights bottom line for autonomous weapons, and oppose the development and use of autonomous weapon systems that are out of control.
 
Ⅴ. Conclusion
 
The growing maturity of artificial intelligence technology has contributed to the rapid progress of technology-aided military weapons, and autonomous weapon system is one of its representative products. An autonomous weapon system is a newtype weapon that highly integrates the ability of human thinking and judgment with the lethality of weapons. It poses challenges to human rights, humanity, and peace and stability of the international community, including but not limited to invasiveness, indiscriminate killing, cruelty, and loss of control. A detailed analysis of these human rights and humanitarian challenges highlights the obvious human rights violations and humanitarian risks of autonomous weapons systems.
 
On February 25, 2022, General Secretary Xi Jinping delivered important speeches when presiding over the 37th group study session of the Political Bureau of the CPC Central Committee. He stressed that we should care for people’s lives, values, and dignity and realize human rights for everyone, which is the common pursuit of human society.60 Countries are racing to develop and apply autonomous weapon systems in an arms race, and the combination of artificial intelligence technology and military weapons has posed huge challenges to the relevant principles and provisions of the existing international human rights law and international humanitarian law. In view of this, China is required to deploy in advance, to be actively involved in international legislation, boost multilateral cooperation, improve the legal mechanism of review and accountability, and seek to respond from the perspective of international human rights law and international humanitarian so as to prevent violations of human rights and humanitarian, and promote the issue concerning the future of mankind to be solved as soon and properly as possible.
 
(Translated by XU Chao)
 
* ZHANG Yunhan ( 张韵涵 ), Doctoral candidate at Jilin University Law School.
 
1. ICRC, “International Humanitarian Law and the Challenges it Faces in contemporary Armed Conflicts,” Report of the 32nd International Conference of the Red Cross and Red Crescent, ICRC East Asia, 2015 edition, page 42.
 
2. Paul Sharer, “Unmanned Armies: Autonomous Weapons and Future Warfare,” translated by Zhu Qichao, Wang Shu and Long Kun (Beijing: World Knowledge Press, 2019), 57-58.
 
3. C. B. Packet, “In This Era of Smart Wepons, Is a State Under a Legel Obigation to Use Precision-Guided Technology in Armed Conflict,” Emory International Law Review 18 (2004): 645-647.
 
4. Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May be Deemed to be Excessively Injurious or to Have Indiscriminate Effects, which took effect on December 2, 1983, and was signed by China at the United Nations Headquarters on September 14, 1981.
 
5. Xu Nengwu and Long Kun, “The Focus and Trends of the Arms Control Debate on Lethal Autonomous weapon Systems under the United Nations CCW Framework,” International Security Studies 5 (2009): 108-112; Sun Wen, “An Analysis of China’s attitude towards CCW,” Journal of Xi’an Political Science University 4 (2008): 89-93. Department of Treaties and Law, Ministry of Foreign Affairs of China, “The Debate on Artificial Intelligence Weapons Based on International Humanitarian Law,” Information Security and Communication Security 5 (2019): 25-27.
 
6. Guo Yang, “Controversy on Reserving or Abolishing Killing Robots,” Legal Daily, May 20, 2014.
 
7. Peter Asaro, “On the Prohibition of Autonomous Weapon Systems: Human Rights, Automation, and the Dehumanization of Lethal Decision-making,” translated by Han Yang, International Review of the Red Cross-New Technology and War, ICRC East Asia, 2014 edition, page 200.
 
8. Christof Heyns, “Autonomous weapons systems and human rights law,” United Nations, accessed June 22, 2022.
 
9. James Dawes, Zhang Wei and Li Bingqing, “Rational Human Rights: Artificial Intelligence and the Future of Humanity,” Journal of China University of Political Science and Law 1 (2022): 289-304.
 
10. Xu Shuang, “Human Rights Challenges posed by Drone Operations and their Responses,” Human Rights 6 (2021): 141-160.
 
11. ICRC, Q&A on International Humanitarian Law, available on the ICRC website, accessed November 10, 2022.
 
12. Zhu Wenqi, Introduction to International Humanitarian Law (Taipei: Jian Hong Press, 1997), 103.
 
13. Wang Kongxiang, Introduction to International Human Rights Law (Wuhan: Wuhan University Press, 2012), 206.
 
14. Legality of the Threat or Use of Nuclear Weapons, Advisory Opinion, ICJ Reports, 1996.
 
15. Bai Guimei, Human Rights Law (Beijing: Peking University Press, 2015), 86.
 
16. Ugo Pagallo, “Robots of Just War: A Legal Perspective,” 24 Philosophy & Technology 3 (2011): 301-315.
 
17. Huang Huikang, Major-Country Diplomacy and International Law with Chinese Characteristics (Beijing: Law Press · China, 2019), 135-136.
 
18. Sean Watts, “Autonomous Weapons: Regulation Tolerant or Regulation Resistant,” 30 Temple International & Comparative Law Journal 2 (2016): 182-183.
 
19. Zhu Wenqi, “The War on Terrorism and International Humanitarian Law,” Journal of East China University of Political Science and Law 1 (2007): 126-130.
 
20. Universal Declaration of Human Rights, 1948, Article 1.
 
21. Preamble to the 1966 International Covenant on Civil and Political Rights.
 
22. Bhuta N, Beck S, Geis R, et al., Autonomous Weapons Systems: Law, Ethics, Policy (Cambridge: Cambridge University Press, 2016), 3-20.
 
23. Christof Heyns, “Autonomous Weapons in Armed Conflict and the Right to a Dignified life; An African Perspective,” 33 South African Journal on Human Rights 1 (2017): 46-71.
 
24. Docherty and Bonnie Lynn, “Shaking the Foundations: The human Rights Implications of Killer Robots,” Human Rights Watch, accessed June 22, 2022.
 
25. Yuan Juntang and Zhang Xiangyan, Introduction to Weapons and Equipment (Beijing: National Defense Industry Press, 2011), 281-287.
 
26. Article 48 of Additional Protocol I to the Geneva Conventions, 1977: “Parties to a conflict shall at all times distinguish between the civilian population and combatants in order to spare civilian population and property. Neither the civilian population as such nor civilian persons shall be the object of attack. Attacks shall be directed solely against military objectives.”
 
27. Jack M. Beard, “Autonomous Weapons and Human Responsibilities,” 45 Georgetown Journal of International Law 3 (2014): 619.
 
28. Preamble to the Convention on the Laws and Customs of War on Land, 1907: “Until a more complete code of the laws of war has been issued, the High Contracting Parties deem it expedient to declare that, in cases not included in the Regulations adopted by them, the inhabitants and the belligerents remain under the protection and the rule of the principles of the law of nations, as they result from the usages established among civilized peoples, from the laws of humanity, and the dictates of the public conscience.”
 
29. Zhu Wenqi, International Humanitarian Law (Beijing: China Renmin University Press, 2007), 114.
 
30. The basic idea of the OODA cycle is that armed conflict can be seen as a competition between rival parties over who can perform the “Observation-Orientation-Decision-Action” cycle faster and better. The OODA loop is a combination of four letters: Observation, Orientation, Decision, and Action.
 
31. Riza M. Shane, Killing without Heart: Limits on Robotic Warfare in an Age of Persistent Conflict (Nebraska: University of Nebraska Press, 2013), 13.
 
32. Yoram Dinstein, War. Aggression and Self-defence (Cambridge: Cambridge University Press, 2011), 127.
 
33. Dong Qingling, “The New Ethics of War: Regulating and constraining Lethal Autonomous Weapon Systems,” International Watch 4 (2018): 51-66.
 
34. Benjamin Kastan, “Autonomous Weapons Systems: A Coming Legal ‘Singularity,’” 45 Journal of Law, Technology and Policy 45 (2013): 49.
 
35. Zhang Weihua, “The New Challenge of Artificial Intelligence Weapons to International Humanitarian Law,”Political Science and Law Forum 4 (2019): 144-155.
 
36. Universal Declaration of Human Rights, 1948, Article 3.
 
37. Article 6, paragraph 1, the International Covenant on Civil and Political Rights, 1966.
 
38. Guo Xingli, “Legal Governance of Unequal Right to Life,” Academia Bimestrie 2 (2021): 109-111.
 
39. Guo Rui, Ethics and Governance of Artificial Intelligence (Beijing: Law Press · China, 2020), 51.
 
40. Article 9, paragraph 1 of the International Covenant on Civil and Political Rights, 1966.
 
41. State Council, “New Generation of Artificial Intelligence Development Plan,” the official website of the State Council, accessed November 20, 2022.
 
42. the Ministry of Science and Technology, “New Generation of AI Governance Principles-Developing Responsible AI,” available on the Ministry of Science and Technology website, accessed November 20, 2022, Ministry of Science and Technology: Ethics Code for the New Generation of Artificial Intelligence, available on the Ministry of Science and Technology website, accessed November 20, 2022.
 
43. The Ministry of Foreign Affairs: China’s Position Paper on Regulating the Military Application of Artificial Intelligence, available on the official website of the Ministry of Foreign Affairs, accessed November 20, 2022.
 
44. The Ministry of Foreign Affairs, “Position Paper on China’s Attendance at the 77th Session of the United Nations General Assembly,” available on the official website of the Ministry of Foreign Affairs, accessed November 20, 2022.
 
45. Leng Xinyu, “Principles of Meaningful Human Control under the Issue of Lethal Autonomous Weapon Systems,” Journal of International Law 2 (2022): 2-3.
 
46. United Nations, “Working Paper of the People’s Republic of China on Lethal Autonomous Weapons Systems,” United Nations Office for DisarmamentAffairs.
 
47. United Nations, “Position Paper of the People’s Republic of China on Strengthening Ethical Governance of Artificial Intelligence,” United Nations Office for Disarmament Affairs, accessed November 20, 2022.
 
48. Liu Caikuan, “Research on the Application of International Humanitarian Law in the New War Formation-Centered on the Principle of Proportionality,” Journal of Hunan University of Science and Technology (Social Science Edition) 6 (2020): 129-136.
 
49. He Zhipeng, “International Law and the Rise of Great Powers,” Social Science Journal of Jilin University 1 (2017): 82-83.
 
50. Ronald Arkin, “The Case for Ethical Autonomy in Unmanned Systems,” Journal of Military Ethics 9 (2010): 332-334.
 
51. The “fog of war” is the uncertainty in situational awareness in the battlefield due to the instability of that environment and the unavailability of real-time updates.
 
52. C Francois, “International Encyclopedia of Systems and Cybernetics,” Saur 1 (2004): 51.
 
53. Riza M. Shane, Killing without Heart: Limits on Robotic Warfare in an Age of Persistent Conflict (Nebraska: University of Nebraska Press, 2013), 13.
 
54. Xi Jinping, “Steadfastly Following the Chinese Path to Promote Further Progress in Human Rights,” Qiushi 12 (2022): 4.
 
55. Anderson K and Waxman M, “Law and Ethics for Robot Soldiers”, Policy Review 176 (2012): 35-49.
 
56. “Killer Robots: UK Goverment Policy on Fully Autonomous Weapons,” Article 36, accessed June 22, 2022.
 
57. Daniel H. and William Boothby, “Weapons and the Law of Armed Conflict,” European Journal of International Law 2 (2010): 483-486.
 
58. “Each State Party undertakes to respect this Convention in all circumstances and to ensure that it is respected.”
 
59. “Losing Humanity: The Case against Killer Robots,” Human Rights Watch.
 
60. “Steadfastly Following the Chinese Path to Promote Further Progress in Human Rights,” People’s Daily, February 27, 2022.
Top