Q4/2019 - Group of Governmental Experts for Lethal Autonomous Weapons Systems (GGE LAWS)

Annual Report, Geneva, 25 September 2019

The concrete outcome of the GGE LAWS` work in 2019 was the “Guiding Principles” they agreed upon. The annual activity report of the Group was published on 25 September and gives an overview of the issues that are still pending[1]. The two decisive statements of the eleven "Guiding Principles" are that international humanitarian law (i.e. the Geneva Convention of 1949) also governs the use of autonomous weapons systems and that at the end of the command chain for the use of autonomous weapons there must always be a human being who takes responsibility. Such responsibility cannot be delegated to machines.

The Chairman of the GGE LAWS, the Indian Ambassador Amandeep Singh Gill, supplemented the formal report of the Group on 8 November 2019 with his own summary of the GGE LAWS’ work in 2019, in which he went into detail on the question of the expected final result. According to Gill, there is currently no consensus on what should be the final outcome of the work of the GGE LAWS. There is a broad spectrum of different proposals, ranging from the demand for an independent convention under international law, a declaration or moratorium that is not binding under international law, to an informal continuation of the discussion without concrete results[2].

For 2020, the GGE LAWS has scheduled two meetings. In 2021, it must submit a report to the 6th CCW Review Conference, under the umbrella of which the GGE LAWS operates. At that conference the future legal form of the discussion results might be determined. 

Mehr zum Thema
Q4/2019GGE-LAWS
  1. [1] Report of the 2019 session of the Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems, Genf, 25. September 2019: „Guiding Principles: It was affirmed that international law, in particular the United Nations Charter and International Humanitarian Law (IHL) as well as relevant ethical perspectives, should guide the continued work of the Group. Noting the potential challenges posed by emerging technologies in the area of lethal autonomous weapons systems to IHL,1 the following were affirmed, without prejudice to the result of future discussions: (a) International humanitarian law continues to apply fully to all weapons systems, including the potential development and use of lethal autonomous weapons systems; (b) Human responsibility for decisions on the use of weapons systems must be retained since accountability cannot be transferred to machines. This should be considered across the entire life cycle of the weapons system; (c) Human-machine interaction, which may take various forms and be implemented at various stages of the life cycle of a weapon, should ensure that the potential use of weapons systems based on emerging technologies in the area of lethal autonomous weapons systems is in compliance with applicable international law, in particular IHL. In determining the quality and extent of human-machine interaction, a range of factors should be considered including the operational context, and the characteristics and capabilities of the weapons system as a whole; (d) Accountability for developing, deploying and using any emerging weapons system in the framework of the CCW must be ensured in accordance with applicable international law, including through the operation of such systems within a responsible chain of human command and control; (e) In accordance with States’ obligations under international law, in the study, development, acquisition, or adoption of a new weapon, means or method of warfare, determination must be made whether its employment would, in some or all circumstances, be prohibited by international law; (f) When developing or acquiring new weapons systems based on emerging technologies in the area of lethal autonomous weapons systems, physical security, appropriate non-physical safeguards (including cyber-security against hacking or data spoofing), the risk of acquisition by terrorist groups and the risk of proliferation should be considered; (g) Risk assessments and mitigation measures should be part of the design, development, testing and deployment cycle of emerging technologies in any weapons systems; (h) Consideration should be given to the use of emerging technologies in the area of lethal autonomous weapons systems in upholding compliance with IHL and other applicable international legal obligations; (i) In crafting potential policy measures, emerging technologies in the area of lethal autonomous weapons systems should not be anthropomorphized; (j) Discussions and any potential policy measures taken within the context of the CCW should not hamper progress in or access to peaceful uses of intelligent autonomous technologies; (k) The CCW offers an appropriate framework for dealing with the issue of emerging technologies in the area of lethal autonomous weapons systems within the context of the objectives and purposes of the Convention, which seeks to strike a balance between military necessity and humanitarian considerations.“, in: https://www.unog.ch/80256EE600585943/(httpPages)/5535B644C2AE8F28C1258433002BBF14?OpenDocument
  2. [2] Chair’s summary of the discussion of the 2019 Group of Governmental Experts on emerging technologies in the area of lethal autonomous weapons systems, Genf, 8. November 2019: „There were several concrete policy options for addressing the challenges raised by emerging technologies in the area of lethal autonomous weapons systems before the Group. 26. Some called for negotiation of a legally binding instrument containing prohibitions, regulations, positive obligations or a combination of these; this could take the form of a CCW protocol or a standalone treaty. Some called for a moratorium on the development and use of autonomous weapons in the interim. 27. Some called for negotiation of a political declaration containing non-binding commitments, possibly based on the Guiding Principles, and possibly leading to a non-binding code of conduct. 28. Some called for improved implementation of legal weapons reviews, as well as information sharing by States on best practices or an annual review mechanism through the CCW.29. Some held that no further legal measures were needed, if the view that IHL is fully applicable and sufficient to deal with any possible challenges raised by LAWS is considered. 30. There was a new proposal this year for the GGE to negotiate a non-legally binding technical outcome comprising a compilation of existing applicable international law and identifying associated good practices for States, which could follow the approach of the 2008 Montreux Document regarding private military and security companies during armed conflict. It was noted that this may have similarities to the proposed code of conduct mentioned previously. 31. A call was made for technical, military, and legal expert working groups to share best practices for responsible use and development, and for using those expert inputs to continue developing, refining, and elaborating the eleven Guiding Principles recommended by the GGE for adoption by the High Contracting Parties. 32. Along the lines of working groups, and in discussing the future work of the Group, a broader notion of discrete working groups or work streams was explored. Many delegations found utility in the notion of defining discrete legal, technological and military work streams, bearing in mind ethical considerations, and noting the need for cross-fertilization across the three and the desirability of each stream incorporating relevant technical expertise. Consensus could not, in the end, be reached regarding the details of this idea, with delegations agreeing, instead, to take forward the notion of working on the legal, technological and military aspects, with days of official work of the GGE focused on each aspect. 33. It was noted that the policy options are not mutually exclusive. Many delegations continued to affirm the suitability of the CCW for considering the implications of emerging technologies in the area of LAWS. Efforts were thus made to find common ground. 34. Regardless of the legal or political nature of any eventual instrument, the question of how to define the type and degree of human responsibility, judgement or control that would be required or appropriate was further explored as an important element in any policy option“ in: https://www.unog.ch/80256EE600585943/(httpPages)/5535B644C2AE8F28C1258433002BBF14?OpenDocument