Q2/2019 - Group of Governmental Experts on Lethal Autonomous Weapons Systems (GGE-LAWS)

7 May 2019, Chairs Non-Paper

After the conclusion of the 3rd GGE-LAWS meeting at the end of March 2019, the chairman of the group, the Indian ambassador Amadeep Singh Gill, circulated a non-paper on May 7, 2019, which summarises the current state of the discussion on six pages. The paper also contains all controversial points, put in square brackets.

The paper is based on the general consensus of the GGE-LAWS that international law in the form of the UN Charter of 1945 and humanitarian law (international law of war) in the form of the Geneva Conventions of 1948 must be regarded as the basis for dealing with all issues related to the development and application of lethal autonomous weapons systems.[1]

The Group also unanimously agrees that responsibility for the use of such weapons systems must not be delegated to machines, but must remain in the hands of human beings who can also be held accountable for their actions.[2]

The non-paper lists a number of principles, guidelines and norms to be taken into account in the development and application of lethal autonomous weapons systems, but leaves open the form in which these norms should be made binding under international law. A moratorium on the development of such weapons systems, as it is advocated by some members of the GGE-LAWS, is not addressed in the non-paper. Instead, the paper refers to the "dual use" nature of LAWS and argues that the possibility of abuse of autonomous systems must not lead to a general ban on the peaceful use of these technologies.[3]

Mehr zum Thema
Q2/2019GGE-LAWS
  1. [1] Draft Conclusions ̶ Chair's Non-Paper, Geneva, 7 May2019: „It was affirmed that international law, in particular the United Nations Charter and international humanitarian law (IHL) as well as relevant ethical perspectives, should guide the continued work of the Group. Noting the potential challenges posed by emerging technologies in the area of lethal autonomous weapons systems to IHL, the following were affirmed, without prejudice to the result of future discussions: Ethical and moral considerations, particularly in relation to human dignity, continue to guide the work of the Group.“ https://www.unog.ch/80256EDD006B8954/(httpAssets)/D8C04EC71F502A77C12583F400476619/$file/Draft+Conclusions+(%D1%84).pdf
  2. [2] Draft Conclusions ̶ Chair's Non-Paper, Geneva, 7 May2019: „Human responsibility for decisions on the use of weapons systems must be retained since accountability cannot be transferred to machines. This should be considered across the entire life cycle of the weapon system. i. Human judgement informed by knowledge of the operational context is essential in order to ensure force is used in compliance with international law, and in particular IHL., ii. Human responsibility over the use of force can be exercised through political direction in the pre-development stage and across the life-cycle of the weapon system, including: research and development; testing, evaluation, and certification; deployment, training, and command and control; use and abort functions; and post-use assessment, iii. The extent and quality of human-machine interaction in the operation of a weapon system based on emerging technologies in the area of lethal autonomous weapons systems should be informed by a range of factors, such as the operational context, the characteristics and capabilities of the weapon system, the performance and reliability of specific functions in the weapon system, and how human-machine interaction has been implemented in other parts of the life-cycle of the weapon system, iv. The totality of human-machine interaction must allow for the use of the weapon system consistent with applicable international law, in particular international humanitarian law, v. Adherence to the IHL principles of distinction, proportionality and precautions in attack rely on qualitative judgements based on contextual knowledge that can only be made by humans, vi. The use of force requires human agency and human intention. It is humans that apply IHL and are obliged to respect it. In particular, the target acquisition and engagement functions, which involve comprehensive assessments of a given situation (taking into account technical, legal, political, military and ethical considerations) requires the exercise of human judgement. Human operators and commanders need to understand, inter alia, the operational environment, since the use of force is contextual, and how the weapon system is likely to interact with the operating environment, in order to be able to ensure their use of force is consistent with applicable international law, vii. Human control, understood as translating human judgement and assessments into operations by design and in use of weapons systems, over the use of force must be exercised over the different stages of the life-cycle of all weapon systems, including those that employ autonomy. https://www.unog.ch/80256EDD006B8954/(httpAssets)/D8C04EC71F502A77C12583F400476619/$file/Draft+Conclusions+(%D1%84).pdf
  3. [3] Draft Conclusions ̶ Chair's Non-Paper, Geneva, 7 May2019: „Discussions and any potential policy measures taken within the context of the CCW should not hamper progress in or access to peaceful uses of intelligent autonomous technologies. Given the dual use nature of the underlying technologies, it is important to promote responsible innovation and to avoid hampering progress in or access to peaceful uses of related technologies. https://www.unog.ch/80256EDD006B8954/(httpAssets)/D8C04EC71F502A77C12583F400476619/$file/Draft+Conclusions+(%D1%84).pdf