Q2/2019 - Group of Governmental Experts on Lethal Autonomous Weapon System (GGE-LAWS)

7. Mai 2019, Chairs Non-Paper

Nach Abschluss der 3. Tagung der GGE-LAWS Ende März 2019 hat der Vorsitzende der Gruppe, der indische Botschafter, Amadeep Singh Gill, am 7. Mai 2019 ein Non-Paper zirkulieren lassen, dass den gegenwärtigen Stand der Diskussion auf sechs Seiten zusammenfasst. Das Papier enthält in eckigen Klammern auch alle kontroversen Punkte.

Das Papier beruht auf dem generellen Konsensus der GGE-LAWS, dass das Völkerrecht in Gestalt der UN-Charta von 1945 und das humanitäre Völkerrecht (Kriegsvölkerrecht) in Gestalt der Genfer Konventionen von 1948, als Grundlagen für die Behandlung aller Fragen im Zusammenhang mit der Entwicklung und Anwendung von tödlichen autonomen Waffensystem zu betrachten sind[1].

Ein zweiter Konsensus besteht darin, dass die Verantwortung für den Einsatz solcher Waffensysteme nicht an Maschinen delegiert werden darf, sondern in der Hand von Menschen, die für ihre Aktionen auch zur Verantwortung gezogen werden können, bleiben muss[2].

Das Non-Paper listet eine Reihe von Prinzipien, Grundsätzen und Normen auf, die bei der Entwicklung und Anwendung tödlicher autonomer Waffensysteme zu berücksichtigen seien, lässt jedoch offen, in welche Form diese Normen völkerrechtlich verbindlich gemacht werden sollen. Ein von einigen Mitgliedern der GGE-LAWS befürwortetes Moratorium für die Entwicklung solcher Waffensysteme wird in dem Non-Paper nicht angesprochen. Das Paper verweist demgegenüber auf den „Dual Use“- Charakter von LAWS und argumentiert, dass die Möglichkeit eines Missbrauchs von autonomen Systemen nicht generell zu einem Verbot der friedlichen Nutzung dieser Technologien führen darf[3].

Mehr zum Thema
Q2/2019GGE-LAWS
  1. [1] Draft Conclusions ̶ Chair's Non-Paper, Genf, 7. Mai 2019: „It was affirmed that international law, in particular the United Nations Charter and international humanitarian law (IHL) as well as relevant ethical perspectives, should guide the continued work of the Group. Noting the potential challenges posed by emerging technologies in the area of lethal autonomous weapons systems to IHL, the following were affirmed, without prejudice to the result of future discussions: Ethical and moral considerations, particularly in relation to human dignity, continue to guide the work of the Group.“ https://www.unog.ch/80256EDD006B8954/(httpAssets)/D8C04EC71F502A77C12583F400476619/$file/Draft+Conclusions+(%D1%84).pdf
  2. [2] Draft Conclusions ̶ Chair's Non-Paper, Genf, 7. Mai 2019: „Human responsibility for decisions on the use of weapons systems must be retained since accountability cannot be transferred to machines. This should be considered across the entire life cycle of the weapon system. i. Human judgement informed by knowledge of the operational context is essential in order to ensure force is used in compliance with international law, and in particular IHL., ii. Human responsibility over the use of force can be exercised through political direction in the pre-development stage and across the life-cycle of the weapon system, including: research and development; testing, evaluation, and certification; deployment, training, and command and control; use and abort functions; and post-use assessment, iii. The extent and quality of human-machine interaction in the operation of a weapon system based on emerging technologies in the area of lethal autonomous weapons systems should be informed by a range of factors, such as the operational context, the characteristics and capabilities of the weapon system, the performance and reliability of specific functions in the weapon system, and how human-machine interaction has been implemented in other parts of the life-cycle of the weapon system, iv. The totality of human-machine interaction must allow for the use of the weapon system consistent with applicable international law, in particular international humanitarian law, v. Adherence to the IHL principles of distinction, proportionality and precautions in attack rely on qualitative judgements based on contextual knowledge that can only be made by humans, vi. The use of force requires human agency and human intention. It is humans that apply IHL and are obliged to respect it. In particular, the target acquisition and engagement functions, which involve comprehensive assessments of a given situation (taking into account technical, legal, political, military and ethical considerations) requires the exercise of human judgement. Human operators and commanders need to understand, inter alia, the operational environment, since the use of force is contextual, and how the weapon system is likely to interact with the operating environment, in order to be able to ensure their use of force is consistent with applicable international law, vii. Human control, understood as translating human judgement and assessments into operations by design and in use of weapons systems, over the use of force must be exercised over the different stages of the life-cycle of all weapon systems, including those that employ autonomy. https://www.unog.ch/80256EDD006B8954/(httpAssets)/D8C04EC71F502A77C12583F400476619/$file/Draft+Conclusions+(%D1%84).pdf
  3. [3] Draft Conclusions ̶ Chair's Non-Paper, Genf, 7. Mai 2019: „Discussions and any potential policy measures taken within the context of the CCW should not hamper progress in or access to peaceful uses of intelligent autonomous technologies. Given the dual use nature of the underlying technologies, it is important to promote responsible innovation and to avoid hampering progress in or access to peaceful uses of related technologies https://www.unog.ch/80256EDD006B8954/(httpAssets)/D8C04EC71F502A77C12583F400476619/$file/Draft+Conclusions+(%D1%84).pdf