Jump to Content Jump to Main Navigation

Part Two Precautions, 6 The Other Side of Autonomous Weapons: Using Artificial Intelligence to Enhance IHL Compliance

Peter Margulies

From: The Impact of Emerging Technologies on the Law of Armed Conflict

Edited By: MAJ Ronald T.P. Alcala, Eric Talbot Jensen

From: Oxford Public International Law (http://opil.ouplaw.com). (c) Oxford University Press, 2021. All Rights Reserved.date: 20 October 2021

Subject(s):
Armed conflict — Protective measures

The role of autonomy and artificial intelligence (AI) in armed conflict has sparked heated debate. The resulting controversy has obscured the benefits of autonomy and AI for compliance with international humanitarian law (IHL). Compliance with IHL often hinges on situational awareness: information about a possible target's behavior, nearby protected persons and objects, and conditions that might compromise the planner's own perception or judgment. This chapter argues that AI can assist in developing situational awareness technology (SAT) that will make target selection and collateral damage estimation more accurate, thereby reducing harm to civilians. SAT complements familiar precautionary measures such as taking additional time and consulting with more senior officers. These familiar precautions are subject to three limiting factors: contingency, imperfect information, and confirmation bias. This chapter breaks down SAT into three roles. Gatekeeper SAT ensures that operators have the information they need. In each of the three contexts, SAT can help fulfill IHL's mandate of “constant care” in the avoidance of harm to civilian persons and objects.

Users without a subscription are not able to see the full content. Please, subscribe or login to access all content.