- Armed conflict — Protective measures
The role of autonomy and artificial intelligence (AI) in armed conflict has sparked heated debate. The resulting controversy has obscured the benefits of autonomy and AI for compliance with international humanitarian law (IHL). Compliance with IHL often hinges on situational awareness: information about a possible target's behavior, nearby protected persons and objects, and conditions that might compromise the planner's own perception or judgment. This chapter argues that AI can assist in developing situational awareness technology (SAT) that will make target selection and collateral damage estimation more accurate, thereby reducing harm to civilians. SAT complements familiar precautionary measures such as taking additional time and consulting with more senior officers. These familiar precautions are subject to three limiting factors: contingency, imperfect information, and confirmation bias. This chapter breaks down SAT into three roles. Gatekeeper SAT ensures that operators have the information they need. In each of the three contexts, SAT can help fulfill IHL's mandate of “constant care” in the avoidance of harm to civilian persons and objects.
Users without a subscription are not able to see the full
to access all content.