Military Use of Artificial Intelligence under International Humanitarian Law: Insights from Canada
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32797
Military Use of Artificial Intelligence under International Humanitarian Law: Insights from Canada

Authors: Mahshid Talebian Kiakalayeh

Abstract:

As artificial intelligence (AI) technologies can be used by both civilians and soldiers; it is vital to consider the consequences emanating from AI military as well as civilian use. Indeed, many of the same technologies can have a dual-use. This paper will explore the military uses of AI and assess their compliance with international legal norms. AI developments not only have changed the capacity of the military to conduct complex operations but have also increased legal concerns. The existence of a potential legal vacuum in legal principles on the military use of AI indicates the necessity of more study on compliance with International Humanitarian Law (IHL), the branch of international law which governs the conduct of hostilities. While capabilities of new means of military AI continue to advance at incredible rates, this body of law is seeking to limit the methods of warfare protecting civilian persons who are not participating in an armed conflict. Implementing AI in the military realm would result in potential issues including ethical and legal challenges. For instance, when intelligence can perform any warfare task without any human involvement, a range of humanitarian debates will be raised as to whether this technology might distinguish between military and civilian targets or not. This is mainly because AI in fully military systems would not seem to carry legal and ethical judgment which can interfere with IHL principles. The paper will take, as a case study, Canada’s compliance with IHL in the area of AI and the related legal issues that are likely to arise as this country continues to develop military uses of AI.

Keywords: Artificial intelligence, military use, International Humanitarian Law, the Canadian perspective.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1117

References:


[1] Ch. Grut, “The Challenge of Autonomous Lethal Robotics to International Humanitarian Law” (2013) 18:1 J Confl & Secur L 5.
[2] D. Bonnie et al., “Head the Call: A Moral and Legal Imperative to Ban Killer Robots” (2018), Human Rights Watch < https://www.hrw.org/report/2018/08/21/heed-call/moral-and-legal-imperative-ban-killer-robots>.
[3] Protocol Additional to the Geneva Conventions of August 12, 1949, and Relating to the Protection of Victims of International Armed Conflicts (Protocol I), 8 June 1977, 1125 UNTS 3, arts 48 & 51, (entered into force 7 December, 1978).
[4] International Committee of the Red Cross, Report on the Ethics and Autonomous Weapon Systems: An Ethical Basis for Human Control? (2018), online: .
[5] F. E. Morgan et al, eds, Military applications of artificial intelligence (Santa Monica, CA: RAND corporation, 2020).
[6] Meeting of the High Contracting Parties to the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects (2019), online:
[7] Convention on the Prohibition of the Use, Stockpiling, Production and Transfer of Anti-Personnel Mines and on their Destruction, 18 September 1997, 2056 UNTS 211 (adopted in 1997 and entered into force 1 March 1999).
[8] D. Amoroso, “Jus in bello and jus ad bellum arguments against autonomy in weapons systems: A re-appraisal”,
[9] H. Evans, “Lethal Autonomous Weapons Systems at the First and Second U.N. GGE Meetings”, LAWFARE (Apr. 9, 2018),
[10] P. Allison, “CCW Report, Reaching Critical Will of Women’s International League for Peace and Freedom” Vol. 6, No. 3 (Apr. 11, 2018),
[11] National Defence, Joint Doctrine Manual: Law of Armed Conflict at the Operational and Tactical Levels,
[12] Directive on Automated Decision-Making:
[13] UNESCO, “Canada First to Adopt Strategy for Artificial Intelligence”, online: h.
[14] Strong, Secure, Engaged: Canada's Defence Policy, Department of National Defence
[15] Ch. Saad & E. Gosal, “Autonomous weapons systems: how to work towards a total ban?” (13 September 2019), online: The Canadian Bar Association .
[16] Ian. Kerr et al, Open Letter to the Prime Minister of Canada (2017), Canada Research Chair in Ethics, Law and Technology, University of Ottawa, “Call for an International Ban on the Weaponization of Artificial Intelligence”, < https://techlaw.uottawa.ca/bankillerai>
[17] J. Kung, “Building an AI World: Report on National and Regional AI Strategies” (May 2020), online (pdf): CIFAR < https://www.cifar.ca/docs/default-source/ai-reports/building-an-ai-world-second-edition-f.pdf>.
[18] N. Davison, “A legal perspective: Autonomous weapon systems under international humanitarian law” in UNODA Occasional Papers No. 30: Perspectives on Lethal Autonomous Weapon Systems, (UN, New York, 2017) 5
[19] Canada: Opening Statement, CCW States Parties GGE on LAWS Second Meeting, Apr. 9–13, 2018, Geneva, Switzerland,
[20] Military Confidence-building, United Nations Office for Disarmament Affairs: < https://www.un.org/disarmament/cbms/>
[21] Ch. Kilford, “Canada's New Defence Policy: A Huge Step in the Right Direction” (2017)
[22] The Militarization of Artificial Intelligence August (2019), United Nations:
[23] Canada-France Statement on Artificial Intelligence (July 6, 2018), .