Ethical boundaries of AI-Driven Autonomous Weapons
DOI:
https://doi.org/10.31617/3.2026(142)10Keywords:
autonomous weapons, artificial intelligence, international humanitarian law, ethical standards, military technologies.Abstract
The rapid development of artificial intelligence technologies and their active implementation in the military sphere are leading to multidimensional challenges encompassing ethical, legal, technological, and security aspects. Particular attention is drawn to the phenomenon of autonomous weapons, specifically combat systems capable of independently making decisions regarding the use of force without direct human involvement. Such an approach radically changes the understanding of the nature of modern armed conflicts, while at the same time generating risks of losing human control, violating international humanitarian law, and blurring the lines of legal and moral responsibility.
The relevance of this study lies in the need to develop clear mechanisms for the ethical and legal regulation of autonomous combat systems in conditions where international law has not yet established unified standards, and national legislation remains fragmented and unsystematic. The research hypothesis is based on the assumption that the absence of defined frameworks for the use of autonomous weapons creates the risk of uncontrolled use of force, the consequences of which could be catastrophic for both civilian populations and global stability and the international order.
The methodology includes an interdisciplinary approach that combines a comparative analysis of international legal acts, an assessment of ethical concepts of autonomous decision-making, a systematic risk analysis, and a critical discourse on the issue of civil and democratic control over military technologies. Particular attention is paid to the analysis of Ukrainian legislation and its compliance with international standards, taking into account the provisions of the Military Security Strategy of Ukraine.
The results of the study indicate that Ukraine’s current regulatory framework only partially governs the field of autonomous weapons, while international conventions do not contain sufficiently specific provisions regarding fully autonomous systems. The need to develop ethical codes, ensure transparency of decision-making algorithms, create accountability mechanisms, and harmonise national legislation with modern international approaches is substantiated. It is emphasised that the integration of ethical and legal control mechanisms is a key condition for minimising potential threats, guaranteeing human rights, and ensuring global security in the context of a new military reality.
References
Asaro, P. (2012). On banning autonomous weapon systems: Human rights, automation, and the dehumanization of lethal decision-making. International Review of the Red Cross, 94(886), 687-709. https://doi.org/10.1017/S1816383112000768
Bendett, S., & Kirichenko, D. (2025). Battlefield Drones and the Accelerating Autonomous Arms Race in Ukraine. https://mwi.westpoint.edu/battlefield-drones-and-the-accelerating-autonomous-arms-race-in-ukraine/
Boulanin, V., & Verbruggen, M. (2017). Autonomous weapon systems: Technical, military, legal and humanitarian aspects. SIPRI.
Heyns, C. (2016). Human rights and the use of autonomous weapons in warfare. Human Rights Quarterly, 38(2), 350-378. https://doi.org/10.1353/hrq.2016.0034
IEEE. (2019). Ethically Aligned Design: A Vision for Prioritizing Human Well-being with Artificial Intelligence and Autonomous Systems. https://standards.ieee.org/wp-content/uploads/import/documents/other/ead_v2.pdf
International Committee of the Red Cross. (1977, June 8). Protocol Additional to the Geneva Conventions of 12 August 1949, and relating to the Protection of Victims of International Armed Conflicts (Protocol I). https://ihl-databases.icrc.org/en/ihl-treaties/api-1977
Ministry of Digital Transformation of Ukraine. (2023, October 7). Roadmap for AI Regulation in Ukraine. https://ai.org.ua/a-road-map-of-ai-regulation-in-ukraine/
Mozur, P., & Wakabayashi, D. (2018, June 1). Google faces internal protest over Pentagon AI project. The New York Times. https://www.nytimes.com/2018/06/01/technology/google-pentagon-project-maven.html
NATO / ARROWS Law Firm. (2025). Legal aspects of the development of weapon systems with AI in 2025. https://arws.cz/news-at-arrows/legal-aspects-of-the-development-of-weapon-systems-with-artificial-intelligence-in-2025
Scharre, P. (2018). Army of none: Autonomous weapons and the future of war. W. W. Norton & Company.
Sharkey, N. (2010). Saying "No!" to lethal autonomous targeting. Journal of Military Ethics, 9(4), 369-383. https://doi.org/10.1080/15027570.2010.537903
The New York Times. (2018, April 4). "The Business of War": Google Employees Protest Work for the Pentagon. https://www.nytimes.com/2018/04/04/technology/google-letter-ceo-pentagon-project.html
The Washington Post. (2025, February 4). Google drops pledge not to use AI for weapons or surveillance. https://www.washingtonpost.com/technology/2025/02/04/google-ai-policies-weapons-harm
UNESCO. (2021). Reimagining our futures together: A new social contract for education. https://unesdoc.unesco.org/ark:/48223/pf0000379707
United Nations. (1997, September 18). Convention on the Prohibition of the Use, Stockpiling, Production and Transfer of Anti-Personnel Mines and on their Destruction. https://treaties.un.org/Pages/ViewDetails.aspx?src=TREATY&mtdsg_no=XXVI-5&chapter=26&clang=_en
United Nations. (2008, May 30). Convention on Cluster Munitions. https://treaties.un.org/pages/ViewDetails.aspx?src=TREATY&mtdsg_no=XXVI-6&chapter=26&clang=_en
United Nations. (2024, October 2). Chair's summary - Second 2024 session of the GGE on LAWS. https://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2024/gge/documents/WP11.pdf
United Nations. (2024, September 22). Pact for the Future, Global Digital Compact and Declaration on Future Generations. https://www.un.org/en/summit-of-the-future/pact-for-the-future-revisions
Additional Files
Published
How to Cite
Issue
Section
License

This work is licensed under a Creative Commons Attribution 4.0 International License.
This work is licensed under a Creative Commons Attribution 4.0 International (CC BY 4.0)
