by Bibi Imre-Millei
September 2025
In Our North, Strong and Free, the Canadian government has prioritized the adoption of emerging technologies, including unmanned or uncrewed aerial systems (UAS). As part of my PhD project, I have conducted 42 interviews with Canadian army and air force members who use UAS or drones in some capacity. One of the discussions we had was on the ethical implications of the use of UAS and drone operator opinions on the impact the use of these technologies has (or does not have) on military members.
This piece is meant to open a conversation about some of the questions the CAF might want to be asking to have a comprehensive ethical education for its UAS operators. As outlined below, participants had divergent attitudes on what drone use meant for Canadian state identity and how this related to ethics. The views of technology users must be considered in shaping the future moral character of drone use in the CAF.
Background
- During the last few years Canada has been working on reviving a rather dormant ‘drone program’ in all branches. I interviewed the army (n=33) and air force (n=9).
- Army drone use is based primarily in the artillery and armoured corps, with plans to build similar capabilities in the infantry.
- The army currently uses the CU-179 Raven-B (made by AeroVironment) and the CU-172 Blackjack (made by Boeing).
- The air force is working on standing up a team to use a modified version of the MQ-9B (Reaper) model of UAS as of 2022, which it will be naming Guardian for the Canadian context.
- The army and air force have different tasks and cultures in general and when it comes to UAS. The army uses shorter range technologies for surveillance, reconnaissance, and targeting/target acquisition. The air force, on the other hand, is planning to use long range UAS for patrolling the arctic along with other tasks.
- Those who use UAS in the army do not necessarily receive specific ethical education or training related to UAS use, apart from laws and regulations which govern who can be surveilled and when, along with different air space considerations. In the air force, the unit responsible for integrating the Guardian is attempting to integrate ethics into its training for operators.
Findings
The type of technology a country uses has implications for state-identity within the international system. Participants understood this and were able to place their own comments on ethics within this context. When it came to the ethical implications of various types of UAS use, participants tended to fall into the following groups. There are some overlaps between these groups, as people can hold multiple opinions at the same times, especially when it comes to complex and loaded topics such as ethical decision-making, and the role and need of certain technologies.
In relation to armed use:
- Armed use as necessary: Many participants felt that Canada was falling behind allies and adversaries. They believed Canada needed to catch up to match adversary capabilities. Participants often mentioned Russia, and some terrorist groups in this conversation. These participants felt there was a moral imperative to be matched with adversaries, to safeguard Canadian lives. Often (but certainly not always) these participants were less concerned about other moral imperatives, as preserving Canada lives through matching capabilities was seen to be the highest imperative, including by arming UAS.
- Armed use as more of the same: This group represented the largest group of participants. In their ethical reasoning, these participants viewed the use of armed drones as similar to the capabilities Canada already has. This group pointed to examples such as long-range targeting, mortars, grenades, crewed/manned planes which drop munitions, etc. The reasoning of this group relied on the idea that it does not matter (usually within the bounds of the laws of war, and international humanitarian law) what kills an enemy if they are killed with the proper planning, procedures, chains of custody for decision-making, and targeting guidelines in place.
- Caution when it comes to armed use: Some participants, including some who expressed opinions in line with the preceding two groups, expressed that there were several considerations when it came to armed use which had to be discussed. This included preparation for different forms of trauma that service members may experience, ensuring compliance with the law of armed conflict and international humanitarian law, along with various more practical aspects such as staffing issues, and equipment maintenance.
- Armed use as misaligned with Canadian values: There was a small group of participants who believed that armed use of UAS was a step too far and misaligned with Canadian values. This group viewed the Canadian military as a peacekeeping and defensive military and were not happy with expeditionary aspirations or military scaling up. They perceived armed drone use out of line with a more moderate military.
In relation to the integration of artificially intelligent (AI) capabilities into UAS:
- AI as necessary: This group aligned with the first group in the previous section in their views. Espousing the same opinions but when it came to AI. The groups tended to contain the same people as well, but this was not always the case.
- AI as suspicious or scary: Most participants approached the integration of AI into UAS with suspicion. Many discussed the idea that an AI would make decisions about targeting as worrisome, a bad idea, or disturbing, and felt that it was better for humans to be the ones to make mistakes as there is the possibility for accountability in these cases. Many participants also noted the importance of empathy and human emotional connection as important when making wartime decisions. This was something they believed AI did not or could not have. However, different participants had different personal lines for what they thought was appropriate.
- AI as misaligned with Canadian values: Like the last group in the previous section, this group urged caution and claimed they would be unhappy if the CAF were to integrate AI more than had already been done due to a perceived misalignment with Canadian military values.
Implications for Canada
The opinion split among participants is useful for understanding what kind of attitudes to technological use exist among users. It also shows that those who operate these technologies in the CAF have thoughts about the moral character of the CAF, and their own personal goals and moral lines when it comes to envisioning what the CAF should look like. As UAS use is still emergent in the CAF, it is important to think about the moral character we want this group of drone operators to take on, from both military and political perspectives.
- What kind of ethical training should be included for those who work with UAS?
- As more types of UAS are (re)introduced (quadcopters, long range UAS), the CAF must also consider how to break down ethical training by technological type, and what the boundaries are. Can the NATO classifications to be considered good enough for Canadian purposes?
- Further, what types of arms treaties should Canada sign on to when it comes to the use of UAS and indeed AI? How should UAS ethical training reflect this?
This is a challenge that must be met soon, as elections and wars throw Canada into an ever more complicated world. These are not small questions. They have implications for Canada’s role in the world, as well as global standards for human rights, which Canada currently considers itself a champion of.
About the Author
Bibi Imre-Millei is a PhD student at the Department of Political Science at Lund University in Sweden. For her thesis, she researches drone operator identity and community from a critical perspective. Bibi is also a graduate research fellow at the Centre for International and Defence Policy at Queen’s University.
Canadian Global Affairs Institute
The Canadian Global Affairs Institute focuses on the entire range of Canada’s international relations in all its forms including trade investment and international capacity building. Successor to the Canadian Defence and Foreign Affairs Institute (CDFAI, which was established in 2001), the Institute works to inform Canadians about the importance of having a respected and influential voice in those parts of the globe where Canada has significant interests due to trade and investment, origins of Canada’s population, geographic security (and especially security of North America in conjunction with the United States), social development, or the peace and freedom of allied nations. The Institute aims to demonstrate to Canadians the importance of comprehensive foreign, defence and trade policies which both express our values and represent our interests.
The Institute was created to bridge the gap between what Canadians need to know about Canadian international activities and what they do know. Historically Canadians have tended to look abroad out of a search for markets because Canada depends heavily on foreign trade. In the modern post-Cold War world, however, global security and stability have become the bedrocks of global commerce and the free movement of people, goods and ideas across international boundaries. Canada has striven to open the world since the 1930s and was a driving factor behind the adoption of the main structures which underpin globalization such as the International Monetary Fund, the World Bank, the World Trade Organization and emerging free trade networks connecting dozens of international economies. The Canadian Global Affairs Institute recognizes Canada’s contribution to a globalized world and aims to inform Canadians about Canada’s role in that process and the connection between globalization and security.
In all its activities the Institute is a charitable, non-partisan, non-advocacy organization that provides a platform for a variety of viewpoints. It is supported financially by the contributions of individuals, foundations, and corporations. Conclusions or opinions expressed in Institute publications and programs are those of the author(s) and do not necessarily reflect the views of Institute staff, fellows, directors, advisors or any individuals or organizations that provide financial support to, or collaborate with, the Institute.



Showing 1 reaction