
Members of the Multinational Brigade launch a CU-172 Blackjack drone during Exercise RESOLUTE WARRIOR in Latvia on November 6, 2024.
Canadian Forces Combat Camera: Multinational Brigade Imagery Technician
A Triple Helix publication
by Bibi Imre-Millei
November 2025
Table of Contents
- Introduction
- Background
- The DEP as Applied to UAS Use
- A Blueprint for UAS Ethical Education?
- Conclusion
- About the Author
- Canadian Global Affairs Institute
Introduction
While the CAF has used unmanned or uncrewed aerial systems (UAS) or ‘drones’ on and off for the past three decades in various settings, it is unclear whether and how ethical questions about their use are being raised in military settings. This is not surprising, as Canada has been criticized for lagging on regulations and ethical guidelines for emerging technologies in general. The CAF has also still not developed the section on ‘Ethics in Operations’ in its Code of Values and Ethics. Both the Canadian government and the CAF have stated goals of strengthening Canadian defence through technological innovation. Our North, Strong and Free envisions a CAF which can deploy drones with surveillance and strike capabilities as well as counter them. The Pan-Domain Command and Control Concept Paper also anticipates a future with data gathered from drones and a reinvestment in emerging technology.
In the face of conflicts such as Russia’s second invasion of Ukraine, decision-makers have been trying to emulate the ingenuity and flexibility of the Ukrainian response. Those I interviewed and observed for my PhD data collection, from Corporals to Lieutenant Colonels with policy influence, were highly focused on Ukraine, even if Canada’s strategy has a more broad focus. But in Ukraine we are also seeing violations of the Law of Armed Conflict and International Humanitarian Law on both sides using drones. This includes the targeting of civilians and civilian infrastructure, but also the treatment of wounded combatants and the photographing of prisoners of war.
So, is the information on ethical decision-making that CAF members receive adequate for the integration of emerging technologies while maintaining these international standards?
In this piece, I focus specifically on the current Defence Ethics Program (DEP) and argue it is not equipped to deal with the scenarios which UAS operators might face, and point to a potential blueprint to better address the ethical considerations of drone operation. Wasilow and Thorpe’s article from 2019 in AI Magazine offers a perspective on ethical education based on the DEP that could be used as a starting point because it engages with a program already used by the CAF. I use data from interviews conducted for my PhD to determine the gaps in the DEP and think through what could be useful for UAS operators to learn. While Wasilow and Thorpe’s blueprint may not be perfect, the CAF must start somewhere if they are to responsibly use the types of emerging technologies which they are currently integrating.
Background
For my PhD, I have interviewed 42 Canadian army (n=33) and air force (n=9) members who use, repair and maintain, or oversee the use of UAS in some capacity.
Within the army, the artillery and armoured use UAS the most extensively, using the CU-179 Raven-B (made by AeroVironment) and the CU-172 Blackjack (made by Boeing). From conversations with infantry officers, there are active plans to integrate UAS such as the TEAL 2 UAS made by TEAL Drones. I also conducted interviews with current and former air force members, including some of those working to integrate an armed UAS, a version of the MQ-9B Predator named the Guardian for the Canadian context.
In interviews with army and air force members, many indicated that our discussions about the ethics of the technology they were using were new to them. It was difficult for them to think of potential scenarios which could breach various ethical guidelines in the Canadian context, and the commonly used example was related to regulations around not filming civilian sites during training exercises and on deployment for the privacy of civilians. Other flight safety considerations were also commonly brought up such as adequate sleep and controlled substance use. But many found it easier to discuss when I began to introduce questions around integrating artificial intelligence (AI). Here, their concerns centred around accountability for decision making, and making sure that Canada was on par with adversaries. More about the views of the current and former servicemembers I interviewed can be found in my briefing note on a similar topic.
Air force and army members were concerned with similar sets of ethics and also mental health challenges for operators, especially those air force members directly in charge of setting up the Guardian program. However, there was a lack of deeper application and connection with principles espoused in the DEP. Instead, there was a general consensus of trust in Canadian regulations and ethics, without a nuanced reflection on what those regulations and ethics should actually entail.
The training for those who will operate the Guardian, including potential ethical guidelines, was still in development at the time of my interviews. However, there are training courses that members who use the Raven and Blackjack take. I was told these courses are skills based, but I have not been able to review the materials to verify. The courses contain practical operational knowledge, as well as materials on flight safety and general flight regulations, as army members are not always taught about this in other settings. I was informed there is not a specific ethical education for CAF UAS operators in the army beyond this, but that they participate in yearly refreshers on the Law of Armed Conflict and International Humanitarian Law just as other servicemembers. They are also supposed to engage with the DEP.
The DEP as Applied to UAS Use
The DEP has gone through multiple updates and changes over its decades of existence, with a current focus on scenario-based exercises which aim for members of the defence team to work through situations which may have various ethical implications. At the base level, the DEP is built on three ethical principles and expected behaviours and five specific values and expected behaviours.
Ethical principles and expected behaviours are:
- Respect the dignity of all persons
- Serve Canada before self
- Obey and support lawful authority
Specific values and expected behaviours are:
- Integrity
- Loyalty
- Courage
- Stewardship
- Excellence
These principles and values are supported by the Canadian Armed Forces (CAF) Ethos: Trusted to Serve, which is meant to reflect Canadian values. It is also supported by the Code of Values and Ethics which, as previously stated, does not yet include guidance on ethics in operations.
As Scoppio and Covell noted when mapping military education in 2016, there was already a lack of resources at that time for military education, as well as a lack of willingness to change existing practices, a lack of pedagogical focus, and a haphazard integration of technology which could assist learning. Such issues have only gotten worse, as CAF members continue to complain of a lack of time and resources for proper training and education, including in my own conversations with them over the last six years researching the CAF. However, Prime Minister Carney’s promises of reinvestment in defence could present an opportunity for better ethics education in the future.
In this environment, the DEP plays a key role in the ethical education for CAF members. The DEP is intended to be given as a one day course taught by commanding officers. Servicemembers are then meant to socialize the DEP along with other ethical principles and values, and the CAF is making a variety of efforts to make the ethical education process work better. However, there are currently no assessments or evaluations which would allow the monitoring of whether and how CAF members live up to espoused values.
At the time of my first analysis in May 2025, there were 151 DEP Professional Conduct Scenarios. During my analysis check October 2025, there were 150, as one was removed. I read the summaries for all the 151 available as of May 2025 on the Defence Ethics website, and flagged scenarios that may be specifically relevant for UAS operation for further analysis. Of course, most scenarios are relevant for a UAS operator’s day-to-day military life, however I set out to determine whether scenarios existed in the DEP which addressed specific challenges that UAS operators may face, and to analyse those scenarios.
I found many of the 151 scenarios were not actual ethical conundrums but issues of procedure or rule-breaking. While whether an officer can bill a breakfast on a trip when a hotel breakfast was already included may be an interesting question for a CAF accountant, it is not necessarily the most pressing question of military ethics. This is not to say there were not interesting and timely scenarios included. Many scenarios were deeply relevant to the ethical problems of military life, such as scenarios where a victim/survivor of harassment does not want to report, or scenarios where CAF members are not authorized to kinetically engage with enemies but may be forced to do so. From the scenarios listed, I determined that the deployments which are discussed mirror conditions in Africa and the Middle East. While some deployments like this still exist, and may exist again in the future, there is a lack of coverage of deployments to places like Latvia (which have a very different set of goals and challenges), where particularly army UAS operators tend to be deployed today. Training deployments such as the deployment to Jordan are also underrepresented in the scenarios.
When it comes to scenarios which could be relevant for UAS operators’ specifically, I found nine. There are now only eight scenarios as of October 2025, because the scenario that was removed was one of those I selected. Three were focused on alcohol consumption, three on rules of engagement with enemies on deployment (only two of these remain online), and three around mistakes, accidents or cutting corners. The nine ethical scenarios I found can be described as follows.
In the category of alcohol consumption, one scenario describes drinking and then an accident and injury occurs, another describes a more general pattern of problems with alcohol going unaddressed on deployment, and the third scenario describes a situation where alcohol is expressly forbidden on a specific deployment and consumed anyway, leading to drunkenness. These three scenarios ask questions about whether and how to report such incidents and demonstrate potentially unforeseen consequences.
This emphasis on alcohol in particular was something I also saw at the military schools I visited, where posters on the wall focused on safety regulations around operating heavy machinery and flight safety often with a direct reference to alcohol and sleep. Alcohol, other substances (usually cannabis), and sleep were also common discussions in my interviews.
In the second category, rules of engagement, the scenarios were more demanding. One described a scenario where a CAF member shoots a belligerent who is not a direct threat to them in supposed revenge or reaction to the killing an aid worker. This scenario, titled ‘Too Quick on the Draw,’ was present as of May 2025 when I conducted my initial analysis, but has been removed as of October 2025. I have left it linked in case the scenario is re-instated. Anotherdescribes a scenario where a commanding officer asks military members to stay on the base, but they receive a call for help from Canadian citizens. There is also another scenario which describes CAF members who kill a captured belligerent in retribution for the killing of their fellow CAF members.
I chose these scenarios for relevance to UAS operators specifically, as they described instances very similar to those outlined in UAS operator biographies (such as Martin’s, McCurley’s, and Velicovich’s) and interviews with long range UAS operators in the UK and the US (such as books by Clark, Lee, Phelps, and Shoker). All three scenarios are reminiscent of scenarios prevalent in this literature where UAS operators can see conflict occurring or about to occur, can see potential and actual casualties of civilians, aid workers, and fellow troops, and are not authorized to engage. This kind of scenario is shown in the literature to be potentially traumatic and morally injurious, and so should be prepared for. Chamayou outlines this potential for moral injury from a theoretical perspective, and Campo builds on the idea that distance itself can be morally injurious through servicemember experiences. Armour and Ross focus more broadly on the wellbeing of those who use drones, and Chappelle and colleagues have researched both burnout and post traumatic stress.
Further, the idea of desired or actual retribution is also common in the literature. Long range UAS operators often feel deeply attached to the troops on the ground they support, meaning that the potential to break with rules of engagement or even commit war crimes (or the desire to do so) may arise when seeing people they feel attached to being targeted, hurt, or killed. Those I interviewed echoed their attachments to troops on the ground, drawing especially on experiences in Afghanistan. This also links to a potential for dehumanizing the enemy, which the literature agrees that at least long-range drones facilitate. Asaro and Adams and Barrie both comment on the effects that bureaucratization in the context drones has on dehumanization, whereas Chamayou, Olhson and Labuski, and Wilcox all comment on gendered and racialized dynamics which can lead to dehumanization. However, dehumanization of the enemy is only supplementarily touched on in the DEP scenarios.
The final set of scenarios I found relevant to UAS operators concern accidents, mistakes, and reporting. In one, a sergeant does not want to report an oil spill to an environmental officer after he accidentally hits a barrel because he believes he has adequately cleaned it up. In another scenario, a military member realized several of her peers have cut corners when maintaining vehicles and doing safety checks, when she reports the issues, she is first ignored, and then faces reprisal for doing so. The last scenarios concerns a technical error and flight safety mistake made on a course. The military member who made the mistake tells his friend, but notes he will not report it, even though his friend encourages him to, because he does not want to jeopardize progress on the course. These scenarios are relevant to UAS operators, those who repair and maintain UAS, and those who oversee UAS use.
The two latter scenarios on insuring safety in various technical settings seem to be in line with what UAS operators and technicians generally discussed with me in interviews about the importance of flight safety protocols and proper maintenance. For UAS operators who use the Raven, all three scenarios are particularly important as the Raven crashes on coming back to the ground and must be reassembled and maintained by the operators themselves each time a flight is taken. The first scenario on environmental protection is important due to the frequent crashes of the Raven as well as the potential for other UAS’ used by the CAF to crash. There is a potential for some of these crashes to involve environmental damage or pollution, and it is also important to teach the need for retrieval of crashed technologies in general, and for proper protocol to be followed when they cannot be retrieved. However, these issues do seem to be discussed among UAS operators regularly, based on my interviews.
While the scenarios mentioned here from the DEP Professional Conduct Scenarios are all important, none of them are actually tailored to a UAS scenario, and only nine out of 150 listed speak to the specific issues that may be faced by UAS operators. Canada has been criticized for being behind on regulations and ethical guidelines for emerging technologies in general, and there has been a lack of development on ethics in CAF operations, so it is not a surprise that the CAF also has some room for improvement. But there is a blueprint for such improvement.
A Blueprint for UAS Ethical Education?
In 2019, Wasilow and Thorpe published an article on AI, robotics, and ethics from a CAF perspective, using the DEP model as a starting point. The authors outlined an ethical framework based on twelve points derived from the DEP principles and values with key ethical issues such as privacy, bias, safety and security, accountability and responsibility, reliability, and trust. Their article can serve as a starting point because it uses a program already in use with the CAF and builds critical thinking around military technology into the DEP, instead of starting from scratch.
Wasilow and Thorpe ask highly relevant ethical questions for CAF UAS operators which are not currently addressed in the DEP such as (in summarized and edited form):
- Would UAS use increase risk-taking behaviour in conflict or lower the barrier to entry for conflict in general? Does UAS use encourage the targeting of civilians due to these potential escalatory factors?
- What is the role of both emotionality and lack of emotionality and its relation to ethical decision making in conflict in the context of using both UAS and AI? How does this impact both outcomes in conflict and the mental health of operators?
- How do accountability mechanisms function and change when using UAS and AI?
- How is data captured by UAS stored and protected? Who can access and use it and for what? Can enemies exploit it?
- Does working with new technologies have impacts for unit cohesion and team dynamics? Could distrust in technologies by military members cause problems during conflict?
- What happens when military members ‘disagree’ with the decisions of, or data provided by, certain technologies or the use the certain technologies in general?
Wasilow and Thorpe raise many more questions as well and provide a template DEP scenario which relates to swarming UAS in particular. Some of the scenarios organically related to my discussions with participants, but I did not ask them about the DEP directly. For example, some participants were distrustful of AI, and some even of UAS technology due to a perceived high degree of possibility for mistakes and accidents. Participants were also concerned with the emotionality of conflict and were worried about the integration of AI into UAS technology and into decision making as they believed human emotionality helped make ethical decisions. Participants were also deeply concerned with accountability when it came to integrating AI into UAS technology and decision-making, but wondered less about this with UAS in general, though there is also the potential for accountability issues to arise with UAS. I have summarized some of these views in a previous briefing note.
Conclusion
Wasilow and Thorpe’s article is not a solution on its own, but it should be considered required reading for anyone in a leadership position in the UAS context, and anyone involved in the creation and execution of UAS training programs. While it was written six years ago, it remains highly relevant. Despite Wasilow and Thorpe use of the DEP, and their provision of concrete recommendations and a sample scenario, it seems from an outside perspective that these kinds of questions have yet to be integrated into the DEP or the UAS operator training which is available in the army or air force.
The DEP itself is not perfect, as this piece has argued. It has gaps and is overly focused on bureaucratic process over ethical reasoning. But updating the DEP scenarios using Wasilow and Thorpe’s suggestions can help build on a blueprint which makes it more likely that ethical principles are followed in the use of UAS, as the DEP is a document service members are trained on and is culturally relevant to the CAF. As Carney promises greater investments in defence, perhaps defence ethics and specifically the DEP can benefit from this funding renewal.
Many UAS operators I interviewed pointed to Ukraine as the future of warfare and a model for drone warfare. If the CAF is looking at Ukraine it is looking at a bad example. Russia’s invasion has been brutal and bloody, with claims of violations of international humanitarian law on both sides using drones. While Canadian military members can look to Ukrainians for their bravery, grit, and resilience, Ukrainians have also been put into a position Canada should strive never to be put in. It is not the fault of the Ukrainians that Russia invaded, but Canada can learn from what happened, and be better equipped for the next conflict. One way to do this is through proper preparedness, including an actual ethical education for UAS operators, who are likely to be on the front lines.
About the Author
Bibi Imre-Millie is a PhD student at the Department of Political Science at Lund University in Sweden. For her thesis, she researches drone operator identity and community from a critical perspective. Bibi is also a graduate research fellow at the Centre for International and Defence Policy at Queen’s University.
Canadian Global Affairs Institute
The Canadian Global Affairs Institute focuses on the entire range of Canada’s international relations in all its forms including trade investment and international capacity building. Successor to the Canadian Defence and Foreign Affairs Institute (CDFAI, which was established in 2001), the Institute works to inform Canadians about the importance of having a respected and influential voice in those parts of the globe where Canada has significant interests due to trade and investment, origins of Canada’s population, geographic security (and especially security of North America in conjunction with the United States), social development, or the peace and freedom of allied nations. The Institute aims to demonstrate to Canadians the importance of comprehensive foreign, defence and trade policies which both express our values and represent our interests.
The Institute was created to bridge the gap between what Canadians need to know about Canadian international activities and what they do know. Historically Canadians have tended to look abroad out of a search for markets because Canada depends heavily on foreign trade. In the modern post-Cold War world, however, global security and stability have become the bedrocks of global commerce and the free movement of people, goods and ideas across international boundaries. Canada has striven to open the world since the 1930s and was a driving factor behind the adoption of the main structures which underpin globalization such as the International Monetary Fund, the World Bank, the World Trade Organization and emerging free trade networks connecting dozens of international economies. The Canadian Global Affairs Institute recognizes Canada’s contribution to a globalized world and aims to inform Canadians about Canada’s role in that process and the connection between globalization and security.
In all its activities the Institute is a charitable, non-partisan, non-advocacy organization that provides a platform for a variety of viewpoints. It is supported financially by the contributions of individuals, foundations, and corporations. Conclusions or opinions expressed in Institute publications and programs are those of the author(s) and do not necessarily reflect the views of Institute staff, fellows, directors, advisors or any individuals or organizations that provide financial support to, or collaborate with, the Institute.



Showing 1 reaction