The latest figures from the Pentagon indicate that the total number of COVID-19 cases among members of the U.S. military has topped 60,000 since the onset of the pandemic. COVID-19 and other similar outbreaks could become an increasingly important consideration in the calculus of future military deployments. They could add impetus to the Pentagon’s development of lethal autonomous weapons (LAWs) or at least be cited as a perfect reason to do so. This could, in turn, have significant implications for the future of both U.S. military operations in the Middle East and the U.S. military presence in the region, which has long been the subject of political disagreement in Washington.
Already, there are numerous land-, air-, and sea-based weapons capable of performing surveillance and voice recognition, as well as tracking and independently choosing to attack targets autonomously using artificial intelligence (AI) — what UN Secretary-General António Guterres has referred to as “machines with the power and discretion to take lives without human involvement.”
Implications for US military operations
Incorporating AI and LAWs into new or existing platforms will have enormous implications for U.S. military operations in general. By its very nature, military conflict is a costly undertaking ethically, morally, politically, economically, and psychologically. Altering or altogether removing these costs will change the course of military operations and warfare in profound ways. In recent years, U.S. operators of remotely operated drones (i.e. manned drones) in the Middle East have already been reportedly grappling with the altered psychological and moral costs of being far removed from the battlefield. Their experience of killing people halfway across the globe has been described as more akin to a video game than the significantly more difficult psychological and personal experience of battlefield soldiers. The future use of lethal autonomous drones (i.e. unmanned drones) could remove human involvement altogether — and along with it many of the associated costs. U.S. leaders have historically needed to secure financial resources, mobilize populations, and risk their credibility when undertaking military operations, especially in the Middle East. The use of LAWs effectively lowers this threshold and makes it is easier for them to do so.
There are also significant systemic implications that are more difficult to predict. Across a variety of sectors, system complexity is increasing in ways that are hard to comprehend. The U.S. subprime housing bubble is an excellent example of this from the financial sector: Layers upon layers of complexity created a system in which there were risks that few understood or predicted, and which would bring it to collapse once stressed. Facebook is another example from the technology sector: Its original vision was to connect people and improve their access to information, furthering democratization across the globe. Instead, its algorithms fueled the polarization, populism, and misinformation we see today.
The impact of autonomous systems on the governance of military operations could be equally disruptive but potentially more difficult to discern. They could impact the traditional cycle involved in the initiation, plateauing, and conclusion of conflicts by making leaders less willing to consider diplomatic and political means to resolve crises. This trend has already been set in motion by the rise of manned drones and cyber weapons, which have been widely used in the Middle East. In such instances, military operations no longer have discernable beginnings and endings, but instead consist of a prolonged low-level conflict where a conclusive victory is never fully realized. It could also mean that the underlying drivers of conflicts are further overlooked. Drone-related targeting in Yemen and Pakistan may have eliminated immediate terrorist threats but it has done little to address the underlying causes of terrorism — a bleak reminder that the strategic utility of these weapons is disconnected from the political and economic context. From an operational perspective, it is also unclear how autonomy would impact command-and-control doctrines and the communication chain during operations, let alone the needed checks and balances impacting the rules of engagement of forces, immunity from prosecution, humanitarian law, and the accountability of leaders at all levels. There are potentially even domestic U.S. implications to consider: The Department of Defense is among the largest employers (public or private) in the U.S., and while concerns over rushing to outsource jobs to machines are not new, reducing the number of jobs in the U.S. military could further fuel unemployment and disenfranchisement.
Impact on the US regional presence
The impact LAWs could have on the U.S. military presence in the Middle East is also largely unknown. For at least the past 15 years this has been a major domestic political issue, and every U.S. presidential election has featured candidates who have made reducing the number of troops in the Middle East a top campaign priority. This could mean that U.S. politicians might initially be inclined to embrace LAWs wherever possible and accelerate their development. For their part, the region’s leaders and citizens are likely to find it problematic to accept the widespread use of LAWs. Setting aside the ethical and moral outrage, it could further contribute to the impression of a U.S. disengagement from the Middle East and the region’s sense of insecurity. In the same vein, it could also encourage the region to achieve greater self-sufficiency to meet legitimate defense needs or push its leaders to pursue partnerships with other security guarantors.
The U.S. has a long and complicated relationship with the Middle East, especially when it comes to its security role. The introduction over the past decade and a half of manned drones and cyber weapons, which never held up well under the scrutiny of international law and public opinion, has further complicated the picture. It has been widely argued that in some respects they worked against U.S. interests in the region, despite the immediate short-term gains. The U.S. security role and relationship with the region will almost certainly be further complicated by the introduction of autonomous weapon systems. It is unrealistic to think that the trajectory of their development will change, given the momentum seen in non-military applications and the lack of global trust and cooperation. Nonetheless, it is imperative to shape the conversation about their regulation sooner rather than later, and before military usage of the technology proliferates further.
Nasser bin Nasser is a non-resident scholar at MEI and the managing director of the Middle East Scientific Institute for Security based in Amman, Jordan. The views expressed in this article are his own.
Photo by John Moore/Getty Images
The Middle East Institute (MEI) is an independent, non-partisan, non-for-profit, educational organization. It does not engage in advocacy and its scholars’ opinions are their own. MEI welcomes financial donations, but retains sole editorial control over its work and its publications reflect only the authors’ views. For a listing of MEI donors, please click here.