1. Home
  2. Global

Generative AI may be the next AK-47

Accessibility, portability, simplicity: AI has the potential to change how wars are waged.

A graphic illustration showing a computer screen at the center. Above we see tangled cables hanging from the cieling. Below the computer we see crowds of people looking at the screen. Illustration made with the artificial intelligence image generator Midjourney.

At the start of the Cold War, a young man from southern Siberia designed what would become the world’s most ubiquitous assault rifle.

 

The AK-47, or Avtomat Kalashnikova, was never patented and thus easily reproducible. It is light and highly portable, making it easy to smuggle across borders. The AK-47 is also renowned for its simplicity: With very little training, most combatants, including children, can strip and clean one in minutes.

 

Decades later, the AK-47 (along with its variants) may be the most widely available military weapon on the planet. Even its inventor, Mikhail Kalashnikov, was surprised by its widespread uptake: “It was like a genie out of the bottle,” he reportedly said. “It began to walk all on its own, and in directions I did not want.”

 

The AK-47 and other light weapons have changed how conflicts are waged, where, and by whom. Though not designed for harmful purposes, the generative AI tools that have swept into the public spotlight in recent months may follow a similarly destructive path. 

 

AI has the potential to change how wars are waged on a scale unseen since the rise of weapons like the AK-47. But few people are thinking about how these technologies will shape “forgotten conflicts” and humanitarian crises – including most humanitarian agencies.

 

AI engineers and enthusiasts may bristle at comparing a deadly weapon to generative AI, examples of which include GPT4, text-to-image models like Stable Diffusion and DALL-E 2, or Eleven Labs’ speech synthesis and voice-cloning tools. ChatGPT certainly does: When I asked the language model about this analogy, it replied that my prompt was “potentially offensive” (it did not specify to whom) and that “it is not appropriate or respectful to compare a life-saving technology like AI to a weapon of war”.

 

There are, however, key similarities: accessibility, portability, and simplicity.

 

The engineering behind generative AI is complex, but the tools themselves can be used by anyone with internet access. Their use isn’t constrained by state boundaries nor adequately regulated (yet). And, at this stage, many of these tools are still free or virtually free. 

 

“Generative AI, we’re told, is meant to support human creativity, generate new forms of art, and simplify marketing. But this fails to account for other incentives driving demand and influencing its design.”

 

In the wrong hands, generative AI could create audio or visual content in support of targeted propaganda and disinformation campaigns, turning the proverbial “fog of war” into a murk so dark and dense that only the most sophisticated tools will spot fact amid the fiction. This could make it harder to bring warring parties to the peace table, or prevent those fleeing conflict from accessing life-saving humanitarian assistance or finding refuge across a border. 

 

Just as the social media giants of the last decade failed to prevent misinformation and hate from weaponising their networks, today’s tech firms are not designing AI with conflicts in mind. They shouldn’t be relied on to build in the necessary guardrails to protect against misuse.

 

Generative AI, we’re told, is meant to support human creativity, generate new forms of art, and simplify marketing. But this fails to account for other incentives driving demand and influencing its design.

 

While China and the United States are battling to achieve supremacy in and through AI, so too are the few firms who have the data, computing power, and capital required to develop the technology. 

 

One notable tech titan recently argued that increased government investment could enable greater technological innovation in support of US national security and defence. As the bioethicist and technology expert Wendell Wallach has warned, the commercial and political incentives underpinning advanced technologies like AI are merging. 

 

This should be particularly alarming to the many humanitarian agencies who hold neutrality as a guiding principle for their operations and partnerships. These agencies should urgently reflect on how generative AI might affect the 2 billion people who live in the world’s conflict areas.

 

Responding to my prompts, ChatGPT noted that “efforts must be made to ensure that the development and deployment of such technologies are governed by ethical principles and regulations to prevent their misuse”. 

 

But such boilerplate responses on ethics leave us with more questions: How will AI be regulated in places where the rule of law is weak? How will tech firms designing AI ensure their tech is “governed by ethical principles”, particularly in light of recent layoffs? What happens when these tools fall into the hands of non-state armed groups? And how will refugees and humanitarian actors protect themselves from the potential harms? 

 

Whatever the answers, it’s safe to say that the genie is now out of the bottle. 

 

Edited by Irwin Loy.

Share this article

Get the day’s top headlines in your inbox every morning

Starting at just $5 a month, you can become a member of The New Humanitarian and receive our premium newsletter, DAWNS Digest.

DAWNS Digest has been the trusted essential morning read for global aid and foreign policy professionals for more than 10 years.

Government, media, global governance organisations, NGOs, academics, and more subscribe to DAWNS to receive the day’s top global headlines of news and analysis in their inboxes every weekday morning.

It’s the perfect way to start your day.

Become a member of The New Humanitarian today and you’ll automatically be subscribed to DAWNS Digest – free of charge.

Become a member of The New Humanitarian

Support our journalism and become more involved in our community. Help us deliver informative, accessible, independent journalism that you can trust and provides accountability to the millions of people affected by crises worldwide.

Join