Stories

Speaker Illuminates AI’s Increasing Role in War and Government Surveillance

Photos Kelly Marsh

As AI becomes a ubiquitous tool used in governance and warfare, the rules of engagement are blurring, says Steve Feldstein, an expert on the topic and a senior fellow at the Carnegie Endowment for International Peace in the Democracy, Conflict, and Governance Program. In a talk at Vassar on November 13, “The Coercive Power of Tech: From AI Surveillance to Drones,” Feldstein explored the role technology has in international conflicts and domestic control.

Two individuals seated at a table against a white background. The person on the left wears a dark sweater and blue pants, looking left. The person on the right wears a striped dress shirt, gesturing with both hands while speaking.
Steve Feldstein (right), a senior fellow at the Carnegie Endowment for International Peace, said the race to apply artificial intelligence to international conflicts “is changing the character of war.” At left: Chibuzo Achinivu, Visiting Assistant Professor of Political Science.

The U.S., historically the epicenter of advanced technology—first in government and then in the private sector—is engaged in an internal wrestling match over who controls how it is developed and used, he said. Meanwhile, AI tools are becoming more widely available and affordable in lower-cost markets around the world, and their uptake is disrupting the global order.

Governments use AI applications to gain a strategic advantage over competitors. Rivals do not stand still, Feldstein said, and if one has a technology advantage, expect a rapid response from the other.

Technology innovation enters the battlefield in a real way. A good example of how this looks in war is playing out in Ukraine. Using drones, Ukraine has held and advanced positions against Russia, despite a vastly smaller army. Russia has countered by building joint drone factories with Iran, a much smaller power with the tech know-how to help Russia up its electronic war game.

A classroom setting with several individuals seated at white tables arranged in a hollow square. Many are working on laptops. Two people are seated at the center table, facing the others, appearing to lead a discussion. Large windows line the left wall, showing brick buildings outside.
Visiting Assistant Professor Achinivu hosted a discussion with Feldstein in a political science class.

“We’re not just looking at anecdotal examples of a powerful weapon that’s changing the character of war. You can see it in the numbers. You can see it in the deaths, and you can see it in the number of drone events,” Feldstein said.

He cited data from Armed Conflict Location & Event Data, an independent conflict monitor, that showed 51,000 drone strikes around the world in 2024, up dramatically from 6,000 in 2020. Fatalities rose from 11,000 to 40,000.

It’s not just formally constituted militaries that are using these tools, he added. As technology development expands beyond the borders of major tech centers to smaller markets like Iran and Turkey, the less expensive, so-called “good enough” tools they build are within reach for use in conflict by non-government players as well.

Houthi strikes on U.S. military vessels in the Red Sea in 2024 constituted “one of the largest maritime battles since World War II,” Feldstein said, with the U.S. Navy spending $1 billion on munitions to fend off assaults.

“We shouldn’t underestimate non-state actors’ ability to acquire and develop these types of technologies, which can rival costly systems and be a threat to military platforms,” he said.

The role technology companies play in international conflict has come under increasing scrutiny, he added, with tech innovations deployed in unanticipated ways. For example, Microsoft struck a deal in 2021 to host sensitive intelligence material for the Israeli Defense Forces on its cloud servers. When it was revealed that the material was granular surveillance data of Palestinian citizens used after the October 7 attacks to identify bombing targets in Gaza, Microsoft denied knowledge of its use, claiming it was helping Israel fend off cyber-attacks.

“Tech companies often justify these transactions by saying they will help democracies, and if they don’t, authoritarians will prevail. So it reduces ethical issues of the use of these tools to a very simple binary ‘us versus them’ logic,” Feldstein said.

He continued, “War is a business opportunity for big tech, and the IDF partnership is just the start, with big tech rushing headlong into the military defense space with few guardrails.”

Governments are also increasing their use of AI tools for surveillance, censorship, data centralization, and propaganda strategies at home. Facial recognition, for example, helps authoritarian governments extend mass repression in a cheaper, more efficient, and reliable way. And citizen records consolidated into a single source can be weaponized for any number of things, such as denying health services or bank access to identified dissidents or cracking down on immigration with the unauthorized use of taxpayer data.

A predisposition by governments for political repression drives these scenarios—not the AI tools themselves, Feldstein pointed out. It is an old problem with a bigger footprint that requires a new fight fueled by fresh ideas.

AI’s use in armed conflict and governmental repression is the subject of Feldstein’s upcoming book, Bytes and Bullets: Global Rivalry, Private Tech, and the New Shape of Modern Warfare, to be published in 2026.

This event was sponsored by the Political Science Department, the Data Science and Society Initiative, the International Studies Program, and the Media Studies Program.

Posted
November 24, 2025