Operations Epic Fury and Roaring Lion were the names given by military planners to the coordinated attacks that the US and Israel launched against Iran in late February 2026. Throughout the campaign, over 11,000 targets were hit. Iran retaliated with about 2,000 drones and more than 500 ballistic missiles. The magnitude was astounding. More than 1,000 targets were hit in the first 24 hours, with only 10% of the human analysts that process would have previously needed. This is the number that sticks with you and makes this conflict truly unique. The majority of the work was being done by the machine. That is a novel form of conflict.
Palantir’s Maven Smart System, which was based on Project Maven, an algorithmic warfare project the Pentagon started in 2017 to use machine learning for satellite imagery analysis, was the system at the core of that operation. What started out as a tool to assist analysts in identifying objects in drone footage has evolved into something far more significant: a platform that creates strike packages almost instantly by combining satellite imagery, radar data, drone video feeds, and signals intelligence into a single interface. According to Palantir’s chief technology officer, the conflict in Iran is “the first large-scale combat operation driven by AI.” The implications of this characterization extend far beyond operational efficiency, making it worthwhile to consider.
| Information | Details |
|---|---|
| Topic | Artificial Intelligence in Modern Warfare and autonomous military systems |
| Key Conflict — Ukraine | Russia-Ukraine war — drones account for 70–80% of battlefield casualties |
| AI Drone Targeting Cost | AI-based targeting added to drones for as little as $25 (approx. 1,000 Ukrainian Hryvnia) |
| FPV Drone Accuracy Improvement | AI integration boosted Ukraine’s FPV strike accuracy from 30–50% up to ~80% |
| Ukraine Drone Data | OCHI system — 15,000+ frontline drone crews, 2 million hours of battlefield footage collected since 2022 |
| Key Conflict — Iran (2026) | Operations Epic Fury & Roaring Lion — largest U.S. military operation since Iraq 2003 |
| Targets Struck — Iran Conflict | Over 11,000 total; more than 1,000 in the first 24 hours using AI-assisted targeting |
| AI System Used | Palantir’s Maven Smart System — fusing satellite, drone, radar, and signals intelligence |
| Maven Accuracy vs. Human | Maven: ~60% accuracy; human analysts: ~84% — raising serious ethical questions |
| Analyst Reduction | Only 10% of human analysts previously required to maintain 1,000 daily strikes |
| Israel AI System | “Lavender” — identified up to 37,000 potential Hamas-linked targets; linked to civilian casualties |
| Key Warning | Austrian FM Alexander Schallenberg: “This is the Oppenheimer moment of our generation” |
| Ukraine Kill Zone Project | “Drone Line” — 15km unmanned kill zone along front, with ambitions to extend to 40km |
The ethical weight is concentrated in the accuracy numbers. The targeting accuracy of Maven is about 60%. Approximately 84% is attained by human analysts. The language used in military briefings tends to euphemize that difference, which is 24 percentage points at the scale of thousands of strikes. It indicates that a significant number of targets were incorrectly identified. during a conflict. by a machine.
The question of what happens when the system is flawed and no one is paying close enough attention to catch it is as real as the speed and tactical advantages. During operations in Gaza, Israel’s AI targeting system, Lavender, allegedly identified up to 37,000 possible Hamas-linked targets, speeding up strike decisions in ways that drew harsh criticism from human rights observers and researchers monitoring civilian casualties.
All of this is no longer theoretical, and the fastest and most obvious practical evolution has occurred in Ukraine. Ukraine first used drones, which are inexpensive, quick, and increasingly guided by AI targeting systems that can be added to an existing platform for as little as $25, in order to counter a Russian force with substantial numerical advantages. Currently, between 70 and 80 percent of battlefield deaths in that conflict are caused by drones. Since 2022, over 2 million hours of frontline drone footage from over 15,000 crews have been gathered by a Ukrainian nonprofit organization called OCHI.
This data is fed into models that have been trained on real combat, including real targets, real terrain, and real electronic warfare conditions. As a result, the battlefield learning feedback loop accelerates more quickly than any conventional weapons development program. Ukraine’s FPV drone strike accuracy increased from about 30 to 50 percent to about 80 percent thanks to AI-assisted targeting. The tactical calculus on the ground was altered by that improvement, which was made in a matter of months, in ways that conventional armies needed years to adjust.

Observing all of this develop in a short amount of time gives the impression that the institutional and ethical frameworks have not kept up with what the hardware and software are already capable of. Alexander Schallenberg, the foreign minister of Austria, referred to the current situation as “the Oppenheimer moment of our generation” during a conference on autonomous weapons in Vienna. Although the analogy to nuclear development is flawed, it is accurate because both involve technologies that, when implemented widely, have effects that surpass the capabilities of the decision-making processes designed to control them.
One thing is certain among the soldiers fighting in Ukraine: people should always be involved, especially when it comes to life-or-death decisions. This has been stated directly by drone pilots from several Ukrainian units, who contend that while AI can help with targeting, it shouldn’t be relied upon to make the ultimate decision. As systems get faster and human review windows get smaller, it’s still unclear if that principle will hold. Technology isn’t waiting for a response.
