Artificial intelligence is going to make drone wars much more deadly. It's already started.
A new research report says AI is dramatically improving the chances of a successful drone strike in the war in Ukraine.
Global Images Ukraine/Global Images Ukraine via Getty Images
- A new report details how artificial intelligence is changing drone warfare.
- With AI enhancements, the chances of a successful strike improve dramatically.
- Ukrainian drone operators have said that this technology has the potential to be a game-changer.
AI drones on the battlefield in Ukraine are three to four times more likely to hit their target than drones piloted solely by humans, a Ukraine war researcher reports.
The drones enhanced by artificial intelligence are not fully autonomous, but Ukraine sees them as potential game-changers. The technology is rapidly evolving as Kyiv aims to replace soldiers on the battlefield with uncrewed systems, reducing the cost of war in human lives.
AI-enabled autonomous drones are a priority. Last fall, a Ukrainian drone unit commander said developments in autonomy might soon eliminate the need for drone pilots altogether.
Kateryna Bondar, Center for Strategic and International StudiesWadhwani AI Center fellow, wrote in a new report published Thursday that, as of now, the "deployment of AI is partial in scope, enhancing certain functions and addressing some operational challenges rather than enabling full system autonomy."
Autonomy in drone navigation and targeting is making a major impact, improving drone strikes and making them three to four times more likely to succeed, or an increase from 10 to 20 percent to around 70 to 80 percent, Bondar said.
Ukraine purchased roughly 10,000 AI-enhanced drones in 2024. Overall, it acquired nearly 2 million drones, meaning that most of the drones Ukraine is using to fight off the Russians are still entirely controlled by human operators.
The AI drones are largely limited to final-approach navigation, but they're proving their worth. These systems demand far less skill from pilots, can bypass electronic warfare that could sever the drone's connection to the operator, and reduce the number of drones necessary for mission success. Bondar said only two drones could do what might otherwise take eight or nine. Stanislav Ivanov/Global Images Ukraine via Getty Images
All of this is being expedited by Ukraine's drone developers, who are constantly working on new adaptations in both software and hardware to problems seen on the battlefield. Operations like Ukraine's special drone unit, Typhoon, are also helping to push drone innovation forward across the armed forces.
Kyiv's government, too, is pushing for a wider adoption of autonomy and AI, which would allow for continued development and additional purchases.
Ukraine has seen particular success in adapting small- and medium-sized first-person-view drones for diverse missions thanks to interchangeable equipment and flexible designs.
This ultimately means these drones can shift from surveillance operations to strikes. Interchangeability in hardware and software is key, Ukrainian drone companies have said, in order to make the systems cheap, scalable, and flexible to countermeasures.
Bondar noted that Ukraine has been training small AI models on small datasets to avoid overloading the limited processing power available from small, inexpensive chips. Doing so offers the flexibility to adapt quickly to an ever-changing battlespace.
There are opportunities in AI. In her report, Bondar said that advancements in AI-enabled automated target recognition have led to drones with the ability to lock onto targets up to two kilometers away in optimal conditions. Unfazed by fatigue or stress, AI also has the potential ability to see through evasion tactics, such as camouflage and decoys.
Autonomy and artificial intelligence in weapons technology are of interest to top militaries. The US, for instance, has been taking notes on drone warfare in Ukraine, spurring developments in integrating AI and drone technologies. But there are real concerns in this space, ethical worries and fears of creating so-called "killer robots."
Bondar wrote that although the Ukrainians seek autonomy to improve operational effectiveness, "engagement decisions remain squarely in the human domain."
She said that the "current human-in-the-loop practices allow operators to override autonomous functions, ensuring critical ethical and strategic judgments remain under human control.