By Craig S. Smith of Eye on AI,
I was in Ukraine in February and wrote this piece before the start of the war with Iran, but its implications are even more relevant today. My interest in lethal autonomous weapons dates back to my time with the National Security Commission on Artificial Intelligence, where full autonomy was debated but largely dismissed as ethically unacceptable.
But in practice, the step to full autonomy is smaller than it sounds. Once a human is no longer actively controlling a system and is only monitoring it with the option to intervene, the shift to removing that human entirely is incremental.
It’s similar to how Iran describes its nuclear program. Uranium enrichment for civilian energy is presented as benign, but once enrichment reaches reactor-grade levels, the remaining technical steps to weapons-grade material are a matter of time and intent, not capability.
It is becoming increasingly difficult to argue that fully autonomous weapons will not arrive. They follow naturally from realities already on the battlefield. What is easier to grasp is the fear they generate. Watch first-person-view footage of a quadcopter chasing a soldier to his inevitable death and the abstraction disappears.
Bundled against Ukraine’s subzero February chill, a man in a gray coat threw what looked like a gray model airplane into the pale blue sky. The buzzing of the drone’s propeller slowly faded as it climbed above snowy fields and barren hedgerows. It looked like a toy.
Oleksandr Liannyi was not playing, however. He was working on a way to make drones far deadlier than they are today.
“It’s mostly about accuracy of positioning, of how the navigation part will perform in different conditions,” said Liannyi, cofounder of NORDA Dynamics, which builds autonomous navigation and targeting modules for military drones..
Liannyi and his colleagues and other Ukrainian teams have achieved partial autonomy, allowing drones to navigate to and strike human-selected targets on their own. The next step is far more controversial: fully autonomous drones, which could navigate to an active front, hunt for targets, and strike without human input. Empowered to make life-or-death choices, such drones would fundamentally change the nature not only of this war, but of all wars.
“The technology is very close,” Liannyi said later inside a battered white van at the tree line. He noted that a number of intermediate stages still need to be developed before such systems exist and that NORDA Dynamics continues to emphasize human approval in the loop when it comes to the strike decision.
Under International Humanitarian Law, humans can’t pass responsibility for killing to a machine.
But Liannyi argues that even if a human is legally required to approve a lethal strike, autonomous target acquisition will, at the very least, increase the number of drones a single pilot can manage. “The drone can notify you when it sees the target, and then you can pull up the picture and approve it, so you can control lots of drones simultaneously,” he said.
I had come to Ukraine, improbably, with a Silicon Valley startup founder to witness tests of his company’s humanoid robot in a combat setting. But because of its sensitive nature, the robot never made it out of its crate at the airport in Warsaw and, for the same reason, never got past the Polish-Ukrainian border in the middle of a snowy night. It was eventually sent back to California. So I began interviewing people about the growing autonomy of weapons in the current war. That led me to the white van on the edge of a snowy field in Western Ukraine – what the Ukrainians call a “polygon,” after the 19th-century European term for a military training ground.
Beside us in the van, a young blond man in a gray parka sat hunched over a screen, watching a video feed from the drone’s camera. He moved a small white box across the screen with his thumbs on the prongs of a drone controller until he spotted a distant tree and flipped a switch with his finger. The box turned green, a red bar at the top of the screen flashed “ENGAGE,” and he lifted his hands away from the controls as if to emphasize that the drone was now flying on its own.
Almost immediately, the drone banked toward the tree outlined on screen by the green glowing square and, within seconds, was hurtling toward it. A moment before the collision, the man took control of the drone again, sending it swooping back into the sky. “Oho!” he exclaimed. Another man in the van muttered in Ukrainian, “Duzhe kruto,” or “very cool.”
Liannyi and his colleagues were testing new control algorithms that can guide a drone to its intended target without human control, a necessity when pilots lose contact with their drones because the enemy has jammed the radio link. Most of these systems allow drones to fly in complete radio silence for the last half mile to two miles, depending on the weather and the cameras used. Once flying autonomously at roughly a hundred miles per hour, the drone is virtually undetectable by the enemy until it is too late.
Autonomy on a Circuit Board
Inside the drone’s plastic housing is a cheap computer chip soldered to a green circuit board modeled on Raspberry Pi, a single-board computer originally designed to teach British schoolkids to code. These boards are imported from China, but Ukraine is now developing its own onboard AI, including homegrown boards built by dozens of local companies. NVIDIA’s more powerful Jetson Orin modules are used in some long-range, high-value drones, but they are expensive. Cheaper modules offer enough onboard AI to lock onto a target while keeping the unit cost low enough to lose in combat.
Currently, attack drones are still flown by a human operator, who uses a screen and controls to steer the aircraft, choose a target, and decide when to strike. With partial autonomy from companies such as NORDA Dynamics, the machine can take over the final phase of the attack. Once a human has picked the target and sent the drone toward it, onboard software handles the last stretch of navigation, avoiding obstacles and lining up the final approach. In practice, that means the person still decides who or what can be attacked, but the drone’s autonomy decides exactly how to get there and hit.
Full autonomy would mean the drone, not a human, decides who or what to attack and carries out the strike on its own. The system would search for potential targets, decide which ones fit its programmed rules, and then launch and complete an attack without asking a person for approval.
Such lethal autonomous weapons, called LAWs, would allow warfighters to define a kill box: a geofenced zone in which autonomous drones could hunt, killing any person or destroying any vehicle they find. The box could be as small as a crossroads or as large as 20 square miles of frontline terrain.
The Legal Gray Zone
To turn the kill box into reality, drones must be able to distinguish a soldier from a medic, a fleeing civilian from a retreating infantryman, a tank from a tractor, in rain and snow, day and night, and do it well enough so that commanders and lawyers are willing to let them fire without a human making the final decision.
Neither International Humanitarian Law nor Ukrainian law specifically prohibits fully autonomous weapons. They require only that weapons distinguish soldiers from civilians and medics, avoid excessive civilian casualties, and allow humans to halt or adjust attacks as battlefield conditions change. Even U.S. law and military doctrine require only that autonomous weapons be designed so commanders and operators can exercise “appropriate levels of human judgment over the use of force.”
Already, Western officials have moved from talking about a human “in the loop,” meaning a person must actively approve each strike, to a human “on the loop,” meaning a person supervises the system and can intervene to stop an attack. Because of “automation bias,” the tendency for humans to trust machines that have proven accurate in the past, “on the loop” risks humans effectively rubber-stamping machine decisions to keep up with the pace of battle.
But autonomy opponents warn about algorithmic errors or hacks that could propagate at machine speed.
“The risks they pose to civilians, friendly forces, and human security in general are staggering,” Dr. Peter Asaro, the Vice Chair of Stop Killer Robots, wrote in an email. “While it may seem expedient in a desperate situation, we need to consider the long-term ramifications of developing these technologies.”
The Asymmetry
Aleksandr Palamarchuk, a soldier with the Azov Brigade who goes by the call sign Paradise, appears as a ghostly image on the laptop screen in my Kyiv hotel room to talk about where the technology is today. A virtual background of the aurora borealis hides any clues to his whereabouts, which he says is a research and development lab within a hundred miles of the front.
Azov Brigade is a Ukrainian National Guard special forces unit, formed in 2014 as a volunteer militia to fight Russian-backed forces in Donbas. It has since become one of Ukraine’s fiercest combat units while remaining controversial because of its early ties to far-right groups.
“You need to be 100 percent sure it’s an enemy,” Palamarchuk said, noting that any civilians killed are Ukrainian because the war is primarily on Ukrainian soil. (Russian civilians in border regions have also died from Ukrainian strikes, but in far smaller numbers.)
However, Russia doesn’t play by the same rules. A recent report by the Institute for the Study of War, a U.S. nonprofit funded by private donations, concluded that Russian drone strikes against unmistakably civilian targets, from pedestrians to apartment blocks, are meant to depopulate frontline-adjacent areas. It also argues that this approach is being institutionalized in Russian doctrine and practice, creating a frontline red zone where any movement or vehicle is treated as a legitimate target.
Russia has shown a willingness to kill civilians since the outset of the war, from the indiscriminate shootings in the town of Bucha, just west of Kyiv, to continued strikes on residential buildings in the capital itself.
For Palamarchuk, that is the core asymmetry of the war. “It’s much easier for them to make absolutely autonomous missions, because they don’t care about the target type or where they hit,” he said.
Palamarchuk said Ukraine is seeking to counterbalance that asymmetry by developing AI that can reliably distinguish legitimate military targets from civilians. He said Azov is experimenting with drones that can fly entire missions by themselves.
“You just place the drone on the ground, then you create a mission for it, and it takes off by itself,” he said. “Then AI models can recognize targets by themselves.”
Ukraine is being forced to innovate faster than any other army on Earth and is restructuring its military around unmanned operations, including giving drones full autonomy. It is planning for a 15-kilometer-wide zone along the front in which machines, not infantry, do most of the work.
The First Robot Assault
In early December 2024, a Ukrainian brigade executed what analysts describe as the first successful unmanned air and land assault in military history, against Russian positions in the Kharkiv region. The dawn attack was coordinated by remote operators who simultaneously deployed an integrated swarm of aerial and ground robots. Kamikaze ground vehicles and robotic machine-gun platforms advanced on the trenches, supported by heavily armed quadcopter bombers and smaller, nimble kamikaze drones acting as close-air support, while dozens of reconnaissance drones provided a total operational overview. The intense, two-hour robotic strike caught Russian forces off guard and destroyed the targeted positions.
Ukraine is still scaling command and control tools to make that repeatable.
At the same time, Ukrainian forces are running an enormous, iterative experiment in unmanned and AI-enabled warfare, with constant adjustments by drone makers based on feedback from the front lines.
Kyiv has formalized this role through its “Test in Ukraine” policy, which invites companies to push new drones, ground robots, missiles, and other systems straight into combat, then feed performance data back to industry and governments.
Western and particularly U.S. firms are among those whose systems are being tested on the battlefield — everything from long-range strike drones to maritime and loitering drones that wait in an area until a target appears — sometimes with very public failures.
Altius loitering munitions, built by U.S. manufacturer Anduril, repeatedly crashed or failed to hit targets and proved highly vulnerable to Russian electronic jamming. They were ultimately withdrawn from use by Ukrainian forces in 2024. Anduril says it has since revised the Altius system based on Ukrainian feedback, and that updated versions have been redeployed with some Ukrainian units.
Ukraine’s breakneck cycle of battlefield experimentation offers a trove of operational data about what works, what fails, and how adversaries adapt. The country’s Ministry of Defense has created a Universal Military Dataset, among the largest of its kind in the world, which can be used to train other AI tools in Ukraine’s defense arsenal. The dataset contains more than two million hours of drone footage and millions of labeled military objects.
The ministry has also developed an AI system called Avengers, which processes live video streams, automatically detecting, classifying, and flagging enemy equipment. Ukrainian officials say this combination of scale and detailed labeling allows the system to recognize most Russian weapons in live video in just a few seconds.
Avengers is integrated into the country’s command-and-control system so that AI-detected targets appear directly on tactical maps, passed almost instantly to drone pilots.
While publicly these systems are described as AI-enabled or semiautonomous, with humans nominally in the loop, the line separating that from full autonomy is blurring. A drone can decide to hit a tank, or a commander can pre-authorize that decision so thoroughly that the last human yes becomes more of a given than a true ethical barrier.
The Army of Drones
Much of this innovation was driven by Kateryna Chernohorenko, who served as Ukraine’s Deputy Minister of Defense for Digital Development from 2023 to 2025. She arrived at my hotel looking more like a student than a former government official, wearing sneakers and black pants with a striped dress shirt open over a white T-shirt. Her laptop was covered in defense-themed stickers. Her energy and creativity have made her integral to Ukraine’s war.
One of her ideas was the Army of Drones project, which has centralized procurement and standardized platforms, treating drones as standard equipment rather than ad hoc volunteer gear.
“There was a need to have a systemic look at drones’ capabilities and practice,” she said.
That project channeled civilian crowdfunding and volunteer innovation into a coordinated pipeline that supplies the military with thousands of reconnaissance and strike drones, sets technical requirements, and fields them where they are most needed. It also created training and certification tracks for operators, helping build a professionalized cadre of drone units rather than scattered, self-taught teams.
By setting standards, aggregating orders, and validating new concepts at the front, the Army of Drones has turned Ukraine into a live testbed for military drone innovation and influenced how other countries and defense firms think about scaling unmanned systems for modern, high-intensity warfare.
It has also created a thriving defense sector with hundreds of companies in Ukraine building drones that operate in the air, on the ground, or on water. A recent defense technology expo sponsored by Azov took place at Kyiv’s National Museum of the History of Ukraine in the Second World War, a Soviet-era bunker-like building embedded in the Pechersk hills overlooking the Dnipro river. Above it, a towering stainless-steel figure of Mother Ukraine rises hundreds of feet into the air, arms raised, a sword and a shield lifted over the city.
Inside, dozens of firms presented their products. Among the company representatives at the expo was Marko Kushnir, a director at the Ukrainian drone maker General Cherry, whose name refers to the fruit associated with the region where the company’s founders are from.
General Cherry is one of two Ukrainian companies selected to compete in the Pentagon’s Drone Dominance Program, a $1.1 billion initiative to field large numbers of cheap, effective one-way attack drones for American forces. Both General Cherry and Ukrainian Defense Drones Tech Corp. have demonstrated they can mass-produce drones on short notice. General Cherry is now in talks with several Persian Gulf states about supplying interceptor drones to the Iran war.
Kushnir visited me later in my hotel, bringing a General Cherry hoodie and other branded swag. He also brought an unarmed Bullet, a nearly three-foot-tall drone shaped like a rocket and built to hunt other unmanned aircraft.
The Bullet is built to knock out Russia’s long-range, fixed-wing kamikaze drones based on Iran’s Shahed and produced under license in central Russia’s Volga region. Known in Russia as the Geran, the rear-propeller drone has become one of Moscow’s primary weapons for striking Ukraine’s energy infrastructure and residential buildings.
“Our drone can understand that it’s a Shahed,” said Kushnir. “It can go to the target without any operator control.”
The Outsiders
Among the most prominent outsiders building for this new battlespace is former Google C.E.O. Eric Schmidt. His military drone company, Swift Beat, produces a line of drones with bee-inspired names. Its flagship is the Bumblebee, a low-cost AI-enabled kamikaze quadcopter that has logged thousands of combat flights against Russian targets in Ukraine. The drone uses onboard cameras and internal motion sensors to navigate by comparing ground features to maps stored in memory, allowing it to operate without GPS, radio signals, or a live data link. Once a pilot designates a target, the AI takes over.
Neither Schmidt nor Swift Beat would comment for this article.
Swift Beat also produces an AI-powered interceptor system designed to hunt and destroy Russian Shahed drones. Called Merops, after the genus of bee-eating birds, it fires fixed-wing drones from mobile launchers and uses onboard machine vision to track and physically ram targets, bypassing radio jamming.
Merops are now being deployed on NATO’s eastern flank. Romania has begun integrating mobile interceptor units into its short-range air-defense networks, and Poland is training military personnel on the system as part of a broader anti-drone shield.
The underlying parts – small minicomputers, commercial computer-vision libraries, visual-inertial navigation – are mostly dual-use technology rather than exotic military hardware. What is emerging in Ukraine is not only a new class of weapon, but a new production logic: autonomy assembled from cheap sensors, commercial computers, and battlefield iteration, then scaled fast enough to make a difference on the battlefield.
Five Levels of Autonomy
While Schmidt is the most prominent technologist building drones for Ukraine, people in the country point to Ukrainian entrepreneur Yaroslav Azhnyuk as the leading expert on autonomy in the drone race.
Azhnyuk is best known in Silicon Valley as the co-founder of Petcube, a startup that makes interactive pet cameras. After Russia’s full-scale invasion, he used his expertise in cameras that detect motion, interpret behavior, and stream video reliably across unstable networks to build AI-driven autonomous systems for drones.
He likens drone autonomy to the five levels of self-driving cars. “Level one is autonomous terminal guidance,” Azhnyuk explained over breakfast at a fashionable gastropub in central Kyiv. “You fly manually, you lock the target, and from that moment the drone can hit it autonomously under all conditions.”
Level two introduces autonomous bombing: the system calculates release timing and performs an escape maneuver. Level three is more controversial: autonomous target recognition and strike decision-making within a defined kill zone.
“The system scans what it sees, recognizes the target, reaches enough confidence, and initiates the strike,” Azhnyuk explained as he ate pork brisket with pink pickled onions.
Level four adds autonomous navigation from launch to the target area without radio or satellite guidance. Level five includes autonomous takeoff and landing, enabling reusable systems rather than one-way missions.
In his framing, the ethical debate may invert. “Within five to ten years,” he said, “it may become unethical to use weapons without AI,” he said, arguing that autonomous precision systems could cause less collateral damage than purely human-operated alternatives.
Baba Yaga
When Russia invaded in 2022, many Ukrainians pivoted into drone warfare. Pavlo Yelizarov, nicknamed Lasar, was a television producer who bought a smuggled agricultural drone and strapped an anti-tank mine to its undercarriage. That effort evolved into Lasar’s Group, one of the military’s most formidable drone formations.
It was the first to put Starlink satellite terminals onto heavy bomber drones, allowing pilots to operate from secure rear positions via internet-based control links, sidestepping Russian jamming of radio frequencies. The arrangement effectively decoupled the pilot’s physical location from the drone’s, allowing the pilots to remain far in the rear — or indeed be based anywhere in the world.
The group has destroyed more than $13 billion of Russian military equipment, including tanks, each strike documented by onboard video. Its signature platform is a four-rotor heavy bomber that Russian troops have nicknamed Baba Yaga, after a witch in Slavic folklore. The drone, mounted with a satellite receiver from Elon Musk’s Starlink, can carry up to 5 kilograms of munitions and travel as far as 35 kilometers and back, often flying low, at treetop level.
Yet even as Lasar’s Group has refined remote piloting, some of its commanders are looking beyond radio, satellite or fiber-optic connections altogether to a day when drones operate without a human pilot at all.
A major named Yurii, who declined to give his family name for security reasons, oversees training and testing of new engineering solutions within Lasar’s Group, an elite military drone unit. He came to see me in my hotel room wearing military fatigues and a name patch that read “Phoenix,” his radio call sign. He told me that, in his view, the next frontier of drone warfare is full onboard autonomy: once a drone is launched, he said, navigation, targeting, and execution will eventually be autonomous, with no need for a live communication link to a pilot.
“Connectivity can be jammed, so you’ve got to do all of that on the edge,” he said, sitting bolt upright, head shaved and a reddish beard fading white at the point. In other words, the drone must be able to see, orient itself, identify what matters, and act without relying on a distant operator or a remote server.
“This will help us to place our personnel far away from our enemy, without direct contact,” he said. “It will create a war of drones, not a war of humans.”
To move in that direction, Lasar’s Group is developing what Phoenix calls autonomy modules – standardized packages of hardware and software that can be attached to different airframes. “We are building drones, but we are also building the autonomy modules,” he said. The decision-making element is migrating into code.
The Cost
For now, it’s still a war of drones against humans, machines against men, with devastating consequences. Drones now account for over 70% of casualties on both sides.
At a rehabilitation hospital outside Lviv, I met Vyacheslav Kondrashenko, a soldier with Ukraine’s 93rd Separate Mechanized Brigade. A year earlier, he had been carrying a 15-inch-square quadcopter fitted with two sixty-millimeter mortars in the fiercely contested eastern reaches of Donetsk. As he emerged from his dugout into the open, a smaller Russian quadcopter, carrying a munition of its own, struck his right arm and exploded. The blast set off the mortar rounds he was carrying. When the smoke cleared, Kondrashenko – Slava, to his friends – had lost his right arm below the elbow and both legs above the knee. His remaining left hand was rendered useless.
“He was waiting for me,” Slava told me from his wheelchair. “I didn’t have a chance.”
The drone that hit him had been resting on the ground outside the dugout. Somewhere miles away, a Russian operator was watching the entrance through the drone’s video feed, delivered in real time through a fiber-optic cable as thin as fishing line, which had unspooled behind it, draping over fields and trees.
A few days after speaking with Slava, I stood outside the Garrison Church of Saints Peter and Paul in Lviv, the main house of worship for the city’s military. A priest in black-and-gold vestments appeared with a cross, followed by uniformed pallbearers bearing a black coffin on their shoulders. A military band played a funerary dirge.
There are funerals nearly every day in cities across Ukraine. This one was for Taras Novoselskyi, killed on his 47th birthday.
Ukraine’s cities, with their trams, baroque facades, and coffeehouses, can still seem improbably normal until a military coffin passes through. Then the war becomes visible again – not as a weapons system, or a software stack, or a theory of machine autonomy, but as a dead body being carried to the grave.
The procession moved with the choreography of grief. At the town hall, a lone bugler appeared in an upper window. He played “Il Silenzio,” the final call. People stopped to watch. Some crossed themselves. Others simply stood still.
The drive for full autonomy isn’t restricted to Ukraine. Russia has begun equipping its Lancet drone with machine-vision systems that can patrol a designated area, searching for vehicles or other targets that fit a predefined profile.
The war with Iran is accelerating the move toward machine-led killing. Israel has reportedly used AI-assisted targeting in its campaign against Iran, while the Pentagon says the United States is pushing to field swarms of low-cost attack drones and more autonomous systems of its own. Meanwhile, Ukraine has said it will share interceptor drones, training, and counter-drone expertise with the United States and Gulf partners.
There is no public evidence that terrorist groups are building such systems inside the United States. But the technology is spreading, the costs are falling, and U.S. officials have been warning that the homeland drone threat is growing.
I thought of a comment the entrepreneur Azhnyuk made at breakfast the previous day when I asked if the prospect of fully autonomous weapons frightened him. “What I’m terrified about is that we won’t get there as fast as the enemy does.”
Watch: The March Toward Fully Autonomous Weapons
* * *








