
Nine years of negotiations, 166 countries in favor, and a deadline seven months away. The autonomous weapons treaty will fail — and the reason why is the best predictor of how global AI regulation will play out over the next decade.
In November 2026, Geneva will host the Seventh Review Conference of the Convention on Certain Conventional Weapons (CCW) — the UN framework that regulates weapons considered excessively injurious or indiscriminate.
The UN General Assembly, the Secretary-General, the International Committee of the Red Cross — all had set 2026 as the deadline for concluding a legally binding treaty on lethal autonomous weapons systems, known as LAWS.
November is seven months away. The treaty doesn't exist. The probability that it will exist before the deadline is realistically zero.
This failure deserves attention — not for pacifist or humanitarian reasons (arguments others are better qualified to make), but because the mechanism of this failure is the same mechanism that will block every other international AI regulation over the next ten years.
Nine years, no definition
Quick background.
The CCW started discussing autonomous weapons in 2013. For three years, informal expert meetings mapped the terrain. In 2016, States Parties created the Group of Governmental Experts on LAWS (the GGE) — a formal body tasked with developing recommendations on the ethical, legal, and humanitarian challenges of autonomous weapons.
The GGE has met annually since 2017. Nine years of meetings. The result: a "rolling text" defining principles — predictability, reliability, human oversight, accountability — on which most States agree.
The problem? After nine years, the group still doesn't have a shared definition of what "meaningful autonomy" means in the context of a weapon system.
Without a definition, no regulation. Without regulation, no treaty.
The votes tell a story
Parallel to Geneva, the UN General Assembly started pushing from New York.
2023: The First Committee approved the first-ever resolution on LAWS. 164 States in favor, 5 against.
2024: A second resolution passed the General Assembly plenary. 166 in favor, only 3 against — Belarus, North Korea, Russia.
2025: A third resolution. 156 in favor, 5 against, 8 abstentions.
The numbers went down. Why?
Because the United States flipped. After supporting both previous resolutions, it voted against in 2025 — joining Russia, North Korea, Belarus, and Israel.
And the language got weaker. The resolution called on CCW parties to:
"Work towards completing the set of elements for an instrument being developed within the mandate of the Group of Governmental Experts, with a view to future negotiation."
Not negotiate. Not conclude. Work towards completing elements. The gap between the ambition of 156 States and the language they managed to pass tells you where the real power lies.
Why one Country can block Everything
The failure isn't mysterious. It's architectural.
The GGE operates by consensus. Any single participating State can block any proposal. Not a majority against. Not an alliance. One country is enough.
And countries with an interest in blocking do exist. Reaching Critical Will — the disarmament program of the Women's International League for Peace and Freedom — identifies six countries that have systematically obstructed the negotiations:
Russia, Israel, India, Australia, the Republic of Korea, and the United States.
Not a random group. It's the set of countries with major investments in autonomous weapons R&D, or with strategic alliances linking them to countries that do. None wants a treaty imposing binding obligations on systems it's actively developing.
Russia has been particularly effective at procedural sabotage: requesting terminological clarifications at every iteration, delaying operational definitions, invoking the need for "further technical discussion." Both Russia and the US have proposed changes to the rolling text that significantly weakened drafts put forward by the Chair.
The result: nine years, no shared definition, no treaty.
The alternative path — and why it probably won't work
The deadlock in Geneva has produced an interesting institutional split.
The First Committee of the General Assembly operates in New York, not Geneva. It decides by majority, not consensus. That's why the resolutions pass with overwhelming numbers. But they're politically significant and legally weak — they don't bind the States that vote against. Pressure, not obligations.
The structural irony: even the Secretary-General, even the ICRC President, even the 2025 resolution — all refer to the CCW as the place where the treaty should be concluded. The CCW operates by consensus. The institutional design requires that the decisive table be the one where blocking works.
The Austrian initiative
Some States are trying to break the loop. Austria hosted a major conference in Vienna in April 2024, tabled all three UNGA resolutions, and has publicly stated that the GGE cannot fulfill the majority's goals.
The precedents exist:
1997 Anti-Personnel Mine Ban Treaty — created outside consensus fora by a coalition of willing States.
2017 Treaty on the Prohibition of Nuclear Weapons — same playbook, two decades later.
September 2025 — 42 States declared themselves ready to negotiate LAWS on the basis of the rolling text.
But this path has three problems.
Problem 1: Who's missing from the table.
A LAWS treaty without the USA, Russia, China, Israel, and India — that is, without the countries producing most autonomous weapons systems — regulates practically nothing.
A political act, not a technical one.
Problem 2: You can't inspect software the way you inspect a minefield.
The landmine precedent works because mines are localizable physical objects. You can verify whether a factory makes them, whether a country stockpiles them, whether a territory has been cleared.
LAWS are distributed systems: hardware, software, algorithms, operational doctrine.
The line between an autonomous weapon and a drone with AI-assisted targeting is a configuration parameter, not a physical characteristic.
And that software can be updated remotely after deployment. Verification at scale may be impossible.
Problem 3: Fractured governance.
Breaking from the CCW means accepting two non-aligned regimes governing the same weapons, with States operating under different rules depending on which treaty they've signed.
This isn't hypothetical — it's exactly what happened with nuclear weapons, where the NPT and the TPNW coexist without reconciliation.
Why this matters beyond weapons: the civilian AI parallel
Here's the reason this process deserves attention well beyond disarmament.
The deadlock on LAWS is structurally identical to the deadlock emerging on global civilian AI governance. Same architecture, same blocking dynamics, same likely outcome.
Dominant powers resist constraints. On weapons: countries with advanced military AI resist limits on strategic technology. On civilian AI: the USA and China resist binding international regimes that could erode their competitive advantage.
Multilateral bodies produce declarations, not rules. On weapons: the CCW produces rolling text without binding decision-making capacity. On civilian AI: the OECD, the UN, the Council of Europe, the G7, the Bletchley Park Summit — all produce non-binding principles and voluntary frameworks. None has enforcement mechanisms.
Bilateral deals replace missing multilateralism. On weapons: AUKUS and other defense agreements establish rules of military AI cooperation among restricted allies. On civilian AI: the CLOUD Act, chip export restrictions on China, AI Security Cooperation Agreements — all bilateral mechanisms filling a multilateral vacuum. Add to this the pattern of executive orders on AI — signed, rescinded, rewritten depending on who holds the White House — showing how even domestic regulation becomes unstable when it depends on unilateral authority.
The EU tries to regulate everyone. On weapons: the EU is the only actor attempting to regulate the behavior of third-party States, based on territorial effect. On civilian AI: the European AI Act does exactly the same thing — regulating systems that operate in EU territory, regardless of where they were designed.
The likely outcome is the same on both planes. Shared principles at the rhetorical level. Real obligations only within the jurisdictional borders of whoever is willing to enforce them. And bilateral rules among allies, effectively replacing the multilateralism that never materialized.
What this means for a European company
This isn't abstract. Italy is directly in this game.
The Italian government has earmarked €3.2 billion for unmanned military systems in its 2025–2027 defense plan. Leonardo — Italy's largest defense contractor — is developing:
Stricks — a silent, flying-wing tactical drone with autonomous route execution
Falco Xplorer — persistent surveillance platform
€2.4 billion partnership with Turkey's Baykar to deploy armed UAVs with kinetic capabilities
Rheinmetall Italia is building AI-assisted air defense systems. Italy participates in the European LEAP initiative for low-cost autonomous swarming drones, and in the Franco-European nEUROn stealth UCAV project.
These are not fringe programs. This is a NATO country, an EU founding member, and the eurozone's third-largest economy investing billions in exactly the autonomous weapons systems the failed treaty was supposed to regulate.
The supply chain is deeply Italian.
Lombardy has one of the highest concentrations of defense-related manufacturing in Europe. Hundreds of SMEs across the region produce components, subsystems, optics, electronics, and precision mechanics for Leonardo, Elettronica, and other defense primes.
Many of these components are dual-use by nature. The same sensor, the same embedded processor, the same navigation module can end up in a civilian drone or in a weapons system.
The absence of a LAWS treaty means there is no binding international framework distinguishing between these destinations at the point of manufacture.
This isn't a geopolitical abstraction. It's a supply chain problem.
The concrete risks:
American military AI will continue to be developed without international constraints, with increasingly wide civilian spillover.
European subcontractors will face growing pressure to participate in supply chains governed by rules they had no part in writing.
Dual-use technologies — computer vision, autonomous optimization, reinforcement learning — power civilian applications subject to much stricter European regulation. The compliance gap between what a component was designed for and what it's eventually used for has no international framework to close it.
Regulatory asymmetry: European vendors operate under stricter rules than American and Chinese competitors. Same technology, different regulatory cost. Speed and cost asymmetries compound over time.
Infrastructure dependency on non-European cloud, foundation models, and chips keeps increasing — precisely as the rules governing that infrastructure become objects of geopolitical negotiation.
The signal
The failure of the LAWS treaty isn't isolated. It's a predictor.
It tells us that the multilateral system that produced the Nuclear Non-Proliferation Treaty in the sixties, the Geneva conventions on chemical and biological weapons, the Montreal protocol on ozone — that system will not produce functional equivalents for the technologies of the twenty-first century.
The rules that matter will be written elsewhere:
In Pentagon procurement.
In supply contracts between Google and the federal government.
In the technical clauses of the EU AI Act.
In the rulings of California state courts.
In Brussels' willingness or unwillingness to fine Meta, Apple, Google.
The practical implications are straightforward:
Diversify dependencies. Contracts that allow switching vendors.
Architectures that don't bind business logic to a single API.
Documentation that demonstrates compliance even when rules change retroactively.
Jurisdictional coverage — not because it's elegant, but because rules change in different places at different speeds.
The LAWS treaty won't be signed in November.
It's a predictor. Not a tragedy.
Fabio Lauria
CEO & Founder, ELECTE
Every week we explore AI without the hype — with data, analysis and an independent perspective.
Sources
The CCW process and the GGE on LAWS UNODA: Group of Governmental Experts on LAWS (2026) | UNODA: Briefing by the Chair of the CCW GGE on LAWS | Reaching Critical Will: CCW Report | Digital Watch: GGE on LAWS | Reaching Critical Will: CCW Report Vol. 12 No. 3
UNGA resolutions Human Rights Watch: UN Vote Should Spur Action on Treaty (2023) | Human Rights Watch: UN Vote Should Spur Treaty Negotiations (2024) | Arms Control Association: UN Moves to Expand Autonomous Weapons Discussions | Stop Killer Robots: 156 States Support UNGA Resolution (2025) | ASIL: LAWS & International Law — Growing Momentum Towards a New Treaty
State positions and voting records Automated Decision Research: United States | Automated Decision Research: Austria | Stop Killer Robots: September 2025 GGE Joint Statement
Geopolitical analysis Arms Control Association: Geopolitics and the Regulation of Autonomous Weapons Systems | ICT4Peace Foundation: Geneva Talks on Autonomous Weapons | Usanas Foundation: Regulating LAWS in a Fractured Multipolar Order | RUSI: The Way Forward on Autonomous Weapons after the Vienna Conference | Article36: Opportunities after the UNGA Resolution

If you found this analysis useful, please share it with someone who might be interested. And if you’d like to find out how ELECTE uses AI to automate data analysis and reporting, you can find out more at electe.net.
