Scientists warn AI could trigger accidental nuclear strike as global arms race intensifies across the US, China, and Russia

Scientists warn AI could trigger accidental nuclear strike as global arms race intensifies across the US, China, and Russia

Imagine a future where the decision to launch nuclear weapons isn’t made by presidents or generals — but by a computer.

It sounds like science fiction, but leading experts are warning that it’s becoming dangerously close to reality.

The Stockholm International Peace Research Institute (SIPRI) has just dropped a sobering report, warning that we’re not just seeing a return of the nuclear arms race — we’re seeing it evolve into something much more high-tech and unpredictable.


Nuclear Stockpiles Are Growing Again

For decades, it looked like the world was moving away from nuclear warheads. Global stockpiles were shrinking, treaties were in place, and cooler heads were mostly prevailing.

But now, SIPRI says that decline has stopped — and in some places, it’s going in reverse.

Countries like China, India, Pakistan, North Korea, and Israel are all expanding their nuclear programs.

In total, the world holds over 12,200 nuclear warheads, and more than 9,600 of those are part of active military forces.

Around 2,100 are on high alert — ready to launch from submarines, aircraft, or missiles.


The Rise of AI in Nuclear Decision-Making

What’s new — and terrifying — is how artificial intelligence (AI) is starting to enter the picture.

Governments see AI as a tool to make faster decisions, especially during crises when time is tight.

But scientists are urging caution, warning that relying on machines to make split-second nuclear choices is a dangerous gamble.

“If the decision to launch nuclear weapons is ever fully handed over to AI, we’d be approaching true doomsday scenarios,” said Dan Smith, SIPRI’s director.

Sure, AI can process more data in less time than any human.

But what it gains in speed, it might lose in judgment — and that’s not something the world can afford when nukes are on the line.


Lessons from the Past: When One Man Saved the World

The report points to moments in history where humanity was only minutes away from disaster.

One example is Lieutenant Colonel Stanislav Petrov, a Soviet officer in 1983 who received a false alert that U.S. missiles were incoming.

If he had blindly trusted the system and sounded the alarm, it could’ve triggered global retaliation.

But Petrov paused, thought critically, and decided it didn’t make sense — and that decision likely saved millions of lives.

“People like Petrov might not exist in the future if AI systems act too quickly for humans to intervene,” Smith warned.


A Dangerous Arms Race Heats Up

SIPRI’s 2025 report says the world is clearly entering a new nuclear arms race, but this time with a digital twist.

Countries aren’t just building more weapons — they’re also racing to develop smarter, faster AI tools to control them.

China, in particular, is making headlines for expanding rapidly.

Since 2023, it’s been adding about 100 nuclear warheads every year, bringing its total up to 600 in 2025.

By the 2030s, China could have as many intercontinental missiles as Russia or the U.S.


The End of Disarmament and the Return of Fear

Perhaps most disturbing is SIPRI’s conclusion: the era of reducing nuclear weapons — a trend since the Cold War — is over.

Instead, we’re seeing a return to sharper nuclear rhetoric, abandoned arms agreements, and fewer diplomatic safeguards.

Even traditional nuclear powers like Russia and the U.S., which together hold around 90% of the world’s nukes, are now modernizing their arsenals rather than shrinking them.


Conflict in the Middle East Raises the Stakes

These warnings come at a tense time globally.

Just last week, Israel launched attacks on Iran’s nuclear facilities, sparking fears that a wider conflict — even a third world war — could be looming.

The White House, while not directly involved, confirmed that President Donald Trump will decide in the next two weeks whether to launch a U.S. military strike against Iran’s nuclear program.

Iran doesn’t have nuclear weapons yet, but its allies Russia and China definitely do — and that adds a chilling layer of global risk.


Technology vs. Humanity — The Final Frontier?

While there’s no denying that AI can bring efficiency and power to military decision-making, SIPRI is urging world leaders not to get swept up in the speed.

The report stresses that a future where AI decides when to go to war is one where accidents are more likely — and more deadly.

As Dan Smith puts it:

“AI has a wide range of strategic utility…

but the careless adoption of AI could significantly increase nuclear risk.”


What’s Next?

With the world’s most powerful nations locked in a secretive, high-tech arms race, and diplomatic safeguards breaking down, the SIPRI report sounds like more than just a warning — it’s a wake-up call.

The choice ahead is clear: Will world leaders stay in control of the nuclear button — or will they let machines take the lead?