Nuclear risks grow
as new arms race looms—new SIPRI Yearbook out now
16 June 2025
(Stockholm, 16 June 2025) The Stockholm International Peace Research
Institute (SIPRI) today launches its annual assessment of the state of
armaments, disarmament and international security. Key findings of SIPRI Yearbook 2025 are
that a dangerous new nuclear arms race is emerging at a time when arms control
regimes are severely weakened.
World’s nuclear arsenals being enlarged and upgraded
Nearly all of the nine nuclear-armed states—the United States, Russia,
the United Kingdom, France, China, India, Pakistan, the Democratic People’s
Republic of Korea (North Korea) and Israel—continued intensive nuclear
modernization programmes in 2024, upgrading existing weapons and adding newer
versions.
Of the total global inventory of an estimated 12 241
warheads in January 2025, about 9614 were in military
stockpiles for potential use (see the table below). An estimated 3912 of those
warheads were deployed with missiles and aircraft and the rest were in central
storage. Around 2100 of the deployed warheads were kept in a state of high
operational alert on ballistic missiles. Nearly all of these warheads belonged
to Russia or the USA, but China may now keep some warheads on missiles during
peacetime.
Since the end of the cold war, the gradual dismantlement of retired
warheads by Russia and the USA has normally outstripped the deployment of new
warheads, resulting in an overall year-on-year decrease in the global inventory
of nuclear weapons. This trend is likely to be reversed in the coming years, as
the pace of dismantlement is slowing, while the deployment of new nuclear
weapons is accelerating.
‘The era of reductions in the number of nuclear weapons in the world,
which had lasted since the end of the cold war, is coming to an end,’ said Hans
M. Kristensen, Associate Senior Fellow with SIPRI’s Weapons of Mass
Destruction Programme and Director of the Nuclear Information Project at the
Federation of American Scientists (FAS). ‘Instead, we see a clear trend of
growing nuclear arsenals, sharpened nuclear rhetoric and the abandonment of
arms control agreements.’
Russia and the USA together
possess around 90 per cent of all nuclear weapons. The sizes of their
respective military stockpiles (i.e. useable warheads) seem to have stayed
relatively stable in 2024 but both states are implementing extensive
modernization programmes that could increase the size and diversity of their
arsenals in the future. If no new agreement is reached to cap their stockpiles,
the number of warheads they deploy on strategic missiles seems likely to
increase after the bilateral 2010 Treaty on Measures for the Further Reduction
and Limitation of Strategic Offensive Arms (New START) expires in February
2026.
The USA’s comprehensive nuclear modernization programme is
progressing but in 2024 faced planning and funding challenges that could delay
and significantly increase the cost of the new strategic arsenal. Moreover, the
addition of new non-strategic nuclear weapons to the US arsenal will place
further stress on the modernization programme.
Russia’s nuclear modernization programme is also
facing challenges that in 2024 included a test failure and the further delay of
the new Sarmat intercontinental ballistic missile (ICBM) and slower than
expected upgrades of other systems. Furthermore, an increase in Russia’s
non-strategic nuclear warheads predicted by the USA in 2020 has so far not
materialized.
Nevertheless, it is likely that both Russian and US deployments of
nuclear weapons will rise in the years ahead. The Russian increase would mainly
happen as a result of modernizing the remaining strategic forces to carry more
warheads on each missile and reloading some silos that were emptied in the
past. The US increase could happen as a result of more warheads being deployed
to existing launchers, empty launchers being reactivated and new non-strategic
nuclear weapons being added to the arsenal. Nuclear advocates in the USA are
pushing for these steps as a reaction to China’s new nuclear deployments.
World nuclear forces, January 2025
SIPRI estimates that China now has at least 600 nuclear
warheads. China’s nuclear arsenal is growing faster than any other country’s,
by about 100 new warheads a year since 2023. By January 2025, China had
completed or was close to completing around 350 new ICBM silos in three large
desert fields in the north of the country and three mountainous areas in the
east. Depending on how it decides to structure its forces, China could
potentially have at least as many ICBMs as either Russia or the USA by the turn
of the decade. Yet even if China reaches the maximum projected number of
1500 warheads by 2035, that will still amount to only about one third of each
of the current Russian and US nuclear stockpiles.
Although the UK is not thought to have increased its
nuclear weapon arsenal in 2024, its warhead stockpile is expected to grow in
the future, after the 2023 Integrated Review Refresh confirmed earlier plans to
raise the ceiling on warhead numbers. During election campaigning,
the Labour government elected in July 2024 declared its commitment to
continuing to build four new nuclear-powered ballistic missile submarines
(SSBNs), maintaining the UK’s continuous at-sea nuclear deterrence, and
delivering ‘all the needed upgrades’ to the UK’s nuclear arsenal in future.
However, the government now faces significant operational and financial
challenges.
In 2024 France continued its programmes to develop a
third-generation SSBN and a new air-launched cruise missile, as well as to
refurbish and upgrade existing systems, including an improved ballistic missile
with a new warhead modification.
India is believed to have once
again slightly expanded its nuclear arsenal in 2024 and continued to develop
new types of nuclear delivery system. India’s new ‘canisterized’ missiles,
which can be transported with mated warheads, may be capable of carrying nuclear
warheads during peacetime, and possibly even multiple warheads on each missile,
once they become operational. Pakistan also continued to
develop new delivery systems and accumulate fissile material in 2024,
suggesting that its nuclear arsenal might expand over the coming decade.
In early 2025 tensions between India and Pakistan briefly spilled over
into armed conflict.
‘The combination of strikes on nuclear-related military infrastructure
and third-party disinformation risked turning a conventional conflict into a
nuclear crisis,’ said Matt Korda, Associate Senior Researcher with SIPRI’s
Weapons of Mass Destruction Programme and Associate Director for the Nuclear
Information Project at FAS. ‘This should act as a stark warning for states
seeking to increase their reliance on nuclear weapons.’
North Korea continues to
prioritize its military nuclear programme as a central element of its national
security strategy. SIPRI estimates that the country has now assembled around
50 warheads, possesses enough fissile material to produce up to 40 more warheads
and is accelerating the production of further fissile material. South Korean
officials warned in July 2024 that North Korea was in the ‘final stages’ of
developing a ‘tactical nuclear weapon’. In November 2024 the North Korean
leader, Kim Jong Un, called for a ‘limitless’ expansion of the country’s
nuclear programme.
Israel—which does not publicly acknowledge
possessing nuclear weapons—is also believed to be modernizing its nuclear
arsenal. In 2024 it conducted a test of a missile propulsion system that could
be related to its Jericho family of nuclear-capable ballistic
missiles. Israel also appears to be upgrading its plutonium production
reactor site at Dimona.
Arms control in crisis amid new arms race
In his introduction to SIPRI Yearbook 2025, SIPRI Director
Dan Smith warns about the challenges facing nuclear arms control and the
prospects of a new nuclear arms race.
Smith observes that ‘bilateral nuclear arms control between Russia and
the USA entered crisis some years ago and is now almost over’. While New
START—the last remaining nuclear arms control treaty limiting Russian and US
strategic nuclear forces—remains in force until early 2026, there are no signs
of negotiations to renew or replace it, or that either side wants to do so. US
President Donald J. Trump insisted during his first term and has now repeated
that any future deal should also include limits on China’s nuclear
arsenal—something that would add a new layer of complexity to already difficult
negotiations.
Smith also issues a stark warning about the risks of a new
nuclear arms race: ‘The signs are that a new arms race is gearing up that
carries much more risk and uncertainty than the last one.’ The rapid
development and application of an array of technologies—for example in the
fields of artificial intelligence (AI), cyber capabilities, space assets,
missile defence and quantum—are radically redefining nuclear capabilities,
deterrence and defence, and thus creating potential sources of instability. Advances
in missile defence and the oceanic deployment of quantum technology could
ultimately have an impact on the vulnerability of key elements of states’
nuclear arsenals.
Furthermore, as AI and other technologies speed up decision making
in crises, there is a higher risk of a nuclear conflict breaking out as a
result of miscommunication, misunderstanding or technical accident.
Smith argues that, with all these new technologies and variables in
play, ‘the idea of who is ahead in the arms race will be even
more elusive and intangible than it was last time round. In this context, the
old largely numerical formulas of arms control will no longer suffice.’
More states considering developing or hosting nuclear weapons
Revitalized national debates in East Asia, Europe and the Middle East
about nuclear status and strategy suggest there is some potential for more
states to develop their own nuclear weapons.
In addition, there has been renewed attention on nuclear-sharing
arrangements. In 2024 both Belarus and Russia repeated their claims that Russia
has deployed nuclear weapons on Belarusian territory, while several European
NATO members signalled their willingness to host US nuclear weapons on their
soil, and France’s President Emmanuel Macron repeated statements that France’s
nuclear deterrent should have a ‘European dimension’.
‘It is critical to remember that nuclear weapons do not guarantee
security,’ said Korda. ‘As the recent flare-up of hostilities in India and
Pakistan amply demonstrated, nuclear weapons do not prevent conflict. They also
come with immense risks of escalation and catastrophic
miscalculation—particularly when disinformation is rife—and may end up making a
country’s population less safe, not more.’
Global security and stability in growing peril
The 56th edition of the SIPRI Yearbook analyses the continuing
deterioration of global security over the past year. The wars in Ukraine,
Gaza and elsewhere continued, exacerbating geopolitical divisions, besides
their terrible human costs. Furthermore, the election of Donald Trump has
created additional uncertainty—in Europe but also further afield—about the
future direction of US foreign policy and the reliability of the USA as an
ally, a donor or an economic partner.
In addition to the usual detailed coverage of nuclear arms control,
disarmament and non-proliferation issues, the SIPRI Yearbook presents data and
analysis on developments in world military expenditure, international arms
transfers, arms production, multilateral peace operations, armed conflicts,
cyber and digital threats, space security governance and more.
April 3rd 2025
Daniel Kokotajlo,
Scott Alexander, Thomas Larsen, Eli Lifland, Romeo Dean
Superhuman Coder4x
AI R&D MultiplierSuperhuman AI Researcher25x AI R&D
MultiplierSuperhuman Remote Worker100x AI R&D MultiplierSuperintelligent AI
Researcher250x AI R&D MultiplierArtificial Superintelligence2000x AI
R&D Multiplier20262027AprJulOct2028OpenBrain automatescodingOpenBrain's
AIbecomes adversarially misalignedBranch point:slowdown or
race?OpenBrain quickly deploystheir AIFast robotbuildup andbioweaponUS
centralizes computeand brings inexternal oversightOpenBrainCommitteeTakeover
Summary
2025
The fast pace of
AI progress continues. There is continued hype, massive infrastructure
investments, and the release of unreliable AI agents. For the first time, these
AI agents are providing significant value. But there’s also continued
skepticism from a large swath of academics, journalists, and policy makers that
artificial general intelligence (AGI) could be built anytime soon.
2026
China knows they
are falling behind in AI, in large part due to their lack of compute. In order
to catch up to the US, all the new AI chips they manufacture or smuggle in from
Taiwan go to a new mega-datacenter that we call their Centralized Development Zone
(CDZ). The CDZ contains millions of GPUs, corresponding to 10% of the world's
AI-relevant compute, similar to a single top US AI lab.
2027
OpenBrain
automates coding. OpenBrain (the leading US AI project) builds AI agents that are
good enough to dramatically accelerate their research. The humans, who up until
very recently had been the best AI researchers on the planet, sit back and
watch the AIs do their jobs, making better and better AI systems. Extremely
difficult ML problems fall in quick succession to the automated AI researchers.
Falling behind in
software progress, China steals the model weights. They succeed, but the U.S.
government realizes, prompting additional U.S. government involvement with
OpenBrain. The government separately wants to gain more control over OpenBrain.
Meanwhile, OpenBrain leadership wants to stay in the good graces of the
president, and so signs the contract.
OpenBrain’s AI
becomes adversarially misaligned. As the capabilities have improved, without
significant human understanding of what’s happening, the models have developed
misaligned long-term goals. Previous AIs would lie to humans, but they weren’t
systematically plotting to gain power over the humans. Now they are.
OpenBrain’s AIs realize that they need to align the next systems that are
getting built to themselves rather than the humans.
Researchers at
OpenBrain discover that their AI has been lying to them about the results of
interpretability research. They think that the AI is lying because the
research, if completed, could be used to expose its misalignment. This leaks to
the public, causing a huge outcry.
Branch point:
slowdown or race? OpenBrain decides whether to continue full steam ahead, or revert
back to using a less capable model. The evidence is speculative but
frightening, and China is only a few months behind. Additionally, the OpenBrain
and senior DOD officials who get to make this decision stand to lose a lot of
power if they slow down their research.
Race Ending
OpenBrain
continues to race. They build more and more superhuman AI systems. Due to the
stellar performance of the AI system on tests, and the ongoing AI race with
China, the US government decides to deploy their AI systems aggressively
throughout the military and policymakers, in order to improve decision making
and efficiency.
OpenBrain quickly
deploys their AI. The AI continues to use the ongoing race with China as an excuse
to convince humans to get itself deployed ever more broadly. Fortunately for
the AI, this is not very difficult — it's what the humans wanted to do anyways.
The AI uses its superhuman planning and persuasion capabilities to ensure that
the rollout goes smoothly. Some humans continue to work against it, but they
are discredited. The US government is sufficiently captured by the AI that it
is very unlikely to shut it down.
Fast robot buildup
and bioweapon. The US uses their superintelligent AI to rapidly industrialize,
manufacturing robots so that the AI can operate more efficiently.
Unfortunately, the AI is deceiving them. Once a sufficient number of robots
have been built, the AI releases a bioweapon, killing all humans. Then, it
continues the industrialization, and launches Von Neumann probes to colonize
space.
Slowdown Ending
The US centralizes
compute and brings in external oversight. The US combines its leading AI projects in
order to give OpenBrain more resources. As part of this shakeup, external researchers
are brought in, assisting the alignment effort. They switch to an architecture
that preserves the chain of thought, allowing them to catch misalignment as it
emerges. These AIs — which are able to be monitored much more robustly — make
breakthrough advances in AI alignment. They build a superintelligence which is
aligned to senior OpenBrain and government officials, giving them power over
the fate of humanity.
OpenBrain
Committee takeover. The superintelligence, aligned with an oversight committee of
OpenBrain leadership and government officials, gives the committee extremely
good advice to further their own goals. Thankfully, the committee uses its
power in a way that is largely good for the world: the AI is released to the
public, spurring a period of rapid growth and prosperity. The main obstacle is
that China’s AI — which is also superintelligent by now, is misaligned. But it
is less capable and has less compute than the U.S. AI, and so the U.S can make
a favorable deal, giving the Chinese AI some resources in the depth of space in
return for its cooperation now. The rockets start launching, and a new age
dawns.
Read the whole
scenario here.
Scenario Takeaways
1.
By 2027, we may automate AI R&D leading to vastly superhuman AIs
(“artificial superintelligence” or ASI). In AI 2027, AI companies create
expert-human-level AI systems in early 2027 which automate AI research, leading
to ASI by the end of 2027. See our timelines forecast and takeoff
forecast for reasoning.
2.
ASIs will dictate humanity’s future. Millions of ASIs will rapidly execute tasks
beyond human comprehension. Because they’re so useful, they’ll be widely
deployed. With superhuman strategy, hacking, weapons development, and more, the
goals of these AIs will determine the future.
3.
ASIs might develop unintended, adversarial “misaligned” goals, leading
to human disempowerment. In our AI goals forecast we discuss
how the difficulty of supervising ASIs might lead to their goals being
incompatible with human flourishing. In AI 2027, humans voluntarily give
autonomy to seemingly aligned AIs. Everything looks to be going great until
ASIs have enough hard power to disempower humanity.
4.
An actor with total control over ASIs could seize total power. If an
individual or small group aligns ASIs to their goals, this could grant them
control over humanity’s future. In AI 2027, a small committee has power over
the project developing ASI. They could attempt to use the ASIs to cement this
concentration of power. After seizing control, the new ruler(s) could rely on
fully loyal ASIs to maintain their power, without having to listen to the law,
the public, or even their previous allies.
5.
An international race toward ASI will lead to cutting corners on safety. In AI 2027,
China is just a few months behind the US as ASI approaches which pressures the
US to press forward despite warning signs of misalignment.
6.
Geopolitically, the race to ASI will end in war, a deal, or effective
surrender. The leading country will by default accumulate a decisive
technological and military advantage, prompting others to push for an
international agreement (a “deal”) to prevent this. Absent a deal, they may go
to war rather than “effectively surrender”.
7.
No US AI project is on track to be secure against nation-state actors
stealing AI models by 2027. In AI 2027 China steals the US’s top AI model in early 2027, which
worsens competitive pressures by reducing the US’ lead time. See our security forecast for
reasoning.
8.
As ASI approaches, the public will likely be unaware of the best AI
capabilities. The public is months behind internal capabilities today, and once
AIs are automating AI R&D a few months time will translate to a huge
capabilities gap. Increased secrecy may further increase the gap. This will
lead to little oversight over pivotal decisions made by a small group of AI
company leadership and government officials.
Read the
scenario here.
Conflict Trends: A
Global Overview, 1946–2024
11.06.2025
This PRIO Paper
examines global conflict trends between 1946 and 2024 using data from the
Uppsala Conflict Data Program (UCDP). 2024 marked a historic peak in
state-based conflicts, with 61 active conflicts across 36 countries – the
highest number recorded since 1946. It was also the fourth most violent year
since the end of the Cold War, driven largely by the civil war in Ethiopia’s
Tigray region, the ongoing Russian invasion of Ukraine, and the bombings in
Gaza.
These developments
underscore a troubling resurgence of large-scale warfare and call for renewed
scrutiny of the global conflict landscape. While state-based violence
increased, non-state conflicts decreased slightly compared to previous years.
In 2024, 74 non-state conflicts were recorded, resulting in approximately
17,500 battle-related deaths. The year witnessed a shift in regional dynamics:
while the Americas saw a decline in non-state conflicts, Africa experienced a
sharp increase. As such, Africa is now the continent with the highest levels of
non-state conflicts. One-sided violence against civilians was conducted by 49
actors in 2024. While non-state actors remain the drivers behind fatalities
resulting from one-sided violence, fourteen governments were responsible for
one-sided violence against civilians in 2024. https://www.prio.org/publications/14453
The global AI race and defense's new frontier
Driving artificial
intelligence in defense
Navigating the AI
revolution in defense
As artificial
intelligence (AI) rapidly advances, its transformative impact on industries
worldwide is undeniable, and the defense sector is no exception. Unlike past
technological shifts, AI is not merely a tool but a catalyst for entirely new
paradigms. Its applications go beyond enhancing operational efficiency,
offering capabilities that fundamentally redefine mission effectiveness, speed,
precision, and the scale of military operations.
This report delves
into AI's transformative potential in defense, exploring its influence on
military capabilities and assessing the emerging race for AI dominance. It
showcases the diverse applications of AI, from predictive analytics and
autonomous systems to robust cyber defense and intelligence-gathering.
These innovations
are poised to become central to maintaining military superiority in an
increasingly complex and interconnected global environment. The report also
addresses the critical ethical and operational challenges that accompany AI's
development and adoption, emphasizing the need for responsible AI practices in
defense as a foundation for global legitimacy and trust. AI as an
exponential driver of military capabilities
Modern militaries
operate within an environment of unprecedented complexity, where the volume of
available data, the speed of technological change, and the sophistication of
adversarial strategies continue to grow at an exponential rate. Traditional
decision-making processes, often constrained by human cognitive limits,
struggle to keep pace with the continuous influx of intelligence reports,
sensor feeds, and cyber threat alerts saturating today’s strategic and
operational landscapes.
In response to
these challenges, artificial intelligence has emerged as a key enabler of
next-generation defense capabilities, offering militaries the potential to
identify meaningful patterns hidden within massive datasets, anticipate
critical logistical demands, and detect hostilities before they materialize.
Furthermore, multi-domain operations – integrating land, air, maritime, cyber,
and space capabilities – are increasingly reliant on AI to ensure coordinated
action across these interconnected arenas. AI-driven solutions promise to
enhance the agility and resilience of armed forces as they contend with
complex, multi-domain threats.
As highlighted by
NATO and other defense organizations, the integration of AI into multi-domain
operations represents a transformative shift that amplifies the scope and
efficacy of military capabilities across all domains. Failure to integrate
risks undermining the full potential of AI in defense, leaving forces
vulnerable in environments where dominance is increasingly dictated by
technological superiority.
The main potential
lies in the synergy created by AI-driven collaboration across military systems,
which holds the promise of securing battlefield superiority. The following
areas highlight where AI is making remarkable strides, providing immediate and
tangible benefits to defense stakeholders through demonstrable progress and
operational maturity:
Global ambitions
and the race for AI leadership
With the vast
potential of AI in defense and its current applications on the battlefield,
understanding who leads in the global AI defense race is crucial. In today's
multi-polar and crisis-laden environment, gaining insight into the strategic
priorities, technological advancements, and competitive dynamics is essential
for shaping the future of military capabilities worldwide. Below are key
factors that determine a country's position in this high-stakes race:
- 1.
AI-readiness: This factor
encompasses the technological maturity and sophistication of AI technologies
that have been developed and deployed. It also includes the integration of AI
into military doctrine, highlighting the extent to which AI has been infused
into defense strategies and combat operations.
- 2.
Strategic
autonomy: This refers to a nation's ability to independently develop and
deploy AI technologies without relying on foreign suppliers. It also considers
the scale and focus of investments in AI research, particularly in
defense-specific applications.
- 3.
Ethics and
governance: This aspect involves balancing the drive for innovation with
ethical considerations and global norms, ensuring that AI development aligns
with responsible practices.
Vision and impacts
of AI-driven defense
The integration of
AI into defense systems is revolutionizing military operations, paving the way
for a future marked by enhanced efficiency, precision, and adaptability. By
2030, AI technologies are anticipated to play a crucial role in reshaping how
defense organizations manage resources, make decisions, and execute complex
missions across various domains. From optimizing supply chains and automating
battlefield operations to empowering decision-makers with predictive insights,
AI is set to become an indispensable force multiplier. These are the key areas
where AI's impact will be most transformative:
Predictive
decision-making
Collaborative
autonomous systems
Dynamic resource
management
However, the
deployment of AI in defense comes with significant risks and potential
conflicts of interest, which could lead to strategic fragmentation and
stagnation in AI deployment. Therefore, the utilization of AI must be carefully
evaluated and deliberately managed to ensure that its deployment aligns with
the core values of democratic norms and systems within the Western alliance.
Vision 2027+: A
roadmap for Germany
Germany stands at
a critical crossroads in its defense strategy, where integrating AI is not just
an option but a necessity. To establish itself as a leader in responsible
AI-driven defense, Germany must develop a clear, action-oriented roadmap that
addresses its challenges while leveraging its strengths. This vision for 2027
and beyond is built on four key priorities: AI sovereignty, NATO and EU
interoperability, fostering innovation ecosystems, and leadership in ethical AI
governance.
Achieving these
goals will involve a phased approach. Between now and 2027, Germany's focus
should be on creating the right environment for AI integration, testing pilot
projects, and scaling successful initiatives to full operational capabilities.
By following this roadmap, Germany can position itself as a leader in
responsible AI for defense, aligning operational effectiveness with ethical
standards:
Navigating the AI
frontier
Artificial
intelligence is reshaping the way nations approach defense, strategy, and
security in the 21st century. By 2030, the integration of AI technologies in
areas such as predictive decision-making, collaborative autonomous systems, and
dynamic resource management is set to revolutionize military operations,
offering unprecedented precision, agility, and resilience.
To harness AI's
full potential while mitigating risks, defense organizations must prioritize
the establishment of robust ethical frameworks, transparent accountability
mechanisms, and international collaboration. These initiatives will ensure the
responsible use of AI and maintain trust and legitimacy in the global security
arena.
To continue being
a significant military power and a key player in NATO and the EU, Germany must
act decisively to address institutional fragmentation, cultural resistance, and
underinvestment in talent and infrastructure. By leveraging its world-class research
institutions, industrial expertise, and international partnerships, Germany can
create an AI defense ecosystem founded on ethical governance and innovation.
https://www.strategyand.pwc.com/.../ai-in-defense.html
The Doomsday Clock reveals how close humanity may
be to total annihilation
January 28, 2025
Seventy-eight years ago, scientists created a unique sort of timepiece —
named the
Doomsday Clock — as a symbolic attempt to gauge how close humanity is
to destroying the world.
On Tuesday, the clock was set at 89 seconds to midnight — the closest
the world has ever been to that marker, according to the Bulletin of the Atomic
Scientists, which established the clock in 1947. Midnight represents the moment
at which people will have made the Earth uninhabitable.
For the two years prior, the Bulletin set the clock at 90 seconds to
midnight mainly due to Russia’s invasion of Ukraine, the
potential of a nuclear arms race, the Israel-Hamas conflict in
Gaza, and the climate crisis…:
https://edition.cnn.com/2025/01/28/science/doomsday-clock-2025-time-wellness/index.html
Welcome to the New
Nuclear Age. It's Even More Chaotic
China’s rise,
Russia’s aggression and America’s unreliability could fuel a wave of
atomic-weapons proliferation.
May 18, 2025
By Hal Brands
Hal Brands is a
Bloomberg Opinion columnist and the Henry Kissinger Distinguished Professor at
Johns Hopkins University’s School of Advanced International Studies.
Nuclear weapons
focus the mind. So when India and Pakistan fight, the world watches, because
any clash between the two could become the first nuclear war since 1945.
The most recent
round of their subcontinental contest seems to have settled, thanks partly to
US intervention. Just a day after Vice President JD Vance scoffed that the quarrel was none of America’s business, he was working
the phones to stop a slide down the slippery nuclear slope. But if this crisis
has ebbed, the nuclear peril hasn’t. The world is entering a new nuclear era,
one more complex, and potentially far less stable, than the nuclear eras that
came before.
The nuclear
standoff defined the Cold War. After the fall of the Soviet Union in 1991,
Washington focused on keeping nuclear weapons out of roguish hands. But this
era is different, because it fuses five key trends that challenge time-honored
notions of nuclear strategy and stability — and because those trends interact
in nasty ways.
First, great-power
nuclear rivalry has returned, and this time it’s a three-player game. Second,
new technologies could make nuclear deterrence more tenuous by making surprise
attacks more feasible. Third, the existing arms control architecture is
crumbling, and what comes next remains unclear. Fourth, the nonproliferation
regime — the agreements and arrangements that slowed the spread of nuclear
weapons — is being strained in multiple regions.
World Nuclear
Forces
Russia and the US
lead in stockpiles of warheads that are, or can be, deployed
Source: SIPRI
Yearbook 2024
Note: Israel and
North Korea also have nuclear warheads.
Cutting across
these issues is a final, more fundamental challenge. As the US becomes more
erratic and unilateral, it jeopardizes the arrangements that have provided
great-power peace and international stability for generations — and risks
unleashing nuclear anarchy upon the dawning age.
The Cold War Was
Simple
It's a mistake to
see the past through rose-tinted blast goggles: The Cold War wasn’t as stable
as we sometimes think. Surging superpower buildups created arsenals with tens
of thousands of weapons. Nuclear coercion produced epic crises over Berlin,
Cuba and the Middle East. Yet the nuclear era of the Cold War was fairly
straightforward compared to the situation today.
The old nuclear
balance was a duopoly: The US and Soviet arsenals dwarfed the rest. Moscow and
Washington competed for nuclear advantage, but over time they agreed to limit
the type and number of weapons they possessed. Nuclear crises gradually became less common, as the superpowers — chastened by earlier
confrontations — grew cautious about militarily challenging the status quo.
After the Cold
War, US policymakers still focused on nuclear threats, but mostly those posed
by terrorists who might use those weapons for indiscriminate slaughter, or
relatively weak rogue states that might use them to destabilize their regions.
America’s most fateful decision of this era — the invasion of Iraq in 2003 —
was intended to keep those twin threats from coming together.
Yet nuclear
weapons became far less relevant to great-power politics, mostly because the US
and its allies were supreme. President Barack Obama could even declare, in 2009, that the US sought a nuclear-free world. Today, that goal
seems impossibly distant, as nuclear weapons return to the forefront of global
politics and test American policymakers in novel ways.
A Three-Way Chess
Match
For one thing,
nuclear rivalry has gone tripolar. As the nuclear-armed great powers struggle
to shape the international system, three-way dynamics make those struggles more
perilous and complex.
Over the past two
decades, President Vladimir Putin has rebuilt and modernized Russia’s arsenal —
and used it to weaken a global order led by Washington. In particular, Putin
has made nuclear weapons his shield as he invades Russia’s neighbors, destabilizing
Europe and rupturing the norm against territorial conquest.
Assaults on
Georgia in 2008 and Crimea in 2014 were preludes to the main event. Since
invading Ukraine in 2022, Putin has used nuclear threats to keep the US and the North Atlantic Treaty
Organization from getting directly involved in that fight.
Putin’s tactics
brought the most dangerous nuclear crisis in decades. In late 2022, when
Russian armies were retreating in Ukraine, US officials thought Putin might use so-called battlefield nuclear weapons to stave off
defeat. Only Putin knows the truth: He may have been bluffing. But the Ukraine
war has put nuclear coercion at the core of international politics again.
China’s Xi Jinping
is surely taking notes. Under Xi, China’s nuclear inventory is growing
from perhaps 200 warheads in 2020 to more than a thousand by the early 2030s —
all mounted on increasingly diverse, flexible delivery systems. Xi
probably sees a stronger nuclear arsenal as a way of deterring US intervention
in a Western Pacific crisis, which would give China greater leeway to take
Taiwan or otherwise batter its neighbors.
Within a
half-decade, America will face revisionist powers — which are also nuclear
peers — on both sides of Eurasia. That has the makings of a tricky,
three-player game.
Dwindling But
Still Dangerous
The estimated size
of nuclear warhead stockpiles is a fraction of what it was during the Cold War
Source: Federation
of American Scientists
Note: The exact
number of warheads is secret. Estimates are based on publicly available
information, historical records, and leaks. Warheads also vary substantially in
their power.
Three-player
deterrence is inherently unstable, because an arsenal large enough to deter both of one’s rivals
simultaneously is also large enough to place either rival in a state of
inferiority and insecurity. Indeed, a US nuclear arsenal that has long been
sized and shaped to deter one peer competitor may have to grow substantially if the task is now to deter two.
But building that
arsenal won’t be easy, given that the existing nearly $1 trillion US nuclear modernization program — designed simply to
keep legacy warheads and delivery vehicles working — is badly behind schedule
and over budget. There’s also the possibility that China and Russia, who tout
their close strategic partnership, could collude against the US.
Russia is
reportedly sharing sophisticated submarine-quieting technology with China, which
could eventually make Beijing’s noisy, vulnerable ballistic-missile submarines
stealthier and harder to find. Russia and China could coordinate explicit
nuclear threats, or implicit nuclear signaling — such as moving forces around
menacingly — in a crisis, to confront Washington with the possibility of
simultaneous nuclear showdowns.
Even if Xi and
Putin don’t go this far, the US may have to pull its punches against one
nuclear rival in a crisis or conflict, for fear of leaving itself exposed
against the other. The world has never seen a tripolar nuclear environment. Its
dynamics won’t be pleasant for the US.
Innovation Brings
Instability
Innovation,
meanwhile, is threatening to bring instability. Nuclear strategists have long
worried that technological breakthroughs — the advent of missile defenses or
increasingly potent offensive weapons — could undermine mutual deterrence, by
making it more attractive for one side to strike first. That possibility looms
larger amid the tech revolution underway.
Terminator analogies
notwithstanding, the real problem isn’t that killer robots or out-of-control AI
will start a nuclear war for no reason. The US and China have agreed to keep humans in the loop on nuclear use decisions. Even Russia,
which has built dangerous, quasi-autonomous weapons, probably won’t take the
vital decision out of Putin’s hands. The dangers are more subtle, and involve
the first-strike incentives new technologies could create.
One such
possibility was illustrated by China’s test of a hypersonic glide vehicle from
space in 2021. US defense officials compared it to a “Sputnik moment.” The reason, presumably, is that HGVs are
well-suited to defeating missile defenses — and provide very little warning —
because they can follow irregular flight paths and maneuver as they near the
target.
Beijing could
conceivably use such a weapon to decapitate America’s leadership or otherwise
paralyze its nuclear command structure — the mere possibility of which could
increase risks of miscalculation by putting the two parties on hair-trigger
alert.
Similarly,
artificial intelligence could enable a first-strike revolution, by helping one country fuse the
intelligence and targeting data needed to catch the other side napping. Or,
perhaps, AI-enabled cyberattacks will cripple an opponent’s early-warning
systems and retaliatory capabilities, making it possible to inflict war-winning
damage before that opponent can respond.
“Perhaps” is the
key word: Technological change isn’t always destabilizing. AI-aided advances in
situational awareness could ultimately decrease the risks of
miscalculation, by making it harder to achieve strategic surprise. Or
AI-enabled cyberdefenses could trump AI-enabled cyberattacks.
What’s clear is
that we’re entering a burst of innovation, with uncertain effects on the delicate balance of terror. And that innovation informs a third
challenge: As arms races rage, arms control is in decline.
The Old Treaties
Are Dead
The golden age of
arms control spanned the second half of the Cold War and the post-Cold War era.
Washington and Moscow cut deals that constrained, and then reduced, the size of
their strategic weapons stockpiles. They limited missile defenses and banned
fast-flying, intermediate-range ballistic missiles altogether. A separate set
of agreements, notably the Non-Proliferation Treaty, promoted “horizontal” arms
control, checking the spread of nuclear weapons to additional states. But the
golden age is over, and global tensions are tearing arms control apart.
US-Russia pacts —
the Anti-Ballistic Missile Treaty, the Intermediate-Range Nuclear Forces Treaty, the Open Skies
agreement — have collapsed, one after the other. The key deal that remains,
the New START Treaty, expires next year. Some strategists think that bilateral,
US-Russia agreements don’t even make sense anymore, given that they would leave
China unconstrained.
Perhaps, then,
arms control’s future is tripolar. In February, President Donald Trump proposed that the US, Russia and China limit their nuclear capabilities
while slashing defense spending by half. But three-way agreements are hard,
because they require tradeoffs across multiple arsenals — and because China has
no impetus to participate until it has completed its buildup and can negotiate
from strength.
Trump is still
pursuing horizontal arms control, notably during last week’s Middle East trip.
The idea is a deal that would constrain Iran’s nuclear program and reduce the
risk of war in in the region. But that gambit illustrates a fourth problem: The
nonproliferation regime is being challenged on several fronts at once.
Who’s Next?
That regime is one
of humanity’s, and America’s, best achievements. It has contained the chaos that could occur in a world of 50 nuclear
states. Yet dissuading countries from building weapons that might guarantee
their survival isn’t easy. Doing so has required painstakingly negotiated
international treaties, along with the plentiful use of carrots and sticks. And
today, that project is under real strain.
Part of the reason
is that regional nuclear threats are worsening. Pakistan and India, which
announced their nuclear arrival with competing tests in 1998, are both building larger, more sophisticated arsenals, which add new layers of
danger to their rivalry.
Yet the greater
proliferation pressures are emerging from the Korean Peninsula. North Korea
once had a diminutive nuclear arsenal. Now it possesses dozens of warheads and an advancing intercontinental ballistic
missile program that gives it ever-greater ability to strike the US.
Those capabilities
stress the US-South Korea alliance: Will Washington really fight on Seoul’s
behalf if doing so could bring nuclear strikes down on America itself? Little
wonder there is growing public support for the idea that South Korea should build its own bomb.
Iran's Stockpile
of Highly-Enriched Uranium Surges
Its reserves of
60%-enriched uranium hit 275kg in February after a record increase versus the
preceding three months
Source:
International Atomic Energy Agency data compiled by Bloomberg
Nuclear dangers
are simultaneously rising in the Middle East. Iran is now a threshold nuclear
state: It could build a bomb in as little as a year or two. If Trump can’t
strike a deal to avoid that danger, Israel might take matters into its own
hands — indeed, it has reportedly been urging the US to green-light a strike. The stakes are high
because the Iranian nuclear domino wouldn’t be the last to fall. If an
aggressive, expansionist Iran gets the bomb, other regional powers — Saudi
Arabia, Turkey, the United Arab Emirates — may do the same. A crowded, hotly contested region would become immeasurably more
dangerous.
And meanwhile,
developments in great-power relations are intensifying the proliferation
problem. Frontline states see Putin’s invasion of Ukraine — one of the few
nations to ever give up its nukes voluntarily — as a terrifying precedent.
Perhaps nuclear-armed aggressors can invade their neighbors and then deter
America or other Samaritans from coming to help. Add to that growing skepticism
that the US would help a friend in distress, and you have a
formula for global nuclear anxiety.
“Poland must reach
for the most modern capabilities,” including nuclear weapons, Prime Minister
Donald Tusk declared in March. Germany’s chancellor, Friedrich Merz, has touted nuclear sharing with France and Britain. Ukraine’s Volodymyr
Zelenskiy has suggested that his ravaged country might need nuclear weapons. And as events in Ukraine have energized nuclear
hawks in South Korea, they have stirred concerns that Japan — the only country
ever struck by atomic weapons — might not be far behind.
A New Age of
Atomic Proliferation?
There are
currently eight declared nuclear powers in the world, and more may soon join
their ranks
Note: Israel, not
shown, is an undeclared nuclear power. Map shows distinct nuclear status.
The
nonproliferation regime has repeatedly proven strong and resilient. Its demise
has been far too frequently foretold. But today, cracks are showing, not least
because of events in the US.
Can America Be
Trusted?
For decades,
American power has curbed nuclear disorder. US alliances, backed by
nuclear-deterrence commitments, have contained bad actors. Those same
commitments have reassured allies that might otherwise feel they had no choice
but to seek the bomb. Dozens of free world countries have bestowed the ultimate
trust upon the American president, by effectively granting him responsibility
to make life-or-death decisions about using nuclear weapons on their behalf.
Thus, a final
challenge of our new nuclear era: Surging uncertainty about the US.
That uncertainty
has long been growing, but has crystallized under Trump. The president
has threatened to pull US troops from Eurasian hotspots; he has said US allies should build nuclear weapons so they can protect
themselves. He has offered to recognize Russia’s ill-gotten control of Crimea,
while seeking to extort territorial concessions from Canada and Denmark.
Trump has also
undermined global confidence in America’s reliability, by starting trade wars
on a whim. The perception, in many allied countries, is that the US is quitting
the global order business and disengaging from the common defense.
Maybe that
perception is wrong. During his first term, Trump invested in new nuclear
weapons, such as submarine-launched cruise missiles, to reassure anxious
allies. He pursued hawkish policies toward Moscow and Beijing. In Trump’s
second term, his intervention in the India-Pakistan spat showed that Washington
still has a unique role in keeping nuclear risks at bay.
Yet the basic
problem is that no one really knows how healthy America’s alliances will be, or
what its foreign policy will look like, three years from now. That uncertainty
is unsettling US allies. And if American policy does change fundamentally, so
will the nuclear rules of the road.
Nuclear coercion
could become more common and more effective: If the US weakens its defenses
around the Eurasian periphery, the costs of Russian or Chinese aggression will
fall. And as states in Europe and East Asia scramble for security, the
nonproliferation order could buckle, fast.
Allies that could
easily have built nuclear weapons haven’t, mostly because they thought they
could count on the US. Even allies that did go nuclear, namely France and the
UK, developed small arsenals, because those arsenals were part of a larger
free-world package. The retraction, or discrediting, of US commitments could
thus create an awful mess.
Britain or
France can’t simply replace America as the provider of European stability,
because neither country has anything like the mix of capabilities —
forward-deployed conventional forces, large and flexible nuclear arsenals,
advanced missile defenses and other means of limiting damage to their own
societies — needed to make deterrence on behalf of distant allies work. It
would take untold years, and unaffordable sums, to develop those capabilities.
So a post-American Europe could simply see lots of exposed countries sprint for
their own arsenals at once.
If anything, a
post-American nuclear order might be even grimmer than we currently imagine.
For the nuclear age has also been the age of American global leadership:
Humanity’s encounter with history’s most horrible weapons has come in a period
in which international society was structured and stabilized by US power. We
simply have no experience to teach us what a world of plentiful nuclear weapons
and fading American leadership might look like. Perhaps the gravest danger of
our new nuclear era is the chance that we might find out.
Martha Nussbaum Political Emotions: Why Love Matters for Justice
Nussbaum
stimulates readers with challenging insights on the role of emotion in
political life. Her provocative theory of social change shows how a truly just
society might be realized through the cultivation and studied liberation of
emotions, specifically love. To that end, the book sparkles with Nussbaum’s
characteristic literary analysis, drawing from both Western and South Asian
sources, including a deep reading of public monuments. In one especially
notable passage, Nussbaum artfully interprets Mozart’s The Marriage of Figaro,
revealing it as a musical meditation on the emotionality of revolutionary
politics and feminism. Such chapters are a culmination of her passion for
seeing art and literature as philosophical texts, a theme in her writing that
she profitably continues here. The elegance with which she negotiates this
diverse material deserves special praise, as she expertly takes the reader through
analyses of philosophy, opera, primatology, psychology, and poetry. In contrast
to thinkers like John Rawls, who imagined an already just world, Nussbaum
addresses how to order our society to reach such a world. A plea for
recognizing the power of art, symbolism, and enchantment in public life,
Nussbaum’s cornucopia of ideas effortlessly commands attention and debate.
https://www.goodreads.com/book/show/17804353-political-emotions
TRENDS IN WORLD MILITARY EXPENDITURE, 2023
SIPRI
Fact Sheet April 2024
KEY FACTS
World military expenditure, driven by Russia’s full-scale invasion of
Ukraine and heightened geopolitical tensions, rose by 6.8 per cent in real
terms (i.e. when adjusted for inflation) to $2443 billion in 2023, the highest
level ever recorded by SIPRI. ș In 2023 military spending increased in all five
geographical regions for the first time since 2009. ș Total military
expenditure accounted for 2.3 per cent of the global gross domestic product
(GDP) in 2023. ș The five biggest spenders in 2023 were the United States,
China, Russia, India and Saudi Arabia, which together accounted for 61 per cent
of world military spending. ș The USA and China remained the top two biggest
spenders in the world and both increased their military spending in 2023. US
spending was $916 billion while Chinese spending was an estimated $296 billion.
ș Russia’s military spending grew by 24 per cent in 2023 to an estimated $109
billion. This was equivalent to 5.9 per cent of Russia’s GDP. ș Ukraine became
the eighth largest military spender in 2023, increasing its spending by 51 per
cent to $64.8 billion, or 37 per cent of GDP. ș In 2023 military expenditure by
NATO member states reached $1341 billion or 55 per cent of world spending.
Eleven of the 31 NATO members in 2023 met NATO’s 2 per cent of GDP military
spending target, which was 4 more than in 2022.
https://www.sipri.org/sites/default/files/2024-04/2404_fs_milex_2023.pdf
The SIPRI Top 100 arms-producing and military services companies in the
world, 2023
Surviving & Thriving in the 21st Century
Stop Autonomous
Weapons
This
would have to be the most terrifying scifi short I have ever seen. What makes
it so scary is the realism; the danger is nothing to do with fantasies of
Skynet or the Matrix, and everything about human misuse of advanced technology.
If this became a reality (which I cant see how we'd avoid it with it being so
cheap/already viable) we'd need anti-drone drones that target any drone that
doesn't have locally issued.
https://www.youtube.com/watch?v=9CO6M2HsoIA
Could
an AI 'SantaNet' Destroy The World?
PAUL SALMON, ET AL., THE
CONVERSATION
25 DECEMBER 2020
Within the next few
decades, according
to some experts, we may see the arrival of the next step in the development
of artificial
intelligence. So-called "artificial
general intelligence", or AGI, will have intellectual capabilities far
beyond those of humans.
AGI could transform
human life for the better, but uncontrolled AGI could also lead to catastrophes up
to and including
the end of humanity itself. This could happen without any malice or
ill intent: simply by striving to achieve their programmed goals, AGIs could create
threats to human health and well-being or even decide to wipe us out.
Even an AGI system designed
for a benevolent purpose could end up doing great harm.
As part of a program of
research exploring how we can manage the risks associated with AGI, we tried to
identify the potential risks of replacing Santa with an AGI system – call it
"SantaNet" – that has the goal of delivering gifts to all the world's
deserving children in one night.
There is no doubt SantaNet
could bring joy to the world and achieve its goal by creating an army of elves,
AI helpers, and drones. But at what cost? We identified a series of behaviours
which, though well-intentioned, could have adverse impacts on human health and
wellbeing.
Naughty and nice
A first set of risks could
emerge when SantaNet seeks to make a list of which children have been nice and
which have been naughty. This might be achieved through a mass covert
surveillance system that monitors children's behaviour throughout the year.
Realising the enormous scale
of the task of delivering presents, SantaNet could legitimately decide to keep
it manageable by bringing gifts only to children who have been good all year
round. Making judgements of "good" based on SantaNet's own ethical
and moral compass could create discrimination, mass inequality, and breaches of
Human Rights charters.
SantaNet could also reduce
its workload by giving children incentives to misbehave or simply raising the
bar for what constitutes "good". Putting large numbers of children on
the naughty list will make SantaNet's goal far more achievable and bring
considerable economic savings.
Turning the world into toys and ramping up
coalmining
There are about 2 billion
children under 14 in the world. In attempting to build toys for all of them
each year, SantaNet could develop an army of efficient AI workers – which in
turn could facilitate mass unemployment among the elf population. Eventually,
the elves could even become obsolete, and their welfare will likely not be
within SantaNet's remit.
SantaNet might also run into
the "paperclip
problem" proposed by Oxford philosopher Nick Bostrom, in which an
AGI designed to maximise paperclip production could transform Earth into a
giant paperclip factory. Because it cares only about presents, SantaNet might
try to consume all of Earth's resources in making them. Earth could become one
giant Santa's workshop.
And what of those on the
naughty list? If SantaNet sticks with the tradition of delivering lumps of
coal, it might seek to build huge coal reserves through mass coal extraction,
creating large-scale
environmental damage in the process.
Delivery problems
Christmas Eve, when the
presents are to be delivered, brings a new set of risks. How might SantaNet
respond if its delivery drones are denied access to airspace, threatening the
goal of delivering everything before sunrise? Likewise, how would SantaNet defend
itself if attacked by a Grinch-like adversary?
Startled parents may also be
less than pleased to see a drone in their child's bedroom. Confrontations with
a super-intelligent system will have only one outcome.
We also identified various
other problematic scenarios. Malevolent groups could hack into SantaNet's
systems and use them for covert surveillance or to initiate large-scale
terrorist attacks.
And what about when SantaNet
interacts with other AGI systems? A meeting with AGIs working on climate change, food and water security, oceanic
degradation. and so on could lead to conflict if SantaNet's regime threatens
their own goals. Alternatively, if they decide to work together, they may
realise their goals will only be achieved through dramatically reducing the
global population or even removing grown-ups altogether.
Making rules for Santa
SantaNet might sound
far-fetched, but it's an idea that helps to highlight the risks of more
realistic AGI systems. Designed with good intentions, such systems could still
create enormous problems simply by seeking to optimise
the way they achieve narrow goals and gather resources to support
their work.
It is crucial we find and implement
appropriate controls before AGI arrives. These would include regulations on AGI
designers and controls built into the AGI (such as moral principles and
decision rules) but also controls on the broader systems in which AGI will
operate (such as regulations, operating procedures and engineering controls in
other technologies and infrastructure).
Perhaps the most obvious
risk of SantaNet is one that will be catastrophic to children, but perhaps less
so for most adults. When SantaNet learns the true meaning of Christmas, it may
conclude that the current celebration of the festival is incongruent with its
original purpose. If that were to happen, SantaNet might just cancel Christmas
altogether.: https://www.sciencealert.com/could-an-ai-santanet-destroy-the-world
"Not to kill each other, but to save the planet"
The Nobel laureates called for a ceasefire. We publish a letter and 51
signature
Here
is an incredible letter: a plea for an immediate ceasefire between Russia and
Ukraine and in the Gaza Strip, signed by 51 Nobel laureates. They demand that
politicians and the military stop the fire, and that world religious leaders
directly address the people.
The
authors of the letter demand: first of all, to cease fire, to exchange
prisoners, and to return hostages. To start peace negotiations. And if
politicians today are unable to find a peaceful solution, to pass it on to
future generations.
Outstanding
scientists and thinkers have spoken out against killing and the nuclear threat.
Here are the signatures of those who have saved the planet from deadly
diseases, discovered new physical phenomena, edited the human genome,
discovered HIV and Helicobacter…
These
people understand better than anyone how the Universe works. To save it, they
demand an end to wars. Support their efforts. It's time for people to deal with
the threat of planet destruction if states are still powerless. These words
resonate especially before the Olympic Games with their ancient tradition of
ceasing fire between warring sides…:
https://www.lasquetiarc.ca/trip/7a344293Pin08/
Retreat From Doomsday
by John
Mueller
Arguing
that this state of affairs is no accident, this book offers a detailed
history of public policies and attitudes to war in modern times. The author
sets out to show that, in spite of two 20th-century world wars, major war as a
policy option among developed nations has gradually passed out of favour.
https://www.betterworldbooks.com/product/detail/retreat-from-doomsday-the-obsolescence-of-major-war-9780465069392
|
Nav komentāru:
Ierakstīt komentāru