International laws and treaties are failing to stop a new arms race
- Written by Alexander Gillespie, Professor of Law, University of Waikato
J. Robert Oppenheimer – the great nuclear physicist, “father of the atomic bomb”, and now subject of a blockbuster biopic – always despaired[1] about the nuclear arms race triggered by his creation.
So the approaching 78th anniversary of the Hiroshima bombing invites us to ask how far we’ve come – or haven’t come – since his death in 1967.
The Cold War represented all that Oppenheimer had feared. But at its end, then-US president George H.W. Bush spoke of a “peace dividend[2]” that would see money saved from reduced defence budgets transferred into more socially productive enterprises.
Long-term benefits and rises in gross domestic product could have been substantial, according to modelling[3] by the International Monetary Fund, especially for developing nations. Given the cost of global sustainable development – currently estimated[4] at US$5 trillion to $7 trillion annually – this made perfect sense.
Unfortunately, that peace dividend is disappearing. The world is now spending at least $2.2 trillion annually on weapons and defence. Estimates are far from perfectly accurate, but it appears overall defence spending increased by 3.7% in real terms[5] in 2022.
The US alone spent $877 billion on defence in 2022 – 39% of the world total. With Russia ($86.4 billion) and China ($292 billion), the top three spenders account for 56% of global defence spending.
Military expenditure in Europe saw its steepest annual increase in at least 30 years. NATO[6] countries and partners are all accelerating towards, or are already past, the 2% of GDP military spending target. The global arms bazaar[7] is busier than ever.
Aside from the opportunity cost represented by these alarming figures, weak international law in crucial areas means current military spending is largely immune to effective regulation.
The new nuclear arms race
Although the world’s nuclear powers agree[8] “a nuclear war cannot be won and must never be fought”, there are still about 12,500 nuclear warheads[9] on the planet. This number is growing, and the power of those bombs is infinitely greater[10] than the ones dropped on Hiroshima and Nagasaki.
According to the United Nations’ disarmament chief, the risk of nuclear war is greater[11] than at any time since the end of the Cold War. The nine nuclear-armed states (Britain, France, India, Pakistan, North Korea and Israel, as well as the big three) all appear to be modernising their arsenals. Several deployed new nuclear-armed or nuclear-capable weapons systems in 2022.
The US is upgrading its “triad” of ground, air and submarine launched nukes, while Russia is reportedly working on submarine delivery of “doomsday[12]” nuclear torpedoes capable of causing destructive tidal waves.
Read more: The Black Sea drone incident highlights the loose rules around avoiding 'accidental' war[13]
While Russia and the US possess about 90% of the world’s nuclear weapons, other countries are expanding quickly. China’s arsenal is projected to grow from 410 warheads in 2023 to maybe 1,000[14] by the end of this decade.
Only Russia and the US were subject to bilateral controls over the buildup of such weapons, but Russian president Vladimir Putin suspended[15] the arrangement. Beyond the promise of non-proliferation, the other nuclear-armed countries are not subject to any other international controls, including relatively simple measures[16] to prevent accidental nuclear war.
Other nations – those with hostile, belligerent and nuclear-armed neighbours showing no signs of disarming – must increasingly wonder why they should continue to show restraint and not develop their own nuclear deterrent capacities.
The threat of autonomous weaponry
Meanwhile, other potential military threats are also emerging – arguably with even less scrutiny or regulation than the world’s nuclear arsenals. In particular, artificial intelligence (AI) is sounding alarm bells.
AI is not without its benefits, but it also presents many risks when applied to weapons systems. There have been numerous warnings from developers about the unforeseeable consequences[17] and potential existential threat[18] posed by true digital intelligence. As the Centre for AI Safety[19] put it:
Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.
Read more: UN fails to agree on 'killer robot' ban as nations pour billions into autonomous weapons research[20]
More than 90 countries have called for a legally binding instrument[21] to regulate AI technology, a position supported by the UN Secretary General, the International Committee of the Red Cross and many non-governmental organisations.
But despite at least a decade of negotiation and expert input[22], a treaty governing the development of “lethal autonomous weapons systems” remains elusive[23].
Read more: War in Ukraine accelerates global drive toward killer robots[24]
Plagues and pathogens
Similarly, there is a fundamental lack of regulation governing the growing number of laboratories capable of holding or making (accidentally or intentionally) harmful or fatal biological materials.
There are 51 known biosafety level-4 (BSL-4) labs[25] in 27 countries – double the number that existed a decade ago. Another 18 BSL-4 labs are due to open in the next few years.
Read more: Reporting all biosafety errors could improve labs worldwide – and increase public trust in biological research[26]
While these labs, and those at the next level down, generally maintain high safety standards, there is no mandatory obligation that they meet international standards or allow routine compliance inspections.
Finally, there are fears the World Health Organization’s new pandemic preparedness treaty, based on lessons from the COVID-19 disaster, is being watered down[27].
As with every potential future threat, it seems, international law and regulation are left scrambling to catch up with the march of technology – to govern what Oppenheimer called[28] “the relations between science and common sense”.
References
- ^ always despaired (news.stanford.edu)
- ^ peace dividend (www.washingtonpost.com)
- ^ modelling (www.imf.org)
- ^ estimated (www.un.org)
- ^ 3.7% in real terms (www.sipri.org)
- ^ NATO (www.nato.int)
- ^ global arms bazaar (sipri.org)
- ^ nuclear powers agree (www.whitehouse.gov)
- ^ 12,500 nuclear warheads (www.sipri.org)
- ^ infinitely greater (www.icanw.org)
- ^ risk of nuclear war is greater (press.un.org)
- ^ doomsday (news.usni.org)
- ^ The Black Sea drone incident highlights the loose rules around avoiding 'accidental' war (theconversation.com)
- ^ 1,000 (www.armscontrol.org)
- ^ suspended (www.theguardian.com)
- ^ simple measures (theconversation.com)
- ^ unforeseeable consequences (time.com)
- ^ existential threat (www.theguardian.com)
- ^ Centre for AI Safety (www.safe.ai)
- ^ UN fails to agree on 'killer robot' ban as nations pour billions into autonomous weapons research (theconversation.com)
- ^ called for a legally binding instrument (automatedresearch.org)
- ^ negotiation and expert input (docs-library.unoda.org)
- ^ remains elusive (www.stopkillerrobots.org)
- ^ War in Ukraine accelerates global drive toward killer robots (theconversation.com)
- ^ biosafety level-4 (BSL-4) labs (static1.squarespace.com)
- ^ Reporting all biosafety errors could improve labs worldwide – and increase public trust in biological research (theconversation.com)
- ^ watered down (www.bmj.com)
- ^ Oppenheimer called (ahf.nuclearmuseum.org)