When the United States revealed in January that it is testing a more nimble, more precise version of its B61 atom bomb, some were immediately alarmed. General James Cartwright, a former strategist for President Obama, warned that “going smaller” could make nuclear weapons “more thinkable” and “more usable.”
However, what is little known is that for the past 25 years, the United States and its allies have routinely used radioactive weapons in battle, in the form of warheads and explosives made with depleted, undepleted, or slightly enriched uranium. While the Department of Defense (DOD) calls these weapons “conventional” (non-nuclear), they are radioactive and chemically toxic. In Iraq, where the United States and its partners waged two wars, toxic waste covers the country and poisons the people. U.S. veterans are also sick and dying.
Scott Ritter, a former Marine Corps officer in Iraq and United Nations weapons inspector, told me, “The irony is we invaded Iraq in 2003 to destroy its non-existent WMD [weapons of mass destruction]. To do it, we fired these new weapons, causing radioactive casualties.”
The weapons were first used in 1991 during Desert Storm, when the U.S. military fired guided bombs and missiles containing depleted uranium (DU), a waste product from nuclear reactors. The Department of Defense (DOD) particularly prized them because, with dramatic density, speed, and heat, they blasted through tanks and bunkers.
Within one or two years, grotesque birth defects spiraled—such as babies with two heads. Or missing eyes, hands, and legs. Or stomachs and brains inside out.