To err is human; to admit it superhuman. – Doug Larson
A history of the world could be written as a series of
errors. Such a thematic approach would miss very little of importance.
Accidents are not solely a human prerogative. My cat
frequently misjudges her leaps, and I’ve ended up on the ground more than once because
a horse tripped or spooked at nothing. However, only humans can make mistakes
on an industrial scale. The BP spill in the Gulf of Mexico
is just one of the latest. The famous bent pyramid in Giza appears to be a mid-course correction
when the initial angle proved unstable, which brings to mind images of tumbling
stone blocks and fleeing workers. In 1628 the spanking new Swedish warship Vasa rolled over and sank after sailing
less than a mile on her maiden voyage – she was topheavy. Later in the century,
Kronan, another Swedish warship also
overturned while maneuvering in combat. In 1919 a storage tank broke in Boston sending a 4-meter wave of sticky molasses through
the streets of Boston ,
killing 21 people and knocking buildings off their foundations. In 1947, a fire
on the SS Grandcamp detonated 2300
tons of ammonium nitrate, which, along with consequent fires and explosions,
leveled Texas City , Texas , killing 581 people and injuring 5000.
At the Three Mile Island nuclear facility in 1979, a worker goofed by turning
off the back-up cooling system which had kicked in perfectly when the primary
system malfunctioned; there were no casualties, but there easily could have
been. In 1981, a suspended walkway at the Kansas
City Hyatt
Regency collapsed killing 116 people. The most lives lost in a single
industrial accident were in the 1984 Union
Carbide disaster in Bhopal ,
India , where a
release of deadly methyl isocyanate gas killed 3,787. Chernobyl ’s
explosion in Ukraine
was in 1986.
None of these was due to a lack of supervision or regulation. On
the contrary, in the case of Chernobyl ,
the accident occurred precisely because of a botched safety procedure. The Hyatt was built according to code and
inspected by city inspectors. Besides, when safety rules become too onerous,
they backfire; at an irradiation plant near where I live, for example, though
no accident ensued, workers were found to have jammed open safety doors to
bypass the hassle of getting through them every two minutes. Engineering
philosophy today, accordingly, increasingly is to design for “passive safety” –
to employ safety systems that work without any human intervention, and that
resist human intervention. We try to remove the opportunity for someone to do
something dangerously stupid. Even so, we’ll never be able to eliminate all major
accidents. All one can say is that for any foolproof design, there always is a
fool bigger than the proof.
This doesn’t mean we should throw up our hands. A reasonable
amount of caution can go a long way, but we need to keep in mind that mistakes
are normal. They are a cost of modern industrial life and always will be. It
should be noted, though, that modern life is much safer on balance than the
pre-industrial kind – all those many little accidents in the old days took a far
greater toll than the fewer but bigger accidents of today. To forbid risk is
itself extremely risky.
Still, there are ordinary risks and extraordinary risks. To
risk the lives not just of thousands but of millions, we need to bring governments
into the act. World War One was a colossal accident – a culmination of parallel
errors and misjudgments by government leaders, not one of whom wanted a general
European war. They got one anyway. It killed some 20 000 000 people, devastated
the center of Europe , and made the world safe
for fascism. Oops.
The remarkable thing is how often governments get away with
rolling the dice. The brinksmanship in the Cuban missile crisis is an obvious
example. Other times there are consequences, but not as big as they could have
been. Take the Castle Bravo thermonuclear
test in 1954.
In 1952, the US
tested a hydrogen bomb, but it was the size of a factory and not a practical
weapon. Edward Teller was largely responsible for designing a compact deliverable
weapon. He solved a major timing problem (which isn’t relevant to the oops, so I’ll
refrain from going into it here) and a smaller technical problem with D-T (deuterium
and tritium: hydrogen isotopes with one and two neutrons respectively). In
simplest terms, a fission (uranium or plutonium) trigger causes D-T to fuse
into helium, thereby making an even bigger explosion. The trouble is that D-T easily
leaks away. The solution was to replace D-T with lithium, which is stable and stores
easily. When the fission trigger goes off, it emits neutrons which split the lithium
atoms into D-T atoms (and helium), which in turn then fuse into helium. (Andrei
Sakharov in the USSR came up
with a similar design, beating the US team by a few months.)
The new US
design was tested at Bikini Atoll. The US team underestimated how
efficient the lithium conversion to D-T would be. The Castle Bravo test was expected to yield 5 megatons.
Observers’ initial satisfaction turned to fear as the fireball grew and grew
and continued to grow. The explosion was 15 megatons. The US observation
ships were way too close – scarily close – and were caught in the fallout plume;
personnel scrambled below decks to avoid overexposure. The navy and physicist
team escaped casualties, more by luck than by planning. Less lucky were the 23
crewmen of the Japanese fishing vessel Daigo
Fukuryu Maru, which also was caught by the unexpectedly extensive fallout pattern.
They received heavy radiation doses, from which one of the crew died, causing a
major diplomatic crisis with Japan .
Yet, despite this recent example of an “oops” with a nuke,
in 1962 the Kennedy Administration authorized an amazing test called Frigate Bird. The submarine USS Ethan Allen fired a Polaris SLBM
(submarine launched ballistic missile) in the first and only test of a
ballistic missile with a live thermonuclear warhead. The warhead detonated 2000
meters from the intended target (pretty accurate by the standards of the day)
with a yield of 600 kilotons. The idea, of course, was to demonstrate that the
missiles were not a bluff: they actually worked. Yet, while there was no “oops”
on the occasion of Frigate Bird, the
recklessness of the test is breathtaking. We know full well how unreliable
rockets and guidance systems can be. That warhead could have come down
anywhere. Hawaii
was well within range. There was a radio-activated self-destruct device built
into this particular missile for emergencies, but this, too, easily could have
failed.
No one conducts open-air testing of nuclear explosives
anymore, which is fortunate, but the gambles taken by leaders in the capitals
of nations around the world are not limited to weapons and war. They also
include budgets and central banks. Their mistakes are harder to outrun than molasses.
Castle Bravo
Frigate Bird