America’s command-and-control system operated safely during the crisis, Sagan found, and yet “numerous dangerous incidents… occurred despite all the efforts of senior authorities to prevent them.” He’d long believed that the risk of nuclear weapon accidents was remote, that nuclear weapons had been “a stabilizing force” in international relations, reducing the risk of war between the United States and the Soviet Union. “Nuclear weapons may well have made deliberate war less likely,” Sagan now thought, “but, the complex and tightly coupled nuclear arsenal we have constructed has simultaneously made accidental war more likely.” Researching The Limits of Safety left him feeling pessimistic about our ability to control high-risk technologies. The fact that a catastrophic accident with a nuclear weapon has never occurred, Sagan wrote, can be explained less by “good design than good fortune.”

* * *

The Titan II explosion at Damascus was a normal accident, set in motion by a trivial event (the dropped socket) and caused by a tightly coupled, interactive system (the fuel leak that raised the temperature in the silo, making an oxidizer leak more likely). That system was also overly complex (the officers and technicians in the control center couldn’t determine what was happening inside the silo). Warnings had been ignored, unnecessary risks taken, sloppy work done. And crucial decisions were made by a commanding officer, more than five hundred miles from the scene, who had little firsthand knowledge of the system. The missile might have exploded no matter what was done after its stage 1 fuel tank began to leak. But to blame the socket, or the person who dropped it, for that explosion is to misunderstand how the Titan II missile system really worked. Oxidizer leaks and other close calls plagued the Titan II until the last one was removed from a silo, northwest of Judsonia, Arkansas, in June 1987. None of those leaks and accidents led to a nuclear disaster. But if one had, the disaster wouldn’t have been inexplicable or hard to comprehend. It would have made perfect sense.

The nuclear weapon systems that Bob Peurifoy, Bill Stevens, and Stan Spray struggled to make safer were also tightly coupled, interactive, and complex. They were prone to “common-mode failures”—one problem could swiftly lead to many others. The steady application of high temperature to the surface of a Mark 28 bomb could disable its safety mechanisms, arm it, and then set it off. “Fixes, including safety devices, sometimes create new accidents,” Charles Perrow warned, “and quite often merely allow those in charge to run the system faster, or in worse weather, or with bigger explosives.” Perrow was not referring to the use of sealed-pit weapons during SAC’s airborne alerts. But he might as well have been. Promoted as being much safer than the weapons they replaced, the early sealed-pit bombs posed a grave risk of accidental detonation and plutonium scattering. Normal accident theory isn’t a condemnation of modern technological systems. But it calls for more humility in how we design, build, and operate them.

The title of an influential essay on the role of technology in society asked the question: “Do Artifacts Have Politics?” According to its author, Langdon Winner, the answer is yes — the things that we produce are not only shaped by social forces, they also help to mold the political life of a society. Some technologies are flexible and can thrive equally well in democratic or totalitarian countries. But Winner pointed to one invention that could never be managed with a completely open, democratic spirit: the atomic bomb. “As long as it exists at all, its lethal properties demand that it be controlled by a centralized, rigidly hierarchical chain of command closed to all influences that might make its workings unpredictable,” Winner wrote. “The internal social system of the bomb must be authoritarian; there is no other way.”

Secrecy is essential to the command and control of nuclear weapons. Their technology is the opposite of open-source software. The latest warhead designs can’t be freely shared on the Internet, improved through anonymous collaboration, and productively used without legal constraints. In the years since Congress passed the Atomic Energy Act of 1946, the design specifications of American nuclear weapons have been “born secret.” They are not classified by government officials; they’re classified as soon as they exist. And intense secrecy has long surrounded the proposed uses and deployments of nuclear weapons. It is intended to keep valuable information away from America’s enemies. But an absence of public scrutiny has often made nuclear weapons more dangerous and more likely to cause a disaster.

Перейти на страницу:

Похожие книги