Eager to reduce the risk of an accidental war and encourage deeper cuts in the Soviet arsenal, President George H. W. Bush announced a month later that the United States would unilaterally make large reductions in its nuclear deployments. It would remove all of the Army’s tactical weapons from Europe, destroy half of the Navy’s tactical weapons and place the rest in storage, take 450 Minuteman II missiles off alert — and end the Strategic Air Command’s ground alert. For the first time since 1957, SAC’s bombers wouldn’t be parked near runways, loaded with fuel and hydrogen bombs, as their crews waited for the sound of Klaxons.

The Soviet Union ceased to exist on Christmas Day, 1991. The following June, the Strategic Air Command disappeared, as well. General Powell and General Butler thought that SAC had outlived its original purpose. The recent war against Iraq had demonstrated the importance of close collaboration between the armed services — and future wars were likely to be fought with conventional, not nuclear, weapons. The Strategic Air Command and its institutional culture no longer seemed relevant. SAC’s aircraft were divided among various Air Force units. America’s land-based missiles and ballistic-missile submarines were assigned to a single, unified command — to be headed, alternately, by an officer from the Air Force or the Navy. The fierce interservice rivalry to control America’s nuclear weapons largely vanished, as those weapons played an increasingly minor role in the Pentagon’s war plans. But many SAC veterans were outraged that what had once been the most powerful organization in the American military was being disbanded. They thought it was a mistake, regarded General Butler as a turncoat, and felt that the legacy of Curtis LeMay was being dishonored.

President Bush told members of his administration not to brag or gloat about the downfall of the Soviet Union, an event with myriad causes that Mikhail Gorbachev had unintentionally but peacefully overseen. General Colin Powell ignored those instructions at the ceremony in Omaha marking the end of the Strategic Air Command. “The long bitter years of the Cold War are over,” Powell said. “America and her allies have won — totally, decisively, overwhelmingly.”

<p>Epilogue</p>

The sociologist Charles B. Perrow began his research on dangerous technologies in August 1979, after the partial meltdown of the core at the Three Mile Island nuclear power plant. In the early minutes of the accident, workers didn’t realize that the valves on the emergency coolant pipes had mistakenly been shut — one of the indicator lights on the control panel was hidden by a repair tag. Perrow soon learned that similar mistakes had occurred during the operation of other nuclear power plants. At a reactor in Virginia, a worker cleaning the floor got his shirt caught on the handle of a circuit breaker on the wall. He pulled the shirt off it, tripped the circuit breaker, and shut down the reactor for four days. A lightbulb slipped out of the hand of a worker at a reactor in California. The bulb hit the control panel, caused a short circuit, turned off sensors, and made the temperature of the core change so rapidly that a meltdown could have occurred. After studying a wide range of “trivial events in nontrivial systems,” Perrow concluded that human error wasn’t responsible for these accidents. The real problem lay deeply embedded within the technological systems, and it was impossible to solve: “Our ability to organize does not match the inherent hazards of some of our organized activities.” What appeared to be the rare exception, an anomaly, a one-in-a-million accident, was actually to be expected. It was normal.

Perrow explored the workings of high-risk systems in his book Normal Accidents, focusing on the nuclear power industry, the chemical industry, shipping, air transportation, and other industrial activities that could harm a large number of people if something went wrong. Certain patterns and faults seemed common to all of them. The most dangerous systems had elements that were “tightly coupled” and interactive. They didn’t function in a simple, linear way, like an assembly line. When a problem arose on an assembly line, you could stop the line until a solution was found. But in a tightly coupled system, many things occurred simultaneously — and they could prove difficult to stop. If those things also interacted with each other, it might be hard to know exactly what was happening when a problem arose, let alone know what to do about it. The complexity of such a system was bound to bring surprises. “No one dreamed that when X failed, Y would also be out of order,” Perrow gave as an example, “and the two failures would interact so as to both start a fire and silence the fire alarm.”

Перейти на страницу:

Похожие книги