If the outbreak of the coronavirus pandemic had been a 1990s disaster movie, we’d have had a brash scientist played by Jeff Goldblum barking commands, and then cut to a montage of test tubes spinning, helicopters taking off, and technicians running down hallways with their lab coats flaring. It wouldn’t be fair to say nothing like that happened earlier this year. Despite initially downplaying the outbreak, the World Health Organization did quickly distribute a test for the virus, and the Centers for Disease Control did promptly issue advisories and assemble research teams. But looking back at those crucial early months, it is striking to note what didn’t happen.

For example, you might have thought health officials would encourage one of the simplest safety protocols—wearing masks in public—that had helped halt previous pandemics. No; the WHO, the CDC, and others all advised against masks at first. You might have assumed that the Food and Drug Administration, which regulates medical equipment, would have sought to remove every barrier to ramping up production of personal protective equipment. Sorry; it took weeks for the FDA to begin streamlining those rules.

Finally, you might think our top epidemiologists might have asked themselves, What don’t we know? You can’t fault our health experts for not knowing much about a brand-new virus. But you can fault them for not communicating how much they didn’t know. Too often, officials fell back on vaguely reassuring advice based on old paradigms: Just wash your hands, they told us, and don’t touch your face! That advice turned out to be woefully inadequate.

We know now that the coronavirus travels through the air, just as some researchers had begun arguing as early as March. At first, the concern that COVID-19 could go airborne was mostly a hunch based on a few studies. But even if that conclusion was preliminary, why didn’t health organizations warn us that it was possible? A few did, notably in Japan, where the public was advised to avoid crowded indoor spaces. But most other health agencies took months to acknowledge this disturbing possibility.

Were they reckless, or merely incompetent? The answer, I think, isn’t that simple. When I look at how hesitantly the WHO and the CDC dealt with the changing science on COVID-19, I see a pattern one might call the precautionary paradox.

In the 1990s, environmental activists began promoting a concept called the “precautionary principle.” In its simplest form, the principle merely expresses commonsense notions like “Better safe than sorry.” The most frequently cited version of the precautionary principle says this: “When an activity raises threats of harm to human health or the environment, precautionary measures should be taken even if some cause-and-effect relationships are not fully established scientifically.”

For example, if some group wants to roll out a new process or technology, the burden is on them to prove that their novel approach is safe. If the science is at all murky, the innovation must wait. In practice, this can be an absurdly high bar. After all, it is impossible to “prove” that some hypothetical risk will never come to pass. At times, the principle has been used to oppose the introduction of innovations—such as genetically modified, vitamin-rich “Golden Rice”—that would actually save lives.

Whether or not they use the term, many safety-oriented organizations apply some version of this principle. Obviously, being careful about making sudden changes is usually a good thing. But admirable caution can harden into inflexible routine, which, ultimately, undermines safety. One of the clearest examples of this paradox actually predates the modern formulation of the principle.

On the night of January 27, 1986, NASA officials held an emergency meeting to discuss some last-minute concerns about the space shuttle Challenger’s launch the next morning. The forecast called for subfreezing weather. Some engineers worried the low temperatures would aggravate a tendency for the craft’s solid-fuel booster rockets to leak small jets of flame during takeoff. But NASA had stringent rules for determining whether or not to proceed with a launch. Any determination that a component was fit to fly had to be backed up with solid documentation. So did recommendations not to fly. Asked for their data on how the booster rockets performed at low temperatures, the engineers had to admit they didn’t have much; the shuttle had never flown in such cold conditions before.

The NASA brass went ahead. Challenger blew up. For years, I, like most observers, assumed NASA officials had recklessly rolled the dice. But during the subsequent hearings on the disaster, a young sociologist named Diane Vaughan began studying the accident. Vaughn was an expert on corporate misconduct and thought the shuttle accident looked like a classic case. “But when I got more deeply into the archival data, I found that they hadn’t violated rules on the eve of the launch,” she told me when I interviewed her several years ago. “They’d actually conformed to the rules.”

Her book, The Challenger Launch Decision, is a classic study in how an organization’s culture can nudge it into disaster. The NASA managers were following a rigorous rulebook, Vaughan discovered. In their view, it would have been reckless to throw out those rules and simply scrub a launch without data to defend that decision. The managers were not the “amoral calculators” that they appeared. In fact, they were “quite moral and rule abiding as they calculated risk,” Vaughan writes. “Following rules, doing their jobs, they made a disastrous decision.”

This is the precautionary paradox in a nutshell. NASA had reached the point where safety rules begin to ossify and start limiting the flexibility an organization needs to respond to new risks. An institution that does everything by the book will eventually have trouble seeing problems that aren’t in the book.

In health-care terms, COVID-19 was the equivalent of a subfreezing launch day for the space shuttle: an unprecedented situation. This was a time to think differently, to move quickly, to solicit more opinions. For outsiders—especially all of us now judging their actions in hindsight—it’s hard to understand why the WHO, CDC, and FDA didn’t immediately change their approach. But for the professionals inside those organizations, the safety standards they’d developed over decades were close to sacrosanct. Even in a once-in-a-lifetime crisis, it was hard for them to imagine tinkering with their procedural gold standards.

Accepting that COVID-19 might behave differently from better-known respiratory viruses was a high hurdle for the WHO and other health organizations. “They don’t want to talk about airborne transmission because that is going to make people afraid,” said Donald Milton, a University of Maryland environmental health professor. Following the health agencies’ lead, government officials doubled down with unfounded assurances. “Coronavirus is not something that hangs in the air,” New York Mayor Bill de Blasio confidently explained in a joint press conference with his health commissioner in March. “It requires literally the transmission of fluids.…It has to get right on you.” Four months later, 20,000 New Yorkers were dead.

I don’t mean to pick on de Blasio (although it is hard to resist). Most health officials and government leaders made similar mistakes. They stressed what some critics call “hygiene theater” but failed to alert the public to the dangers of crowded indoor settings and the need for better ventilation. In early July, Milton led a group of 239 scientists who published an open letter begging the WHO and other agencies to take the airborne risk seriously. The agencies finally relented. By then, of course, COVID had been racing through nursing homes, prisons, and other dense environments for months.

The WHO, the CDC, and other agencies, like NASA, had strict protocols against making decisions without overwhelming data to back them up. In both cases, officials took what, in their rule-bound institutional cultures, seemed to be the more conservative approach: Stay the course until we get more data. The irony is, the actions being discussed—delaying a launch, warning people to open more windows—didn’t pose significant risks. Hewing too closely to their rulebooks, the officials wound up taking the much more dangerous course.

In the end, our health authorities became overcautious about being cautious. And here we are.

We want to hear your thoughts about this article. Click here to send a letter to the editor.

+ A A -
You may also like
18 Shares
Share via
Copy link