In August 1985, Delta Flight 191 was approaching the Dallas/Fort Worth airport when the pilots noticed a small thunderstorm forming in their path. As the Lockheed L-1011 passed under the clouds, it suddenly lost velocity and began plummeting to earth. The resulting crash killed 134 passengers and crew, at the time the highest death toll for a single-plane accident. Investigators attributed the incident to wind shear, the phenomenon that occurs when a thunderstorm releases a sudden downdraft of cold air. Although wind shear was suspected as the cause of many previous accidents, the phenomenon was not well understood, and pilots and air traffic controllers didn’t fully appreciate the risks.
The detailed analysis of the Flight 191 crash finally proved just how deadly thunderstorms could be for planes flying at low altitudes. Efforts to prevent further accidents tackled both human and technological factors. Pilots and air traffic controllers began working harder to avoid storms. Flight-training programs were revised. The Federal Aviation Administration installed Doppler radar systems—which can detect downdrafts—at major airports. It worked. The U.S. has not had a commercial aviation crash due to wind shear since 1994.
The American aviation industry provides the textbook on learning from disasters. The steady improvement in aviation safety is the paradigm other high-risk industries strive to follow. The process involves several key steps: collecting as much data from an incident as possible (which is why airliners carry “black box” flight recorders); assigning the investigation to neutral experts; giving those analysts time to research the accident without interference from industry or government officials; and, finally, conscientiously implementing solutions.
This system of studying and learning from disasters depends on a certain social compact. The rest of society needs to let the investigation process work, wait for the facts to come out, and accept the conclusions. That’s the way things used to work, for the most part. Sure, there have always been and will always be people who rush to point fingers at their political enemies or float wild conspiracy theories. But after most disasters—say, the 1979 Three Mile Island nuclear accident, or the Exxon Valdez oil spill in 1989—responsible investigations came up with useful, detailed explanations of what went wrong.
Such investigations usually reveal that accidents almost never have a single, obvious cause, but rather emerge from the complex interplay between technological and human factors. Preventing future accidents, therefore, requires addressing a range of problems across the whole system, not focusing on a single broken part or supposed villain. Sober and system-wide safety reforms can be very effective. The U.S. has never had another serious nuclear accident or oil tanker rupture in the decades since the well-studied Three Mile Island and Exxon Valdez incidents. Three Mile Island happened 46 years ago; Exxon Valdez, 36 years ago.
But is that model of accident investigation still operative? Recent disasters suggest it is breaking down. There are several causes. For one, information travels so fast today that political blame-casting and conspiratorial thinking can overwhelm the more deliberate process of fact-finding. Within minutes after the January 29 midair collision over the Potomac, one could watch this process in real time on X. “That helicopter flew straight toward the plane,” one poster wrote. “This can’t have been an accident.” Others instantly jumped to the conclusion that diversity, equity, and inclusion rules must have been to blame.
I want to be clear: There’s nothing wrong with asking such questions. In any accident investigation, nothing—including conspiracy scenarios—should be ruled out. And we know that the FAA, for a time, employed bizarre DEI tests specifically designed to weed out qualified non-minority air traffic control candidates.
But you never want investigators—or the public, for that matter—to focus exclusively on a single explanation for a disaster. A safety reform that addresses just one “cause” of a catastrophe, while ignoring all others, may do little more than promote a false sense of security. Now, it’s one thing when random X users toss out their pet theories after an accident (I’ve done this myself). It’s something else when political leaders insist they know who or what to blame in the immediate aftermath. President Trump’s confident assertion that DEI rules explain the Washington crash may have a grain of truth. But when the nation’s leader expresses such opinions, it puts pressure on everyone down the ladder to steer the investigation in the approved direction. That’s not the way to get to the bottom of a complex problem.
Most likely, the D.C. crash will turn out to have had many overlapping causes: an overworked air traffic controller, helicopter pilots disoriented by complex visual data, too many aircraft sharing a crowded air corridor, and complacency on the part of people in charge. But let’s wait for the final report, shall we?
After the devastating recent fires in Los Angeles, political leaders had a field day assigning blame to their preferred bêtes noires. President Trump put the onus on Governor Gavin Newsom’s water-management policies that supposedly leave Los Angeles dry. “All they have to do is turn the valve,” he said. Newsom, as always, stressed climate change. “If you don’t believe in science, believe your own damn eyes,” he retorted. In fact, both claims are tangential. L.A.’s problem wasn’t lack of water so much as lack of local infrastructure to get enough water to hillside neighborhoods. And while warmer weather might make fires worse at the margins, the region has always been prone to devastating fires. There are much bigger factors that make some California residents extra vulnerable to fires. These include poor woodland management, too many flammable plantings and structures in fire zones, L.A.’s lax preparedness, and artificially reduced insurance rates that lure homeowners into high-risk areas. These problems have been evident for decades. But California’s leaders keep refusing to learn the lessons of previous fires.
Sometimes we fail to learn from disasters because our leaders, like those in California, don’t want to change. Other times, people just don’t trust the experts in charge. This isn’t simply a moral failing on the part of an ungrateful public. Lately, our expert institutions have too often proved to be downright untrustworthy. What lessons should we have learned from the Covid disaster? To keep schools and churches closed? To stay six feet apart? Our public health officials could have carefully investigated the disease and openly shared their findings with the public. Instead, Anthony Fauci and others told white lies, twisted scientific research to support politically driven policies, and covered up their links to the Wuhan lab that almost certainly leaked the virus.
It’s hard to make safer choices when we don’t receive accurate information in the first place. As COMMENTARY’s Abe Greenwald often says, “We don’t get answers anymore.” After negative events, authorities sometimes suppress information because they don’t trust the public to handle unsettling or politically inconvenient facts. And the media rarely pursue these awkward stories. What’s the real story behind the 2017 mass shooting in Las Vegas? Who planted pipe bombs in Washington, D.C., prior to January 6, 2021? What was the deal with those Chinese spy balloons? What have authorities learned about the “trans cult” allegedly linked to the recent shooting of a U.S. Border Patrol agent and other killings? We may never know.
I wish we could go back to the days when the people in charge trusted us with accurate, actionable information. And when we could trust them to tell us the truth.
U.S. Coast Guard photo by Petty Officer 1st Class Brandon Giles via Wikimedia
We want to hear your thoughts about this article. Click here to send a letter to the editor.