When in August 2022 the FBI searched Mar-a-Lago, Donald Trump’s Florida residence, for sensitive materials the former president had improperly removed from the White House, a chorus of outrage emanated from liberal pundits and experts on document classification.

“Donald Trump Is Not Above the Law,” a New York Times editorial thundered, solemnly praising Attorney General Merrick Garland’s valiant “search of the premises to recover documents because of concerns that their disclosure could compromise ‘clandestine human sources’ of intelligence.” Former FBI agent and legal analyst Asha Rangappa tweeted that “unauthorized possession or storage of classified documents are, by definition, a threat to national security” and suggested to Business Insider that Trump “intentionally wanted to leave these sources and methods exposed.” Oona Hathaway, a Yale Law School professor and former Pentagon special counsel, told NPR that Trump appeared to possess top-secret materials, including “the kind of information that is the most likely to do damage to the U.S. government.”

But when, five months later, President Biden was revealed to have stored classified documents of his own in nonsecure parts of various private residences, the narrative shifted from individual presidential culpability for mishandling sensitive information to a broader indictment of the “overclassification” of those materials in the first place.

Hathaway asserted that experts who’ve been “thinking about classification have recognized for a very long time that the system is out of control.” Rangappa, while acknowledging Biden’s misdeeds, wrote that “it seems pretty clear that there needs to be some reforms to how classified documents are accounted for when administrations leave office.” The Times editorial board kept mum, but it commissioned an op-ed from a historian who wrote that “the system for protecting secrets vital to national security has spun out of control.”

Yet while this narrative switch proved as predictable as it was cynical, one must ask: Do the critics of overclassification have a point? Has government information been unduly protected to the detriment of American citizens?

Matthew Connelly, a Columbia history professor and the author of that Times op-ed, certainly thinks so. In The Declassification Engine, his bracing and impassioned tour of the history of classification and his recapitulation of his team’s efforts to balance the scales, Connelly persuasively but imperfectly urges fundamental change in how the government thinks about secrets.

As a counterweight to what he calls “the dark state,” Connelly proposes, and has partially implemented, his eponymous declassification engine: “If we assembled millions of documents in databases,” he posits, “and harnessed the power of high-performance computing clusters, it might be possible to train algorithms to look for sensitive records requiring the closest scrutiny and accelerate the release of everything else.”


What’s so bad about over-classification anyway?

To answer this question, Connelly takes the reader on a historical secrecy tour, imparting important lessons along the way.

He begins with Pearl Harbor, a surprise attack that Connelly believes could have been averted absent the failure of our intelligence apparatus to share intercepted Japanese communications with commanders in Hawaii. He argues that the debacle demonstrated that “secrets are often kept to hide incompetence” and that “our leaders think we can’t handle the truth and wouldn’t support their plans if we knew what they were.” He also asserts that intense nuclear secrecy backfired, as the gravest breaches were committed by those, such as Klaus Fuchs and the Rosenbergs, who chafed at the government’s insistence that it maintain a monopoly on nuclear knowledge.

In addition, Connelly chronicles numerous indiscretions at the National Security Agency, charged with keeping “secrets about the secrets.” When the agency’s building was first constructed at Fort Meade, Maryland, in the 1950s, highly sensitive blueprints, building specifications, and computer locations were hardly protected at all. In essence, the se-crets about the secrets about the secrets were unduly exposed. He also documents how, in the 1970s, a Minuteman Missile operations officer opened a launch checklist and found the Strategic Air Command code authorizing a launch of 10 nuclear missiles, each bearing three warheads. Instead of residing with authorities in Washington, the code was printed out directly on the card, and, astoundingly, it was 00000000—yet more evidence of what Connelly labels “the rot of incompetence.”

Undue secrecy can also stifle innovation. Connelly points to various military projects, kept tightly under wraps, that squandered tens of billions of dollars, including a titanium exoskeleton, a robotic mule, and the B-70 Valkyrie—a $13 billion behemoth, known as the “Dyna-Soar” and intended to circle Earth 15 miles up at 2,000 miles per hour—that never reached cruising altitude. Too often the Pentagon has cloaked in secrecy projects that would never have received funding had they seen the light of day. The same can be said for clandestine government-conducted experiments, including seeding clouds to control the weather, detecting nuclear fallout in the Pacific and in Hanford, Washington, measuring radioactivity by injecting African-American newborns with iodine, and testing the effects of hallucinogenic drugs on psychiatric prisoners: None of these morally repugnant endeavors could have been carried out openly.

Overclassification, of course, has afflicted both Democratic and Republican administrations, and every modern presidential candidate from Truman to Trump has vowed to increase transparency—only to double down on secrecy once the exigencies of the office set in. Connelly takes care to note that President Obama prosecuted more leakers under the Espionage Act than all previous administrations combined. So it wasn’t shocking that, when in 2015 Connolly and his team approached the head of the Intelligence Advanced Research Project Activity (IARPA) to gauge her interest in developing their declassification engine, she demurred, citing an “insufficient return on investment.”

Indeed, the sheer volume of government records staggers the mind: more than 28 million cubic feet of paper files alone, the equivalent in volume to 26 Washington Monuments. As for digital files, a single (unnamed) intelligence agency reported in 2012 that every 18 months, it produced a petabyte of classified data, enough to circle the equator if printed out and placed in ordinary file cabinets.

The time and money required to classify and maintain these documents securely is similarly astronomical. In 1998, the Department of Energy undertook a review of 200 million pages of publicly available documents and withdrew from circulation 6,640 of them, amounting to a risible $4,236 per page—on par with the cost per page of a Gutenberg Bible.

The key, writes Connelly, is to “distinguish the kind of information that really does require safeguarding from the information that citizens urgently need in order to hold their leaders to account,” as this represents “the only way to uphold both national security and democratic accountability.”

Connelly’s team sought to do exactly that with 117,000 declassified documents available at various presidential libraries. They first ran a “brute-force” program on the data that ranked these documents on the basis of “dirty words” such as “secret,” “classified,” “sensitive,” and the like. They then segregated 613 specific documents bearing stamps of “RESTRICTED DATA” and trained a computer to recognize patterns in those plainly sensitive documents that could be applied to the rest of the data set. Indeed, they reported identifying 90 percent of similarly highly sensitive documents in the batch with a 30 percent false-positive rate.

Similarly, Connelly’s team conducted a meta-analysis of a subset of the 1.4 billion pages of declassified documents composing The Foreign Relations of the United States and concluded that the Suez Crisis, the Indo-Pakistani border wars, and Vietnam were the topics in the highest percentage of top-secret documents. Typically, the topics of FRUS documents containing the heaviest redactions were correlated with the terms “source,” “arms,” “area,” “mission,” “information,” “officers,” and “base.” But the team discovered that, during the Eisenhower administration, the most heavily redacted documents were associated with “oil,” “day,” “man,” “times,” “companies,” “Arabia,” and construction. Employing some sleuthing skills, they linked many of these documents to the 1954 Guatemalan coup, in which a CIA-supported junta ousted the Moscow-friendly president Jacobo Arbenz, and Connelly convincingly suggests that Eisenhower engaged the services of William Pawley, a businessman and former ambassador, to threaten Arbenz with an oil embargo.

But unfortunately, Connelly often gets sidetracked by issues of only tangential importance to his thesis. His chapter-long discourses on the military-industrial complex allows him to wax indignant about the imbalance in civil–military relations, where he fingers “unaccountable spending” and “collusion with private industry” as the key culprits, going so far as to claim that “our elected civilian leadership is not in fact in charge of the military.” He also retreads the worn ground of government surveillance of private citizens, including the NSA’s infamous vacuuming-up of oceans of metadata, a problematic practice that’s prone to abuse but not exactly relevant to declassification. Exaggeration is another flaw, as when he notes that certain archiving failures amount, “in a very literal sense, to the end of history as we know it”; he gratuitously invokes racial animus as a driving factor of excessive government secrecy; and his narrative suffers from poor organization.

More problematically, though, he never fully articulates a framework for distinguishing between necessary and undue classification, both in creating and in maintaining information. Yes, a computer-aided declassification engine can serve important purposes, but on what basis should it differentiate between, say, covert sources and methods and anodyne government communications? After all, an AI is only as useful as its training instructions, and if those instructions fail to convey sensitivity guidelines adequately, the resulting output will be useless.

Then, too, other than a stray reference to the UK’s Official Secrets Act, Connelly fails to place the U.S. experience in any kind of comparative international perspective. Are other Western countries more or less restrictive than the United States? Does this comparison flatter American exceptionalism or dictate fundamental changes? Do uniquely American causes explain any such dichotomy—and should they?

The overclassification debate implicates profound questions of governance and extends far beyond whether Trump, Biden, or Hillary Clinton should be prosecuted for their indiscretions. As Connelly puts it, instead of asking whether a given government official has a “need to know,” we should instead ask “what do we, the people, need to know to do our job, as citizens, to keep our government accountable?” His book reflects a Herculean effort to answer this question, but far more work remains to be done.

We want to hear your thoughts about this article. Click here to send a letter to the editor.

+ A A -
You may also like
Share via
Copy link