The Greenbrier Bunker: A Case Study in Secrecy
Less than a year after the Cold War ended, one of its most closely guarded secrets was exposed when the Washington Post revealed the hidden purpose of West Virginia’s Greenbrier Hotel. Beneath the luxury resort lay a massive bunker built to house the United States Congress following a nuclear war, part of the nation’s “continuity of government” program. Congressional leaders had urged the newspaper not to publish the story, but the Post’s executive editor justified the decision as covering a “historically significant and interesting story that posed no grave danger to national security or human life.”
The journalist who uncovered the story, Ted Gup, went further, arguing that the facility had been potentially destabilizing even while secret. Evacuating Congress during a crisis could have signaled to the Soviet Union that the United States was preparing for nuclear war, potentially triggering a first strike. What appeared to be an act of irresponsible journalism was, in Gup’s view, a case study in the value of press oversight of government secrecy.
The Paradox of Information Control
The tension between protecting sensitive information and maintaining an informed public represents one of the most persistent challenges in democratic governance. This tension intensified dramatically after September 11, 2001, when governments worldwide scrambled to restrict access to information that might aid terrorists while simultaneously confronting the reality that much of this information was already publicly available.
The National Infrastructure Protection Center issued advisories urging “risk management” when reviewing materials for web dissemination, warning that “publicly available data should be carefully reviewed.” Government agencies removed thousands of documents from public websites, and the Government Printing Office even asked federal depository libraries to destroy copies of a U.S. Geological Survey CD-ROM about public water supplies, fearing terrorists could exploit the data for chemical or biological attacks.
Open-Source Intelligence and Terrorist Research
The Al Qaeda training manual, excerpts of which were released by the U.S. Justice Department, included a chapter on “Espionage: Information-Gathering Using Open Methods,” demonstrating that terrorist organizations were well aware of the intelligence value of publicly available information. Former CIA counterterrorism analyst Dennis Pluchinsky warned that detailed media coverage of security vulnerabilities effectively provided terrorists with a roadmap for attacks.
However, critics of information restriction argued that censoring public data created a false sense of security while undermining the democratic accountability that ultimately makes societies more resilient. Laura Donohue of Georgetown University wrote in the Washington Post that “censoring science won’t make us any safer,” arguing that security through obscurity was ultimately self-defeating.
The Smyth Report and Nuclear Secrecy
The history of nuclear secrecy illustrates the recurring pattern. In 1945, Princeton physicist Henry DeWolf Smyth published the official account of the Manhattan Project, titled “Atomic Energy for Military Purposes.” The report was intended to establish the boundaries between public and classified nuclear information, but it inadvertently provided foreign weapons programs with a valuable technical roadmap.
Khidhir Hamza, who worked on Saddam Hussein’s nuclear weapons program, later acknowledged that the Smyth Report had been useful to Iraqi weapons scientists. The Soviet nuclear program similarly benefited from publicly available American technical literature, supplementing intelligence gathered through espionage.
The pattern repeated with the Progressive magazine case in 1979, when the U.S. government attempted to prevent publication of an article describing the design principles of a hydrogen bomb. The government obtained a temporary restraining order, but the case became moot when similar information was published elsewhere, demonstrating what scholars call the “Streisand effect”: efforts to suppress information often draw greater attention to it.
Dual-Use Research and Biological Threats
The life sciences presented a particularly acute version of the dilemma. In 2001, Australian researchers accidentally created a lethal mousepox variant while attempting to develop a contraceptive vaccine for pest control. The publication of their methods sparked debate about whether such research should be conducted at all and whether the findings should be published.
The scientific community was divided. Some argued that publication was essential for other researchers to develop countermeasures. Others contended that the findings provided a potential blueprint for creating devastating biological weapons. One of the original researchers, Dr. Ian Ramshaw, appeared to hold both positions simultaneously, arguing that the work “should never have been started in the first place” while also supporting publication of the results because they were “interesting and important.”
Vulnerability Disclosure and Security Testing
Government Accountability Office reports repeatedly demonstrated that security systems were less robust than claimed. GAO undercover teams successfully smuggled simulated weapons through airport screening, penetrated the perimeters of biosafety level 4 laboratories, and identified vulnerabilities in nuclear material detection systems. Each published report simultaneously served the public interest by documenting failures and potentially informed adversaries about exploitable weaknesses.
Journalist Jeffrey Goldberg conducted his own security test in 2008, successfully carrying prohibited items through airport screening in a demonstration published in the Atlantic Monthly. The TSA’s own screening procedures were accidentally exposed in 2009 when the agency posted an improperly redacted manual online, revealing details of its security protocols.
The WikiLeaks Challenge
The massive document releases facilitated by WikiLeaks, beginning with the Afghanistan war logs in 2010 and followed by the Iraq war files and diplomatic cables, brought the debate to a new scale. Hundreds of thousands of classified documents became publicly available, including a cable listing critical infrastructure sites worldwide that the U.S. considered vital to its national security.
Defenders of the leaks argued they exposed war crimes, diplomatic dishonesty, and government overreach. Critics countered that the bulk releases endangered intelligence sources, compromised diplomatic relationships, and provided adversaries with detailed knowledge of American vulnerabilities and capabilities.
Creative Attack Scenarios and Self-Censorship
A more subtle aspect of the dilemma involved whether researchers, journalists, and security professionals should publicly speculate about potential attack methods. Author Steven Levitt described a simple scenario involving armed individuals conducting random shootings across the country that could cause massive disruption. Terrorism analysts published detailed studies of potential attacks involving radiological dispersal devices, infrastructure sabotage, and biological agents.
The military and intelligence community themselves engaged in such exercises. The Defense Threat Reduction Agency conducted a study called “Thwarting an Evil Genius,” the Army held “Mad Scientist” seminars, and the Department of Homeland Security operated an Analytic Red Cell office. After September 11, the Pentagon even consulted Hollywood screenwriters to brainstorm potential terrorist scenarios.
Al Qaeda’s Ayman al-Zawahiri was quoted as observing that “America itself provides its enemies with the ideas” for attacks, apparently referring to the extensive media coverage of potential vulnerabilities.
The NSA Domestic Surveillance Revelation
The New York Times’ 2005 disclosure that the Bush administration was conducting warrantless domestic surveillance through the NSA illustrated the highest-stakes version of the information dilemma. The administration argued the program was legal and essential, while critics saw it as an unconstitutional violation of civil liberties. The newspaper had actually withheld the story for over a year at the government’s request before ultimately publishing.
Author Gabriel Schoenfeld argued in Commentary magazine that the Times had violated the Espionage Act. Others countered that without the disclosure, an illegal surveillance program would have continued without public knowledge or democratic accountability.
The Volunteer Code of Silence
Perhaps the most instructive historical precedent was the voluntary censorship system implemented during World War II. The Office of Censorship, headed by Associated Press executive Byron Price, relied on the voluntary cooperation of the American press to avoid publishing information that could aid the enemy. The system worked remarkably well, with the press largely adhering to guidelines without legal compulsion.
The success of the wartime model depended on several factors largely absent in the modern era: a clear and universally recognized threat, a high degree of public trust in government, and a media landscape dominated by a small number of professional outlets that shared a common understanding of national interest. None of these conditions exist today, making voluntary restraint a far less reliable mechanism.
The Fundamental Tension
The information dilemma ultimately comes down to a question without a clean answer: How does a democratic society protect genuinely sensitive information without creating a secrecy apparatus that enables abuse, corruption, and the erosion of the very freedoms it claims to defend?
History suggests that both excessive secrecy and excessive disclosure carry real costs. The challenge lies not in choosing one extreme over the other, but in developing institutional mechanisms that can make these judgments with some degree of wisdom, accountability, and humility about the limits of anyone’s ability to predict the consequences of information release.
As the 9/11 Commission concluded, the attacks succeeded in part because of a “failure of imagination” within the intelligence community. Whether that imagination is best cultivated through openness or secrecy remains the central and unresolved question in the governance of sensitive information.



