Too Many Secrets? The intelligence community’s classification system is broken.
The capture of Osama Bin Laden this past year took discipline, interagency cooperation, the artful use of new technology, willingness to accept risk and steely nerves. I recommend we apply these same skills and fortitude to an equally challenging task: the effective management of U.S. government secrets.
So far the U.S. government has been trying but failing to manage this seemingly far simpler task. Senior policymakers seem unwilling or unable to give the subject the attention it deserves. Unfortunately, there will be serious consequences for such inattention. We are riding a tiger: an outdated classification system that threatens either to overwhelm us with data or to deliver vital secrets to our adversaries. Fortunately, fixing this shouldn’t be too hard.
The first and most difficult step is to figure out what’s wrong. For example, some experts have argued the key issue is excessive secrecy, which poses a round-about security threat because it dumbs down policy debates, denies Americans knowledge and thus undermines accountability. In addition to dredging up surprising examples of flawed classification decisions of the past (such as the World War II-era assessment of the number of annual shark attacks on humans, a number not declassified until 1958), they point to the issue of sheer numbers: In 2010, government agencies reported 224,734 new U.S. government secrets, an increase of 22.6 percent over the prior year. (Numbers are from the Information Security Oversight Office of the National Archives and Records Administration, “2010 Report to the President”, pp. 8-9.)
Such new secrets, what we call original classification decisions, are just the seed corn. Based on these new secrets and all prior ones, 2010 saw 76.6 million derivative classification decisions, involving the incorporation of existing secrets into new documents, videos, speeches and other products. This total was almost double the number of such decisions just two years earlier. A tsunami of electronic secrets is growing exponentially, threatening to break over U.S. taxpayers’ heads. When it does, the size and cost of managing a rescue could prove very painful. At one intelligence agency alone, the growth of classified records is approximately 1 petabyte (1 million gigabytes) every 18 months. According to the information security oversight office at the National Archives, it takes two full-time employees one full year to review just one gigabyte of data. Where is the U.S. government going to find two million full-time employees to review one petabyte, let alone the 18 petabytes or more generated by all our national security agencies?
The costs associated with keeping this backlog are also mounting. According to the U.S. National Archives, just keeping secrets classified cost more than $10.17 billion in 2010 and continues to skyrocket. The costs of declassification will be even greater.
The problem has led some experts to argue for more bureaucracy. The authors of one report see skewed incentives and incompetent or wrong-headed officials at the core of the problem.1 To fix things, they advocate introducing new processes and people to strengthen oversight of every one of those 76.8 million derivative classification decisions. Overclassifiers, once found, would be punished with fines.