NSA Let Snowden Crawl All Over Its Intranet Unchecked
All it took for Edward Snowden to grab roughly 1.7 million classified documents from the National Security Agency’s network was an open-source Web crawler and a few scripts, according to a New York Times report on Sunday. An investigation of Snowden’s activities at the NSA outposts in Hawaii apparently found that he was able to retrieve millions of classified documents in an automated fashion using what the Times described as “low-cost” software. That software was likely based on the open source GNU Wget utility.
Intelligence officials would not say what the tool was, but said they believed it was “more powerful” than Wget. The anonymous sources don’t add much to the narrative of Snowden’s extraction of secret documents, though they do start to put a number on the volume of what officials believe he made off with. But the real sting of the latest data is that the NSA’s internal IT operations are portrayed as even more fast and loose than before. Anyone with admin access might have been able to do what Snowden did.