I’m no conspiracy theorist, my feet are way too firmly planted in the world of testable observations to fall for that level of crazy, but I do love it when we the public get to see the inner workings of secretive programs, government or otherwise. Part of it is sheer voyeurism but if I’m truthful the things that really get me are the big technical projects, things that done without the veil of secrecy would be wondrous in their own right. The fact that they’re hidden from public view just adds to the intrigue, making you wonder why such things needed to be kept secret in the first place.

One of the first things that comes to mind was the HEXAGON series of spy satellites which were high resolution observation platforms launched during the cold war that still rival the resolution of satellites launched today. It’s no secret that all space fairing nations have fleets of satellites up there for such purposes but the fact that the USA was able to keep the exact nature of the entire program secret for so long is quite astounding. The technology behind it though was what really intrigued me as it really was years ahead of the curve in terms of capabilities, even if it didn’t have the longevity of its fully digital progeny.

Yesterday however a friend sent me this document from the Electronic Frontier Foundation which provides details on something called the Presidential Surveillance Program (PSP). I was instantly intrigued.

According to William Binney, a former head of the National Security Agency the PSP is in essence a massive data gathering program with possible intercepts at all major fibre terminations within the USA. The system simply siphons off all incoming and outgoing data which is then stored in massive, disparate data repositories. This in itself is a mind boggling endeavour as the amount of data that transits the Internet in a single day dwarfs the capacity of most large data centres. The NSA then ramps it up a notch by being able to recover files, emails and all sorts of other data based on keywords and pattern matching which implies heuristics on a level that’s just simply mind blowing. Of course this is all I’ve got to go on at the moment but the idea itself is quite intriguing.

For starters creating a network that’s able to handle a direct tap to a fibre connection is no small feat in itself. When the fibres terminating at the USA border are capable of speeds in the GB/s range the require infrastructure to handle that is non-trivial, especially so if you want to store that data later. Storing that amount of data is another matter entirely as most commercial arrays begin to tap out in the petabyte range. Binney’s claims start to seem a little far fetched here as he states there are plans up into the yottabyte range but concedes that current incarnations of the program couldn’t have more than tens of exabytes. Barring some major shake up in the way we store data I can’t fathom how they’d manage to create an array that big. Then again I don’t work for the NSA.

As intriguing as such a system might be there’s no question that its existence is a major violation of privacy for US citizens and the wider world. Such a system is akin to tapping every single phone and recording every conversation on it which is most definitely not supported by their current legal system. Just because they don’t use it until the have a reason to doesn’t make it just either as all data gathered without the suspicion of guilt or pretence to commit a crime is illegitimate. I could think of many legitimate uses for the data (anonymous analytical stuff could prove very useful) but the means by which its was gathered eliminates any purpose being legitimate.

About the Author

David Klemke

David is an avid gamer and technology enthusiast in Australia. He got his first taste for both of those passions when his father, a radio engineer from the University of Melbourne, gave him an old DOS box to play games on.

View All Articles