NSA shows how big ‘big data’ can be by Frank Konkel.
If big data was cheap and easy and always resulted in an abundance of relevant insights, every agency and organization would do it.
The fact that so few federal agencies are engaging this new technology – zero out of 17 in a recent Meritalk survey – only highlights the challenges inherent with what recent intelligence leaks show the National Security Agency is trying to do.
NSA reportedly collects the daily phone records of hundreds of millions of customers from the largest providers in the nation, as well as a wealth of online information about individuals from Internet companies like Facebook, Microsoft, Google and others.
To put the NSA’s big data problems into perspective, Facebook’s 1 billion worldwide users alone generate 500 terabytes of information per day – about as much data as a digital library containing all books ever written in any language. Worldwide, humans generate 6.1 trillion text messages annually, and Americans alone make billions of phone calls each year.
Even if the NSA takes in only a small percentage of the metadata generated daily by those major companies and carriers in its efforts to produce foreign signals intelligence and thwart terrorists, the information contained therein would be a vast sea of data.
Frank’s line: If big data was cheap and easy and always resulted in an abundance of relevant insights, every agency and organization would do it. bears repeating.
Especially in light of misleading news stories like: Intercepted communications called critical in terror investigations by Tim Lister and Paul Cruickshank.
Sure, starting from someone already known or under surveillance, intercepting communications can be valuable.
But that wasn’t what the NSA was doing, at least from the court order to monitor all Verizon customers. Unless they are all in cahoots with terrorists? Seems unlikely.
So why is the NSA gathering data it can’t effectively analyze?
Suggestions?