Counterterrorism center increases data retention time to five years by Mark Rockwell.
From the post:
The National Counterterrorism Center, which acts as the government’s clearinghouse for terrorist data, has moved to hold onto certain types of data for up to five years to improve its ability to keep track of it across government databases.
On March 22, NCTC implemented new guidelines allowing much lengthier data retention period for “terrorism information” in federal datasets including non-terrorism information. NCTC had previously been required to destroy data on citizens within three months if no ties were found to terrorism. Those rules, according to NCTC, limited the effectiveness of the data, since in some instances, the ability to link across data sets over time could help track threats that weren’t immediate, or immediately evident. According to the center, the longer retention time can aid in connecting dots that aren’t immediately evident when the initial data is collected.
Director of National Intelligence James Clapper, Attorney General Eric Holder, and National Counterterrorism Center (NCTC) Director Matthew Olsen signed the updated guidelines designed on March 22 to allow NCTC to obtain and more effectively analyze certain data in the government’s possession to better address terrorism-related threats.
I looked for the new guidelines but apparently they are not posted to the NCTC website.
Here is the justification for the change:
One of the issues identified by congress and the intelligence community after the 2009 Fort Hood shootings and the Christmas Day 2009 bombing attempt was the government’s limited ability to query multiple federal datasets and to correlate information from many sources that might relate to a potential attack, said the center. A review of those attacks recommended the intelligence community push for the of state-of-the-art search and correlation capabilities, including techniques that would provide a single point of entry to various government databases, it said.
“Following the failed terrorist attack in December 2009, representatives of the counterterrorism community concluded it is vital for NCTC to be provided with a variety of datasets from various agencies that contain terrorism information,” said Clapper in a March 22 statement. “The ability to search against these datasets for up to five years on a continuing basis as these updated Guidelines permit will enable NCTC to accomplish its mission more practically and effectively than the 2008 Guidelines allowed.”
OK, so for those two cases, what evidence would having search capabilities over five years worth of data uncover? Even with the clarity of hindsight, there has been no showing of what data could have been uncovered.
The father of the attacker reported his son’s intentions to the CIA on November 19, 2009. That right, within 45 days of the attack.
Building a bigger haystack is a singularly ineffectual way to fight terrorism. It will generate more data, more IT systems, with the personnel to man and sustain them, all of which are agency drone, not fighting terrorism goals.
Cablegate was the result of a “bigger haystack” project. Do you think we need another one?
Topic maps and other semantic technologies can produce smaller, relevant haystacks.
I guess that is the question:
Do you want more staff and a larger budget or to have the potential to combat terrorism? (The latter is only potential given that US intelligence can’t intercept bombers on 36 day notice.)
[…] The intelligence types, who can’t analyze a small haystack effectively, want to build a bigger one: Building a Bigger Haystack. […]
Pingback by Open Street Map GPS users mapped « Another Word For It — April 11, 2012 @ 6:15 pm