Another Word For It Patrick Durusau on Topic Maps and Semantic Diversity

November 15, 2016

Researchers found mathematical structure that was thought not to exist [Topic Map Epistemology]

Filed under: Epistemology,Mathematics,Philosophy,Topic Maps — Patrick Durusau @ 5:04 pm

Researchers found mathematical structure that was thought not to exist

From the post:

Researchers found mathematical structure that was thought not to exist. The best possible q-analogs of codes may be useful in more efficient data transmission.

The best possible q-analogs of codes may be useful in more efficient data transmission.

In the 1970s, a group of mathematicians started developing a theory according to which codes could be presented at a level one step higher than the sequences formed by zeros and ones: mathematical subspaces named q-analogs.

While “things thought to not exist” may pose problems for ontologies and other mechanical replicas of truth, topic maps are untroubled by them.

As the Topic Maps Data Model (TMDM) provides:

subject: anything whatsoever, regardless of whether it exists or has any other specific characteristics, about which anything whatsoever may be asserted by any means whatsoever

A topic map can be constrained by its author to be as stunted as early 20th century logical positivism or have a more post-modernist approach, somewhere in between or elsewhere, but topic maps in general are amenable to any such choice.

One obvious advantage of topic maps being that characteristics of things “thought not to exist” can be captured as they are discussed, only to result in the merging of those discussions with those following the discovery things “thought not to exist really do exist.”

The reverse is also true, that is topic maps can capture the characteristics of things “thought to exist” which are later “thought to not exist,” along with the transition from “existence” to being thought to be non-existent.

If existence to non-existence sounds difficult, imagine a police investigation where preliminary statements then change and or replaced by other statements. You may want to capture prior statements, no longer thought to be true, along with their relationships to later statements.

In “real world” situations, you need epistemological assumptions in your semantic paradigm that adapt to the world as experienced and not limited to the world as imagined by others.

Topic maps offer an open epistemological assumption.

Does your semantic paradigm do the same?

November 14, 2016

U.S. Navy As Software Pirates

Filed under: Cybersecurity,Intellectual Property (IP),Software — Patrick Durusau @ 9:13 pm

Navy denies it pirated 558K copies of software, says contractor consented by David Kravets.

From the post:

In response to a lawsuit accusing the US Navy of pirating more than 558,000 copies of virtual reality software, the Navy conceded Monday that it had installed the software on “hundreds of thousands of computers within its network” without paying the German software maker for it. But the Navy says it did so with the consent of the software producer.

I suspect that “consent” here means that Bitmanagement Software modified its product to remove installation restrictions in hopes the U.S. Navy would become utterly dependent upon the software and only then “notice” the Navy had licensed only 38 copies.

Nice try but sovereigns have been rolling citizens for generations.

The complaint and the government’s answer are both amusing reads.

The lesson here is you are responsible for protecting your property. Especially when exposing it to potential thieves.

Tor Risks for Whistleblowers

Filed under: Cybersecurity,Security,Tor — Patrick Durusau @ 7:57 pm

Exclusively Relying on Tor Risks Detection and Exposure for Whistleblowers by Michael Best.

Eighteen (18) slides to remind you that just using Tor can leave you vulnerable to detection and exposure.

Depending on who you are exposing, detection may be hazardous to your freedom or even your life.

Unfortunately, like other forms of cybersecurity, avoiding detection and exposure requires effort. Effort that is rare among casual users of computers.

Depending upon your risk factors, you and your colleagues should review security practices on a regular basis.

I would include these slides and/or an adaptation of them as part of that review.

Pointers to regular security practice review cheatsheets?

Leaking and Whistleblowing in the Trump Era

Filed under: Cybersecurity,Government,Privacy,Security — Patrick Durusau @ 5:47 pm

In the Trump Era, Leaking and Whistleblowing Are More Urgent, and More Noble, Than Ever by Glenn Greenwald.

From the post:

For the past 15 years, the U.S. Government under both parties has invented whole new methods for hiding what they do behind an increasingly impenetrable wall of secrecy. From radical new legal doctrines designed to shield their behavior from judicial review to prosecuting sources at record rates, more and more government action has been deliberately hidden from the public.

One of the very few remaining avenues for learning what the U.S. Government is doing – beyond the propaganda that they want Americans to ingest and thus deliberately disseminate through media outlets – is leaking and whistleblowing. Among the leading U.S. heroes in the War on Terror have been the men and women inside various agencies of the U.S. Government who discovered serious wrongdoing being carried out in secret, and then risked their own personal welfare to ensure that the public learned of what never should have been hidden from it in the first place.

Many of the important consequential revelations from the last two administrations were possible only because of courageous sources who came forward in this way. It’s how we learned about the abuses of Abu Ghraib, the existence of torture-fueled CIA “black sites,” the Bush warrantless eavesdropping program, the wanton slaughter carried out in Iraq and Afghanistan, the recklessness and deceit at the heart of the U.S. drone program, the NSA’s secret construction of the largest system of suspicionless, mass surveillance ever created, and so many other scandals, frauds, and war crimes that otherwise would have remained hidden. All of that reporting was possible only because people of conscience decided to disregard the U.S. Government’s corrupt decree that this information should remain secret, on the ground that concealing it was designed to protect not national security but rather the reputations and interests of political officials.

For that reason, when the Intercept was created, enabling safe and productive whistleblowing was central to our mission. We hired some of the world’s most skilled technologists, experts in information security and encryption, to provide maximum security for our journalists and our sources. We adopted the most advanced programs for enabling sources to communicate and provide information to us anonymously and without detection, such as Secure Drop. And we made an institutional commitment to expend whatever resources are necessary to defend the right of a free press to report freely without threats of recrimination, and to do everything possible to protect and defend our sources who enable that vital journalism.

Over the past two years, we have published several articles by our security experts on how sources (and others) can communicate and provide information to us in the safest and most secure manner possible, to minimize the chances of being detected. We’ve published interviews with other experts, such as Edward Snowden, on the most powerful tools and methods available for securing one’s online communications. As our technologist Micah Lee explained, no method is perfect, so “caution is still advised to those who want to communicate with us without exposing their real-world identities,” but tools and practices do exist to maximize anonymity, and we are committed to using those and informing the public about how to use them in the safest and most effective manner possible.

Considering the damage done to the Constitution by George W. Bush and Barack Obama, leaking/whistleblowing in the Trump era is not “more urgent, and more noble….”

That is to say leaking/whistleblowing is always urgent and noble.

Think about the examples Greenwald cites. All are from the Bush and Obama administrations with nary a hint of Trump.

Exposing murder, torture, war crimes, lying to allies, Congress and the American public. And that’s just the short list. The margin of this page isn’t large enough to enumerate all the specific crimes committed by both administrations.

By all means, let’s encourage leaking and whistleblowing in the Trump era, but don’t leak timidly.

Government officials, staffers, contractors and their agents (double or otherwise), have freely chosen to participate in activities hidden from the public. Hidden because they are ashamed of what they have done (think CIA torturers) and/or fear just prosecution for their crimes (waging wars of aggression).

Leak boldly, insist on naming all names and all actions being described.

Secrecy hasn’t prevented excesses in secret, perhaps severe and repeated consequences from bold leaks will.

Leak early, often and in full.

PS: We should not rely exclusively on insiders to leak information.

Hackers have an important role to play in creating government transparency, with or without the government’s consent.

November 13, 2016

Orwell: The surveillance game that puts you in Big Brother’s shoes [Echoes of Enders Game?]

Filed under: Cybersecurity,Games,Privacy — Patrick Durusau @ 8:40 pm

Orwell: The surveillance game that puts you in Big Brother’s shoes by Claire Reilly.

From the post:

“Big Brother has arrived — and it’s you.”

As CNET’s resident privacy nark, I didn’t need much convincing to play a game all about social engineering and online surveillance.

But when I stepped into my role as a new recruit for the fictional Orwell internet surveillance program, I didn’t expect to find the rush of power so beguiling, or unsettling.

Developed by German outfit Osmotic Studios, Orwell sees you working as a new recruit in a surveillance agency of the same name, following a series of terrorist attacks in Bonton, the fictional capital of The Nation. As an agent, you are responsible for scraping social media feeds, blogs, news sites and the private communications of the Nation’s citizens to find those with connections to the bombings.

You start with your first suspect before working through a web of friends and associates. You’re after data chunks — highlighted pieces of information and text found in news stories, websites and blogs that can be dragged and uploaded into the Orwell system and permanently stored as evidence.

The whole game has a kind of polygon graphic aesthetic, making the news clippings, websites and social media feeds you’re trawling feel close to the real thing. But as with everything in Orwell, it’s viewed through a glass, darkly.

If you are a game player, this sounds wickedly seductive.

If your not, what if someone weaponized Orwell so that what appear to be “in the game” hacks are hacks in the “real world?”

A cybersecurity “Enders Game” where the identity of targets and consequences of attacks are concealed from hackers?

Are the identity of targets or consequences of attacks your concern? Or is credit for breaching defenses and looting data enough?

Before reaching that level of simulation, imagine changing from the lone/small group hacker model to a more distributed model.

Where anonymous hackers offer specialized skills, data or software in collaboration on proposed hacks.

Ideas on the requirements for such a collaborative system?

Assuming nation states get together on cybersecurity, it could be a mechanism to match or even out perform such efforts.

Outbrain Challenges the Research Community with Massive Data Set

Filed under: Contest,Data,Data Mining,Prediction — Patrick Durusau @ 8:15 pm

Outbrain Challenges the Research Community with Massive Data Set by Roy Sasson.

From the post:

Today, we are excited to announce the release of our anonymized dataset that discloses the browsing behavior of hundreds of millions of users who engage with our content recommendations. This data, which was released on the Kaggle platform, includes two billion page views across 560 sites, document metadata (such as content categories and topics), served recommendations, and clicks.

Our “Outbrain Challenge” is a call out to the research community to analyze our data and model user reading patterns, in order to predict individuals’ future content choices. We will reward the three best models with cash prizes totaling $25,000 (see full contest details below).

The sheer size of the data we’ve released is unprecedented on Kaggle, the competition’s platform, and is considered extraordinary for such competitions in general. Crunching all of the data may be challenging to some participants—though Outbrain does it on a daily basis.

The rules caution:


The data is anonymized. Please remember that participants are prohibited from de-anonymizing or reverse engineering data or combining the data with other publicly available information.

That would be a more interesting question than the ones presented for the contest.

After the 2016 U.S. presidential election we know that racists, sexists, nationalists, etc., are driven by single factors so assuming you have good tagging, what’s the problem?

Yes?

Or is human behavior is not only complex but variable?

Good luck!

Intro to Image Processing

Filed under: Image Processing,Image Recognition,Image Understanding,OpenCV — Patrick Durusau @ 5:03 pm

Intro to Image Processing by Eric Schles.

A short but useful introduction to some, emphasis on some, of the capabilities of OpenCV.

Understanding image processing will make you a better consumer and producer of digital imagery.

To its great surprise, the “press” recently re-discovered government isn’t to be trusted.

The same is true for the “press.”

Develop your capability to judge images offered by any source.

November 12, 2016

10 Reasons to Choose Apache Solr Over Elasticsearch

Filed under: ElasticSearch,Lucene,LucidWorks,Solr — Patrick Durusau @ 9:24 pm

10 Reasons to Choose Apache Solr Over Elasticsearch by Grant Ingersoll.

From the post:

Hey, clickbait title aside, I get it, Elasticsearch has been growing. Kudos to the project for tapping into a new set of search users and use cases like logging, where they are making inroads against the likes of Splunk in the IT logging market. However, there is another open source, Lucene-based search engine out there that is quite mature, more widely deployed and still growing, granted without a huge marketing budget behind it: Apache Solr. Despite what others would have you believe, Solr is quite alive and well, thank you very much. And I’m not just saying that because I make a living off of Solr (which I’m happy to declare up front), but because the facts support it.

For instance, in the Google Trends arena (see below or try the query yourself), Solr continues to hold a steady recurring level of interest even while Elasticsearch has grown. Dissection of these trends (which are admittedly easy to game, so I’ve tried to keep them simple), show Elasticsearch is strongest in Europe and Russia while Solr is strongest in the US, China, India, Brazil and Australia. On the DB-Engines ranking site, which factors in Google trends and other job/social metrics, you’ll see both Elasticsearch and Solr are top 15 projects, beating out a number of other databases like HBase and Hive. Solr’s mailing list is quite active (~280 msgs per week compared to ~170 per week for Elasticsearch) and it continues to show strong download numbers via Maven repository statistics. Solr as a codebase continues to innovate (which I’ll cover below) as well as provide regular, stable releases. Finally, Lucene/Solr Revolution, the conference my company puts on every year, continues to set record attendance numbers.

Not so much an “us versus them” piece as tantalizing facts about Solr 6 that will leave you wanting to know more.

Grant invites you to explore the Solr Quick Start if one or more of his ten points capture your interest.

Timely because with a new presidential administration about to take over in Washington, D.C., there will be:

  • Data leaks as agencies vie with each other
  • Data leaks due to inexperienced staffers
  • Data leaks to damage one side or in retaliation
  • Data leaks from foundations and corporations
  • others

If 2016 was the year of “false news” then 2017 is going to be the year of the “government data leak.”

Left unexplored except for headline suitable quips found with grep, leaks may not be significant.

On the other hand, using Solr 6 can enable you to weave a coherent narrative from diverse resources.

But you will have to learn Solr 6 to know for sure.

Enjoy!

Preventing Another Trump – Censor Facebook To Protect “Dumb” Voters

Filed under: Censorship,Free Speech,Government,Journalism,News,Politics,Reporting — Patrick Durusau @ 9:01 pm

Facebook can no longer be ‘I didn’t do it’ boy of global media by Emily Bell.


Barack Obama called out the fake news problem directly at a rally in Michigan on the eve of the election: “And people, if they just repeat attacks enough, and outright lies over and over again, as long as it’s on Facebook and people can see it, as long as it’s on social media, people start believing it….And it creates this dust cloud of nonsense.”

Yesterday, Zuckerberg disputed this, saying that “the idea that fake news on Facebook… influenced the election…is a pretty crazy idea” and defending the “diversity” of information Facebook users see. Adam Mosseri, the company’s VP of Product Development, said Facebook must work on “improving our ability to detect misinformation.” This line is part of Zuckerberg’s familiar but increasingly unconvincing narrative that Facebook is not a media company, but a tech company. Given the shock of Trump’s victory and the universal finger-pointing at Facebook as a key player in the election, it is clear that Zuckerberg is rapidly losing that argument.

In fact, Facebook, now the most influential and powerful publisher in the world, is becoming the “I didn’t do it” boy of global media. Clinton supporters and Trump detractors are searching for reasons why a candidate who lied so frequently and so flagrantly could have made it to the highest office in the land. News organizations, particularly cable news, are shouldering part of the blame for failing to report these lies for what they were. But a largely hidden sphere of propagandistic pages that target and populate the outer reaches of political Facebook are arguably even more responsible.

You can tell Bell has had several cups of the Obama kool-aid by her uncritical acceptance of Barack Obama’s groundless attacks on “…fake news problem….”

Does Bell examine the incidence of “fake news” in other elections?

No.

Does Bell specify which particular “fake news” stories should have been corrected?

No.

Does Bell explain why voters can’t distinguish “fake news” from truthful news?

No.

Does Bell explain why mainstream media is better than voters at detecting “fake news?”

No.

Does Bell explain why she should be the judge over reporting during the 2016 Presidential election?

No.

Does Bell explain why she and Obama consider voters to be dumber than themselves?

No.

Do I think Bell or anyone else should be censoring Facebook for “false news?”

No.

How about you?

November 11, 2016

“connecting the dots” requires dots (Support Michael Best)

Filed under: Government Data,Politics,Transparency — Patrick Durusau @ 9:45 pm

Michael Best is creating a massive archive of government documents.

From the post:

Since 2015, I’ve published millions of government documents (about 10% of the text items on the Internet Archive, with some items containing thousands of documents) and terabytes of data; but in order to keep going, I need your help. Since I’ve gotten started, no outlet has matched the number of government documents that I’ve published and made freely available. The only non-governmental publisher that rivals the size and scope of the government files I’ve uploaded is WikiLeaks. While I analyze and write about these documents, I consider publishing them to be more important because it enables and empowers an entire generation of journalists, researchers and students of history.

I’ve also pressured government agencies into making their documents more widely available. This includes the more than 13,000,000 pages of CIA documents that are being put online soon, partially in response to my Kickstarter and publishing efforts. These documents are coming from CREST, which is a special CIA database of declassified records. Currently, it can only be accessed from four computers in the world, all of them just outside of Washington D.C. These records, which represent more than 3/4 of a million CIA files, will soon be more accessible than ever – but even once that’s done, there’s a lot more work left to do.

Question: Do you want a transparent and accountable Trump presidency?

Potential Answers include:

1) Yes, but I’m going to spend time and resources hyper-ventilating with others and roaming the streets.

2) Yes, and I’m going to support Michael Best and FOIA efforts.

Governments, even Trump’s presidency, don’t spring from ocean foam.

1024px-sandro_botticelli_-_la_nascita_di_venere_-_google_art_project_-_edited-460

The people chosen fill cabinet and other posts have history, in many cases government history.

For example, I heard a rumor today that Ed Meese, a former government crime lord, is on the Trump transition team. Hell, I thought he was dead.

Michael’s efforts produce the dots that connect past events, places, people, and even present administrations.

The dots Michael produces may support your expose, winning story and/or indictment.

Are you in or out?

The TPP Is Dead! Really Most Sincerely Dead! (Celebration Is In Order!)

Filed under: Government,Intellectual Property (IP),Transparency — Patrick Durusau @ 5:47 pm

Obama Administration Gives Up on Pacific Trade Deal by William Mauldin.

From the post:

The Obama administration on Friday gave up all hope of enacting its sweeping Pacific trade agreement, a pact designed to preserve U.S. economic influence in fast-growing Asia that was buried by a wave of antitrade political sentiment that culminated with Tuesday’s presidential election….

Yes!

I have ranted about the largely secret Trans-Pacific Partnership (TPP) trade agreement on several occasions.

Negotiated entirely in secret and even worse, designed to be kept secret from the citizens of signing countries, among the worse provisions (there were several), were those enabling investors to sue sovereign countries if laws diminished their investments.

I don’t know, like health warnings on cigarettes for example.

With the election of Donald Trump, I should say president-elect Donald Trump, the TPP is dead. (full stop)

As the proverb says:

It’s an ill wind that blows nobody any good.

Whatever your feelings about president-elect Donald Trump and any of his decisions/policies as president, the defeat of the TPP is one for the win column.

Hazards and dangers lie ahead, just as they would for any presidency, but take a moment to appreciate this win.

November 10, 2016

Here’s to the return of the journalist as malcontent

Filed under: Journalism,News,Reporting — Patrick Durusau @ 6:39 pm

Here’s to the return of the journalist as malcontent by Kyle Pope.

From the post:

JOURNALISM’S MOMENT of reckoning has arrived.

Its inability to understand Donald Trump’s rise over the last year, ending in his victory Tuesday night, clearly stand among journalism’s great failures, certainly in a generation and probably in modern times.

Reporters’ eagerness first to ridicule Trump and his supporters, then dismiss them, and finally to actively lobby and argue for their defeat have led us to a moment when the entire journalistic enterprise needs to be rethought and rebuilt. In terms of bellwether moments, this is our anti-Watergate.

Already the finger-pointing deconstructions have begun. Yes, social media played a role, enclosing reporters in echo chambers that made it hard, if not impossible, for them to hear contrarian voices; yes, the brutal economics of the news business hurt all our efforts, decimating newsrooms around the country and leaving fewer people to grapple with what was a gargantuan story; and yes, reporters can be forgiven, at least initially, for laughing off a candidate whose views and personality seemed so outside the norm of a serious contender for the White House.

While all those things are true, journalism’s fundamental failure in this election, its original sin, is much more basic to who we are and what we are supposed to be. Simply put, it is rooted in a failure of reporting.

(emphasis in original)

You should read this essay at the start of everyday. Even after opposition to and suspicion of every government, corporation or other statement is second nature.

It’s ironic that Pope points out:

[Trump] already has made clear that he is no friend of the press.

True enough but the press has made it clear it is the friend of government, for several administrations.

Regaining the trust of the public is going to be a long and hard slough.

XML Calabash 1.1.13 released…

Filed under: XML,XProc — Patrick Durusau @ 1:23 pm

Norm Walsh tweeted:

XML Calabash 1.1.13 released for Saxon 9.5, Saxon 9.6, and, with all praise to @fgeorges, Saxon 9.7.

You are using XML pipelines for processing XML files. (XProc)

Yes?

See also: XML Calabash Reference.

Enjoy!

Bursting Your News Bubble

Filed under: Journalism,News,Reporting — Patrick Durusau @ 11:55 am

It would not have helped the Clinton clones (a sense of entitlement makes people tone deaf and fact blind) but C.J. Adams and Izzie Zahorian explore a way to “see” news beyond your usual news bubble.

In If you are reading this, we might be in the same news bubble they write:

In Myanmar we met two journalists who, during a period of military rule, had smuggled newspapers in duffel bags to carry news between their country and the outside world. Their story stuck with us as a sort of personal challenge: these reporters had regularly risked their lives to read a just a few pages of news from outside their country; while we, with all our connectivity, rarely make the effort to do the same.

Even with the power of the internet, it can be surprisingly difficult to explore the diversity of global perspectives. Technology has made it easier for everyone share information, but it hasn’t made us better at finding viewpoints that are distant from our own. In some ways, a duffel bag full of newspapers would include a wider range of perspectives than many of us see on a daily basis.

Search engines, social media and news aggregators are great at surfacing information close to our interests, but they are limited by the set of topics and people we choose to follow. Even if we read multiple news sources every day, what we discover is defined by the languages we are able to read, and the topics that our sources decide to cover. Ultimately, these limitations create a “news bubble” that shapes our perspective and awareness of the world. We often miss out on the chance to connect and empathize with ideas beyond these boundaries.

How to “see” news without your usual filters?


We’ve just released a new experiment related to this idea: a data visualization called Unfiltered.News. The viz uses Google News data to show what the daily news topics are being published in every region. Headlines for these topics can be viewed from around the world, with translations provided in 40 languages. We hope the viz can become a useful tool to explore what shapes our different perspectives, and to help users discover topics and viewpoints they would have otherwise missed.

Push this one up to the top of your “sites/technology to explore” stack!

I’m having a mixed experience on Ubuntu 14.04. Chrome fails altogether, no support for WebGL. Mozilla displays the side bar of headlines but not the graph like presentation of stories.

I also tried to load the site on Windows 7 with IE and got no joy.

Understandable (but disappointing) that the site may be optimized for Windows but to exclude Chrome?

It’s a great idea, hopeful that during this beta shakedown that it becomes more widely accessible.

November 9, 2016

Trump Wins! Trump Wins! A Diversity Lesson For Data Scientists

Filed under: Diversity,Government,Politics,Statistics — Patrick Durusau @ 9:50 pm

Here’s Every Major Poll That Got Donald Trump’s Election Win Wrong by Brian Flood.

From the post:

When Donald Trump shocked the world to become the president-elect on Tuesday night, the biggest loser wasn’t his opponent Hillary Clinton, it was the polling industry that tricked America into thinking we’d be celebrating the first female president right about now.

The polls, which Trump has been calling inaccurate and rigged for months, made it seem like Clinton was a lock to occupy the White House come January.

Nate Silver’s FiveThirtyEight is supposed to specialize in data-based journalism, but the site reported on Tuesday morning that Clinton had a 71.4 percent chance of winning the election. The site was wrong about the outcome in major battleground states including Florida, North Carolina and Pennsylvania, and Trump obviously won the election in addition to the individual states that were supposed to vote Clinton. Silver wasn’t the only pollster to botch the 2016 election.

Trump’s victory should teach would be data scientists this important lesson:

Diversity is important in designing data collection

Some of the reasons given for the failure of prediction in this election:

  1. People without regular voting records voted.
  2. People polled weren’t honest about their intended choices.
  3. Pollster’s weren’t looking for a large, angry segment of the population.

All of which can be traced back to a lack of imagination/diversity in the preparation of the polling instruments.

Ironic isn’t it?

Strive for diversity, including people whose ideas you find distasteful.

Such as vocal Trump supporters. (Substitute your favorite villain.)

November 8, 2016

We Should Feel Safer Than We Do

Filed under: Data Mining,R,Social Sciences — Patrick Durusau @ 5:38 pm

We Should Feel Safer Than We Do by Christian Holmes.

Christian’s Background and Research Goals:

Background

Crime is a divisive and important issue in the United States. It is routinely ranked as among the most important issue to voters, and many politicians have built their careers around their perceived ability to reduce crime. Over 70% of Americans believe that crime is increasing, according to a recent Gallup poll, but is that really the case? I seek to answer this question in this post, as well as determine if there is any clear correlation between government spending and crime.

Research Goals

-Is crime increasing or decreasing in this country?
-Is there a clear link between government spending and crime?

provide an interesting contrast with his conclusions:

From the crime data, it is abundantly clear that crime is on the decline, and has been for around 20 years. The reasons behind this decrease are quite nuanced, though, and I found no clear link between either increased education or police spending and decreasing crime rates. This does not mean that such a relationship does not exist. Rather, it merely means that there is no obvious correlation between the two variables over this specific time frame.

In his background, Christian says:

Over 70% of Americans believe that crime is increasing, according to a recent Gallup poll, but is that really the case? I seek to answer this question in this post,…

Christian presumes, without proof, a relationship between: public beliefs about crime rates (rising or falling) and crime rates as recorded by government agencies.

Which also presumes:

  1. The public is aware that government collects crime statistics.
  2. The public is aware of current crime statistics.
  3. Current crime statistics influence public beliefs about the incidence of crime.

If the central focus of the paper is a comparison of “crime rates” as measured by government with other data on government spending, why even mention the disparity between public “belief” about crime and crime statistics?

I suspect, just as a rhetorical move, Christian is attempting to draw a favorable inference for his “evidence” by contrasting it with “public belief.” “Public belief” that is contrary to the “evidence” in this instance.

Christian doesn’t offer us any basis for judgments about public opinion on crime one way or the other. Any number of factors could be influencing public opinion on that issue, the crime rate as measured by government being only one of those.

The violent crime rate may be very low, statistically speaking, but if you are the victim of a violent crime, from your perspective crime is very prevalent.

Of R and Relationships

Christian uses R to compare crime date with government spending on education and policing.

The unhappy result is that no relationship is evidenced between government spending and a reduction in crime so Christian cautions:

…This does not mean that such a relationship does not exist. Rather, it merely means that there is no obvious correlation between the two variables over this specific time frame….

There is where we switch from relying on data and explore the realms of “the data didn’t prove I was wrong.”

Since it isn’t possible to prove the absence of a relationship between the “crime rate” and government spending on education/police, no, the evidence didn’t prove Christian to be wrong.

On the other hand, it clearly shows that Christopher has no evidence for that “relationship.”

The caution here is that using R and “reliable” data may lead to conclusions you would rather avoid.

PS: Crime and the public’s fear of crime are both extremely complex issues. Aggregate data can justify previously chosen positions, but little more.

None/Some/All … Are Suicide Bombers & Probabilistic Programming Languages

The Design and Implementation of Probabilistic Programming Languages by Noah D. Goodman and Andreas Stuhlmüller.

Abstract:

Probabilistic programming languages (PPLs) unify techniques for the formal description of computation and for the representation and use of uncertain knowledge. PPLs have seen recent interest from the artificial intelligence, programming languages, cognitive science, and natural languages communities. This book explains how to implement PPLs by lightweight embedding into a host language. We illustrate this by designing and implementing WebPPL, a small PPL embedded in Javascript. We show how to implement several algorithms for universal probabilistic inference, including priority-based enumeration with caching, particle filtering, and Markov chain Monte Carlo. We use program transformations to expose the information required by these algorithms, including continuations and stack addresses. We illustrate these ideas with examples drawn from semantic parsing, natural language pragmatics, and procedural graphics.

If you want to sharpen the discussion of probabilistic programming languages, substitute in the pragmatics example:

‘none/some/all of the children are suicide bombers’,

The substitution raises the issue of how “certainty” can/should vary depending upon the gravity of results.

Who is a nice person?, has low stakes.

Who is a suicide bomber?, has high stakes.

November 7, 2016

How the Ghana Floods animation was created [Animating Your Local Flood Data With R]

Filed under: R,Visualization — Patrick Durusau @ 9:08 pm

How the Ghana Floods animation was created by David Quartey.

From the post:

Ghana has always been getting flooded, but it seems that only floods in Accra are getting attention. I wrote about it here, and the key visualization was an animated map showing the floods in Ghana, and built in R. In this post, I document how I did it, hopefully you can do one also!

David’s focus is on Ghana but the same techniques work for data of greater local interest.

Election for Sale

Filed under: Government,MapD,Mapping,Politics — Patrick Durusau @ 8:23 pm

Election for Sale by Keir Clarke.

mapsmania2-460

MapD’s US Political Donations map allows you to explore the donations made to the Democratic and Republican parties dating back to 2001. The map includes a number of tools which allow you to filter the map by political party, by recipient and by date.

After filtering the map by party and date you can explore details of the donations received using the markers on the map. If you select the colored markers on the map you can view details on the amount of the donation, the name of the recipient & recipient’s party and the name of the donor. It is also possible to share a link to your personally filtered map.

The MapD blog has used the map to pick out a number of interesting stories that emerge from the map. These stories include an analysis of the types of donations received by both Hilary Clinton and Donald Trump.

An appropriate story for November 7th, the day prior to the U.S. Government sale day, November 8th.

It’s a great map but that isn’t to say it could not be enhanced by merging in other data.

While everyone acknowledges donations, especially small ones, are made for a variety of reasons, consistent and larger donations are made with an expectation of something in return.

One feature this map is missing is what did consistent and larger donors get in return?

Harder to produce and maintain than a map based on public campaign donation records but far more valuable to the voting public.

Imagine that level of transparency for the tawdry story of Hillary Clinton and Big Oil. How Hillary Clinton’s State Department Fought For Oil 5,000 Miles Away.

Apparent Browser Incompatibility: The MapD map loads fine with Firefox (49.0.2) but crashes with Chrome (Version 54.0.2840.90 (64-bit)) (Failed to load dashboard. TypeError: Cannot read property ‘resize’ of undefined). Both on Ubuntu 14.04.

Drip, Drip, Drip, Leaking At Wikileaks

Filed under: Hillary Clinton,Politics,Wikileaks — Patrick Durusau @ 3:58 pm

wikileaks-dnc-460

Two days before the U.S. Presidential election, Wikileaks released 8,200 emails from the Democratic National Committee (DNC). Which were in addition to its daily drip, drip, drip leaking of emails from John Podesta, Hillary Clinton’s campaign chair.

The New York Times, a sometimes collaborator with Wikileaks (The War Logs (NYT)), has sponsored a series of disorderly and nearly incoherent attacks on Wikileaks for these leaks.

The dominant theme in those attacks is that readers should not worry their shallow and insecure minds about social media but rely upon media outlets to clearly state any truth readers need to know.

I am not exaggerating. The exact language that appears in one such attack was:

…people rarely act like rational, civic-minded automatons. Instead, we are roiled by preconceptions and biases, and we usually do what feels easiest — we gorge on information that confirms our ideas, and we shun what does not.

Is that how you think of yourself? It is how the New York Times thinks about you.

There are legitimate criticisms concerning Wikileaks and its drip, drip, drip leaking but the Times manages to miss all of them.

For example, the daily drops of Podesta emails, selected on some “unknown to the public” criteria, prevented the creation of a coherent narrative by reporters and the public. The next day’s leak might contain some critical link, or not.

Reporters, curators and the public were teased with drips and drabs of information, which served to drive traffic to the Wikileaks site, traffic that serves no public interest.

If that sounds ungenerous, consider that as the game draws to a close, that Wikileaks has finally posted a link to the Podesta emails in bulk: https://file.wikileaks.org/file/podesta-emails/.

To be sure, some use has been made of the Podesta emails, my work and that of others on DKIM signatures (unacknowledged by Wikileaks when it “featured” such verification on email webpages), graphs, etc. but early bulk release of the emails would have enabled much more.

For example:

  • Concordances of the emails and merging those with other sources
  • Connecting the dots to public or known only to others data
  • Entity recognition and linking in extant resources and news stories
  • Fitting the events, people, places into coherent narratives
  • Sentiment analysis
  • etc.

All of that lost because of the “Wikileaks look at me” strategy for releasing the Podesta emails.

I applaud Wikileaks obtaining and leaking data, including the Podesta emails, but, a look at me strategy impairs the full exploration and use of leaked data.

Is that really the goal of Wikileaks?

PS: If you are interested in leaking without games or redaction, ping me. I’m interested in helping with such leaks.

November 6, 2016

Debate Night Twitter: Analyzing Twitter’s Reaction to the Presidential Debate

Filed under: Data Mining,Government,Twitter — Patrick Durusau @ 5:34 pm

Debate Night Twitter: Analyzing Twitter’s Reaction to the Presidential Debate by George McIntire.

A bit dated content-wise but George covers techniques, from data gathering to analysis, useful for future events. Possible Presidential inauguration riots on January 20, 2017 for example. Or, the 2017 Super Bowl, where Lady GaGa will be performing.

From the post:

This past Sunday, Donald Trump and Hillary Clinton participated in a town hall-style debate, the second of three such events in this presidential campaign. It was an extremely contentious affair that reverberated across social media.

The political showdown was massively anticipated; the negative atmosphere of the campaign and last week’s news of Trump making lewd comments about women on tape certainly contributed to the fire. Trump further escalated the immense tension by holding a press conference with women who’ve accused former President Bill Clinton of abusing.

With having a near unprecedented amount of attention and hostility, I wanted to gauge Twitter’s reaction to the event. In this project, I streamed tweets under the hashtag #debate and analyzed them to discover trends in Twitter’s mood and how users were reacting to not just the debate overall but to certain events in the debate.

What techniques will you apply to your tweet data sets?

November 5, 2016

Freedom of Speech/Press – Great For “Us” – Not So Much For You (Wikileaks)

Filed under: Free Speech,Politics,Social Media,Wikileaks — Patrick Durusau @ 8:33 pm

The New York Times, sensing a possible defeat of its neo-liberal agenda on November 8, 2016, has loosed the dogs of war on social media in general and Wikileaks in particular.

Consider the sleight of hand in Farhad Manjoo’s How the Internet Is Loosening Our Grip on the Truth, which argues on one hand,


You’re Not Rational

The root of the problem with online news is something that initially sounds great: We have a lot more media to choose from.

In the last 20 years, the internet has overrun your morning paper and evening newscast with a smorgasbord of information sources, from well-funded online magazines to muckraking fact-checkers to the three guys in your country club whose Facebook group claims proof that Hillary Clinton and Donald J. Trump are really the same person.

A wider variety of news sources was supposed to be the bulwark of a rational age — “the marketplace of ideas,” the boosters called it.

But that’s not how any of this works. Psychologists and other social scientists have repeatedly shown that when confronted with diverse information choices, people rarely act like rational, civic-minded automatons. Instead, we are roiled by preconceptions and biases, and we usually do what feels easiest — we gorge on information that confirms our ideas, and we shun what does not.

This dynamic becomes especially problematic in a news landscape of near-infinite choice. Whether navigating Facebook, Google or The New York Times’s smartphone app, you are given ultimate control — if you see something you don’t like, you can easily tap away to something more pleasing. Then we all share what we found with our like-minded social networks, creating closed-off, shoulder-patting circles online.

This gets to the deeper problem: We all tend to filter documentary evidence through our own biases. Researchers have shown that two people with differing points of view can look at the same picture, video or document and come away with strikingly different ideas about what it shows.

You caught the invocation of authority by Manjoo, “researchers have shown,” etc.

But did you notice he never shows his other hand?

If the public is so bat-shit crazy that it takes all social media content as equally trustworthy, what are we to do?

Well, that is the question isn’t it?

Manjoo invokes “dozens of news outlets” who are tirelessly but hopelessly fact checking on our behalf in his conclusion.

The strong implication is that without the help of “media outlets,” you are a bundle of preconceptions and biases doing what feels easiest.

“News outlets,” on the other hand, are free from those limitations.

You bet.

If you thought Majoo was bad, enjoy seething through Zeynep Tufekci’s claims that Wikileaks is an opponent of privacy, sponsor of censorship and opponent of democracy, all in a little over 1,000 words (1069 exact count). Wikileaks Isn’t Whistleblowing.

It’s a breath taking piece of half-truths.

For example, playing for your sympathy, Tufekci invokes the need of dissidents for privacy. Even to the point of invoking the ghost of the former Soviet Union.

Tufekci overlooks and hopes you do as well, that these emails weren’t from dissidents, but from people who traded in and on the whims and caprices at the pinnacles of American power.

Perhaps realizing that is too transparent a ploy, she recounts other data dumps by Wikileaks to which she objects. As lawyers say, if the facts are against you, pound on the table.

In an echo of Manjoo, did you know you are too dumb to distinguish critical information from trivial?

Tufekci writes:


These hacks also function as a form of censorship. Once, censorship worked by blocking crucial pieces of information. In this era of information overload, censorship works by drowning us in too much undifferentiated information, crippling our ability to focus. These dumps, combined with the news media’s obsession with campaign trivia and gossip, have resulted in whistle-drowning, rather than whistle-blowing: In a sea of so many whistles blowing so loud, we cannot hear a single one.

I don’t think you are that dumb.

Do you?

But who will save us? You can guess Tufekci’s answer, but here it is in full:


Journalism ethics have to transition from the time of information scarcity to the current realities of information glut and privacy invasion. For example, obsessively reporting on internal campaign discussions about strategy from the (long ago) primary, in the last month of a general election against a different opponent, is not responsible journalism. Out-of-context emails from WikiLeaks have fueled viral misinformation on social media. Journalists should focus on the few important revelations, but also help debunk false misinformation that is proliferating on social media.

If you weren’t frightened into agreement by the end of her parade of horrors:


We can’t shrug off these dangers just because these hackers have, so far, largely made relatively powerful people and groups their targets. Their true target is the health of our democracy.

So now Wikileaks is gunning for democracy?

You bet. 😉

Journalists of my youth, think Vietnam, Watergate, were aggressive critics of government and the powerful. The Panama Papers project is evidence that level of journalism still exists.

Instead of whining about releases by Wikileaks and others, journalists* need to step up and provide context they see as lacking.

It would sure beat the hell out of repeating news releases from military commanders, “justice” department mouthpieces, and official but “unofficial” leaks from the American intelligence community.

* Like any generalization this is grossly unfair to the many journalists who work on behalf of the public everyday but lack the megaphone of the government lapdog New York Times. To those journalists and only them, do I apologize in advance for any offense given. The rest of you, take such offense as is appropriate.

Clinton/Podesta Map (through #30)

Filed under: Data Mining,Hillary Clinton,Politics,Wikileaks — Patrick Durusau @ 7:26 pm

Charlie Grapski created Navigating Wikileaks: A Guide to the Podesta Emails.

podesta-map-grapski-460

The listing take 365 pages to date so this is just a tiny sample image.

I don’t have a legend for the row coloring but have tweeted to Charlie about the same.

Enjoy!

November 4, 2016

The U.S. Government And Zero-Day Vulnerabilities: …

Filed under: Cybersecurity,Government,Security — Patrick Durusau @ 8:37 pm

The U.S. Government And Zero-Day Vulnerabilities: From Pre-Heartbleed To Shadow Brokers by Jason Healey. (PDF version)

I have seldom seen this many weasel words used by a non-lawyer, at least in one sentence:

We estimate with moderate confidence that the current U.S. arsenal of zero-day vulnerabilities is probably in the dozens.

In fuller context, followed by more weaseling:


We estimate with moderate confidence that the current U.S. arsenal of zero-day vulnerabilities is probably in the dozens. The arsenal is a function of several factors, an equation through which it is difficult to get much higher than 50 or 60. The factors include how many years the United States has been retaining zero days (at least fifteen), how many are retained per year (dozens before 2014 and single digits since), the average number burned per year (say 50 percent), the average life of a zero day once used (approximately 300 days[39]), the average number of zero days discovered by vendors or used by other actors which thereby renders them useless for the United States (25 percent), and the average half-life of a zero-day vulnerability if not used (approximately 12 months). Note that this count critically depends on the “single digit per year” assessment discussed above. This count does not include battlefield and non-commercial systems, non-U.S. systems (such as the TopSec firewall vulnerabilities in the Shadow Brokers’ release), or U.S. government exploits that utilize vulnerabilities that have already been made public. (emphasis in original)

The critical lesson I take from Healey is that sovereigns don’t voluntarily disarm to their disadvantage. Ever.

Reciprocity. Isn’t that when you treat others as they treat you?

Governments that put users at risk have no reasonable expectation of any better treatment from others.

Considering that all of the major breaches of the last 24 months involved no zero-day exploits, you have to wonder who the U.S. government intends to hack that is all that clever?

Hire Fancy Bear to send them a GMail phishing email. 😉

PS: Don’t hire the FBI. It took them two weeks and custom software sort emails. (Clinton/Abedin/Weiner laptop)

BBC News Could Do Better: Scottish witchcraft book published online

Filed under: BBC,Books — Patrick Durusau @ 7:41 pm

Scottish witchcraft book published online

From the post:

The Names of Witches in Scotland, 1658 collection, was drawn up during a time when the persecution of supposed witches was rife.

The book also lists the towns where the accused lived and notes of confession.

It is believed many were healers, practicing traditional folk medicine.

Some of the notes give small insights into the lives of those accused.

It is recorded that the spouse of Agnes Watsone, from Dumbarton, is “umquhile” (deceased).

A majority of those accused of witchcraft were women although the records reveal that some men were also persecuted.

Jon Gilchreist and Robert Semple, from Dumbarton, are recorded as sailors. A James Lerile of Alloway, Ayr, is noted as “clenged”, in other words cleaned or made clean.

While Mr Lerile’s fate is unclear, the term probably meant banishment or death.

I’m glad BBC News drew attention to this volume but the only links in the post go to a very annoying commercial site that has transcribed the work.

🙁

With very little effort, I can send you to images of the original:

Names of the witches (in Scotland) 1658.

Some readers (cough), may find the commercial service useful. OK, but BBC News should include links to originals, especially then those are sans annoying subscription requests.

Weakly Weaponized Open Data

Filed under: Open Data,Transparency,Weaponized Open Data — Patrick Durusau @ 7:06 pm

Berners-Lee raises spectre of weaponized open data by Bill Camarda.

From the post:

open-data-sabotage-460

Practically everybody loves open data, ie “data that anyone can access, use or share”. And nobody loves it more than Tim Berners-Lee, creator of the World Wide Web, and co-founder of the Open Data Institute (ODI).

Berners-Lee and his ODI colleagues have spent years passionately evangelizing governments and companies to publicly release their non-personal data for use to improve communities.

So when he recently told the Guardian that hackers could use open data to create societal chaos, it might have been this year’s most surprising “man bites dog” news story.

What’s going on here? The growing fear of data sabotage, that’s what.

Bill focuses on the manipulation and/or planting of false data, which could result in massive traffic jams, changes in market prices, etc.

In fact, Berners-Lee says in the original Guardian story:


“If you falsify government data then there are all kinds of ways that you could get financial gain, so yes,” he said, “it’s important that even though people think about open data as not a big security problem, it is from the point of view of being accurate.”

He added: “I suppose it’s not as exciting as personal data for hackers to get into because it’s public.”

Disruptive to some, profitable to others, but what should be called weakly weaponized open data.

Here is one instance of strongly weaponized open data.

Scenario: We Don’t Need No Water, Let The Motherfucker Burn

The United States is currently experiencing a continuing drought. From the U.S. Drought Monitor:

drought-us-460

Keying on the solid red color around Atlanta, GA, Fire Weather, a service of the National Weather Service, estimates the potential impact of fires near Atlanta:

atlanta-fire-weather-prediction-clip

Impacted by a general conflagration around Atlanta:

Population: 2,783,418
Airports: 38
Miles of Interstate: 556
Miles of Rail: 2399
Parks: 4
Area: 27,707 Sq. Miles

Pipelines are missing from the list of impacts. For that, consult the National Pipeline Mapping System where even a public login reveals:

fulton-pipelines-460

The red lines are hazardous liquid pipelines, blue lines are gas transmission pipelines, the yellow lines outline Fulton County.

We have located a likely place for a forest fire, have some details on its probable impact and a rough idea of gas and other pipelines in the prospective burn area.

Oh, we need a source of ignition. Road flares anyone?

wsdot-flares-460-clip

From the WSDOT, Winter Driving Supply Checklist. Emergency kits with flares are available at box stores and online.

Bottom line:

Intentional forest fires can be planned from public data sources. Governments gratuitously suggest non-suspicious methods of transporting forest fire starting materials.

Details I have elided over, such as evacuation routes, fire watch stations, drones as fire starters, fire histories, public events, plus greater detail from the resources cited, are all available from public sources.

What are your Weaponized Open Data risks?

The GCHQ Puzzle Book

Filed under: Books,Cryptography,Law — Patrick Durusau @ 9:20 am

The GCHQ Puzzle Book

The Amazon description:

If 3=T, 4=S, 5=P, 6=H, 7=H … what is 8?

What is the next letter in the sequence: M, V, E, M, J, S, U, ?

Which of the following words is the odd one out: CHAT, COMMENT, ELF, MANGER, PAIN, POUR?

GCHQ is a top-secret intelligence and security agency which recruits some of the very brightest minds. Over the years, their codebreakers have helped keep our country safe, from the Bletchley Park breakthroughs of WWII to the modern-day threat of cyberattack. So it comes as no surprise that, even in their time off, the staff at GCHQ love a good puzzle. Whether they’re recruiting new staff or challenging each other to the toughest Christmas quizzes and treasure hunts imaginable, puzzles are at the heart of what GCHQ does. Now they’re opening up their archives of decades’ worth of codes, puzzles and challenges for everyone to try.
(emphasis in original)

Hard to say if successful completion of the GCHQ Puzzle Book or hacking into GCHQ would be the better way to introduce yourself to the GCHQ.

Depends on which department within GCHQ captures your interest. 😉

Be aware that some pedestrian agencies and their personnel view intrusion into government computers to be crime and punishable as such.

More sophisticated agencies/personnel realize that “…in Jersey, anything is legal so long as you don’t get caught” and/or if you have something of sufficient value to trade.

The “rule of law,” and “letter of the law” stuff is for groundlings. Don’t be a groundling.

Tracking Mall Shoppers With ISMI Numbers (Legally?)

Filed under: Cybersecurity,Security — Patrick Durusau @ 9:00 am

Tweets from a retailer whose initials are A-M-A-Z-O-N remind me daily there are less than 30 days until Black Friday. (Non-U.S. readers, Black Friday is an attempt to build up a sense of personal worth weakened by the prior day’s association with family members. “I shop, therefore my life has meaning.”)

Build your own IMSI slurping, phone-stalking Stingray-lite box – using bog-standard Wi-Fi by John Leyden.

From the post:

Black Hat EU Wi-Fi networks can tease IMSI numbers out of nearby smartphones, allowing pretty much anyone to wirelessly track and monitor people by their handsets’ fingerprints. (emphasis in original)

See John’s post for the details but if only being able to track people by their cellphones sounds ho-hum, think again.

Mall shoppers are tracked by observers, video, credit card purchases, but what about tracking their physical locations from entry into the mall, all ay shopping until they exit?

Inexpensively, unless you want to triangulate their precise locations.

Assuming the data is centralized for processing, identification of shoppers who previously visited ****, or who just were at the food court, or even individuals, could be provided in real time.

So far as I know, Wi-Fi networks are legal in all fifty states of the United States.

The presentation and the slides.

Privacy tip: Leave you smart phone in your car.

November 3, 2016

Understanding the fundamentals of attacks (Theory of Exploitation)

Filed under: Computer Science,Cybersecurity,Security — Patrick Durusau @ 8:31 pm

Understanding the fundamentals of attacks – What is happening when someone writes an exploit? by Halvar Flake / Thomas Dullien.

The common “bag of tricks” as Halvar refers to them for hacking, does cover all the major data breaches for the last 24 months.

No zero-day exploits.

Certainly none of the deep analysis offered by Halvar here.

Still, you owe it to yourself and your future on one side or the other of computer security, to review these slides and references carefully.

Even though Halvar concludes (in part)

Exploitation is programming emergent weird machines.

It does not require EIP/RIP, and is not a bad of tricks.

Theory of exploitation is still in embryonic stage.

Imagine the advantages of having mastered the art of exploitation theory at its inception.

In an increasingly digital world, you may be worth your own weight in gold. 😉

PS: Specifying the subject identity properties of exploits will assist in organizing them for future use/defense.

One expert hacker is like a highly skilled warrior.

Making exploits easy to discover/use by average hackers is like a skilled warrior facing a company of average fighters.

The outcome will be bloody, but never in doubt.

Attn: Secrecy Bed-Wetters! All Five Volumes of Bay of Pigs History Released!

Filed under: FOIA,Government,Government Data,Transparency — Patrick Durusau @ 4:06 pm

Hand-wringers and bed-wetters who use government secrecy to hide incompetence and errors will sleep less easy tonight.

All Five Volumes of Bay of Pigs History Released and Together at Last: FRINFORMSUM 11/3/2016 by Lauren Harper.

From the post:

After more than twenty years, it appears that fear of exposing the Agency’s dirty linen, rather than any significant security information, is what prompts continued denial of requests for release of these records. Although this volume may do nothing to modify that position, hopefully it does put one of the nastiest internal power struggles into proper perspective for the Agency’s own record.” This is according to Agency historian Jack Pfeiffer, author of the CIA’s long-contested Volume V of its official history of the Bay of Pigs invasion that was released after years of work by the National Security Archive to win the volume’s release. Chief CIA Historian David Robarge states in the cover letter announcing the document’s release that the agency is “releasing this draft volume today because recent 2016 changes in the Freedom of Information Act (FOIA) requires us to release some drafts that are responsive to FOIA requests if they are more than 25 years old.” This improvement – codified by the FOIA Improvement Act of 2016 – came directly from the National Security Archive’s years of litigation.

The CIA argued in court for years – backed by Department of Justice lawyers – that the release of this volume would “confuse the public.” National Security Archive Director Tom Blanton says, “Now the public gets to decide for itself how confusing the CIA can be. How many thousands of taxpayer dollars were wasted trying to hide a CIA historian’s opinion that the Bay of Pigs aftermath degenerated into a nasty internal power struggle?”

To read all five volumes of the CIA’s Official History of the Bay of Pigs Operation – together at last – visit the National Security Archive’s website.

Even the CIA’s own retelling of the story, The Bay of Pigs Invasion, ends with a chilling reminder for all “rebels” being presently supported by the United States.


Brigade 2506’s pleas for air and naval support were refused at the highest US Government levels, although several CIA contract pilots dropped munitions and supplies, resulting in the deaths of four of them: Pete Ray, Leo Baker, Riley Shamburger, and Wade Gray.

Kennedy refused to authorize any extension beyond the hour granted. To this day, there has been no resolution as to what caused this discrepancy in timing.

Without direct air support—no artillery and no weapons—and completely outnumbered by Castro’s forces, members of the Brigade either surrendered or returned to the turquoise water from which they had come.

Two American destroyers attempted to move into the Bay of Pigs to evacuate these members, but gunfire from Cuban forces made that impossible.

In the following days, US entities continued to monitor the waters surrounding the bay in search of survivors, with only a handful being rescued. A few members of the Brigade managed to escape and went into hiding, but soon surrendered due to a lack of food and water. When all was said and done, more than seventy-five percent of Brigade 2506 ended up in Cuban prisons.

100% captured or killed. There’s an example of US support.

In a less media savvy time, the US did pay $53 million (in 1962 dollars, about $424 million today) for the release of 1113 members of Brigade 2506.

Another important fact is that fifty-seven (57) years of delay enabled the participants to escape censure and/or a trip to the gallows for their misdeeds and crimes.

Let’s not let that happen with the full CIA Torture Report. Even the sanitized 6,700 page version would be useful. More so the documents upon which it was based.

All of that exists somewhere. We lack a person with access and moral courage to inform their fellow citizens of the full truth about the CIA torture program. So far.


Update: Michael Best, NatSecGeek advises CIA Histories has the most complete CIA history collection. Thanks Michael!

« Newer PostsOlder Posts »

Powered by WordPress