Open-Source R software driving Big Data analytics in government by David Smith.
From the post:
As government agencies and departments expand their capabilities for collecting information, the volume and complexity of digital data stored for public purposes is far outstripping departments’ ability to make sense of it all. Even worse, with data siloed within individual departments and little cross-agency collaboration, untold hours and dollars are being spent on data collection and storage with return on investment in the form of information-based products and services for the public good.
But that may now be starting to change, with the Obama administration’s Big Data Research and Development Initiative.
In fact, the administration has had a Big Data agenda since its earliest days, with the appointment of Aneesh Chopra as the nation’s first chief technology officer in 2009. (Chopra passed the mantle to current CTO Todd Park in March.) One of Chopra’s first initiatives was the creation of data.gov, a vehicle to make government data and open-source tools available in a timely and accessible format for a community of citizen data scientists to make sense of it all.
For example, independent statistical analysis of data released by data.gov revealed a flaw in the 2000 Census results that apparently went unnoticed by the Census Bureau.
David goes on to give some other examples of the use of R with government data.
The US federal government is diverse enough that its IT solutions will be diverse as well. But R will be familiar to some potential clients.
I first saw this at the Revolutions blog on R.