Another Word For It Patrick Durusau on Topic Maps and Semantic Diversity

January 19, 2014

Medicare Spending Data…

Filed under: Government,Government Data,Transparency — Patrick Durusau @ 2:11 pm

Medicare Spending Data May Be Publicly Available Under New Policy by Gavin Baker.

From the post:

On Jan. 14, the Centers for Medicare & Medicaid Services (CMS) announced a new policy that could bring greater transparency to Medicare, one of the largest programs in the federal government. CMS revoked its long-standing policy not to release publicly any information about Medicare’s payments to doctors. Under the new policy, the agency will evaluate requests for such information on a case-by-case basis. Although the impact of the change is not yet clear, it creates an opportunity for a welcome step forward for data transparency and open government.

Medicare’s tremendous size and impact – expending an estimated $551 billion and covering roughly 50 million beneficiaries in 2012 – mean that increased transparency in the program could have big effects. Better access to Medicare spending data could permit consumers to evaluate doctor quality, allow journalists to identify waste or fraud, and encourage providers to improve health care delivery.

Until now, the public hasn’t been able to learn how much Medicare pays to particular medical businesses. In 1979, a court blocked Medicare from releasing such information after doctors fought to keep it secret. However, the court lifted the injunction in May 2013, freeing CMS to consider whether to release the data.

In turn, CMS asked for public comments about what it should do and received more than 130 responses. The Center for Effective Government was among the organizations that filed comments, calling for more transparency in Medicare spending and urging CMS to revoke its previous policy implementing the injunction. After considering those comments, CMS adopted its new policy.

The change may allow the public to examine the reimbursement amounts paid to medical providers under Medicare. Under the new approach, CMS will not release those records wholesale. Instead, the agency will wait for specific requests for the data and then evaluate each to consider if disclosure would invade personal privacy. While information about patients is clearly off-limits, it’s not clear what kind of information about doctors CMS will consider private, so it remains to be seen how much information is ultimately disclosed under the new policy. It should be noted, however, that the U.S. Supreme Court has held that businesses don’t have “personal privacy” under the Freedom of Information Act (FOIA), and the government already discloses the amounts it pays to other government contractors.

The announcement from CMS: Modified Policy on Freedom of Information Act Disclosure of Amounts Paid to Individual Physicians under the Medicare Program

The case by case determination of a physician’s privacy rights is an attempt to discourage requests for public information.

If all physician payment data, say by procedure, were available in state by state data sets, local residents in a town of 500 would know a 2,000 x-rays a year is on the high side. Without every knowing any patient’s identity.

If you are a U.S. resident, take this opportunity to push for greater transparency in Medicare spending. Be polite and courteous but also be persistent. You need no more reason than an interest in how Medicare is being spent.

Let’s have an FOIA (Freedom of Information Act) request pending for every physician in the United States within 90 days of the CMS rule becoming final.

It’s not final yet, but when it is, let slip the lease on the dogs of FOAI.

January 18, 2014

Pay the Man!

Filed under: Publishing,Transparency — Patrick Durusau @ 11:07 am

Books go online for free in Norway by Martin Chilton.

From the post:

More than 135,000 books still in copyright are going online for free in Norway after an innovative scheme by the National Library ensured that publishers and authors are paid for the project.

The copyright-protected books (including translations of foreign books) have to be published before 2000 and the digitising has to be done with the consent of the copyright holders.

National Library of Norway chief Vigdis Moe Skarstein said the project is the first of its kind to offer free online access to books still under copyright, which in Norway expires 70 years after the author’s death. Books by Stephen King, Ken Follett, John Steinbeck, Jo Nesbø, Karin Fossum and Nobel Laureate Knut Hamsun are among those in the scheme.

The National Library has signed an agreement with Kopinor, an umbrella group representing major authors and publishers through 22 member organisations, and for every digitised page that goes online, the library pays a predetermined sum to Kopinor, which will be responsible for distributing the royalties among its members. The per-page amount was 0.36 Norwegian kroner (four pence), which will decrease to three pence when the online collection reaches its estimated target of 250,000 books.

Norway has discovered a way out of the copyright conundrum, pay the man!

Can you imagine the impact if the United States were to bulk license all of the Springer publications in digital format?

Some immediate consequences:

  1. All citizen-innovators would have access to a vast library of high quality content, without restriction by place of employment or academic status.
  2. Taking over the cost of Springer materials would act as a additional funding for libraries with existing subscriptions.
  3. It would even out access to Springer materials across the educational system in the U.S.
  4. It would reduce the administrative burden on both libraries and Springer by consolidating all existing accounts into one account.
  5. Springer could offer “advanced” services in addition to basic search and content for additional fees, leveraged on top of the standard content.
  6. Other vendors could offer “advanced” services for fees leveraged on top of standard content.

I have nothing against the many “open access” journals but bear in mind the vast legacy of science and technology that remains the property of Springer and others.

The principal advantage that I would pitch to Springer would be the availability of its content under bulk licensing would result in other vendors building services on top of that content.

What advantage is there for Springer? Imagine that you can be either a road (content) or a convenience store (app. built on content) next to the road. Which one gets maintained longer?

Everybody has an interest in maintaining and even expanding the road. By becoming part of the intellectual infrastructure of education, industry and government, even more than it is now, Springer would secure a very stable and lucrative future.

Put that way, I would much rather be the road than the convenience store.

You?

January 14, 2014

Home Invasion by Google

Filed under: Data Integration,Privacy,Transparency — Patrick Durusau @ 2:58 pm

When Google closes the Nest deal, privacy issues for the internet of things will hit the big time by Stacey Higginbotham.

From the post:

Google rocked the smart home market Monday with its intention to purchase connected home thermostat maker Nest for $3.2 billion, which will force a much-needed conversation about data privacy and security for the internet of things.

It’s a conversation that has seemingly stalled as advocates for the connected home expound upon the benefits in convenience, energy efficiency and even the health of people who are collecting and connecting their data and devices together through a variety of gadgets and services. On the other side are hackers and security researchers who warn how easy some of the devices are to exploit — gaining control of data or even video streams about what’s going on in the home.

So far the government, in the form of the Federal Trade Commission — has been reluctant to make rules and is still gathering information. A security research told the FTC at a Nov. 19 event that companies should be fined for data breaches, which would encourage companies to design data protection into their products from the beginning. Needless to say, industry representatives were concerned that such an approach would “stifle innovation.” Even at CES an FTC commissioner expressed a similar sentiment — namely that the industry was too young for rules.

Stacey writes a bit further down:

Google’s race to gather data isn’t evil, but it could be a problem

My assumption is that Google intends to use the data it is racing to gather. Google may not know or foresee all the potential uses for the data it collects (sales to the NSA?) but it has been said: “Data is the new oil.” Big Data Is Not the New Oil by Jer Thorp.

Think of Google as a successful data wildcatter, which in the oil patch resulted in heirs wealthy enough to attempt to corner the world silver market.

Don’t be mislead by Jer’s title, he means to decry the c-suite use of a phrase read on a newsstand cover. Later he writes:

Still, there are some ways in which the metaphor might be useful.

Perhaps the “data as oil” idea can foster some much-needed criticality. Our experience with oil has been fraught; fortunes made have been balanced with dwindling resources, bloody mercenary conflicts, and a terrifying climate crisis. If we are indeed making the first steps into economic terrain that will be as transformative (and possibly as risky) as that of the petroleum industry, foresight will be key. We have already seen “data spills” happen (when large amounts of personal data are inadvertently leaked). Will it be much longer until we see dangerous data drilling practices? Or until we start to see long term effects from “data pollution”?

An accurate account of our experience with oil, as far as it goes.

Unlike Jer, I see data continuing to follow the same path as oil, coal, timber, gold, silver, gemstones, etc.

I say continuing because scribes were the original data brokers. And enjoyed a privileged role in society. Printing reduced the power of scribes but new data brokers took their place. Libraries and universities and those they trained had more “data” than others. Specific examples of scientia potentia est (“knowledge is power”), are found in: The Information Master: Jean-Baptiste Colbert‘s Secret State Intelligence System (Louis XiV) and IBM and the Holocaust. (Not to forget the NSA.)

Information, or “data” if you prefer, has always been used to advance some interests and used against others. The electronic storage of data has reduced the cost of using data that was known to exist but was too expensive or inaccessible for use.

Consider marital history. For the most part, with enough manual effort and travel, a person’s marital history has been available for the last couple of centuries. Records are kept of marriages, divorces, etc. But accessing that information wasn’t a few strokes on a keyboard and perhaps an access fee. Same data, different cost of access.

Jer’s proposals and others I have read, are all premised on people foregoing power, advantage, profit or other benefits from obtaining, analyzing and acting upon data.

I don’t know of any examples in the history where that has happened.

Do you?

Access to State Supreme Court Data

Filed under: Government,Law,Law - Sources,Transparency — Patrick Durusau @ 10:10 am

Public access to the states’ highest courts: a report card

The post focuses on the Virginia Supreme Court, not surprisingly since it is the Open Virginia Law project.

But it also mentions Public Access to the States’ Highest Courts: A Report Card (PDF), which is a great summary of public access to state (United States) supreme court data. With hyperlinks to relevant resources.

The report card will definitely be of interest to law students, researchers, librarians, lawyers and even members of the public.

In addition to being a quick synopsis for public policy discussions, it makes a great hand list of state court resources.

An earlier blog post pointed out that the Virginia Supreme Court is now posting audio recordings of oral arguments.

Could be test data for speech recognition and other NLP tasks or used if you are simply short of white noise. 😉

January 12, 2014

Transparency and Bank Failures

Filed under: Finance Services,Open Data,Transparency — Patrick Durusau @ 11:40 am

The Relation Between Bank Resolutions and Information Environment: Evidence from the Auctions for Failed Banks by João Granja.

Abstract:

This study examines the impact of disclosure requirements on the resolution costs of failed banks. Consistent with the hypothesis that disclosure requirements mitigate information asymmetries in the auctions for failed banks, I find that, when failed banks are subject to more comprehensive disclosure requirements, regulators incur lower costs of closing a bank and retain a lower portion of the failed bank’s assets, while bidders that are geographically more distant are more likely to participate in the bidding for the failed bank. The paper provides new insights into the relation between disclosure and the reorganization of a banking system when the regulators’ preferred plan of action is to promote the acquisition of undercapitalized banks by healthy ones. The results suggest that disclosure regulation policy influences the cost of resolution of a bank and, as a result, could be an important factor in the definition of the optimal resolution strategy during a banking crisis event.

A reminder that transparency needs to be broader than open data in science and government.

In the case of bank failures, transparency lowers the cost of such failures for the public.

Some interests profit from less transparency in bank failures and other interests (like the public) profit from greater transparency.

If bank failure doesn’t sound like a current problem, consider Map of Banks Failed since 2008. (Select from Failed Banks Map (under Quick Links) to display the maps.) U.S. only. Do you know of a similar map for other countries?

Speaking of transparency, it would be interesting to track the formal, financial and social relationships of those acquiring failed bank assets.

You know, the ones that are selling for less than fair market value due to a lack of transparency.

December 21, 2013

Google Transparency Report

Filed under: Marketing,Search Behavior,Search Data,Search History,Transparency — Patrick Durusau @ 5:32 pm

Google Transparency Report

The Google Transparency Report consists of five parts:

  1. Government requests to remove content

    A list of the number of requests we receive from governments to review or remove content from Google products.

  2. Requests for information about our users

    A list of the number of requests we received from governments to hand over user data and account information.

  3. Requests by copyright owners to remove search results

    Detailed information on requests by copyright owners or their representatives to remove web pages from Google search results.

  4. Google product traffic

    The real-time availability of Google products around the world, historic traffic patterns since 2008, and a historic archive of disruptions to Google products.

  5. Safe Browsing

    Statistics on how many malware and phishing websites we detect per week, how many users we warn, and which networks around the world host malware sites.

I pointed out the visualizations of the copyright holder data earlier today.

There are a number of visualizations of the Google Transparency Report and I may assemble some of the more interesting ones for your viewing pleasure.

You certainly should download the data sets and/or view them as Google Docs Spreadsheets.

I say that because while Google is more “transparent” than the current White House, it’s not all that transparent at all.

Take the government take down requests for example.

According to the raw data file, the United States has made five (5) requests on the basis of national security, four (4) of which were for YouTube videos and one (1) was for one web search result.

Really?

And for no government request, is there sufficient information to identify the information that any government sought to conceal.

Google may have qualms about information governments want to conceal but that sounds like a marketing opportunity to me. (Being mindful of your availability to such governments.)

June 3, 2013

CIA, Solicitation and Government Transparency

Filed under: Government,Government Data,Transparency — Patrick Durusau @ 8:26 am

IBM battles Amazon over $600M CIA cloud deal by Frank Konkel, reports that IBM has protested a contract award for cloud computing by the CIA to Amazon.

The “new age” of government transparency looks a lot like the old age in that:

  • How Amazon obtained the award is not public.
  • The nature of the cloud to be built by Amazon is not public.
  • Whether Amazon has started construction on the proposed cloud is not public.
  • The basis for the protest by IBM is not public.

“Not public” means opportunities for incompetence in contract drafting and/or fraud by contractors.

How are members of the public or less well-heeled potential bidders suppose to participate in this discussion?

Or should I say “meaningfully participate” in the discussion over the cloud computing award to Amazon?

And what if others know the terms of the contract? CIA CTO Gus Hunt is reported as saying:

It is very nearly within our grasp to be able to compute on all human generated information,

If the proposed system is supposed to “compute on all human generated information,” so what?

How does knowing that aid any alleged enemies of the United States?

Other than the comfort that the U.S. makes bad technology decisions?

Keeping the content of such a system secret might disadvantage enemies of the U.S.

Keeping the contract for such a system secret disadvantages the public and other contractors.

Yes?

June 2, 2013

White House Releases New Tools… [Bank Robber’s Defense]

Filed under: Government,Government Data,Transparency — Patrick Durusau @ 9:27 am

White House Releases New Tools For Digital Strategy Anniversary by Caitlin Fairchild.

From the post:

The White House marked the one-year anniversary of its digital government strategy Thursday with a slate of new releases, including a catalog of government APIs, a toolkit for developing government mobile apps and a new framework for ensuring the security of government mobile devices.

Those releases correspond with three main goals for the digital strategy: make more information available to the public; serve customers better; and improve the security of federal computing.

Just scanning down the API list, it is a very mixed bag.

For example, there are four hundred and ten (410) individual APIs listed, the National Library of Medicine has twenty-four (24) and the U.S. Senate has one (1).

Defenders of this release will say we should not talk about the lack of prior efforts but focus on what’s coming.

I call that the bank robber’s defense.

All prosecutors want to talk about is what a bank robber did in the past. They never want to focus on the future.

Bank robbers would love to have the “let’s talk about tomorrow” defense.

As far as I know, it isn’t allowed anywhere.

Question: Why do we allow avoidance of responsibility with the “let’s talk about tomorrow” defense for government and others?

If you review the APIs for semantic diversity I would appreciate a pointer to your paper/post.

May 25, 2013

Deepbills

Filed under: Government,Government Data,Transparency — Patrick Durusau @ 5:56 pm

Cato’s “Deepbills” Project Advances Government Transparency by Jim Harper.

From the post:

But there’s no sense in sitting around waiting for things to improve. Given the incentives, transparency is something that we will have to force on government. We won’t receive it like a gift.

So with software we acquired and modified for the purpose, we’ve been adding data to the bills in Congress, making it possible to learn automatically more of what they do. The bills published by the Government Printing Office have data about who introduced them and the committees to which they were referred. We are adding data that reflects:

– What agencies and bureaus the bills in Congress affect;

– What laws the bills in Congress effect: by popular name, U.S. Code section, Statutes at Large citation, and more;

– What budget authorities bills include, the amount of this proposed spending, its purpose, and the fiscal year(s).

We are capturing proposed new bureaus and programs, proposed new sections of existing law, and other subtleties in legislation. Our “Deepbills” project is documented at cato.org/resources/data.

This data can tell a more complete story of what is happening in Congress. Given the right Web site, app, or information service, you will be able to tell who proposed to spend your taxpayer dollars and in what amounts. You’ll be able to tell how your member of Congress and senators voted on each one. You might even find out about votes you care about before they happen!

Two important points:

First, transparency must be forced upon government (I would add businesses).

Second, transparency is up to us.

Do you know something the rest of us should know?

On your mark!

Get set!

Go!

I first saw this at: Harper: Cato’s “Deepbills” Project Advances Government Transparency.

May 21, 2013

Searching on BillTrack50

Filed under: Government,Law,Transparency — Patrick Durusau @ 6:58 am

How to find what you are looking for – constructing a search on BillTrack50 by Karen Suhaka.

From the post:

Building a search on BillTrack50 is fairly straightforward, however it isn’t exactly like doing a Google search. So there’s a few things you need to keep in mind, which I’ll explain in this post. There’s also a few tips and tricks advanced users might find useful. Any bills that are introduced later and meet your search terms will be automatically added to your bill sheet (if you made a bill sheet).

Tracking “thumb on the scale” (TOTS) at the state level? BillTrack50 is a great starting point.

BillTrack50 provides surface facts, to which you can add vote trading, influence peddling and other routine legislative activities.

May 20, 2013

U.S. Senate Panel Discovers Nowhere Man [Apple As Tax Dodger]

Filed under: Government,Government Data,Transparency — Patrick Durusau @ 4:47 pm

Forty-seven years after Nowhere Man by the Beatles, a U.S. Senate panel discovers several nowhere men.

A Wall Street Journal Technology Alert:

Apple has set up corporate structures that have allowed it to pay little or no corporate tax–in any country–on much of its overseas income, according to the findings of a U.S. Senate examination.

The unusual result is possible because the iPhone maker’s key foreign subsidiaries argue they are residents of nowhere, according to the investigators’ report, which will be discussed at a hearing Tuesday where Apple CEO Tim Cook will testify. The finding comes from a lengthy investigation into the technology giant’s tax practices by the Senate Permanent Subcommittee on Investigations, led by Sens. Carl Levin (D., Mich.) and John McCain (R., Ariz.).

In additional coverage, Apple says:

Apple’s testimony also includes a call to overhaul: “Apple welcomes an objective examination of the US corporate tax system, which has not kept pace with the advent of the digital age and the rapidly changing global economy.”

Tax reform will be useful only if “transparent” tax reform.

Transparent tax reform mean every provision with more than a $100,000 impact on any taxpayer, names all the taxpayers impacted. Whether more or less taxes.

We have the data, we need the will to apply the analysis.

A tax-impact topic map anyone?

March 31, 2013

The new analytic stack…

Filed under: Analytics,Transparency — Patrick Durusau @ 5:00 am

The new analytic stack is all about management, transparency and users by George Mathew.

On transparency:

Predictive analytics are essential for data-driven leaders to craft their next best decision. There are a variety of techniques across the predictive and statistical spectrums that help businesses better understand the not too distant future. Today’s biggest challenge for predictive analytics is that it is delivered in a very black-box fashion. As business leaders rely more on predictive techniques to make great data-driven decisions, there needs to be much more of a clear-box approach.

Analytics need to be packaged with self-description of data lineage, derivation of how calculations were made and an explanation of the underlying math behind any embedded algorithms. This is where I think analytics need to shift in the coming years; quickly moving away from black-box capabilities, while deliberately putting decision makers back in the driver’s seat. That’s not just about analytic output, but how it was designed, its underlying fidelity and its inherent lineage — so that trusting in analytics isn’t an act of faith.

Now there’s an opportunity for topic maps.

Data lineage, derivations, math, etc. all have their own “logics” and the “logic” of how they are assembled for a particular use.

Could debate how to formalize those logics and might eventually reach agreement years after the need has passed.

Or, you could use a topic map to declare the subjects and relationships important for your analytics today.

And merge them with the logics you devise for tomorrows analytics.

March 24, 2013

Lobbyists 2012: Out of the Game or Under the Radar?

Filed under: Government,Government Data,Transparency — Patrick Durusau @ 10:35 am

Lobbyists 2012: Out of the Game or Under the Radar?

Executive Summary:

Over the past several years, both spending on lobbying and the number of active lobbyists has declined. A number of factors may be responsible, including the lackluster economy, a gridlocked Congress and changes in lobbying rules.

CRP finds that the biggest players in the influence game — lobbying clients across nearly all sectors — increased spending over the last five years. The top 100 lobbying firms income declined only 6 percent between 2007 and 2012 but the number of registered lobbyists dropped by 25 percent.

The more precipitous drop in the number of lobbyists is likely due to changes in the rules. More than 46 percent of lobbyists who were active in 2011 but not in 2012 continue to work for the same employers, suggesting that many have simply avoided the reporting limits while still contributing to lobbying efforts.

Whatever the cause, it is important to understand whether the same activity continues apace with less disclosure and to strengthen the disclosure regimen to ensure that it is clear, enforceable — and enforced. If there is a general sense that the rules don’t matter, there could be erosion to disclosure and a sense that this is an “honor system” that isn’t being honored any longer. This is important because, if people who are in fact lobbying do not register, citizens will be unable to understand the forces at work in shaping federal policy, and therefore can’t effectively participate in policy debates and counter proposals that are not in their interest. At a minimum, the Center for Responsive Politics will continue to aggregate, publish and scrutinize the data that is being reported, in order to explain trends in disclosure — or its omission.

A caution on relying on public records/disclosure for topic maps of political influence.

You can see the full report here.

My surprise was the discovery that:

[the] “honor system” that isn’t being honored any longer.

Lobbying for private advantage at public expense is contrary to any notion of “honor.”

Why the surprise that lobbyists are dishonorable? (However faithful they may be to their employers. Once bought, they stay bought.)

I first saw this at Full Text Reports.

March 21, 2013

Should Business Data Have An Audit Trail?

Filed under: Auditing,Business Intelligence,Datomic,Transparency — Patrick Durusau @ 11:19 am

The “second slide” I would lead with from Stuart Halloway’s Datomic, and How We Built It would be:

Should Business Data Have An Audit Trail?

Actually Stuart’s slide #65 but who’s counting? 😉

Stuart points out the irony of git, saying:

developer data is important enough to have an audit trail, but business data is not

Whether business data should always have an audit trail would attract shouts of yes and no, depending on the audience.

Regulators, prosecutors, good government types, etc., mostly shouting yes.

Regulated businesses, security brokers, elected officials, etc., mostly shouting no.

Some in between.

Datomic, which has some common characteristics with topic maps, gives you the ability to answer these questions:

  • Do you want auditable business data or not?
  • If yes to auditable business data, to what degree?

Rather different that just assuming it isn’t possible.

Abstract:

Datomic is a database of flexible, time-based facts, supporting queries and joins, with elastic scalability and ACID transactions. Datomic queries run your application process, giving you both declarative and navigational access to your data. Datomic facts (“datoms”) are time-aware and distributed to all system peers, enabling OLTP, analytics, and detailed auditing in real time from a single system.

In this talk, I will begin with an overview of Datomic, covering the problems that it is intended to solve and how its data model, transaction model, query model, and deployment model work together to solve those problems. I will then use Datomic to illustrate more general points about designing and implementing production software, and where I believe our industry is headed. Key points include:

  • the pragmatic adoption of functional programming
  • how dynamic languages fare in mission- and performance- critical settings
  • the importance of data, and the perils of OO
  • the irony of git, or why developers give themselves better databases than they give their customers
  • perception, coordination, and reducing the barriers to scale

Resources

  • Video from CME Group Technology Conference 2012
  • Slides from CME Group Technology Conference 2012

March 10, 2013

Using and abusing evidence

Filed under: Government,Medical Informatics,Transparency — Patrick Durusau @ 3:14 pm

New thematic series: Using and abusing evidence by Adrian Aldcroft.

From the post:

Scientific evidence plays an important role in guiding medical laws and policies, but how evidence is represented, and often misrepresented, warrants careful consideration. A new cross-journal thematic series headed by Genome Medicine, Using and abusing evidence in science and health policy, explores the application of evidence in healthcare law and policy in an attempt to uncover how evidence from research is translated into the public sphere. Other journals involved in the series include BMC Medical Ethics, BMC Public Health, BMC Medical Genomics, BMC Psychiatry, and BMC Medicine.

Articles already published include an argument for reframing the obesity epidemic through the use of the term caloric overconsumption, an examination of bioethics in popular science literature, and a look at the gap between reality and public perception when discussing the potential of stem cell therapies. Other published articles look at the quality of informed consent in pediatric research and evidence for genetic discrimination in the life insurance industry. More articles will be added to the series as they are published.

Articles published in this series were invited from delegates at the meeting “Using and Abusing Evidence in Science and Health Policy” held in Banff, Alberta, on May 30th-June 1st, 2012. We hope the publication of the article collection will contribute to the understanding of the ethical and political implications associated with the application of evidence in research and politics.

A useful series but I wonder how effective the identification of “abuse” of evidence will be without identifying its abusers?

And making the case for “abuse” of evidence in a compelling manner?

For example, changing “obesity” to “caloric overconsumption” (Addressing the policy cacophony does not require more evidence: an argument for reframing obesity as caloric overconsumption), carries the day if and only if one presumes a regulatory environment with the goal of improving public health.

The near toxic levels of high fructose corn syrup in the average American diet demonstrate the goals of food regulation in the United States have little to do with public health and welfare.

Identification of who make such policies, who benefits and who is harmed:

obesity

could go a long way towards creating a different regulatory environment.

March 6, 2013

State Sequester Numbers [Is This Transparency?]

Filed under: Government,Government Data,Transparency — Patrick Durusau @ 7:22 pm

A great visualization of the impact of sequestration state by state.

And, a post on the process followed to produce the visualization.

The only caveat being that one person read the numbers from PDF files supplied by the White House and another person typed them into a spreadsheet.

Doable with a small data set such as this one, but why was it necessary at all?

Once you have the data in a machine readable form, putting faces in the local community to the abstract categories should be the next step.

Topic maps anyone?

March 5, 2013

Transparency and the Digital Oil Drop

Filed under: Government,Government Data,Transparency — Patrick Durusau @ 3:52 pm

I left off yesterday pointing out three critical failures in the Digital
 Accountability and Transparency 
Act
 (DATA
 Act)

Those failures were:

  • Undefined goals with unrealistic deadlines.
  • Lack of incentives for performance.
  • Lack of funding for assigned duties.

Digital
 Accountability and Transparency 
Act
 (DATA
 Act) [DOA]

Make no mistake, I think transparency, particularly in government spending is very important.

Important enough that proposals for transparency should take it seriously.

In broad strokes, here is my alternative to the Digital Accountability and Transparency Act (DATA Act) proposal:

  • Ask the GAO, the federal agency with the most experience auditing other federal agencies, to prepare an estimate for:
    • Cost/Time for preparing a program internal to the GAO to produce mappings of agency financial records to a common report form.
    • Cost/Time to train GAO personnel on the mapping protocol.
    • Cost/Time for additional GAO staff for the creation of the mapping protocol and permanent GAO staff as liaisons with particular agencies.
    • Recommendations for incentives to promote assistance from agencies.
  • Upon approval and funding of the GAO proposal, which should include at least two federal agencies as test cases, that:
    • Test case agencies are granted additional funding for training and staff to cooperate with the GAO mapping team.
    • Test case agencies are granted additional funding for training and staff to produce reports as specified by the GAO.
    • Staff in test case agencies are granted incentives to assist in the initial mapping effort and maintenance of the same. (Positive incentives.)
  • The program of mapping of accounts expand no more often than every two to three years and only if prior agencies have achieved and remain in conformance.

Some critical differences between my sketch of a proposal and the Digital
 Accountability and Transparency 
Act
 (DATA
 Act):

  1. Additional responsibilities and requirements will be funded for agencies, including additional training and personnel.
  2. Agency staff will have incentives to learn the new skills and procedures necessary for exporting their data as required by the GAO.
  3. Instead of trying to swallow the Federal whale, the project proceeds incrementally and with demonstrable results.

Topic maps can play an important role in such a project but we should be mindful that projects rarely succeed or fail because of technology.

Project fail because, like the DATA Act, they ignore basic human needs, experience in similar situations (9/11), and substitute abuse for legitimate incentives.

March 4, 2013

Digital
 Accountability and Transparency 
Act
 (DATA
 Act) [DOA]

Filed under: Government,Government Data,Transparency — Patrick Durusau @ 5:44 pm

I started this series of posts in: Digital
 Accountability 
and
 Transparency 
Act
 (DATA
 Act) [The Details], where I concluded the Data Act had the following characteristics:

  • Secretary of the Treasury has one (1) year to design a common data format for unknown financial data in Federal agencies.
  • Federal agencies have one (1) year to comply with the common data format from the Secretary of the Treasure.
  • No penalties or bonuses for the Secretary of the Treasury.
  • No penalties or bonuses for Federal agencies failing to comply.
  • No funding for the Secretary of the Treasury to carry out the assigned duties.
  • No funding for Federal agencies to carry out the assigned duties.

As written, the Digital
 Accountability 
and
 Transparency 
Act
 (DATA
 Act) will be DOA (Dead On Arrival) in the current or any future session of Congress.

There are three (3) main reasons why that is the case.

A Common Data Format

Let me ask a dumb question: Do you remember 9/11?

Of course you do. And the United States has been in a state of war on terrorism every since.

I point that out because intelligence sharing (read common data format) was identified as a reason why the 9/11 attacks weren’t stopped and has been a high priority to solve since then.

Think about that: Reason why the attacks weren’t stopped and a high priority to correct.

This next September 11th will be the twelfth anniversary of those attacks.

Progress on intelligence sharing: Progress Made and Challenges Remaining in Sharing Terrorism-Related Information which I gloss in Read’em and Weep, along with numerous other GAO reports on intelligence sharing.

The good news is that we are less than five (5) years away from some unknown level of intelligence sharing.

The bad news is that puts us sixteen (16) years after 9/11 with some unknown level of intelligence sharing.

And that is for a subset of the entire Federal government.

A smaller set than will be addressed by the Secretary of the Treasury.

Common data format in a year? Really?

To say nothing of the likelihood of agencies changing the multitude of systems they have in place in a year.

No penalties or bonuses

You can think of this as the proverbial carrot and stick if you like.

What incentive does either the Secretary of the Treasury and/or Federal agencies have to engage in this fool’s errand pursuing a common data format?

In case you have forgotten, both the Secretary of the Treasury and Federal agencies have obligations under their existing missions.

Missions which they are designed by legislation and habit to discharge before they turn to additional reporting duties.

And what happens if they discharge their primary mission but don’t do the reporting?

Oh, they get reported to Congress. And ranked in public.

As Ben Stein would say, “Wow.”

No Funding

To add insult to injury, there is no additional funding for either the Secretary of the Treasury or Federal agencies to engage in any of the activities specified by the Digital
 Accountability 
and
 Transparency 
Act
 (DATA
 Act).

As I noted above, the Secretary of the Treasury and Federal agencies already have full plates with their current missions.

Now they are to be asked to undertake unfamiliar tasks, creation of a chimerical “common data format” and submitting reports based upon it.

Without any addition staff, training, or other resources.

Directives without resources to fulfill them are directives that are going to fail. (full stop)

Tentative Conclusion

If you are asking yourself, “Why would anyone advocate the Digital
 Accountability 
and
 Transparency 
Act
 (DATA
 Act)?,” five points for your house!

I don’t know of anyone who understands:

  1. the complexity of Federal data,
  2. the need for incentives,
  3. the need for resources to perform required tasks,

who thinks the Digital
 Accountability 
and
 Transparency 
Act
 (DATA
 Act) is viable.

Why advocate non-viable legislation?

Its non-viability make it an attractive fund raising mechanism.

Advocates can email, fund raise, telethon, rant, etc., to their heart’s content.

Advocating non-viable transparency lines an organization’s pocket at no risk of losing its rationale for existence.


The third post in this series, suggesting a viable way forward, will appear tomorrow under: Transparency and the Digital Oil Drop.

Digital
 Accountability 
and
 Transparency 
Act
 (DATA
 Act) [The Details]

Filed under: Government,Government Data,Transparency — Patrick Durusau @ 5:42 pm

The Data Transparency Coalition, the Sunlight Foundation and others are calling for reintroduction of the Digital
 Accountability 
and
 Transparency 
Act
 (DATA
 Act) in order to make U.S. government spending more transparent.

Transparency in government spending is essential for an informed electorate. An electorate that can call attention to spending that is inconsistent with policies voted for by the electorate. Accountability as it were.

But saying “transparency” is easy. Achieving transparency, not so easy.

Let’s look at some of the details in the DATA Act.

(2) DATA STANDARDS-

    ‘(A) IN GENERAL- The Secretary of the Treasury, in consultation with the Director of the Office of Management and Budget, the General Services Administration, and the heads of Federal agencies, shall establish Government-wide financial data standards for Federal funds, which may–

      ‘(i) include common data elements, such as codes, unique award identifiers, and fields, for financial and payment information required to be reported by Federal agencies;

      ‘(ii) to the extent reasonable and practicable, ensure interoperability and incorporate–

        ‘(I) common data elements developed and maintained by an international voluntary consensus standards body, as defined by the Office of Management and Budget, such as the International Organization for Standardization;

        ‘(II) common data elements developed and maintained by Federal agencies with authority over contracting and financial assistance, such as the Federal Acquisition Regulatory Council; and

        ‘(III) common data elements developed and maintained by accounting standards organizations; and

      ‘(iii) include data reporting standards that, to the extent reasonable and practicable–

        ‘(I) incorporate a widely accepted, nonproprietary, searchable, platform-independent computer-readable format;

        ‘(II) be consistent with and implement applicable accounting principles;

        ‘(III) be capable of being continually upgraded as necessary; and

        ‘(IV) incorporate nonproprietary standards in effect on the date of enactment of the Digital Accountability and Transparency Act of 2012.

    ‘(B) DEADLINES-

      ‘(i) GUIDANCE- The Secretary of the Treasury, in consultation with the Director of the Office of Management and Budget, shall issue guidance on the data standards established under subparagraph (A) to Federal agencies not later than 1 year after the date of enactment of the Digital Accountability and Transparency Act of 2012.

      ‘(ii) AGENCIES- Not later than 1 year after the date on which the guidance under clause (i) is issued, each Federal agency shall collect, report, and maintain data in accordance with the data standards established under subparagraph (A).

OK, I have a confession to make: I was a lawyer for ten years and reading this sort of thing is second nature to me. Haven’t practiced law in decades but I still read legal stuff for entertainment. 😉

First, read section A and write down the types of data you would have to collect for each of those items.

Don’t list the agencies/organizations you would have to contact, you probably don’t have enough paper in your office for that task.

Second, read section B and notice that the Secretary of the Treasury has one (1) years to issue guidance for all the data you listed under Section A.

That means gathering, analyzing, testing and designing a standard for all that data, most of which is unknown. Even to the GAO.

And, if they meet that one (1) year deadline, the various agencies have only one (1) year to comply with the guidance from the Secretary of the Treasury.

Do I need to comment on the likelihood of success?

As far as the Secretary of the Treasury, what happens if they don’t meet the one year deadline? Do you see any penalties?

Assuming some guidance emerges, what happens to any Federal agency that does not comply? Any penalties for failure? Any incentives to comply?

My reading is:

  • Secretary of the Treasury has one (1) year to design a common data format for unknown financial data in Federal agencies.
  • Federal agencies have one (1) year to comply with the common data format from the Secretary of the Treasure.
  • No penalties or bonuses for the Secretary of the Treasury.
  • No penalties or bonuses for Federal agencies failing to comply.
  • No funding for the Secretary of the Treasury to carry out the assigned duties.
  • No funding for Federal agencies to carry out the assigned duties.

Do you disagree with that reading of the Digital
 Accountability 
and
 Transparency 
Act
 (DATA
 Act)?

My analysis of that starting point appears in Digital
 Accountability 
and
 Transparency 
Act
 (DATA
 Act) [DOA]

February 28, 2013

From President Obama, The Opaque

Filed under: Government,Government Data,Open Data,Open Government,Transparency — Patrick Durusau @ 5:26 pm

Leaked BLM Draft May Hinder Public Access to Chemical Information

From the post:

On Feb. 8, EnergyWire released a leaked draft proposal from the U.S. Department of the Interior’s Bureau of Land Management on natural gas drilling and extraction on federal public lands. If finalized, the proposal could greatly reduce the public’s ability to protect our resources and communities. The new draft indicates a disappointing capitulation to industry recommendations.

The draft rule affects oil and natural gas drilling operations on the 700 million acres of public land administered by BLM, plus 56 million acres of Indian lands. This includes national forests, which are the sources of drinking water for tens of millions of Americans, national wildlife refuges, and national parks, which are widely used for recreation.

The Department of the Interior estimates that 90 percent of the 3,400 wells drilled each year on public and Indian lands use natural gas fracking, a process that pumps large amounts of water, sand, and toxic chemicals into gas wells at very high pressure to cause fissures in shale rock that contains methane gas. Fracking fluid is known to contain benzene (which causes cancer), toluene, and other harmful chemicals. Studies link fracking-related activities to contaminated groundwater, air pollution, and health problems in animals and humans.

If the leaked draft is finalized, the changes in chemical disclosure requirements would represent a major concession to the oil and gas industry. The rule would allow drilling companies to report the chemicals used in fracking to an industry-funded website, called FracFocus.org. Though the move by the federal government to require online disclosure is encouraging, the choice of FracFocus as the vehicle is problematic for many reasons.

First, the site is not subject to federal laws or oversight. The site is managed by the Ground Water Protection Council (GWPC) and the Interstate Oil and Gas Compact Commission (IOGCC), nonprofit intergovernmental organizations comprised of state agencies that promote oil and gas development. However, the site is paid for by the American Petroleum Institute and America’s Natural Gas Alliance, industry associations that represent the interests of member companies.

BLM would have little to no authority to ensure the quality and accuracy of the data reported directly to such a third-party website. Additionally, the data will not be accessible through the Freedom of Information Act since BLM is not collecting the information. The IOGCC has already declared that it is not subject to federal or state open records laws, despite its role in collecting government-mandated data.

Second, FracFocus.org makes it difficult for the public to use the data on wells and chemicals. The leaked BLM proposal fails to include any provisions to ensure minimum functionality on searching, sorting, downloading, or other mechanisms to make complex data more usable. Currently, the site only allows users to download PDF files of reports on fracked wells, which makes it very difficult to analyze data in a region or track chemical use. Despite some plans to improve searching on FracFocus.org, the oil and gas industry opposes making chemical data easier to download or evaluate for fear that the public “might misinterpret it or use it for political purposes.”

Don’t you feel safer? Knowing the oil and gas industry is working so hard to protect you from misinterpreting data?

Why the government is helping the oil and gas industry protect us from data I cannot say.

I mention this an example of testing for “transparency.”

Anything the government freely makes available with spreadsheet capabilities, isn’t transparency. It’s distraction.

Any data that the government tries to hide, that data has potential value.

The Center for Effective Government points out these are draft rules and when published, you need to comment.

Not a bad plan but not very reassuring given the current record of President Obama, the Opaque.

Alternatives? Suggestions for how data mining could expose those who own floors of the BLM, who drill the wells, etc?

February 26, 2013

EU Commission – Open Data Portal Open

Filed under: EU,Government,Government Data,Transparency — Patrick Durusau @ 1:53 pm

EU Commission – Open Data Portal Open

From the post:

The European Union Commission has unveiled a new Open Data Portal, with over 5,580 data sets – the majority of which comes from the Eurostat (the statistical office of the European Union). The portal is the result of the Commission’s ‘Open Data Strategy for Europe’, and will publish data from the European Commission and other bodies of the European Union; it already holds data from the European Environment Agency.

The portal has a SPARQL endpoint to provide linked data, and will also feature applications that use this data. The published data can be downloaded by everyone interested to facilitate reuse, linking and the creation of innovative services. This shows the commitment of the Commission to the principles of openness and transparency.

For more information https://ec.europa.eu/digital-agenda/en/blog/eu-open-data-portal-here.

If the Commission is committed to “principles of openness and transparency, when can we expect to see:

  1. Rosters of the institutions and individual participants in EU funded research from 1980 to present?
  2. Economic analysis of the results of EU funded projects, on a project by project basis, from 1980 to present?

Noting from 1984 – 2013, the total research funding exceeds EUR 118 billion.

To be fair, CORDIS: Community Research and Development Information Service has report summaries and project reports for FP5, FP6 and FP7. And CORDIS Search Service provides coverage back to the early 1980’s.

About Projects on Cordis has a wealth of information to guide searching into EU funded research.

While a valuable resource, CORDIS requires the extraction of detailed information on a project by project basis, making large scale analysis difficult if not prohibitively expensive.

PS: Of the 5855 datasets, some 5680 datasets, were previously published by EuroStat. European Environmental Agency, 106 datasets. Perhaps a net increase of 59 datasets over those previously available.

February 15, 2013

DataDive to Fight Poverty and Corruption with the World Bank!

Filed under: Data Mining,Government,Transparency — Patrick Durusau @ 5:44 am

DataDive to Fight Poverty and Corruption with the World Bank!

From the post:

We’re thrilled to announce a huge new DataKind DataDive coming to DC the weekend of 3/15! We’re teaming up with the World Bank to put a dent in some of the most serious problems in poverty and corruption through the use of data. Low bar, right?

We’re calling on all socially conscious analysts, statisticians, data scientists, coders, hackers, designers, or eager-to-learn do-gooders to come out with us on the weekend of 3/15 to work with data to improve the world. You’ll be working alongside experts in the field to analyze, visualize, and mashup the most cutting-edge data from the World Bank, UN, and other sources to improve poverty monitoring and root out corruption. We’ve started digging into the data a little ourselves and we’re already so excited for how cool this event is going to be. “Oh, what’d you do this weekend? I reduced global poverty and rooted out corruption. No big deal.”

BTW, there is an Open Data Day on 2/23 to prepare for the DataDive on 3/15.

What isn’t clear from the announcement(s) is what data is to be mined to fight poverty and corruption?

Or what is meant by “corruption?”

Graph solutions, for example, would be better at tracking American style corruption that shuns quid pro quo in favor of a community of interest of the wealthy and well-connected.

Such communities aren’t any less corrupt than members of government with cash in their freezers, just less obvious.

February 14, 2013

Intellectual Property Rights: Fiscal Year 2012 Seizure Statistics

Filed under: Government,Intellectual Property (IP),Transparency — Patrick Durusau @ 7:51 pm

Intellectual Property Rights: Fiscal Year 2012 Seizure Statistics

Fulltextreports.com quotes this report as saying:

In Fiscal Year (FY) 2012, DHS and its agencies, CBP and ICE, remained vigilant in their commitment to protect American consumers from intellectual property theft as well as enforce the rights of intellectual property rights holders by expanding their efforts to seize infringing goods, leading to 691 arrests, 423 indictments and 334 prosecutions. Counterfeit and pirated goods pose a serious threat to America’s economic vitality, the health and safety of American consumers, and our critical infrastructure and national security. Through coordinated efforts to interdict infringing merchandise, including joint operations, DHS enforced intellectual property rights while facilitating the secure flow of legitimate trade and travel.

I just feel so…. underwhelmed.

When was the last time you felt frightened by a fake French handbag? Or imitation Italian shoes?

I mean, they may be ugly but so were the originals.

I mention this because tracking data across the various intellectual property enforcement agencies isn’t straight forward.

I found that out while looking into some historical data on copyright enforcement. After the Aaron Swartz tragedy.

The question I want to pursue with topic maps is: Who benefits from these government enforcement efforts?

As far as I can tell now, today, I never have. I bet the same is true for you.

More on gathering the information to make that case anon.

February 10, 2013

Mapping the census…

Filed under: Census Data,Mapping,Maps,R,Transparency — Patrick Durusau @ 4:20 pm

Mapping the census: how one man produced a library for all by Simon Rogers.

From the post:

The census is an amazing resource – so full of data it’s hard to know where to begin. And increasingly where to begin is by putting together web-based interactives – like this one on language and this on transport patterns that we produced this month.

But one academic is taking everything back to basics – using some pretty sophisticated techniques. Alex Singleton, a lecturer in geographic information science (GIS) at Liverpool University has used R to create the open atlas project.

Singleton has basically produced a detailed mapping report – as a PDF and vectored images – on every one of the local authorities of England & Wales. He automated the process and has provided the code for readers to correct and do something with. In each report there are 391 pages, each with a map. That means, for the 354 local authorities in England & Wales, he has produced 127,466 maps.

Check out Simon’s post to see why Singleton has undertaken such a task.

Question: Was the 2011 census more “transparent,” or “useful” after Singleton’s work or before?

I would say more “transparent” after Singleton’s work.

You?

February 1, 2013

Sunlight Congress API [Shifting the Work for Transparency?]

Filed under: Government,Government Data,Transparency — Patrick Durusau @ 8:10 pm

Sunlight Congress API

From the webpage:

A live JSON API for the people and work of Congress, provided by the Sunlight Foundation.

Features

Lots of features and data for members of Congress:

  • Look up legislators by location or by zip code.
  • Official Twitter, YouTube, and Facebook accounts.
  • Committees and subcommittees in Congress, including memberships and rankings.

We also provide Congress' daily work:

  • All introduced bills in the House and Senate, and what occurs to them (updated daily).
  • Full text search over bills, with powerful Lucene-based query syntax.
  • Real time notice of votes, floor activity, and committee hearings, and when bills are scheduled for debate.

All data is served in JSON, and requires a Sunlight API key. An API key is free to register and has no usage limits.

We have an API mailing list, and can be found on Twitter at @sunlightlabs. Bugs and feature requests can be made on Github Issues.

Important not to confuse this effort with transparency.

As the late Aaron Swartz remarked in the O’Reilly “Open Government” text:

…When you create a regulatory agency, you put together a group of people whose job is to solve some problem. They’re given the power to investigate who’s breaking the law and the authority to punish them. Transparency, on the other hand, simply shifts the work from the government to the average citizen, who has neither the time nor the ability to investigate these questions in any detail, let alone do anything about it. It’s a farce: a way for Congress to look like it has done something on some pressing issue without actually endangering its corporate sponsors.

Here is an interface that:

…shifts the work from the [Sunlight Foundation] to the average citizen, who has neither the time nor the ability to investigate these questions in any detail, let alone do anything about it. It’s a farce: a way for [Sunlight Foundation] to look like it has done something on some pressing issue without actually endangering its corporate sponsors. (O’Reilly’s Open Government book [“…more equal than others” pigs]

Suggestions for ending the farce?

I first saw this at the Legal Informatics Blog, Mill: Sunlight Foundation releases Congress API.

Docket Wrench: Exposing Trends in Regulatory Comments [Apparent Transparency]

Filed under: Government,Government Data,Transparency — Patrick Durusau @ 8:10 pm

Docket Wrench: Exposing Trends in Regulatory Comments by Nicko Margolies.

From the post:

Today the Sunlight Foundation unveils Docket Wrench, an online research tool to dig into regulatory comments and uncover patterns among millions of documents. Docket Wrench offers a window into the rulemaking process where special interests and individuals can wield their influence without the level of scrutiny traditional lobbying activities receive.

Before an agency finalizes a proposed rule that Congress and the president have mandated that they enforce, there is a period of public commenting where the agency solicits feedback from those affected by the rule. The commenters can vary from company or industry representatives to citizens concerned about laws that impact their environment, schools, finances and much more. These comments and related documents are grouped into “dockets” where you can follow the actions related to each rule. Every rulemaking docket has its own page on Docket Wrench where you can get a graphical overview of the docket, drill down into the rules and notices it contains and read the comments on those rules. We’ve pulled all this information together into one spot so you can more easily research trends and extract interesting stories from the data. Sunlight’s Reporting Group has done just that, looking into regulatory comment trends and specific comments by the Chamber of Commerce and the NRA.

An “apparent” transparency offering from the Sunlight Foundation.

Imagine that you follow their advice and do discover “form letters,” horror, that have been submitted in a rule making process.

What are you going to do? Whistle up the agency’s former assistant director who is on your staff to call his buds at the agency to complain?

Get yourself a cardboard sign and march around your town square? Start a letter writing campaign of your own?

Rules are drafted, debated and approved in the dark recesses of agencies, former agency staff, lobbyists and law firms.

Want transparency? Real transparency?

That would require experts in law and policy who have equal access to the agency as its insiders and an obligation to report to the public who wins and who loses from particular rules.

An office like the public editor of the New York Times.

Might offend donors if you did that.

Best just to expose the public to a tiny part of the quagmire so you can claim people had an opportunity to participate.

Not a meaningful one, but an opportunity none the less.

I first saw this at the Legal Informatics Blog, Sunlight Foundation Releases Docket Wrench: Tool for Analyzing Comments to Proposed Regulations

January 21, 2013

No Joy in Vindication

Filed under: Government,Government Data,Transparency — Patrick Durusau @ 7:31 pm

You may have seen the news about the latest GAO report on auditing the U.S. government: U.S. Government’s Fiscal Years 2012 and 2011 Consolidated Financial Statements, GAO-13-271R, Jan 17, 2013, http://www.gao.gov/products/GAO-13-271R.

The reasons why the GAO can’t audit the U.S. government:

(1) serious financial management problems at DOD that have prevented its financial statements from being auditable,

(2) the federal government’s inability to adequately account for and reconcile intragovernmental activity and balances between federal agencies, and

(3) the federal government’s ineffective process for preparing the consolidated financial statements.

Number 2 reminds me of: The 560+ $Billion Shell Game, where I provided data files based on the OMB Sequestration report, detailing that over 560 $billion in agency transfers could not be tracked.

That problem has now been confirmed by the GAO.

I am sure my analysis was not original and has been known to insiders at the GAO and others for years.

But did you know that I mailed that analysis to both of my U.S. Senators and got no response?

I did get a “bug letter” from my representative, Austin Scott:

Washington continues to spend at unsustainable levels. That is why I voted against H.R. 8, the American Taxpayer Relief Act when it passed Congress on January 1, 2013. This plan does not address the real driver of our debt – spending. President Obama’s unwillingness to address this continues to cripple our efforts to find a long-term solution. We cannot tax our way out of this fiscal situation.

The President himself has said on multiple occasions that spending cuts must be part of the solution. In fact, on April 13, 2011 he remarked, “So any serious plan to tackle our deficit will require us to put everything on the table, and take on excess spending wherever it exists in the budget.” However, his words have seldom matched his actions.

We owe it to our children and grandchildren to make the tough choices and devise a long-term solution that gets our economy back on track and reduces our deficits. I remain hopeful that the President will join us in this effort. Thank you for contacting me. It’s an honor to represent the Eighth Congressional District of Georgia.

Non-responsive would be a polite word for it.

My original point has been vindicated by the GAO but that brings no joy.

My request to the officials I have contacted was simple:

All released government financial data must be available in standard spreadsheet formats (Excel, CSV, ODF).

There are a whole host of other issues that will arise from such data but the first step is to get it in a crunchable format.

O’Reilly’s Open Government book [“…more equal than others” pigs]

Filed under: Government,Government Data,Open Data,Open Government,Transparency — Patrick Durusau @ 7:30 pm

We’re releasing the files for O’Reilly’s Open Government book by Laurel Ruma.

From the post:

I’ve read many eloquent eulogies from people who knew Aaron Swartz better than I did, but he was also a Foo and contributor to Open Government. So, we’re doing our part at O’Reilly Media to honor Aaron by posting the Open Government book files for free for anyone to download, read and share.

The files are posted on the O’Reilly Media GitHub account as PDF, Mobi, and EPUB files for now. There is a movement on the Internet (#PDFtribute) to memorialize Aaron by posting research and other material for the world to access, and we’re glad to be able to do this.

You can find the book here: github.com/oreillymedia/open_government

Daniel Lathrop, my co-editor on Open Government, says “I think this is an important way to remember Aaron and everything he has done for the world.” We at O’Reilly echo Daniel’s sentiment.

Be sure to read Chapter 25, “When Is Transparency Useful?”, by the late Aaron Swartz.

It includes this passage:

…When you create a regulatory agency, you put together a group of people whose job is to solve some problem. They’re given the power to investigate who’s breaking the law and the authority to punish them. Transparency, on the other hand, simply shifts the work from the government to the average citizen, who has neither the time nor the ability to investigate these questions in any detail, let alone do anything about it. It’s a farce: a way for Congress to look like it has done something on some pressing issue without actually endangering its corporate sponsors.

As a tribute to Aaron, are you going to dump data on the WWW or enable the calling of “more equal than others” pigs to account?

January 18, 2013

Freeing the Plum Book

Filed under: Government,Government Data,Transparency — Patrick Durusau @ 7:15 pm

Freeing the Plum Book by Derek Willis.

From the post:

The federal government produces reams of publications, ranging from the useful to the esoteric. Pick a topic, and in most cases you’ll find a relevant government publication: for example, recent Times articles about presidential appointments draw on the Plum Book. Published annually by either the House or the Senate (the task alternates between committees), the Plum Book is a snapshot of appointments throughout the federal government.

The Plum Book is clearly a useful resource for reporters. But like many products of the Government Printing Office, its two main publication formats are print and PDF. That means the digital version isn’t particularly searchable, unless you count Ctrl-F as a legitimate search mechanism. And that’s a shame, because the Plum Book is basically a long list of names, positions and salary information. It’s data.

Derek describes freeing the Plum Book from less than useful formats.

It is now available in JSON and YAML formats at Github and in Excel.

Curious, what other public datasets would you want to match up to the Plum Book?

January 9, 2013

Center for Effective Government Announces Launch [Name Change]

Filed under: Government,Government Data,Transparency — Patrick Durusau @ 12:00 pm

Center for Effective Government Announces Launch

The former OMB Watch is now the Center for Effective Government (www.foreffectivegov.org).

A change to reflect a broader expertise on government effectiveness in general.

From the post:

The Center for Effective Government will continue to offer expert analysis, in-depth reports, and news updates on the issues it has been known for in the past. Specifically, the organization will:

  • Analyze federal tax and spending choices and advocate for progressive revenue options and transparency in federal spending;
  • Defend and improve national standards and safeguards and the regulatory systems that produce and enforce them;
  • Expose undue special interest influence in federal policymaking and advocate for open government reforms that ensure public officials put the public interest first; and
  • Encourage more active citizen engagement in our democracy by ensuring people have access to easy-to-understand, contextualized, meaningful public information and understand how they can participate in public policy decision making processes.

If you have been running a topic map in this area, reflect the name change to the OMB Watch topic.

Beyond simple semantic impedance, which is always present, government is replete with examples of intentional impedance if not outright deception.

A fertile field for topic map practitioners!

« Newer PostsOlder Posts »

Powered by WordPress