Why the US Government is Desperately Seeking Data Integration by David Linthicum.
From the post:
“When it comes to data, the U.S. federal government is a bit of a glutton. Federal agencies manage on average 209 million records, or approximately 8.4 billion records for the entire federal government, according to Steve O’Keeffe, founder of the government IT network site, MeriTalk.”
Check out these stats, in a December 2013 MeriTalk survey of 100 federal records and information management professionals. Among the findings:
- Only 18 percent said their agency had made significant progress toward managing records and email in electronic format, and are ready to report.
- One in five federal records management professionals say they are “completely prepared” to handle the growing volume of government records.
- 92 percent say their agency “has a lot of work to do to meet the direction.”
- 46 percent say they do not believe or are unsure about whether the deadlines are realistic and obtainable.
- Three out of four say the Presidential Directive on Managing Government Records will enable “modern, high-quality records and information management.”
I’ve been working with the US government for years, and I can tell that these facts are pretty accurate. Indeed, the paper glut is killing productivity. Even the way they manage digital data needs a great deal of improvement.
I don’t doubt a word of David’s post. Do you?
What I do doubt is the ability of the government to integrate its data. At least unless and until it makes some fundamental choices about the route it will take to data integration.
First, replacement of existing information systems is a non-goal. Unless that is an a prioriassumption, the politics, both on Capital Hill and internal to any agency, program, etc. will doom a data integration effort before it begins.
The first non-goal means that the ROI of data integration must be high enough to be evident even with current systems in place.
Second, integration of the most difficult cases is not the initial target for any data integration project. It would be offensive to cite all the “boil the ocean” projects that have failed in Washington, D.C. Let’s just agree that judicious picking of high value and reasonable effort integration cases are a good proving ground.
Third, the targets and costs for meeting those targets of data integration, along with expected ROI, will be agreed upon by all parties before any work starts. Avoidance of mission creep is essential to success. Not to mention that public goals and metrics will enable everyone to decide if the goals have been meet.
Fourth, employment of traditional vendors, unemployed programmers, geographically dispersed staff, etc. are also non-goals of the project. With the money that can be saved by robust data integration, departments can feather their staffs as much as they like.
If you need proof of the fourth requirement, consider the various Apache projects that are now the the underpinnings for “big data” in its many forms.
It is possible to solve the government’s data integration issues. But not without some hard choices being made up front about the project.
Sorry, forgot one:
Fifth, the project leader should seek a consensus among the relevant parties but ultimately has the authority to make decisions for the project. If every dispute can have one or more parties running to their supervisor or congressional backer, the project is doomed before it starts. The buck stops with the project manager and no where else.