Applying “Lateral Thinking” to Data Quality by Ken O’Connor.
From the post:
I am a fan of Edward De Bono, the originator of the concept of Lateral Thinking. One of my favourite examples of De Bono’s brilliance, relates to dealing with the worldwide problem of river pollution.
De Bono suggested “each factory must be downstream of itself” – i.e. Require factories’ water inflow pipes to be just downstream of their outflow pipes.
Suddenly, the water quality in the outflow pipe becomes a lot more important to the factory. Apparently several countries have implemented this idea as law.
What has this got to do with data quality?
By applying the same principle to data entry, all downstream data users will benefit, and information quality will improve.
How could this be done?
So how do you move the data input pipe just downstream of the data outflow pipe?
Before you take a look at Ken’s solution, take a few minutes to brain storm about how you would do it.
Important for semantic technologies because there aren’t enough experts to go around. Meaning non-expert users will do a large portion of the work.
Comments/suggestions?