Bob Carpenter writes in: Mavandadi et al. (2012) Distributed Medical Image Analysis and Diagnosis through Crowd- Sourced Games: A Malaria Case Study:
I found a link from Slashdot of all places to this forthcoming paper:
- Mavandadi, Sam, Stoyan Dimitrov, Steve Feng, Frank Yu, Uzair Sikora, Oguzhan Yaglidere, Swati Padmanabhan, Karin Nielsen, and Aydogan Ozcan. (2012) Distributed Medical Image Analysis and Diagnosis through Crowd-Sourced Games: A Malaria Case Study. PLoS ONE.
None of the nine authors, the reviewer(s) or editor(s) knew that their basic technique has been around for over 30 years. (I’m talking about the statistical technique here, not the application to distributed diagnosis of diseases, which I don’t know anything about.)
Of course, many of us reinvented this particular wheel over the past three decades, and the lack of any coherent terminology for the body of work across computer science, statistics, and epidemiology is part of the problem. But now, in 2012, a simple web search for crowdsourcing should reveal the existing literature because enough of us have found it and cited it.
Have you ever wondered how much reinvention costs your company or organization every year?
Or how little benefit is derived from funded research that reinvents a wheel?
It is popular to talk about more cost-effective and efficient government, but shouldn’t being cost-effective and efficient be goals of private organizations as well?
How would you go about detecting reinvention of the wheel in your company or organization? (Leaving to one side how you would preserve that discovery once made.)