From the website:
The main objective of DBpedia is to extract structured information from Wikipedia, convert it into RDF, and make it freely available on the Web. In a nutshell, DBpedia is the Semantic Web mirror of Wikipedia.
Wikipedia users constantly revise Wikipedia articles with updates happening almost each second. Hence, data stored in the official DBpedia endpoint can quickly become outdated, and Wikipedia articles need to be re-extracted. DBpedia-Live enables such a continuous synchronization between DBpedia and Wikipedia.
Important Links:
- SPARQL-endpoint: http://live.dbpedia.org/sparql
- DBpedia-Live Statistics: http://live.dbpedia.org/livestats
- Changesets: http://live.dbpedia.org/liveupdates
- Sourcecode: http://dbpedia.hg.sourceforge.net/hgweb/dbpedia/extraction_framework
- Synchronization Tool: http://sourceforge.net/projects/dbpintegrator/files/
OK, so you have a live feed. Now how do you judge the importance of updates and which ones trigger alerts to the user? Or are important enough to trigger merges? (Assuming not all possible merges are worth the expense.)