Another Word For It Patrick Durusau on Topic Maps and Semantic Diversity

April 19, 2012

Knoema Launches the World’s First Knowledge Platform Leveraging Data

Filed under: Data,Data Analysis,Data as Service (DaaS),Data Mining,Knoema,Statistics — Patrick Durusau @ 7:13 pm

Knoema Launches the World’s First Knowledge Platform Leveraging Data

From the post:

DEMO Spring 2012 conference — Today at DEMO Spring 2012, Knoema launched publicly the world’s first knowledge platform that leverages data and offers tools to its users to harness the knowledge hidden within the data. Search and exploration of public data, its visualization and analysis have never been easier. With more than 500 datasets on various topics, gallery of interactive, ready to use dashboards and its user friendly analysis and visualization tools, Knoema does for data what YouTube did to videos.

Millions of users interested in data, like analysts, students, researchers and journalists, struggle to satisfy their data needs. At the same time there are many organizations, companies and government agencies around the world collecting and publishing data on various topics. But still getting access to relevant data for analysis or research can take hours with final outcomes in many formats and standards that can take even longer to get it to a shape where it can be used. This is one of the issues that the search engines like Google or Bing face even after indexing the entire Internet due to the nature of statistical data and diversity and complexity of sources.

One-stop shop for data. Knoema, with its state of the art search engine, makes it a matter of minutes if not seconds to find statistical data on almost any topic in easy to ingest formats. Knoema’s search instantly provides highly relevant results with chart previews and actual numbers. Search results can be further explored with Dataset Browser tool. In Dataset Browser tool, users can get full access to the entire public data collection, explore it, visualize data on tables/charts and download it as Excel/CSV files.

Numbers made easier to understand and use. Knoema enables end-to-end experience for data users, allowing creation of highly visual, interactive dashboards with a combination of text, tables, charts and maps. Dashboards built by users can be shared to other people or on social media, exported to Excel or PowerPoint and embedded to blogs or any other web site. All public dashboards made by users are available in dashboard gallery on home page. People can collaborate on data related issues participating in discussions, exchanging data and content.

Excellent!!!

When “other” data becomes available, users will want to integrate it with their data.

But “other” data will have different or incompatible semantics.

So much for attempts to wrestle semantics to the ground (W3C) or build semantic prisons (unnamed vendors).

What semantics are useful to you today? (patrick@durusau.net)

September 9, 2011

Kasabi

Filed under: Data,Data as Service (DaaS),Data Source,RDF — Patrick Durusau @ 7:16 pm

Kasabi

A data as service site that offers access to data (no downloads) via API codes. Has helps for authors to prepare their data, APIs for data, etc. Currently in beta.

I mention it because data as service is one model for delivery of topic map content so the successes, problems and usage of Kasabi may be important milestones to watch.

True, Lexis/Nexis, WestLaw, and any number of other commercial vendors have sold access to data in the past but it was mostly dumb data. That is you had to contribute something to it to make it meaningful. We are in the early stages but I think a data market for data that works with my data is developing.

The options to download citations in formats that fit particular bibliographic programs are an impoverished example of delivered data working with local data.

Not quite the vision for the Semantic Web but it isn’t hard to imagine your calendaring program showing links to current news stories about your appointments. You have to supply the reasoning to cancel the appointment with the bank president just arrested for securities fraud and to increase your airline reservations to two (2).

August 18, 2011

Integration Imperatives Around Complex Big Data

Filed under: BigData,Data as Service (DaaS),Data Integration,Marketing — Patrick Durusau @ 6:52 pm

Integration Imperatives Around Complex Big Data

  • Informatica Corporation (NASDAQ: INFA), the world’s number one independent provider of data integration software, today announced the availability of a new research report from the Aberdeen Group that shows how organizations can get the most from their data integration assets in the face of rapidly growing data volumes and increasing data complexity.
  • Entitled: Future Integration Needs: Embracing Complex Data, the Aberdeen report reveals that:
    • Big Data is the new reality – In 2010, organizations experienced a staggering average data volume growth of 40 percent.
    • XML adoption has increased dramatically – XML is the most common semi-structured data source that organizations integrate. 74 percent of organizations are integrating XML from external sources. 66 percent of organizations are integrating XML from internal sources.
    • Data complexity is skyrocketing – In the next 12 months enterprises plan to introduce more complex unstructured data sources – including office productivity documents, email, web content and social media data – than any other data type.
    • External data sources are proliferating – On average, organizations are integrating 14 external data sources, up from 11 a year ago.
    • Integration costs are rising – As integration of external data rises, it continues to be a labor- and cost-intensive task, with organizations integrating external sources spending 25 percent of their total integration budget in this area.
  • For example, according to Aberdeen, organizations that have effectively integrated complex data are able to:
    • Use up to 50 percent larger data sets for business intelligence and analytics.
    • Integrate twice as successfully external unstructured data into business processes (40 percent vs. 19 percent).
    • Deliver critical information in the required time window 2.5 times more often via automated data refresh.
    • Slash the incidence of errors in their data almost in half compared to organizations relying on manual intervention when performing data updates and refreshes.
    • Spend an average of 43 percent less on integration software (based on 2010 spend).
    • Develop integration competence more quickly with significantly lower services and support expenditures, resulting in less costly business results.

I like the 25% of data integration budgets being spend on integrating external data. Imagine making that easier for enterprises with a topic map based service.

Maybe “Data as service (DaaS)” will evolve from simply being data delivery to dynamic integration of data from multiple sources. Where currency, reliability, composition, and other features of the data are on a sliding scale of value.

Powered by WordPress