Another Word For It Patrick Durusau on Topic Maps and Semantic Diversity

August 18, 2011

Integration Imperatives Around Complex Big Data

Filed under: BigData,Data as Service (DaaS),Data Integration,Marketing — Patrick Durusau @ 6:52 pm

Integration Imperatives Around Complex Big Data

  • Informatica Corporation (NASDAQ: INFA), the world’s number one independent provider of data integration software, today announced the availability of a new research report from the Aberdeen Group that shows how organizations can get the most from their data integration assets in the face of rapidly growing data volumes and increasing data complexity.
  • Entitled: Future Integration Needs: Embracing Complex Data, the Aberdeen report reveals that:
    • Big Data is the new reality – In 2010, organizations experienced a staggering average data volume growth of 40 percent.
    • XML adoption has increased dramatically – XML is the most common semi-structured data source that organizations integrate. 74 percent of organizations are integrating XML from external sources. 66 percent of organizations are integrating XML from internal sources.
    • Data complexity is skyrocketing – In the next 12 months enterprises plan to introduce more complex unstructured data sources – including office productivity documents, email, web content and social media data – than any other data type.
    • External data sources are proliferating – On average, organizations are integrating 14 external data sources, up from 11 a year ago.
    • Integration costs are rising – As integration of external data rises, it continues to be a labor- and cost-intensive task, with organizations integrating external sources spending 25 percent of their total integration budget in this area.
  • For example, according to Aberdeen, organizations that have effectively integrated complex data are able to:
    • Use up to 50 percent larger data sets for business intelligence and analytics.
    • Integrate twice as successfully external unstructured data into business processes (40 percent vs. 19 percent).
    • Deliver critical information in the required time window 2.5 times more often via automated data refresh.
    • Slash the incidence of errors in their data almost in half compared to organizations relying on manual intervention when performing data updates and refreshes.
    • Spend an average of 43 percent less on integration software (based on 2010 spend).
    • Develop integration competence more quickly with significantly lower services and support expenditures, resulting in less costly business results.

I like the 25% of data integration budgets being spend on integrating external data. Imagine making that easier for enterprises with a topic map based service.

Maybe “Data as service (DaaS)” will evolve from simply being data delivery to dynamic integration of data from multiple sources. Where currency, reliability, composition, and other features of the data are on a sliding scale of value.

No Comments

No comments yet.

RSS feed for comments on this post.

Sorry, the comment form is closed at this time.

Powered by WordPress