Big data in minutes with the ELK Stack by Philippe Creux.
From the post:
We’ve built a data analysis and dashboarding infrastructure for one of our clients over the past few weeks. They collect about 10 million data points a day. Yes, that’s big data.
My highest priority was to allow them to browse the data they collect so that they can ensure that the data points are consistent and contain all the attributes required to generate the reports and dashboards they need.
Is it just me or does processing “big data” seem to have gotten easier over the past several years?
But however easy or hard the processing, the value-add question is what do we know post data processing that we didn’t know before?