Hadoop: The Lay of the Land by Tom White.
From the post:
The core map-reduce framework for big data consists of several interlocking technologies. This first installment of our tutorial explains what Hadoop does and how the pieces fit together.
Big Data is in the news these days, and Apache Hadoop is one of the most popular platforms for working with Big Data. Hadoop itself is undergoing tremendous growth as new features and components are added, and for this reason alone, it can be difficult to know how to start working with it. In this three-part series, I explain what Hadoop is and how to use it, presenting a simple, hands-on examples that you can try yourself. First, though, let’s look at the problem that Hadoop was designed to solve.
Much later:
Tom White has been an Apache Hadoop committer since February 2007, and is a member of the Apache Software Foundation. He is an engineer at Cloudera, a company set up to offer Hadoop tools, support, and training. He is the author of the best-selling O’Reilly book, Hadoop: The Definitive Guide.
If you are getting started with Hadoop or need a good explanation for others, start here.
I first saw this at: Learn How To Hadoop from Tom White in Dr. Dobb’s by Justin Kestelyn.