You Don’t Have to Be Google to Build an Artificial Brain by Cade Metz.
From the post:
When Google used 16,000 machines to build a simulated brain that could correctly identify cats in YouTube videos, it signaled a turning point in the art of artificial intelligence.
Applying its massive cluster of computers to an emerging breed of AI algorithm known as “deep learning,” the so-called Google brain was twice as accurate as any previous system in recognizing objects pictured in digital images, and it was hailed as another triumph for the mega data centers erected by the kings of the web.
…
But in the middle of this revolution, a researcher named Alex Krizhevsky showed that you don’t need a massive computer cluster to benefit from this technology’s unique ability to “train itself” as it analyzes digital data. As described in a paper published later that same year, he outperformed Google’s 16,000-machine cluster with a single computer—at least on one particular image recognition test.
This was a rather expensive computer, equipped with large amounts of memory and two top-of-the-line cards packed with myriad GPUs, a specialized breed of computer chip that allows the machine to behave like many. But it was a single machine nonetheless, and it showed that you didn’t need a Google-like computing cluster to exploit the power of deep learning.
…
Cade’s article should encourage you to do two things:
- Learn GPU’s cold
- Ditto on Deep Learning
Google and others will always have more raw processing power than any system you are likely to afford. However, while a steam shovel can shovel a lot of clay, it takes a real expert to make a vase. Particularly a very good one.
Do you want to pine for a steam shovel or work towards creating a fine vase?
PS: Google isn’t building “an artificial brain,” not anywhere close. That’s why all their designers, programmers and engineers are wetware.