Streaming Analytics: with sparse distributed representations by Jeff Hawkins.
Sparse distributed representations appear to be the means by which brains encode information. They have several advantageous properties including the ability to encode semantic meaning. We have created a distributed memory system for learning sequences of sparse distribute representations. In addition we have created a means of encoding structured and unstructured data into sparse distributed representations. The resulting memory system learns in an on-line fashion making it suitable for high velocity data streams. We are currently applying it to commercially valuable data streams for prediction, classification, and anomaly detection In this talk I will describe this distributed memory system and illustrate how it can be used to build models and make predictions from data streams.
Looking forward to learning more about “sparse distributed representation (SDR).”
Not certain about Jeff’s claim that matching across SDRs = semantic similarity.
Design of the SDR determines the meaning of each bit and consequently of matching.
Which feeds back into the encoders that produce the SDRs.
The core paper: Hierarchical Temporal Memory including HTM Cortical Learning Algorithms. Check the FAQ link if you need the paper in Chinese, Japanese, Korean, Portuguese, Russian, or Spanish. (unverified translations)
A very good FAQ that goes a long way to explaining the capabilities and limitations (currently) of Grok. “Unstructured text” for example isn’t appropriate input into Grok.
Jeff Hawkins and Sandra Blakeslee co-authored On Intelligence in 2004. The FAQ describes the current work as an extension of “On Intelligence.”
BTW, if you think you have heard the name Jeff Hawkins before, you have. Inventor of the Palm Pilot among other things.