Convolutional Neural Nets in Net# by by Alexey Kamenev.
From the post:
After introducing Net# in the previous post, we continue with our overview of the language and examples of convolutional neural nets or convnets.
Convnets have become very popular in recent years as they consistently produce great results on hard problems in computer vision, automatic speech recognition and various natural language processing tasks. In most such problems, the features have some geometric relationship, like pixels in an image or samples in audio stream. An excellent introduction to convnets can be found here:
https://www.coursera.org/course/neuralnets (Lecture 5)
http://deeplearning.net/tutorial/lenet.htmlBefore we start discussing convnets, let’s introduce one definition that is important to understand when working with Net#. In a neural net structure, each trainable layer (a hidden or an output layer) has one or more connection bundles. A connection bundle consists of a source layer and a specification of the connections from that source layer. All the connections in a given bundle share the same source layer and the same destination layer. In Net#, a connection bundle is considered as belonging to the bundle’s destination layer. Net# supports various kinds of bundles like fully connected, convolutional, pooling and so on. A layer might have multiple bundles which connect it to different source layers.
…
BTW, the previous post was: Neural Nets in Azure ML – Introduction to Net#. Not exactly what I was expecting by the Net# reference.
Machine Learning Blog needs to be added to your RSS feed.
If you need more information: Guide to Net# neural network specification language.
Enjoy!
I first saw this in a tweet by Michael Cavaretta.