Another Word For It Patrick Durusau on Topic Maps and Semantic Diversity

September 26, 2011

The HasGP user manual

Filed under: Functional Programming,Guassian Processes,Haskell — Patrick Durusau @ 6:58 pm

The HasGP user manual (pdf)

Abstract:

HasGP is an experimental library implementing methods for supervised learning using Gaussian process (GP) inference, in both the regression and classification settings. It has been developed in the functional language Haskell as an investigation into whether the well known advantages of the functional paradigm can be exploited in the field of machine learning, which traditionally has been dominated by the procedural/object-oriented approach, particularly involving C/C++ and Matlab. HasGP is open-source software released under the GPL3 license. This manual provides a short introduction on how install the library, and how to apply it to supervised learning problems. It also provides some more in-depth information on the implementation of the library, which is aimed at developers. In the latter, we also show how some of the specific functional features of Haskell, in particular the ability to treat functions as first-class objects, and the use of typeclasses and monads, have informed the design of the library. This manual applies to HasGP version 0.1, which is the initial release of the library.

HasGP website

What a nice surprise for a Monday morning, something new and different (not the same thing). Just scanning the pages before a conference call I would say you need to both read and forward this to your Haskell/Gaussian friends.

Comes with demo programs. Release 0.1 so it will be interesting to see what the future holds.

The project does need a mailing list so users can easily discuss their experiences, suggestions, etc. (One may already exist but isn’t apparent from the project webpage. If so, apologies.)

March 6, 2011

Gaussian Processes for Machine Learning

Filed under: Algorithms,Guassian Processes,Machine Learning — Patrick Durusau @ 3:31 pm

Gaussian Processes for Machine Learning

Complete text of:

Gaussian Processes for Machine Learning, Carl Edward Rasmussen and Christopher K. I. Williams, MIT Press, 2006. ISBN-10 0-262-18253-X, ISBN-13 978-0-262-18253-9.

I like the quote from James Clerk Maxwell that goes:

The actual science of logic is conversant at present only with things either certain, impossible, or entirely doubtful, none of which (fortunately) we have to reason on. Therefore the true logic for this world is the calculus of Probabilities, which takes account of the magnitude of the probability which is, or ought to be, in a reasonable man’s mind.

Interesting. Is our identification of subjects probabilistic or is our identification of what we thought others meant probabilistic?

Or both? Neither?

From the preface:

Over the last decade there has been an explosion of work in the “kernel machines” area of machine learning. Probably the best known example of this is work on support vector machines, but during this period there has also been much activity concerning the application of Gaussian process models to machine learning tasks. The goal of this book is to provide a systematic and unified treatment of this area. Gaussian processes provide a principled, practical, probabilistic approach to learning in kernel machines. This gives advantages with respect to the interpretation of model predictions and provides a well founded framework for learning and model selection. Theoretical and practical developments of over the last decade have made Gaussian processes a serious competitor for real supervised learning applications.

I am downloading the PDF version but have just ordered a copy from Amazon.

If you want to encourage MIT Press and other publishers to put materials online as well as in print, order a copy of this and other online materials.

Saying online copies don’t hurt print sales isn’t as convincing as hearing the cash register go “cha-ching!”

(I would also drop a note to the press saying you bought a copy of the online book as well.)

December 21, 2010

Bayesian inference and Gaussian processes – In six (6) parts

Filed under: Bayesian Models,Guassian Processes — Patrick Durusau @ 4:45 pm

Bayesian inference and Gaussian processes Authors: Carl Edward Rasmussen

Quite useful as the presenter concludes with disambiguating terminology used differently in the field. Same terms used to mean different things, different terms to mean the same thing. Hmmm, that sounds really familiar. 😉

Start with this lecture before Dirichlet Processes: Tutorial and Practical Course

BTW, if this seems a bit AI-ish, consider it to be the reverse of supervised classification (person helps machine), that is machine helps person, but the person should say when answer is correct.

Powered by WordPress