Archive for the ‘Julia’ Category

The Matrix Cheatsheet

Tuesday, March 3rd, 2015

The Matrix Cheatsheet by Sebastian Raschka.

Sebastian has created a spreadsheet of thirty (30) matrix tasks and compares the code for each in: MATLAB/Octave, Python NumPy, R, and Julia.

Given the prevalence of matrices in so many data science tasks, this can’t help but be useful.

A bit longer treatment can be found at: The Matrix Cookbook.

I first saw this in a tweet by Yhat, Inc.

Beginning deep learning with 500 lines of Julia

Monday, March 2nd, 2015

Beginning deep learning with 500 lines of Julia by Deniz Yuret.

From the post:

There are a number of deep learning packages out there. However most sacrifice readability for efficiency. This has two disadvantages: (1) It is difficult for a beginner student to understand what the code is doing, which is a shame because sometimes the code can be a lot simpler than the underlying math. (2) Every other day new ideas come out for optimization, regularization, etc. If the package used already has the trick implemented, great. But if not, it is difficult for a researcher to test the new idea using impenetrable code with a steep learning curve. So I started writing KUnet.jl which currently implements backprop with basic units like relu, standard loss functions like softmax, dropout for generalization, L1-L2 regularization, and optimization using SGD, momentum, ADAGRAD, Nesterov’s accelerated gradient etc. in less than 500 lines of Julia code. Its speed is competitive with the fastest GPU packages (here is a benchmark). For installation and usage information, please refer to the GitHub repo. The remainder of this post will present (a slightly cleaned up version of) the code as a beginner’s neural network tutorial (modeled after Honnibal’s excellent parsing example).

This tutorial “begins” with you coding deep learning. If you need a bit more explanation on deep learning, you could do far worse than consulting Deep Learning: Methods and Applications or Deep Learning in Neural Networks: An Overview.

If you are already at the programming stage of deep learning, enjoy!

For Julia, Julia (homepage), Julia (online manual), (Julia blog aggregator), should be enough to get you started.

I first saw this in a tweet by Andre Pemmelaar.

Wintel and Open Source

Thursday, November 13th, 2014

The software world is reverberating with the news that Microsoft is in the process of making .NET completely open source.

On the same day, Intel announced that it had released “Julia2C, a source-to-source translator from Julia to C.”

Hmmm, is this evidence that open source is a viable path for commercial vendors? 😉

Next Question: How long before non-open source code become a liability? As in a nesting place for government surveillance/malware.

Speculation: Not as long as it took Wintel to move towards open source.

Consumers should demand open source code as a condition for purchase. All software, all the time.


Saturday, October 4th, 2014

JUNO: Juno is a powerful, free environment for the Julia language.

From the about page:

Juno began as an attempt to provide basic support for Julia in Light Table. I’ve been working on it over the summer as part of Google Summer of Code, and as the project has evolved it’s come closer to providing a full IDE for Julia, with a particular focus on providing a good experience for beginners.

The Juno plugin itself is essentially a thin wrapper which provides nice defaults; the core functionality is provided in a bunch of packages and plugins:

  • Julia-LT – which provides the basic language support for Julia in Light Table
  • Jewel.jl – A Julia source code analysis and manipulation library for Julia
  • June – Nicer themes and font defaults for LT
  • Reminisce – Sublime-style saving of files and content for LT

In case you have forgotten about Julia:

Julia is a high-level, high-performance dynamic programming language for technical computing, with syntax that is familiar to users of other technical computing environments. It provides a sophisticated compiler, distributed parallel execution, numerical accuracy, and an extensive mathematical function library. The library, largely written in Julia itself, also integrates mature, best-of-breed C and Fortran libraries for linear algebra, random number generation, signal processing, and string processing. In addition, the Julia developer community is contributing a number of external packages through Julia’s built-in package manager at a rapid pace. IJulia, a collaboration between the IPython and Julia communities, provides a powerful browser-based graphical notebook interface to Julia.

Julia programs are organized around multiple dispatch; by defining functions and overloading them for different combinations of argument types, which can also be user-defined. For a more in-depth discussion of the rationale and advantages of Julia over other systems, see the following highlights or read the introduction in the online manual.

Curious to see if this project will follow Light Table onto the next IDE project, Eve.

Exploring Calculus with Julia

Wednesday, August 27th, 2014

Exploring Calculus with Julia

From the post:

This is a collection of notes for exploring calculus concepts with the Julia programming language. Such an approach is used in MTH 229 at the College of Staten Island.

These notes are broken into different sections, where most all sections have some self-grading questions at the end that allow you to test your knowledge of that material. The code should be copy-and-pasteable into a julia session. The code output is similar to what would be shown if evaluated in an IJulia cell, our recommended interface while learning julia.

The notes mostly follow topics of a standard first-semester calculus course after some background material is presented for learning julia within a mathematical framework.

Another example of pedagogical technique.

Semantic disconnects are legion and not hard to find. However, what criteria would you use to select a set to be solved using topic maps?

Or perhaps better, before mentioning topic maps, how would you solve them so that the solution works up to being a topic map?

Either digitally or even with pencil and paper?

Thinking that getting people to internalize the value-add of topic maps before investing effort into syntax, etc. could be a successful way to promote them.

Julia: a new language for technical computing

Friday, April 13th, 2012

Julia: a new language for technical computing

From the post:

Julia is a new open-source language for high-performance technical computing, created by Jeff Bezanson, Stefan Karpinski, Viral Shah and Alan Edelman and first announced in February. Their motivation for creating a new language was, they say, “greed”:

We are power Matlab users. Some of us are Lisp hackers. Some are Pythonistas, others Rubyists, still others Perl hackers. There are those of us who used Mathematica before we could grow facial hair. There are those who still can’t grow facial hair. We’ve generated more R plots than any sane person should. C is our desert island programming language.

We love all of these languages; they are wonderful and powerful. For the work we do — scientific computing, machine learning, data mining, large-scale linear algebra, distributed and parallel computing — each one is perfect for some aspects of the work and terrible for others. Each one is a trade-off.

We are greedy: we want more.

Pointers to articles and a vocabulary comparison of Julia and R. Recalling the recent complaint that a user might know the operation in R but not Julia. And my suggestion that a “lite” topic map application might be useful in that context.

An R programmer looks at Julia

Sunday, April 8th, 2012

An R programmer looks at Julia by Douglas Bates.

Douglas writes:

In January of this year I first saw mention of the Julia language in the release notes for LLVM. I mentioned this to Dirk Eddelbuettel and later we got in contact with Viral Shah regarding a Debian package for Julia.

There are many aspects of Julia that are quite intriguing to an R programmer. I am interested in programming languages for “Computing with Data”, in John Chambers ‘term, or “Technical Computing”, as the authors of Julia classify it. I believe that learning a programming language is somewhat like learning a natural language in that you need to live with it and use it for a while before you feel comfortable with it and with the culture surrounding it.

A common complaint for those learning R is finding the name of the function to perform a particular task. In writing a bit of Julia code for fitting generalized linear models, as described below, I found myself in exactly the same position of having to search through documentation to find how to do something that I felt should be simple. The experience is frustrating but I don’t know of a way of avoiding it. One word of advice for R programmers looking at Julia, the names of most functions correspond to the Matlab/octave names, not the R names. One exception is the d-p-q-r functions for distributions, as I described in an earlier posting. [bold emphasis added in last paragraph]

Problem: Programming languages with different names for the same operation.

Suggestions anyone?


Do topic maps spring to mind?

Perhaps with select match language, select target language and auto-completion capabilities?

Unintrusive window or pop-up for text entry of name (or signature) in match language, that displays equivalent name/signature (would Hamming distance work here?) in target language. Using XTM/CTM as format would enable distributed (and yet interchangeable) construction of editorial artifacts for various programming languages.

Not the path to world domination or peace but on the other hand, it would be useful.