Another Word For It Patrick Durusau on Topic Maps and Semantic Diversity

April 9, 2014

Learning Lisp With C

Filed under: C/C++,Functional Programming,Lisp,Programming — Patrick Durusau @ 12:53 pm

Build Your Own Lisp by Daniel Holden.

From the webpage:

If you’re looking to learn C, or you’ve ever wondered how to build your own programming language, this is the book for you.

In just a few lines of code, I’ll teach you how to effectively use C, and what it takes to start building your own language.

Along the way we’ll learn about the weird and wonderful nature of Lisps, and what really makes a programming language. By building a real world C program we’ll learn implicit things that conventional books cannot teach. How to develop a project, how to make life easy for your users, and how to write beautiful code.

This book is free to read online. Get started now!

Read Online!

This looks interesting and useful.

Enjoy!

March 5, 2014

February 4, 2014

Data Structures in Clojure:…

Filed under: Clojure,Data Structures,Lisp,Programming — Patrick Durusau @ 9:23 pm

Data Structures in Clojure: Singly-Linked Lists by Max Countryman.

From the post:

This series is about the implementation of common data structures. Throughout the series we will be implementing these data structures ourselves, exploring how they work. Our implementations will be done in Clojure. Consequently this tutorial is also about Lisp and Clojure. It is important to note that, unlike the standard collection of data structures found in Clojure, which are persistent, ours will be mutable.

To start with, we will explore a linked list implementation using deftype (more on this later) to define a JVM class from Clojure-land. This implementation will be expanded to include in-place reversal. Finally we will utilize Clojure’s built-in interfaces to give our linked list access to some of the methods Clojure provides.

If you aren’t going to invent your own computer language, why not learn an existing one better?

The next post is on hash tables.

Enjoy!

November 14, 2013

Somthing Very Big Is Coming… [Wolfram Language]

Filed under: Homoiconic,Lisp,Mathematica,Wolfram Language — Patrick Durusau @ 10:51 am

Something Very Big Is Coming: Our Most Important Technology Project Yet by Stephen Wolfram.

From the post:

Computational knowledge. Symbolic programming. Algorithm automation. Dynamic interactivity. Natural language. Computable documents. The cloud. Connected devices. Symbolic ontology. Algorithm discovery. These are all things we’ve been energetically working on—mostly for years—in the context of Wolfram|Alpha, Mathematica, CDF and so on.

But recently something amazing has happened. We’ve figured out how to take all these threads, and all the technology we’ve built, to create something at a whole different level. The power of what is emerging continues to surprise me. But already I think it’s clear that it’s going to be profoundly important in the technological world, and beyond.

At some level it’s a vast unified web of technology that builds on what we’ve created over the past quarter century. At some level it’s an intellectual structure that actualizes a new computational view of the world. And at some level it’s a practical system and framework that’s going to be a fount of incredibly useful new services and products.

A crucial building block of all this is what we’re calling the Wolfram Language.

In a sense, the Wolfram Language has been incubating inside Mathematica for more than 25 years. It’s the language of Mathematica, and CDF—and the language used to implement Wolfram|Alpha. But now—considerably extended, and unified with the knowledgebase of Wolfram|Alpha—it’s about to emerge on its own, ready to be at the center of a remarkable constellation of new developments.

We call it the Wolfram Language because it is a language. But it’s a new and different kind of language. It’s a general-purpose knowledge-based language. That covers all forms of computing, in a new way.

There are plenty of existing general-purpose computer languages. But their vision is very different—and in a sense much more modest—than the Wolfram Language. They concentrate on managing the structure of programs, keeping the language itself small in scope, and relying on a web of external libraries for additional functionality. In the Wolfram Language my concept from the very beginning has been to create a single tightly integrated system in which as much as possible is included right in the language itself.

And so in the Wolfram Language, built right into the language, are capabilities for laying out graphs or doing image processing or creating user interfaces or whatever. Inside there’s a giant web of algorithms—by far the largest ever assembled, and many invented by us. And there are then thousands of carefully designed functions set up to use these algorithms to perform operations as automatically as possible.

It’s not possible to evaluate the claims that Stephen makes in this post without access to the Wolfram language.

But, given his track record, I do think it is important that people across CS begin to prepare to evaluate it upon release.

For example, Stephen says:

In most languages there’s a sharp distinction between programs, and data, and the output of programs. Not so in the Wolfram Language. It’s all completely fluid. Data becomes algorithmic. Algorithms become data. There’s no distinction needed between code and data. And everything becomes both intrinsically scriptable, and intrinsically interactive. And there’s both a new level of interoperability, and a new level of modularity.

Languages that don’t distinguish between programs and are called homoiconic languages.

One example of a homoiconic language is Lisp, first specified in 1958.

I would not call homoiconicity a “new” development, particularly with a homoiconic language from 1958.

Still, I have signed up for early notice of the Wolfram language release and suggest you do the same.

October 15, 2013

Diving into Clojure

Filed under: Clojure,Lisp,Programming — Patrick Durusau @ 7:44 pm

Diving into Clojure

A collection of Clojure resources focused on “people who want to start learning Clojure.”

There are a number of such collections on the Net.

It occurs to me that it would be interesting to mine a data set like Common Crawl for Clojure resources.

Deduping the results but retaining the number of references to each resource for ranking purposes.

That could be a useful resource.

Particularly if all the cited resources were retrieved, indexed, mapped and burned to a DVD as conference swag.

August 28, 2013

Casting SPELs In LISP

Filed under: Clojure,Lisp — Patrick Durusau @ 2:40 pm

Casting SPELs In LISP by Conrad Barski, M.D.

From the homepage:

Anyone who has ever learned to program in Lisp will tell you it is very different from any other programming language. It is different in lots of surprising ways- This comic book will let you find out how Lisp’s unique design makes it so powerful!

There are other language versions, Emacs Lisp, Clojure Lisp and Turkish.

Understand I am just taking Dr. Barski’s word for the Turkish version being the same as the original text. I don’t read Turkish.

If you prefer playful ways to learn a computer language, this should a winner for you!

June 1, 2013

An introduction to Emacs Lisp

Filed under: Authoring Topic Maps,Editor,Lisp — Patrick Durusau @ 10:30 am

An introduction to Emacs Lisp by Christian Johansen.

From the webpage:

As a long-time passionate Emacs user, I’ve been curious about Lisp in general and Emacs Lisp in particular for quite some time. Until recently I had not written any Lisp apart from my .emacs.d setup, despite having read both An introduction to programming in Emacs Lisp and The Little Schemer last summer. A year later, I have finally written some Lisp, and I thought I’d share the code as an introduction to others out there curious about Lisp and extending Emacs.

(…)

The Task

The task I set out to solve was to make Emacs slightly more intelligent when working with tests written in Buster.JS, which is a test framework for JavaScript I’m working on with August Lilleaas. In particular I wanted Emacs to help me with Buster’s concept of deferred tests.

Yesterday a graph programmer suggested to me some people program in Lisp and but the whole world uses Java.

Of course, most of the world is functionally illiterate too but I don’t take that as an argument for illiteracy.

Not to cast aspersions on Java, a great deal of excellent work is done in Java. (See the many Apache projects that use Java.)

But counting noses is a lemming measure, which is not related the pros or cons of any particular language.

What topic map authoring tasks would you extend Emacs to facilitate?

I first saw this in Christophe Lalanne’s A bag of tweets / May 2013.

April 12, 2013

NLTK 2.3 – Working with Wordnet

Filed under: Lisp,Natural Language Processing,NLTK,WordNet — Patrick Durusau @ 3:38 pm

NLTK 2.3 – Working with Wordnet by Vsevolod Dyomkin.

From the post:

I’m a little bit behind my schedule of implementing NLTK examples in Lisp with no posts on topic in March. It doesn’t mean that work on CL-NLP has stopped – I’ve just had an unexpected vacation and also worked on parts, related to writing programs for the excellent Natural Language Processing by Michael Collins Coursera course.

Today we’ll start looking at Chapter 2, but we’ll do it from the end, first exploring the topic of Wordnet.

Vsevolod more than makes up for his absence with his post on Wordnet.

As a sample, consider this graphic of the potential of Wordnet:

Wordnet schema

Pay particular attention to the coverage of similarity measures.

Enjoy!

March 19, 2013

AI Algorithms, Data Structures, and Idioms…

Filed under: Algorithms,Artificial Intelligence,Data Structures,Java,Lisp,Prolog — Patrick Durusau @ 10:51 am

AI Algorithms, Data Structures, and Idioms in Prolog, Lisp and Java by George F. Luger and William A. Stubblefield.

From the introduction:

Writing a book about designing and implementing representations and search algorithms in Prolog, Lisp, and Java presents the authors with a number of exciting opportunities.

The first opportunity is the chance to compare three languages that give very different expression to the many ideas that have shaped the evolution of programming languages as a whole. These core ideas, which also support modern AI technology, include functional programming, list processing, predicate logic, declarative representation, dynamic binding, meta-linguistic abstraction, strong-typing, meta-circular definition, and object-oriented design and programming. Lisp and Prolog are, of course, widely recognized for their contributions to the evolution, theory, and practice of programming language design. Java, the youngest of this trio, is both an example of how the ideas pioneered in these earlier languages have shaped modern applicative programming, as well as a powerful tool for delivering AI applications on personal computers, local networks, and the world wide web.

Where could you go wrong with comparing Prolog, Lisp and Java?

Either for the intellectual exercise or because you want a better understanding of AI, a resource to enjoy!

March 6, 2013

NLTK 1.3 – Computing with Language: Simple Statistics

Filed under: Lisp,Natural Language Processing,NLTK — Patrick Durusau @ 11:20 am

NLTK 1.3 – Computing with Language: Simple Statistics by Vsevolod Dyomkin.

From the post:

Most of the remaining parts of the first chapter of NLTK book serve as an introduction to Python in the context of text processing. I won’t translate that to Lisp, because there’re much better resources explaining how to use Lisp properly. First and foremost I’d refer anyone interested to the appropriate chapters of Practical Common Lisp:

List Processing
Collections
Variables
Macros: Standard Control Constructs

It’s only worth noting that Lisp has a different notion of lists, than Python. Lisp’s lists are linked lists, while Python’s are essentially vectors. Lisp also has vectors as a separate data-structure, and it also has multidimensional arrays (something Python mostly lacks). And the set of Lisp’s list operations is somewhat different from Python’s. List is the default sequence data-structure, but you should understand its limitations and know, when to switch to vectors (when you will have a lot of elements and often access them at random). Also Lisp doesn’t provide Python-style syntactic sugar for slicing and dicing lists, although all the operations are there in the form of functions. The only thing which isn’t easily reproducible in Lisp is assigning to a slice:

Vsevolod continues his journey through chapter 1 of NLTK 1.3 focusing on the statistics (with examples).

March 4, 2013

NLTK 1.1 – Computing with Language: …

Filed under: Lisp,Natural Language Processing,NLTK — Patrick Durusau @ 3:56 pm

NLTK 1.1 – Computing with Language: Texts and Words by Vsevolod Dyomkin.

From the post:

OK, let’s get started with the NLTK book. Its first chapter tries to impress the reader with how simple it is to accomplish some neat things with texts using it. Actually, the underlying algorithms that allow to achieve these results are mostly quite basic. We’ll discuss them in this post and the code for the first part of the chapter can be found in nltk/ch1-1.lisp.

A continuation of Natural Language Meta Processing with Lisp.

Who knows? You might decide that Lisp is a natural language. 😉

February 24, 2013

The Rob Warnock Lisp Usenet Archive [Selling Topic Maps]

Filed under: Lisp,Programming — Patrick Durusau @ 7:50 pm

The Rob Warnock Lisp Usenet Archive by Zach Beane.

From the post:

I've been reading and enjoying comp.lang.lisp for over 10 years. I find it important to ignore the noise and seek out material from authors that clearly have something interesting and informative to say.

Rob Warnock has posted neat stuff for many years, both in comp.lang.lisp and comp.lang.scheme. After creating the Erik Naggum archive, Rob was next on my list of authors to archive. It took me a few years, but here it is: the Rob Warnock Lisp Usenet archive. It has 3,265 articles from comp.lang.lisp and comp.lang.scheme from 1995 to 2009, indexed and searchable. I hope it helps you find as many useful articles as I have over the years.

You can imagine my heartbreak when the Eric Naggum archive turned out to be for comp.lang.lisp. 😉

I think Zach’s point, it is “important to ignore the noise and seek out material from authors that clearly have something interesting and informative to say,” is a clue to the difficulty selling topic maps.

Who thinks that is important?

If I am being paid by the hour to sort through search engine results, what is my motivation to do it faster/better?

If I am managing hourly workers, who are doing the sorting of search engine results, won’t doing it faster reduce the payroll I manage?

If my department has the manager with hourly workers and the facilities to house them, what is my motivation for faster/better?

If my company/government agency has the department with the manager with hourly workers and facilities under contract, what is my motivation for faster/better?

If that helps identify who has no motivation for topic maps, who should be interested in topic maps?

I first saw this at Christophe Lalanne’s A bag of tweets / February 2013.

Natural Language Meta Processing with Lisp

Filed under: Lisp,Natural Language Processing — Patrick Durusau @ 5:51 pm

Natural Language Meta Processing with Lisp by Vsevolod Dyomkin.

From the post:

Recently I’ve started work on gathering and assembling a comprehensive suite of NLP tools for Lisp — CL-NLP. Something along the lines of OpenNLP or NLTK. There’s actually quite a lot of NLP tools in Lisp accumulated over the years, but they are scattered over various libraries, internet sites and books. I’d like to have them in one place with a clean and concise API which would provide easy startup point for anyone willing to do some NLP experiments or real work in Lisp. There’s already a couple of NLP libraries, most notably, langutils, but I don’t find them very well structured and also their development isn’t very active. So, I see real value in creating CL-NLP.

Besides, I’m currently reading the NLTK book. I thought that implementing the examples from the book in Lisp could be likewise a great introduction to NLP and to Lisp as it is an introduction to Python. So I’m going to work through them using CL-NLP toolset. I plan to cover 1 or 2 chapters per month. The goal is to implement pretty much everything meaningful, including the graphs — for them I’m going to use gnuplot driven by cgn of which I’ve learned answering questions on StackOverflow. 🙂 I’ll try to realize the examples just from the description — not looking at NLTK code — although, I reckon it will be necessary sometimes if the results won’t match. Also in the process I’m going to discuss different stuff re NLP, Lisp, Python, and NLTK — that’s why there’s “meta” in the title. 🙂

Just in case you haven’t found a self-improvement project for 2013! 😉

Seriously, this could be a real learning experience.

I first saw this at Christophe Lalanne’s A bag of tweets / February 2013.

February 17, 2013

Lisp lore : a guide to programming the Lisp machine (1986)

Filed under: Lisp,Programming — Patrick Durusau @ 4:09 pm

Lisp lore : a guide to programming the Lisp machine (1986) by Hank Bromley.

From the introduction:

The full 11 -volume set of documentation that comes with a Symbolics lisp machine is understandably intimidating to the novice. “Where do I start?” is an oft-heard question, and one without a good answer. The eleven volumes provide an excellent reference medium, but are largely lacking in tutorial material suitable for a beginner. This book is intended to fill that gap. No claim is made for completeness of coverage — the eleven volumes fulfill that need. My goal is rather to present a readily grasped introduction to several representative areas of interest, including enough information to show how easy it is to build useful programs on the lisp machine. At the end of this course, the student should have a clear enough picture of what facilities exist on the machine to make effective use of the complete documentation, instead of being overwhelmed by it.

From the days when documentation was an expectation, not a luxury.

One starting place to decide if the ideas in a patent application are “new” or invented before a patent examiner went to college. 😉

Some other Lisp content you may find of interest:

I first saw this at Christophe Lalanne’s “A bag of tweets / January 2013.”

November 26, 2012

BigData using Erlang, C and Lisp to Fight the Tsunami of Mobile Data

Filed under: BigData,Erlang,Lisp — Patrick Durusau @ 7:23 pm

BigData using Erlang, C and Lisp to Fight the Tsunami of Mobile Data by Jon Vlachogiannis.

From the post:

BugSense, is an error-reporting and quality metrics service that tracks thousand of apps every day. When mobile apps crash, BugSense helps developers pinpoint and fix the problem. The startup delivers first-class service to its customers, which include VMWare, Samsung, Skype and thousands of independent app developers. Tracking more than 200M devices requires fast, fault tolerant and cheap infrastructure.

The last six months, we’ve decided to use our BigData infrastructure, to provide the users with metrics about their apps performance and stability and let them know how the errors affect their user base and revenues.

We knew that our solution should be scalable from day one, because more than 4% of the smartphones out there, will start DDOSing us with data.

A number of lessons to consider if you want a system that scales.

October 21, 2012

7 John McCarthy Papers in 7 weeks – Prologue

Filed under: Artificial Intelligence,CS Lectures,Lisp — Patrick Durusau @ 6:28 pm

7 John McCarthy Papers in 7 weeks – Prologue by Carin Meier.

From the post:

In the spirit of Seven Languages in Seven Weeks, I have decided to embark on a quest. But instead of focusing on expanding my mindset with different programming languages, I am focusing on trying to get into the mindset of John McCarthy, father of LISP and AI, by reading and thinking about seven of his papers.

See Carin’s blog for progress so far.

I first saw this at John D. Cooks’s The Endeavor

How would you react to something similar for topic maps?

April 14, 2012

Procedural Reflection in Programming Languages Volume 1

Filed under: Lisp,Reflection,Scala — Patrick Durusau @ 6:28 pm

Procedural Reflection in Programming Languages Volume 1

Brian Cantwell Smith’s dissertation that is the base document for reflection in programming languages.

Abstract:

We show how a computational system can be constructed to “reason”, effectively and consequentially, about its own inferential processes. The analysis proceeds in two parts. First, we consider the general question of computational semantics, rejecting traditional approaches, and arguing that the declarative and procedural aspects of computational symbols (what they stand for, and what behaviour they engender) should be analysed independently, in order that they may be coherently related. Second, we investigate self-referential behaviour in computational processes, and show how to embed an effective procedural model of a computational calculus within that calculus (a model not unlike a meta-circular interpreter, but connected to the fundamental operations of the machine in such a way as to provide, at any point in a computation, fully articulated descriptions of the state of that computation, for inspection and possible modification). In terms of the theories that result from these investigations, we present a general architecture for procedurally reflective processes, able to shift smoothly between dealing with a given subject domain, and dealing with their own reasoning processes over that domain.

An instance of the general solution is worked out in the context of an applicative language. Specifically, we present three successive dialects of LISP: 1-LISP, a distillation of current practice, for comparison purposes; 2-LISP, a dialect constructed in terms of our rationalised semantics, in which the concept of elevation is rejected in favour of independent notions of simplification and reference, and in which the respective categories of notation, structure, semantics, and behaviour are strictly aligned; and 3-LISP, an extension of 2-LISP endowed with reflective powers. (Warning: Hand copied from an image PDF. Tying errors may have occurred.)

I think reflection as it is described here is very close to Newcomb’s notion of composite subject identities, which are themselves composed of composite subject identities.

Has me wondering what a general purpose identification language with reflection would look like?

December 6, 2011

Common Lisp is the best language to learn programming

Filed under: Authoring Topic Maps,Lisp,Programming,Topic Maps — Patrick Durusau @ 8:06 pm

Common Lisp is the best language to learn programming

From the post:

Now that Conrad Barski’s Land of Lisp (see my review on Slashdot) has come out, I definitely think Common Lisp is the best language for kids (or anyone else) to start learning computer programming.

Not trying to start a language war but am curious about two resources cited in this post:

Common Lisp HyperSpec

and,

Common Lisp the language, 2nd edition

My curiosity?

How would you map these two resources into a single topic map on Lisp?

Is there any third resource, perhaps the “Land of Lisp” that you would like to add?

Any blogs, mailing list posts, etc.?

Would that topic map be any different if you decided to add Scheme or Haskell to your topic map?

If this were a “learning lisp” resource for beginning programmers, how would you limit the amount of information exposed?

September 24, 2011

newLISP® for Mac OS X, GNU Linux, Unix and Win32

Filed under: Lisp,MapReduce — Patrick Durusau @ 6:59 pm

newLISP® for Mac OS X, GNU Linux, Unix and Win32

From the website:

newLISP is a Lisp-like, general-purpose scripting language. It is especially well-suited for applications in AI, web search, natural language processing, and machine learning. Because of its small resource requirements, newLISP is also excellent for embedded systems applications. Most of the functions you will ever need are already built in. This includes networking functions, support for distributed and parallel processing, and Bayesian statistics.

At version 10.3.3, newLISP say that it has over 350 functions and is about 200K in size.

Interesting that one of the demo applications written in 2007 is MapReduce.

Some posts on its mailing lists but I would not call them high traffic. 😉

September 1, 2011

Common Lisp HyperSpec

Filed under: Lisp — Patrick Durusau @ 5:59 pm

Common Lisp HyperSpec

A hypertext version of the Common Lisp standard, along with issues resolved in its making.

Since functional programming discussions make reference to Lisp fairly often, you might want to bookmark this site.

July 8, 2011

Beating the Averages

Filed under: Lisp — Patrick Durusau @ 3:55 pm

Betting the Averages

Great summer reading for anyone who wants a successful startup or simply to improve an ongoing software company. (although the latter is probably the harder of the two tasks)

A couple of quotes to get you interested in reading more:

But I think I can give a kind of argument that might be convincing. The source code of the Viaweb editor was probably about 20-25% macros. Macros are harder to write than ordinary Lisp functions, and it’s considered to be bad style to use them when they’re not necessary. So every macro in that code is there because it has to be. What that means is that at least 20-25% of the code in this program is doing things that you can’t easily do in any other language. However skeptical the Blub programmer might be about my claims for the mysterious powers of Lisp, this ought to make him curious. We weren’t writing this code for our own amusement. We were a tiny startup, programming as hard as we could in order to put technical barriers between us and our competitors.

A suspicious person might begin to wonder if there was some correlation here. A big chunk of our code was doing things that are very hard to do in other languages. The resulting software did things our competitors’ software couldn’t do. Maybe there was some kind of connection. I encourage you to follow that thread. There may be more to that old man hobbling along on his crutches than meets the eye.

And consider his advice on evaluating competitors:

If you ever do find yourself working for a startup, here’s a handy tip for evaluating competitors. Read their job listings. Everything else on their site may be stock photos or the prose equivalent, but the job listings have to be specific about what they want, or they’ll get the wrong candidates.

During the years we worked on Viaweb I read a lot of job descriptions. A new competitor seemed to emerge out of the woodwork every month or so. The first thing I would do, after checking to see if they had a live online demo, was look at their job listings. After a couple years of this I could tell which companies to worry about and which not to. The more of an IT flavor the job descriptions had, the less dangerous the company was. The safest kind were the ones that wanted Oracle experience. You never had to worry about those. You were also safe if they said they wanted C++ or Java developers. If they wanted Perl or Python programmers, that would be a bit frightening– that’s starting to sound like a company where the technical side, at least, is run by real hackers. If I had ever seen a job posting looking for Lisp hackers, I would have been really worried.

This was written in 2003.

How would you update his advice on evaluating job descriptions at other startups?

April 23, 2011

Structure and Interpretation of Computer Programs

Filed under: Lisp — Patrick Durusau @ 8:21 pm

Structure and Interpretation of Computer Programs

From the website:

Structure and Interpretation of Computer Programs has been MIT’s introductory pre-professional computer science subject since 1981. It emphasizes the role of computer languages as vehicles for expressing knowledge and it presents basic principles of abstraction and modularity, together with essential techniques for designing and implementing computer languages. This course has had a worldwide impact on computer science curricula over the past two decades. The accompanying textbook by Hal Abelson, Gerald Jay Sussman, and Julie Sussman is available for purchase from the MIT Press, which also provides a freely available on-line version of the complete textbook.

These twenty video lectures by Hal Abelson and Gerald Jay Sussman are a complete presentation of the course, given in July 1986 for Hewlett-Packard employees, and professionally produced by Hewlett-Packard Television. The videos have been used extensively in corporate training at Hewlett-Packard and other companies, as well as at several universities and in MIT short courses for industry.

An introduction to computer programming, Lisp and the art of teaching the same.

April 19, 2011

The Lisp Curse

Filed under: Lisp,Marketing,Semantic Diversity — Patrick Durusau @ 9:48 am

The Lisp Curse by Rudolf Winestock begins:

This essay is yet another attempt to reconcile the power of the Lisp programming language with the inability of the Lisp community to reproduce their pre-AI Winter achievements. Without doubt, Lisp has been an influential source of ideas even during its time of retreat. That fact, plus the brilliance of the different Lisp Machine architectures, and the current Lisp renaissance after more than a decade in the wilderness demonstrate that Lisp partisans must have some justification for their smugness. Nevertheless, they have not been able to translate the power of Lisp into a movement with overpowering momentum.

In this essay, I argue that Lisp’s expressive power is actually a cause of its lack of momentum.

Read the essay, then come back here. I’ll wait.

… … … …

OK, good read, yes?

At first blush, I thought about HyTime and its expressiveness. Or of topic maps. Could there be a parallel?

But non-Lisp software projects proliferate.

Let’s use http://sourceforge.net for examples.

Total projects for the database category – 906.

How many were written using Lisp?

Lisp 1

Compared to:

Java 282
C++ 106
PHP 298
Total: 686

That may not be fair.

Databases may not attract AI/Lisp programmers.

What about artificial intelligence?

Lisp 8
Schema 3
Total: 11

Compared to:

Java 115
C++ 111
C 42
Total: 268

Does that mean that Java, C++ and C are too expressive?

Or that their expressiveness has retarded their progress in some way?

Or is some other factor is responsible for proliferation of projects?

And a proliferation of semantics.

*****
Correction: I corrected sourceforge.org -> sourceforge.net and made it a hyperlink. Fortunately sourceforge silently redirects my mistake in entering the domain name in a browser.

« Newer Posts

Powered by WordPress