Posts in category CL

A small, well-balanced set of very few selected ideas.

Zen and Art of Symbolics Common Lisp.

Let's say that Lisp is very different because it was based on a "good ideas", well researched in "good places" like MIT AI and CS labs (no punks could get there). It has something to do with shoulders of Titans. The set of selected "good ideas" is intentionally keept small and well-balanced. Let's take a walk.

The "layers of DSLs" metaphor mimics as close as possible the way our mind uses a language, which we call "verbal", or "logical" mind, as opposed to "non-verbal, ancient, emotional" mind, which is based on evolved reflexes, captured by genes (Sorry!).

We are using symbols to refer to inner representations (concepts) we have in our minds. so "everything is a symbol" (just a reference to a "storage") is a "good idea".

When we're talking about some specific context (situation, or "domain") we tend to use some reduced, appropriate set of words (symbols) related to it, along with usual "glue". This what we call Domain Specific Language, or a Slang, if you wish. This is also a "good idea", groups has slangs.

Layered structure is, in some sense, what the Nature is. Atoms, molecules, proteins, tissues, organs, systems, brain, body, you see. So, layers of data is a good idea, but layers of DSLs is even better one. It not only mimics how the world is, but how we should structure our programs.

Neither the language nor its layers are set in stone, they could be transformed, extended, adapted using the very same mechanism which underlies it. Iterative looping constructs, for example, were build out of macros, and the Loop DSL is the most striking example of how a language could be expended with itself as a Meta-language.

Some "good people", like R. Gabriel, have argued that we need more complex control constructs (special forms) as long as we are trying to write complex programs (language should be adequate to the domain), so, the ability do define new special forms as we go, without breaking everything, is a "good idea".

This is, btw, the idea behind the recursive bottom-up process of software development, popularized by SICP and championed by pg and rtm. Language should evolve together with our understanding of the problem domain.

Structures is also a DSL. This gives us the way to structure our data (everything is an expression in a Lisp, everything could be evaluated, this is another very "good idea") to mimic or represent more conveniently the objects of real world. Structures could be nested, getters and setters were created automatically, but could be redefined, etc.

So, by extending the language with new, appropriate constructs (by defining new special forms) we could improve our way of modeling the reality and create better "inner representations" for the concepts we made (captured)? Seems like a "good idea".

But wait, because everything is an expression (a Lisp form) which could be evaluated, why not just put code blocks (expressions) into the same structures? Thus we have "data structures" which captures not just characteristics (state) but also behavior of real world "objects".

To capture the behavior we need the notion of "protocols", which is just a named set of generic functions. So we have defprotocol which, essentially, creates a structure and binds names (symbols) to procedures (expressions) which consists of Lisp forms. Thus we got MOP implemented, which is the basis of CLOS.

I forgot to mention, that since everything is a symbol (reference) we could combine "objects" into lists and other aggregates, map them, reduce them - the whole set of layers of language "below" are available.

This doesn't mean that this is the only way to program, this is just one of the possible programming paradigms. What is really matter is that we could, by using the very same means of combination and abstraction "import" any other paradigm we wish.

And everything so uniform and concise that it could be easily traced back to "conses".

btw, this is not a "strict", "rigid", "set in stone" language (it is a nonsense to try to "fix" a language, it contradicts with the its nature). We have reasonable, "evolved" defaults, such as Lexical scoping for variables, but we could have Dynamic scoped ones if we wish. Thus, it is possible to extend the language with "non-local exit" control structures, which is your fucking Exceptions.

Immutability has also reasonable defaults. List and mapping functions are always producing a new copy of a list, leaving original ones unaltered, while their "destructive" equivalents were segregated by following an explicit calling convention (Scheme is famous for this).

Evaluation strategies could also be explicitly selected, so lazy lists or streams could be defined using the very same conses and macros and list notation.

Being a small language (after all the transformations - macro-expansions, rewriting rules, inlining has been done) it could be efficiently compiled (using a compiler written in itself) diretly into Machine code, which runs on more plathorms than fucking JVM.

This is what some people are considered as a work of a fine art, and called "programmable programming language".

But this is only a half of the story. There were machines (a hardware FSM if you wish) which was able to run the Lisp code efficiently. But this is another story.