There are foundations, literally.
Everything that is good and true.
- Code is data (from the designers of earlier CPUs)
- List Processing (from
John McCarthyand MIT (then Stanford) AI labs)
- Text streams, pipelines (from Bell labs - UNIX, Plan9, etc.)
- Layers of Protocols (how communication work)
- The Actor Model and Message Passing (a model of distributed systems)
There are fundamental ideas. They describe how real-world systems were designed and why. How Computers were designed, and why this or that way. What is High-Level Abstractions and Programming Languages. (high-level means Lisp, not OO or VMs). What is Inter-Process Communication and why the content must be human-readable. How complicated Heterogeneous Networks works and why. How large Distributed Systems were built and what is the underlying design.
After studying those concepts dealing with particular technologies, such as XML, becomes easy. All their shortcomings and ugliness would be clearly seen.
- No side effects (procedures acts like functions - same input - same output)
- Function composition (Mostly-functional / mostly-stateless approach)
- Interfaces (a set of function type-signatures)
- Protocols (a set of an explicit and formally defined rules)
- Share nothing (Mother Nature knows)
- Asynchronous Message passing (event-driven world)
- Procedures are first-class (values)
- Procedures are functions (closures)
- Referential transparency (values and bindings are immutable)
- Everything is a pointer (uniformity)
- Every value has a type (tag)
- Code is data (list structure)
- A value might have its own structure (slots)
- Strong typing (no implicit coersions)
- Type safety (types are checked at runtime, errors are trapped)
- Type annotations (checked at compile-time)
- Macros (transformation)