Posts by author schiptsov

A polite fuckyou-review

There is something deeply cultural in how Russians are falling in love with abstract bullshit, which they call theories to signal that they are smarter than they really are. It must be related to almost a century of brainwashing with Marxism, Leninism and Freudian and Hegelian bullshit.

Like any consumers of abstract systems they love to "stretch" the terminology a bit, forgetting that words are supposed to have precise meaning (being associated with actual aspects of reality) and that terminology should have been defined unambiguously before being used.

Of course, this is not necessarily required if you just want to get some hype by shitposting about some trendy abstract bullshit, which, like any esoteric and theosophical subject, never get critically reviewed by adepts of the sect.

By the way, so called Category Theory is the single biggest virtue-signaling opportunity, second only to religions, and, ironically enough, like Haskell, it is a trap for pseudo-intellectual narcissistic idiots.

Well, I will break the rule and do partial peer-review of this piece:

https://boris-marinov.github.io/category-theory-illustrated/05_logic/

I know I will be down-voted into oblivion and banned for this post on HN (which has been evolved into a virtue-signaling platform), so I will publish my review on my own website.

So, lets begin.

Logic is the science of the possible.

he said

As such, it is at the root of all other sciences, all of which are sciences of the actual, i.e. that which really exists.

The very first passage raises some eyebrows, to say the least. What does "the science of possible" even mean? As far as I know, science is an evolved (from merely speculation) methodology to establish the truth about certain aspects of reality.

It consists of a systematic and well-structured process, which require careful observation of a phenomena in question, formulation of a hypothesis regarding some well-defined and carefully measured aspect of it, design of a replicable (reproducible) experiment to test the hypothesis, and then refining or reformulating the hypothesis depending on the results of experiments.

Science, therefore, implies only something possible - a science of impossible is a plain nonsense. The whole methodology (which is what the world "science" stands for) is to be applied to actual possibilities, and an absence of a result is a signal that some theorizing is, perhaps, went too far.

Logic studies the rules by which knowing one thing leads you to conclude (or prove) that some other thing is also true, regardless of the things’ domain (e.g. scientific discipline) and by only referring to their form.

Well, logic is not the study of the rules of inference. It is the study of a sound argumentation (including formal reasoning), to which the rules of inference (or of simplification of proofs) are just one technical aspect.

Being sloppy in terminology leads to this "to conclude (or prove) that some other thing is also true" and even to more horrifying "regardless of the things’ domain". This sloppiness is the sign of a conceptual mess.

First of all, a proof is not in the same category as conclusion. Applying an operator (OR in this case) to both of them is a type-error, plain and simple.

A proof is a reproducible (like an experiment) and verifiable sequence of terms, from a set of premises to a conclusion, which can be verified by the whole formalism - the particular system in question.

The use of logic to verify its own terms is the fundamental and definitive quality and is reflected in the recursive nature of any general (or concrete) simplifier (or evaluator).

Recursive, convergent processes are fundamental to any system of logic, of which mathematical induction is a specialization. Such view of a process (recursive and convergent i.e. spiral shaped) is fundamental to the Universe itself.

Second, there is no such thing as logic regardless of the domain. This statement should be re-read a few times.

There is nothing ephemeral or god given out there which is valid (or even exist) regardless of the domain. Literally nothing.

Although there are rules of inference based on the form of terms (compound terms with connectives), such terms cannot be formed abstractly, without domain knowledge. Unless you are Hegel, of course.

The principle here is that your premises has been proven valid (by the very same recursive process, which relies on what is already known, including the rules of inference) before they are being used as terms in new compound propositions.

That "Socrates is a Man" deduction reflects a few deep facts about biological evolution (that what we classify as species have common traits or qualities (attributes) based on which the mind classify them as "an external observer").

There is no justification to assume validity or existence of Modus Ponens as an entity disconnected from a domain.

So what Modus Ponens is? Well, it is an expression in a meta-language, a captured generalization, which is being observed again and again in domain-specific languages.

It captures the law of causation, that everything has its causes, and, it also generalizes to categorical thinking, which, in turn, is possible only because there are physical laws and the law of causation in the first place (everything which mind categorizes is because of causation).

Again, disconnected from the domain it loses meaning and ceases to exist.

And yes, there is a fundamental problem with the classic truth-tables, in part where falsehood implies anything. They cannot be universal because they contradict the [Multiple] Causality Principle. One of them must be wrong, and it is not the causality, of course.

instead of the word “formal” we used another similar word, namely “abstract”, and instead of “logical system” we said “theory”.

This is bullshit in its essence. Formal and abstract are unrelated, unless they are "purely formal" (which is nonsense) and "purely abstract" (which under-educated virtue-signalers love so much). And, of course, a logical system is not a theory.

today most people agree that every mathematical theory is actually logic plus some additional definitions added to it.

No, it is not. Formal logic is a sub-discipline (actually, a meta-language, the one above domain-specific languages), which includes a methodology for formal reasoning (reasoning, based on the form of compound expressions with involves certain logical connectives). Again, the premises must be valid apriory (or being form a valid, non-contradictory set of axioms) which connects and grounds it in reality.

Logical connectives and predicates operate at the level of the meta-language (or a particular logical system). It has been discovered that this meta-language must be typed (the conceptual space must be be partitioned).

A mathematical theory is, by definition, uses mathematical logic (which is what makes mathematics what it is), but using mathematical formalism and notation does not promote bullshit into realm of scientific disciplines.

rules of inference are almost the same thing except they allow us to actually distill the conclusion from the premises.

What the fuck is "distill"? Is this some vodka-induced logic, Ivan? Similar to Hegelian, but from Russia with love?

I am sorry, I cannot continue to read this bullshit. As one may imagine, it will not get any better down below.

What is important is to show what a fucking bullshit this virtue-signalling witting really is. Remember, that a single logical flaw is enough to discard the whole line of argument.

And no, this "theory" is nothing but a graphical representation of structures of some similar concepts, which, in turns, are generalizations, sometimes too abstract to remain real. Monoid, perhaps, where to stop.

Is π the Same in Every Universe?

https://news.ycombinator.com/item?id=27818986

This is a double barrel idiotic question. First of all, there is no other universes. More precisely, there is no way to know or even assume the existence of other.

Second, pi is a notion (generalisation) of human mind, and does not exist anywhere outside of shared culture. It is just a relation of two abstract notions.

Yes, other observers could come to the same notions and conclusions, but it is only a theoretical possibly.

Idiots Against SQL

https://news.ycombinator.com/item?id=27791539

Against the fundamental notions of product types, records, unions and intersections, binary relations?

Idiots, idiots everywhere.

What Are the Odds We Are Living in a Computer Simulation

https://news.ycombinator.com/item?id=27737069

This is are a meaningless questions. One cannot calculate any such odds in principle. It is an application of a wrong concept. The real answer is that in cannot be known, again, in principle.

The first such answer has been given by Upanishadic seers back then, based on the principle that intellect, conditioned by perceptions, cannot know. Observations of effects is not enough to know the causes and "mechanics".

The modern answer is that abstraction barriers are impenetrable (yes, in principle). There is absolutely no way to even guess an actual wiring of a processor from the level of code (separated by a few layers of abstraction barriers).

The Limits to Blockchain Scalability

Why do we even take this punk seriously? I would like to read something from really bright and qualified people like Lamport, who have studied distributed systems for decades.

It's time for us in the tech world to speak out ab...

What should we talk about? Tokens are merely chips in the global network of online casinos.

Ethereum is an amateur crap, technically like early PHP webshit. Btc is better but it in principle cannot scale, so it will remain a mere technological curiosity.

Cardano is a corporation-like swamp, but yes, it has been affiliated with the best minds in PL world, at least on paper.

Everything else is either scams, "swaps" or memes. What should we speak about?

Another thread to remember

Bitcoin Blockchain Visualization

https://news.ycombinator.com/item?id=26859895

It should be included in textbooks how IOHK is failing to deliver by turning their Haskell into enterprise java with monads (instead of keeping it as logic and math with actions/effects). Fpcomplete got it all wrong. (Don't read their tutorials, obsessed with strictness - it is a crap).

Haskell is a language to formalize problems and define declarative solutions using equational reasoning.

It is a pure logic, not some funny ML with laziness by default.

No one gives a shit what programming language you use

[https://news.ycombinator.com/item?id=26621344

Only if you do webshit or CRUD. Formal verification people, however, do care a lot.

Haskell is logic, so if you have managed to express your solution in Haskell and it typechecks and compiles, you suddenly have more than just code.

But, of course, Javascript...

Nix is the ultimate DevOps toolkit

https://news.ycombinator.com/item?id=26748696

No. Unnecessary, redundant abstractions and wrapping is never the answer. Standardized interfaces and protocols are.

Erlang and Go got it right. We should learn from hardware people, not from virtue signalling narcissistic assholes.

Nix is a cancer. Stable/standardized and versioned interfaces is a remedy.

The most important statistical ideas of the past 50 years

https://news.ycombinator.com/item?id=26799702

Statistics (merely observation and counting) cannot establish or even discover causation in principle. Period. No matter how many per reviewed gibberish papers will be published, the philosophical principle will stand.

What you observe are effects. Causes are not there.

Only discrete, fully observable, simple systems, like dice or a deck of cards, could be modelled adequately.

Most of real world complex systems with multiple causation cannot.

Advanced statistics is a definition of a socially constructed sectarian movement.

Nassim Taleb: Bitcoin failed as a currency and because...

https://news.ycombinator.com/item?id=26840006

Good morning lmao.

Anyone with a tech background or a few functioning neurones would see this back in 2018.

55% of all Tether, $25B, were created in 2021

https://news.ycombinator.com/item?id=26842986

Supply and demand, why? Some demand comes from retail, who wants to convert their crypto or USD into it, other demand comes from a exchanges, of course, and yet another from whoever it was for whatever they want. Of course, on a retail side the meme "1 usdt = 1 USD" will "hold", because it is a meme "social contract" in which normies believe. However, on the exchange and crooks side there could be almost arbitrary arrangements, including bulk discounts, long term loans, etc.

This is what lack of transparency is for.

We could easily argue that comparable amount has been used by retail, because the whole Ponzi is to sell to them. They always pay a full price plus fees.

Broken logic.

I finally got what is so wrong with these idiotic English examples of silly "logical" implications.

Unrelated expressions cannot constitute a valid logical implication.

Just like that. There must be a relation. Preferably causal.

And when we consider an implication as establishing of a necessary and sufficient condition, necessity should be related to causation.

2 + 2 = 4 does NOT imply 1 - 1 = 0.

These two propositions (both True) are unrelated.

And, of course, False implies nothing. This only valid in mathematical logic. So is inclusive OR.

Mathematical logic is special due to referential transparency of valid expressions and equational reasoning, as a consequence.

Truth tables, not causality or any natural laws controls mathematical logic. This is why False "implies" True and both truths are OR.

Reality has only causal implications and exclusive ORs.

Haskell Requires Perfection "When There is Nothing More To Take Away"

With Haskell perfection is not optional, but required. Otherwise one ends up deep into redundant abstract bullshit, unnecessary wrapping mess, which would be even worse than J2EE bullshit.

Another great example is brevity of speech of smart autistic people compared to verbal diarrhea of intelligence cosplaying imposers.

Dalai Lama speaks a few sentences at a time, but these are well though, have no redundancy no decoration and no long words to impress idiots.

This is precisely how Haskell code must be written - Just Right (the Buddha's principle) or the principle of Antoine de Saint-Exupéry: Perfection is achieved when there is nothing more to take away (which is implied in Buddha's Just Right).

The modern mantra for that is Data Dominates, which means that after finding the most appropriate (Just Right) data structures, the algorithms and the code (implementation) just follow.

For Haskell the mantra is Just Right Types, and everything else follows. It must be explicitly said - The most straightforward, down-to-earth types, such as Sequences, Trees, Tables (Traversable, Foldable, etc), NOT Free Monads and similar stuff.

Monads is just a convenient formal conceptual framework to enforce an abstraction barrier for a declarative (pure functional) language. No more, no less. Explicit order of evaluation is enforced by function call nesting, which is at the core of Monads (and Arrows) implementation.

Kleisli categories and stuff is just an abstract framework which provided an insight of how a barrier could be generalized.

Just this.

What Is Wrong With GPT3 and related models.

The linguistic researchers of the past were much more systematic guys than modern rML imposers.

Most notably, the fathers of NLP (Neuro Linguistic Programming, a pseudo-science) realized that humans have at least two representations, in principle. One, so called Deep Structure (representation) is how our abstractions (maps of the world) are stored in a brain, and Surface Structure one, which is used for verbal communication, after verbalization (literally encoding) for a transmission.

What they did not realize, that this Deep Structure is not arbitrary (by no means) but reflects the constraints of the environment, of which everything, including a brain, is a product.

Genetically transmitted structure of a brain encodes environmental constraints.

This is not for visual or motor cortexes, but for speech areas too. It reflects, for example, that there are things, process, attributes, and events. Deep structure is not arbitrary, like they trying to make it with NNs, it is the opposite - the structure is highly optimized, and it mimics (maps) reality (environment).

This is precisely why (and how) a meaningful speech could be produced - it is just a verbalization of inner conceptual "maps" (represented as brain structures), which reflects what is real.

This is why children are producing meaningful phrases instead of infinite patterns of arbitrary noise, for example.

So, any model based on merely weight will never produce anything meaningful. Only almost indistinguishable from meaningful, which is even more dangerous.

What Is Wrong With HN

Censorship is bad for many reasons, most notably it discourages freedom of expression, which is absolutely crucial part of communication.

All the harsh words, name-calling, etc, are creating required, necessary tension, which is fruitful in the long run, because it encourages people to become stronger. The way Navi seals gets trained, the way fraternity goes on in any college, etc, etc.

HN became a walled garden, safe space for *mediocrity*, with some "sheriffs" patrolling it. The community has been quickly degenerated into some sort of LinkedIn, full of imposers, cosplay of intelligence and commonplaceness.

Below are examples of comments for which I got banned so many times, and yet I refuse to follow any CoC or frame my emotional responses differently, because the emotions are crucial part of the message.

So, fuck off.

Is Philosophy an Art?

https://news.ycombinator.com/item?id=26456223

Philosophy used to be a systematic attempt to answer just one question - What Is? (or what is real?) Science emerged as a standard methodology much later.

Abstract bullshitting, which is mistakenly called philosophy too, may be considered as an art, like storytelling.

To clarify - fancy philosophical systems have nothing to do with philosophy. They are just piles of abstractions

Is autism the legacy of humans evolving the ability to innovate?

https://news.ycombinator.com/item?id=26449890

Fuck, no.

Autism is inability to deal with ones own emotions due to some genetic mutations.

It is clearly and verifiable inherited and carried by pretty women who are less affected, being compensated by motherly instincts for social enpirements.

Women are traditionally more emotional and rely on feeling, so society (families) readily accept autistic women as long as they do their duties.

Physical beauty (mostly face) guarantee a marriage. This is how autistic traits are not washed out.

Each autistic person develop his own behavioural patterns to compensate being overwhelmed by his own emotions. (Depending on severity these patterns vary from Turing to Rainman so to speak). This is the main principle.

Everything else is just bullshit.

https://news.ycombinator.com/item?id=26325425

Oh, really. So applying probabilities to partially observable systems yields bullshit, and estimated probabilities based on observations of the past events does not predict anything for evolving systems? I got banned here for such assertions lmao.

What else is new?

https://news.ycombinator.com/item?id=26314610

Lol, it won't fly. First of all, science is just a methodology of establishing truth about some aspects of what we call reality. Everything which cannot be verified by a reproducible experiment is not a science. All theories, therefore, are mere theories. Analogue of religious sects. It also automatically disqualifies all humanities, and especially socially constructed bullshit like race theories, etc.

That, in turn, will strip high social status of academics and "people doing science" from way too many people (which is absolutely good, so it will never happen).

Science as a social construction took (along with big government) place of religion as prestigious occupation and the way to have a high social status.

https://boards.4channel.org/biz/thread/29944428

A guy FUDed Cardano singlehandedly

https://news.ycombinator.com/item?id=26300741

This is a pretty bad writing compared to classic writers (Hudak, Hutton, Bird or Thompson). It begins with nix, which is a cancer (solving of non-existent problem by attention seeking narcissists - stable interfaces and semantic versioning will do).

Then it proceeds with a subpar code, which mixes styles without justification, is cryptic and unreadable. You don't have to zip and then unwords, case is cryptic and redundant, etc.

And it is not some rant, it is an adequate peer-review as it should be.

Simplification (reduction) to perfection (when there is nothing more to remove) both in abstraction and in code, is not optimal with Haskell.

Piling up of esoteric bullshit such as lenses and monad transformers is a non-goal.

Xmonad, cabal and, of course, GHC itself are still the gold standard. No bullshit dependencies.

What "exist" and "real" mean.

There are two different kind of existence - one is of the mind and one of the universe (here "the" is redundant).

There are process (and only processes) or to be precise sub-processes in the universe, which itself is a one single unfolding process. No more, no less.

Processes do exist and are real. Nothing else is. What we call atoms are processes too.

For the mind, it's own abstractions and generalisations are real and seem to exist the way Nature exist. Mind sees its own creations as if they are real in the universe.

Strangely enough, most of these abstractions and generalisations (however valid and useful) do not exist outside ones own head (and common culture).

The veil of these concepts which distorts our perception of what is real had been recognised by Upanishadic seers and early Buddhists and is called the veil of Maya by them. The term Maya is even more general and signifies a source of delusion or misperceptions.

Mother Nature, for example, does not possess any numbers and does not count. There is no notion of counting at the level of molecular biology (no counters, no overflows). Information is structural, not digital. There is only structural pattern-matching in DNA/RNA/protein machinery.

This is an implicit proof that lots of concepts does not exist universally. Molecular biology uses literally everything what is real and available, and does not use what does not exist.

There is no notion of time, no notion of simultaneity (yes, everything is built as if nothing else exist - just triggers and message-passing (of "concrete" molecular structures).

The fundamental principle is that what seems real and existing for the mind does not necessarily (and usually do not) exist outside of it.

Mixing and matching concepts of the mind with real processes or their attributes is the cause of the fucking mess which is called modern science, which is actually modern astrology and alchemy based on sectarian consensus, like any religious sect, abstract models and simulations.

There are two concepts which does not exist. First one is simple - a number. There are literally no numbers anywhere of any kind. Just none. The other, less obvious but closely related, is so-called time-space (or absolute, "physical" time). It is nowhere to be found and cannot be experimentally demonstrated.

No, human-made clocks does not measure anything. They are man-made devices which trying to count equal intervals of whatever it happens to be. Of course, these devices are subject to gravitational influences, so they became distorted (as processes) and their reading vary depending on physical aspects of reality.

Have you seen any other devices made to support a human made abstractions? Visit any church or monastery, visit pyramids or Maya temples - there are no shortage of such devices in there.

The famous thought experiment by Einstein is logically unsound, or, to be precise, has type errors, mixing concepts of different kind - existing and nonexistent concepts of the mind.

Again, clocks measure nothing. Time is a derived notion. So each clock is completely unrelated to each other, unlike, say, thermometers or other devices built to measure what is real.

Knowledge is power

Knowledge is power.

But knowing is not merely a process of collecting and memorising facts here are and there. It is precisely like solving a jigsaw puzzle where each new piece (of information) must match (does not contradict) the ones that has been matched before.

Not just that, but once in a while there is a fact which contradicts (disproves) the previous results - it literally doesn't fit. In that case the results must be thrown away and the whole inductive process backtrack to the last non-contradictory set of matching pieces. Demolition of socially constructed bullshit is a recursive process.

Neti neti - not this, not that.

But once the puzzle is completed (at least in principles) one would see things as they really are.

The Buddha discovered (popularized) this path.

Solved

So, basically I have it solved analytically. Ancient eastern philosophy still holds.

Any simulation of a multiple causation stochastic phenomena is as representative of reality as a cartoon animation movie. This is almost a definition.

The imperial college of London has been released its corvid19 simulation on GitHub. I am still in doubt what was the purpose of that endeavour and how much money has been wasted on it. I am also sure that the authors think themselves as advanced scientists and that they are entitled to an upper middle class life of a scientific researcher.

That model, however was a disconnected from reality naive bullshit. They have modelled neighborhoods of various density full of people of a few kinds and a function with emulates contagion parameterized by a few naive assumptions. They ran it a few times and they released their recommendations to the govts.

A success story by no means. Heroes on the blessing edge.

There is however a few principal objections. First, this naive model is literally as oversimplified as a cartoon. Tom and Jerry if you wish.

A model which trying to represent the whole outbreak as a single chemical reaction, based on concentration of virus particles, temperature as intensity of social interactions, etc, would be a way better cartoon but a cartoon nevertheless.

Abstract concepts of pure mathematics does not exist in what we call reality. They exist only in what we call Maya - a veil made by mind out of socially constructed concepts which hides reality from our primordial awareness (the one newborns and animals posses).

There is no such thing as randomness (it is a vague mathematical concept), there is no such thing as perfect bell-shaped curves, there is no such thing as estimated probabilities. These are all just socially constructed notions. Memes.

There is no way to model a stochastic process which evolves as you trying to capture it's dynamics. This is a philosophical principle, it cannot be short-circuited. All the models capture the past, the results. The past is made out of effects, not the causes.

The only way to deal with multiple causation is to know all the relevant factors, which is possible only with simple discrete fully deterministic system such as coins, dices or decks of cards. These methods cannot, again, in principle, be used against stochastic processes. They are simply not applicable - it is just a type error like multiplication of trees by birds.

Modeling of stochastic system is the new astrology. No matter how sophisticated (one might try to train a neural network to learn parameters for a model - no difference, it is still uses the past as it's source) a model will always be disconnected from what is as a movie or a story. By the same reason it will never predict anything valuable. It would be just "like what happened", a mere bunch of ideas.

That model is useless. The actual corvid19 numbers in UK just showed no correlation with any naive model. But successful astrology will definitely make you rich and famous, just like a few millennia ago .

The Big Picture

Now I have got it as a big picture. The 20th century was the century of abstract bullshit, which flourished at the second half of 19th. Hegelian "philosophy", Froudian and Jungian "psychology", Marxist "economics", and even theoretical "physics" has its roots way back in the Platonic memes of pure abstract entities and imaginary escape of a pure reason into the realm of pure ideas, escape from mundane biological reality of which it is a merely by-product.

No wonder abstractionism and related art forms have been peaked at the same time. Abstract bullshit as an art form, the real apogee.

Now, in the age of molecular biology and computers we begin to realise that abstract bullshit, like Froudian abstract entities of the mind, were not even close to what is, Marxism, which goes against the major social instincts of any biological organism had not a single chance, and that about ninety percent of what we call modern science is socially constructed and maintained by sectarian consensus bullshit.

Wrong abstractions is the root of all evil. Reality is very "concrete" and "mechanical" so to speak. Mother Nature does not even count (but has the notion of information - structural, not digital), leave alone things like potential or purpose.

Everything is just a sub-processes in one single unfolding process which we call Universe (notice lack of the) or Reality. Indian Upanishadic seers and Buddha were right. Everything what follows was mostly a regression, at least all the western abstract bullshit.

Modern Programming

Now I understood what exactly is wrong with f*cking node_modules and Rust's crates -- the amateur developers are using redundant, useless abstractions a lot and you have all the fancy shit in your hundreds (thousands in the case of node_modules abomination) dependencies.

Everything is going to regress back to PHP - a pile of narcissistic amateur crap -- fractal of bad design.

It is almost as bad as it is in Haskell, where every single idiot is trying to use all the esoteric stuff he read about in this or that blog post.

C++ is more sane in this way. And Haskell is a bliss if you are using just a few modules which Cabal or GHC itself uses.

Avoid Javascript at all costs. Rust is still an ambicious amateur Ruby-ish.

Deep Crisis

BTW, corvid19 outbreak made evident and obvious the deep crisis of so called modern "science" which is based on sectarian consensus, peer-reviewed bullshit, compilations of sources instead of tedious and costly experimental research.

Basically, the situation is exactly the same as in obscure branches of "research" like Tibetology and Tantric "Buddhism" which is exactly a complication of anecdotes and citations of unqualified amateurs. The famous bullshitters like Tucci invented whole civilizations and are most widely cited nevertheless.

So called scientific consensus is, obviously, never a criterion for the Truth. Only series of replicable experiments are. Sectarian consensus is what proper science emerged to fight against. Now pseudo-intellectual posers ruined it back to the level of astrology and alchemy, with probabilities and simulations and what not.

The argument about complexity of phenomena is misused. When faced with overwhelming complexity one has to slowdown and backtrack to study underlying principles instead of modeling and simulating poorly understood nonsense. There is no other way.

Half of ML in one post

You don't even know what you're talking about and throw random buzzwords

lol, so you didn't get the argument. ok. I will try to spoonfeed

classification algorithms are applicable only to a discrete, deterministic environments, which are stable and finite. language is such an environment, pictures, obviously, because cats have stable traits which do not change at all, etc. I hope you get the idea.

markets have no such stable traits, so each and any classificator will be plain wrong. it will capture noise and will produce a noise, no matter how much data you feed to it. no stable traits - no ML classification possible.

it is that simple lmao. to have pattern-recognition you have to have patterns. real, not abstract and imaginary.

you code will work, but your results will be noisy and inconsistent.

Fuck political correctness and fuck censure

https://news.ycombinator.com/item?id=23022864

This is just an attention whoring, bragging and self-praise, lots of ambitions with little or no real education. Canonical definition of a russian. Any attempt to apply mathematical methods for modeling of fully observable, deterministic, stable systems, like probability or neural networks (they assume, by definition, stable patterns, which arise from underlying laws at the training stage and in feedback loops) will inevitably lead to a failure. Always. In principle.

Gamefication of markets is even more stupid idea, hence, again, an adequate game model requires a well-defined set of rules which does not change in the middle of the game, and, like any other model or simulation, requires that all (every) relevant factors has been adequately represented, which will never be the case.

Last but not least, one never trade in the middle of the chart, on the past data. If a model fits the past data it only means that it has been trained on the past data, which is already "stable". It will be unable to deal with new data because it will be nothing like training and testing set in principle.

Even classic methods from signal processing will not do, because a signal implies regularly and predictability (a predetermined set of probabilities), while markets do not possess such properties.

This is just misapplication of methods without any understand of underlying phenomena, methods being used, their applicability and limitations.

Read some Hamming lmao.

SJW censorship ruined HN community.

I have got banned on 5 different accounts for comments like this:

https://news.ycombinator.com/item?id=22382317

The classic case, the gold standard - the math is correct, the programmer is one of the best in the whole world (I have read PAIP) but the simulation is completely meaningless. First of all, almost no (zero) transactions occur across different social strates (or castes), at least in real world, like here, in India (where I am living for now).

Second, randomness does not exist. Each social dynamic has its causes, too complex to capture, so it is much easier (and profitable) to assume randomness. However, randomness implies lose of any meaning. A blur which ruins the original image.

A naive model superimposed on a too complex to comprehend reality is neither explanatory or even meaningful. This is just a philosophy 101. Map is not a territory, simulation (a cartoon) is merely, well, a cartoon.

Third, this kind of modeling bullshit nowadays is literally everywhere, and this is called "science".

https://news.ycombinator.com/item?id=22207961

Education (and inflated government) is a new way of signaling a higher social state, similar to scholastic priesthood of the past. It is entirely a social construct (aside from practical, concrete-math-based STEM and engineering).

Another real-life aspect is that passing through a really decent schools like MIT or Caltech (or Yale, as an exception) is a proof (by example) that one is capable of self-discipline, concentration, self-improvement, able to do research and learn on by doing.

Aside from that a degree is just and merely a social status certificate, a certificate of belonging to a higher social class (no matter actual skills and abilities).

I myself am from a third-world social shithole and never went to a high school. I could, however, beat a vast majority of so-called liberal arts majors and even some Stanford grads, which I regularly did on this very site before CoCs and bans for use of an "inappropriate" language.

So, there is nothing much to talk about. A crappy degree is a social status certificate, and obsession with language usage, long-words and polit-correctness and fancy abstract terminology is merely a signaling of assumed, self-proclaimed (and almost always absent) virtue.

On the crisis of Go

We should save Go from the current crisis of overconfident narcissistic idiocy.

In the good old times only very few people, who have passed through harsh selection (basically, you have to me Math or EE major) were allowed into PL design. Out of that we got APL, Smalltalk, Common Lisp, Scheme, Miranda, Haskell, Standard ML and Scala - all the nice things.

When demagogues and narcissistic bullshitters were allowed to design a language we got C++, Java with all the inconsistencies and sloppy kitchen sink thinking which are so characteristic of Liberal Arts majors.

When complete degenerates are allowed to design a language, abominations like PHP or Javascript were born.

So, Go has been famous for keeping the list of features small, orthogonal (on the implementation side) and complementary (on the semantic side). It is precisely this what makes it such a success.

It has a multiple return values (not a tuple) similar to Common Lisp and this is the best we could have. Literally.

Introducing the full-blown type-classes (or better to call them type-traits, suggesting that they are composable, or flavors as they used to be named by Symbolics) is a too much effort, basically a design from scratch is required.

Just adding a Maybe monad, as idiots are suggesting, is a bullshit "solution" since it requires type-classes in the first place.

If you are that fucking smart (hint: you are not - thousands of exponentially brighter people have pushed the field since 60s especially in Common Lisp and ML communities) - just fork the language and make it your way, but, please, leave us alone and stop spamming us with your naive bullshit.

Some pastas

some gems from /g/

Ocaml is a product of brain-dead people, like Java. It is really that bad. No proper numeric tower in a functional language is a disaster.

Haskell is a miracle, but it is plagued by idiots and narcissistic degenerates who abuse the type system and Monadic IO by exercising in an idiocy of producing too abstract too general and useless data types where just a function composition and sum-types will do.

So, learn to reduce everything to the smallest possible sets of appropriate idioms.

The current public domain Haskell code form fpcomplete and other narcissistic idiots is utter bullshit. Everything could be done with much more ease and simplicity by merely porting stuff from SML, Erlang (with some message-passing lib) or Common Lisp world. Laziness has been solved by Monadic IO.

Beauty is very very relevant, because it is a sign of approaching perfection (which is a state when there is nothing more left to take away - a local optimum). In nature it is a product of countless trials and errors, while in human crafts it is a product of quality.

Beauty arises out of very good quality, you pleb.

Take a look at the evolution of Haskell's prelude, lets say from ghc-6.x and up to now.

Contrast is which the fucking abomination which is called foundation-0.x which is a product of enterprise degenerates.

and

Your proposition was that code cannot be elegant, beautiful and maintainable at the same time.

I have shown you a concrete example with refutes your idiotic assumption. Just this.

good thread

Tribute

universal_server() ->
    receive
       {become, F} ->
           F()
    end.

factorial_server() ->
    receive
       {From, N} ->
           From ! factorial(N),
           factorial_server()
    end.

factorial(0) -> 1;
factorial(N) -> N * factorial(N-1).

test() ->
    Pid = spawn(fun universal_server/0),
    Pid ! {become, fun factorial_server/0},
    Pid ! {self(), 50},
    receive
        X -> X
    end.

Joe Armstrong is gone

Very sad day. It is a real lose. He was a great wizard, principle-guided.

On the second thought I would say that I always felt happiness and gratitude while reading his books or watching his talks - a lone voice of sanity and principle-guided reason in the sea of screaming bullshit, a lone figure in a crowd of bearded narcissistic clowns, like Wadler or whoever it might be.

This gratitude for showing the power of clear disciplined reasoning guided by the right principles I would carry with me and will try to pass further. Thank you, Joe! You were a great teacher, a guru and you literally have moved the earth!

I think I should have a drink.

Snapd

snapd cannot resume a download after a network error (network change)

fucking degenerates

Oh god

lngnmn1@ideapad:~$ node
> "" == '0'
false
> 0 == ''
true
> 0 == '0'
true
> false == 'false'
false
> false == '0'
true

Wow

This is what we call inconsistent behavior. So much for the principle of less astounishment

lngnmn1@ideapad:~$ python3
Python 3.6.7 (default, Oct 22 2018, 11:32:17) 
[GCC 8.2.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> round(1.5)
2
>>> round(2.5)
2
>>> round(3.5)
4 
>>> round(4.5)
4
>>> round(5.5)
6
>>> round(6.5)
6
>>> 

Fuck off HN

Look, I am really tired of all this. It is very emotionally taxing for nothing. It is not easy to argue in a foreign language about subtle topics. All I have tried to say is that there are language features based on extensive research. Erlang is one example, Go is another. There are obvious counter examples everyone know and love.

My first comment was about that by making an IO selector a language construct the need call Jetty which calls Netty, which call Shitty is eliminated, and the code could be reduced into a single function with a few helpers. This has been demonstrated by Rob Pike in every single fucking video on Youtube.

Creating another general purpose event-driven framework, Tokio or whatever, is, of course, very rewarding, especially for those imposers who call themselves "engineers", but it seems that engineering as a discipline is about reducing to an optimum, instead of piling up fancy bullshit. Every single video of Joe Armstrong on Youtube is about this very difference. We already have way too many Java frameworks.

We could actually compare into which assembly Go's code with select statement compiles versus Rust's code, including dependencies, but it is so obvious that it is almost frustrating to spell this explicitly. That is why I mentioned LOCs.

It is really frustrating to argue when some unknown idiots are flagging your posts without offering any single reason. BTW, if you want to know how shotings are originated - this is this exactly the way. But don't worry, I am very far away in some third world shithole, and I don't care that much.

In good old times, on LKML and everywhere else smart people would focus on meaning and ignore the style, which is one's own choice, while nowadays they attack our style, ignoring the meaning. So be it.

Just fuck this modern HN with all that SJW idiocy.

I really can't stand it, so fuck you, HN.

There is the discussion.

https://news.ycombinator.com/item?id=18636125

The fucking idiots are claiming, that some gut bacteria have been cured autism in a mice!

What the fuck is autism in mice? A cross-species disorder in a species with unrelated structure of the brain? Really?

I am struggling with autism my whole life, and some clowns are getting paid for "modeling" autism on unrelated species? Fuck you! Just fuck you fucking degenerates.

No wonder I have got banned. It is okay to be banned by idiots. There are my post which has been flagged by fucking SJW activists.

What kind of fucking bullshit is this? Autism is a spectrum of compensating behavioral patterns that emerge as the way to diminish unusually high emotional stress (which is also a spectrum) due to genetically transmitted "imbalances" in a brain's structure (and/or biochemistry). So, nowadays every kind of fucking bullshit deserve the label "working hypothesis" instead of "gross incompetence"? BTW, autistic spectrum disorders are "hijacked" the most stable evolutionary strategy of "good looks" - quiet village beauties are carrying these alleles (presumably, product of inbreeding).

There is no coincidence, that ASDs affects mostly boys. Motherly instincts, it seems, diminish (or compensate for) behavioral deviations.

If one could come down to earth from the Ivory towers of abstract bullshit and look at the real world, the answers are out there.

and

How this is related to humans?

What is autism in mice in the first place?

What kind of fucking degenerates are "studying" or "modeling" a cross-species >"mental" disorder???

On species with an almost unrelated brain structure as a "model"?? What kind of >bullshit is this?

and

In not so distant past many of weak-minded but talkative, ambitious and narcissistic people could have find their niche in any organized religion. Almost every fancy bullshit, as long as it fits the canon, could be accepted, praised and even rewarded. There were never a shortage of fancy bullshit. In the current age religions has been obsoleted, but weak-minded "creative" people are still here. So, science became a new religion, especially when it hit the wall of empiricism, a limitation, which has been realized by ancient eastern philosophers (Brahman is unattainable to conditioned intellect which could see nothing, but its conditioning). Modern day's notion of impossibility to break an abstraction barrier (see the wiring of a processor from the level of code) is the very same notion reformulated.

Every bizarre bullshit could be framed as a hypothesis and published, giving a high social status of "theoretical researches" to its authors (instead of much more appropriate status of talkative idiots). It is due to social status, similar to those of a monk in medieval ages, which one's parents could buy for their children by paying them through a costly elite religious school. Nothing new under the moon.

I personally prefer to see those disconnected from reality academics, who gave advice to Macron (based on disconnected from reality abstract notions) to tax the population to combat climate change, to be held accountable for all the damages caused by resulting riots and being forced to pay for their ignorant arrogance, but this will never happen, because academics are allowed to produce bullshit labeled as working hypothesis. The rest of us aren't.

Fundamentally wrong ;)

The very first page shows an utter lack of conceptual discipline (sloopy thinking).

For instance, the following set of commands runs the hello_world example:

git clone https://github.com/SergioBenitez/Rocket
cd Rocket
git checkout v0.4.0-rc.2
cd examples/hello_world
cargo run

Set implies no notion of ordering whatsoever, while a sequence of commands implies a particular definite order.

It is a list of commands, not a set. Don't use fancy words you not fully understand.

https://news.ycombinator.com/item?id=18515678

Epic fight (or fail?)

https://news.ycombinator.com/item?id=18445609

A decent CS degree, like that of MIT matters a lot. I could give an example. Once upon a time a stumbled upon this classic article (which I re-read sometimes to feel better):

https://eev.ee/blog/2012/04/09/php-a-fractal-of-bad-design/

When I read this for the first time I have been literally shocked by sudden realization that I have coded some stuff in PHP but never knew it is such a crap.

It also taught me that I literally know nothing, that I am merely a stupid coder, not a programmer, so I undertook a serious study of CS fundamentals to be able to write an article like this in the future.

Now, after a few years of studying, I know a half-a-dozen of languages and I know crap when I see it. Ironically, what is going on with the Node ecosystem puts PHP3 into a shame. But it is natural, because most of JS coders have no CS background. People with CS background sometimes produce gems like Go or Erlang, while "mere coders" always produce things like PHP, J2EE and npm.

There is only one thing that I regret - it could save me so much time by not reading bullshit on the web if only I could find the right books (like Programming Erlang, Haskell School of Expression or On Lisp) and right courses (like Dan Grossman's and 6.001 2004 - the last course in Scheme) first. There is a huge, qualitative difference between a well-written gems like these and fucking crap some narcissistic idiots post in their blogs.

I have found the right books and the right courses by literally swimming through the sewers for years. A decent school will teach you the right principles, instead of irrelevant details, in ML or Scheme right from the beginning.

Cathedral of packer's stupidity.

I managed to stay away from Java crap as much as I could, but I have never suspected that it is really as bad as this. I think crappy design should be a felony like cheats in finances.

I will cite it verbatim as a real piece from the Cathedral of stupidity.

public class ListViewLoader extends ListActivity
        implements LoaderManager.LoaderCallbacks<Cursor> {

    // This is the Adapter being used to display the list's data
    SimpleCursorAdapter mAdapter;

    // These are the Contacts rows that we will retrieve
    static final String[] PROJECTION = new String[] {ContactsContract.Data._ID,
            ContactsContract.Data.DISPLAY_NAME};

    // This is the select criteria
    static final String SELECTION = "((" +
            ContactsContract.Data.DISPLAY_NAME + " NOTNULL) AND (" +
            ContactsContract.Data.DISPLAY_NAME + " != '' ))";

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);

        // For the cursor adapter, specify which columns go into which views
        String[] fromColumns = {ContactsContract.Data.DISPLAY_NAME};
        int[] toViews = {android.R.id.text1}; // The TextView in simple_list_item_1

        // Create an empty adapter we will use to display the loaded data.
        // We pass null for the cursor, then update it in onLoadFinished()
        mAdapter = new SimpleCursorAdapter(this,
                android.R.layout.simple_list_item_1, null,
                fromColumns, toViews, 0);
        setListAdapter(mAdapter);

        // Prepare the loader.  Either re-connect with an existing one,
        // or start a new one.
        getLoaderManager().initLoader(0, null, this);
    }

    // Called when a new Loader needs to be created
    public Loader<Cursor> onCreateLoader(int id, Bundle args) {
        // Now create and return a CursorLoader that will take care of
        // creating a Cursor for the data being displayed.
        return new CursorLoader(this, ContactsContract.Data.CONTENT_URI,
                PROJECTION, SELECTION, null, null);
    }

    // Called when a previously created loader has finished loading
    public void onLoadFinished(Loader<Cursor> loader, Cursor data) {
        // Swap the new cursor in.  (The framework will take care of closing the
        // old cursor once we return.)
        mAdapter.swapCursor(data);
    }

    // Called when a previously created loader is reset, making the data unavailable
    public void onLoaderReset(Loader<Cursor> loader) {
        // This is called when the last Cursor provided to onLoadFinished()
        // above is about to be closed.  We need to make sure we are no
        // longer using it.
        mAdapter.swapCursor(null);
    }

    @Override
    public void onListItemClick(ListView l, View v, int position, long id) {
        // Do something when a list item is clicked
    }
}

https://developer.android.com/guide/topics/ui/layout/listview.html

Bitcoin $1,500

Time to say something on the occasion.

There is no way to get out of this keeping such inflated valuation. No institution would buy bitcoin in case of a selloff, which will happen some day or another.

Another notion is that there is just not enough free USD to convert any substantial amount of BTC, an operation which by itself will trigger a selloff and burst.

These millionaires are only on a disk. There is no way to cash it out on such valuation.

It is a trap. The classic case of inflated asset bubble, given that asset is virtual and the Ponzi scheme is global.

Lets see.

Gold Standard

Peter Norvig's code is a gold standard.

http://nbviewer.jupyter.org/url/norvig.com/ipython/Probability.ipynb

Habitual clarity, precise use of language and attention to details.

This is what programming is all about.

Notice also how expressive and concise Python3 is.

Perfectly sums up what is PHP

This is the quote from https://phacility.com/phabricator/

  • Written in PHP so literally anyone can contribute, even if they have no idea how to program.

This is a precise description of what PHP is.

The sad truth

Programming is mostly about what are you doing and why instead of how and with what tools.

Data structures and related algorithms is central to programming, not languages. A programming language should be programmable (to be adapted, evolved into a set of layered DSLs for a problem domain), like Common Lisp.

Programming languages is an easy part...

3rd account has been banned on HN

A have reached yet another milestone - my 3rd account has been banned on HN. That is something.

Is it a hint that I am a complete idiot or something is wrong with the "social media" world? Both, perhaps. Something is definitely wrong with the world in which Java and Javascript are the most popular languages and statistics is used instead of experimentally proven causality.

But there is no other choice but to continue to do what you like, the way you like.. Everything else is even more meaningless.

Oops, I monad again

This boils down to an ugly hack which is what a monad is.

The principle is that a function by definition is a mapping, which maps a set of values into another set of values. The crucial difference between a function and a procedures is that a function must, by definition, produce the same input for the same output. Always.

So, when a supposed function produces sometimes values and sometimes errors, everything is broken. It is not a function anymore. The ugly hack is to wrap different values into a compound structure of a certain type and reformulate the law of functions to the law of procedures - the same type of output for same values. This is what Maybe monad and Either monad are. Mere ADTs, mere wrappers. As long as input and output values are of the same type (as long as they conform to the same protocol/interface they are considered to be equivalent, but, of course, not equal) procedures which take such type as input and output values could be composed and form a pipelines (like any other procedures with consistent types).

There is nothing more than that. And, strictly speaking, it is not a pure functional programming anymore. Just procedural with static typing (hello, Haskell purists!).

'Breakthrough' Prizes

While multidimensional geometry could be regarded as a separate discipline, dimensions themselves does not exist. It is mere an abstraction created by an observer.

A sphere makes a perfect sense in a in this Universe, while a 4D hyper-sphere (and so-called hyper-cube) is an utter nonsense. Of course, one could describe such an abstraction in as a mathematical object, but it will be mere an abstraction, like Hegelian ones. It does not exist outside people's heads.

All the billions years of this planet never revealed a single dimension. All the geometry we have in proteins is based on the notion of 3D spheres - a notion of the same distance in all possible dimensions (given that there is no such thing as dimension).

Same logic could be applied to refute any string theory sectarian nonsense, given that just one single contradiction is enough. Non-existence of time as a phenomena is enough to destroy all the space-time curvature mathematical or rather Hegelian crap.

But who cares.

Inapplicable

Not every abstract notion is applicable. Much less applicable to itself.

The Russell's paradox should be resolved with 'inapplicable' type error. A category tag is inapplicable to itself.

Same goes for probabilistic inference - it is inapplicable for partially observable phenomena. It should be a type error.

Yet another ban.

For the love of god, why?

1 ]=> (expt -2 2)

;Value: 4

1 ]=> (* -2 -2)

;Value: 4
* (expt -2 2)

4
* (* -2 -2)

4
>>> -2**2
-4
>>> -2 * -2
4

Elections 2016

The majority of "less educated" has been primitively manipulated, using naive memes and the modern propaganda technologies developed by the well-educated, to vote against well-educated, framed as "the corrupt establishment".

The less educated majority are so sure about being the majority (statistically they obviously are) that they are taking "the stealing elections from them" meme for granted.

No one even tries to evaluate memes anymore, like "returning back all these manufacturing jobs" - would you really like to compete with Chinese wage laborers by working long hours for less than $15 per day? But this is how the global economy works. No amount of legislation could make much more expensive to produce goods competitive.

The same thing happened with the Brexit - the uneducated majority has been manipulated by populists to vote for unrealistic meme-like prospects.

The problem is simple - too many uneducated, and memes, memes everywhere.

Cognitive decline..

It seems that I used to be in a much better shape 2 years ago - Folds

The fuck?!

schiptsov@Ideapad-300-15ISK:~$ ps -ax |grep systemd
  282 ?        Ss     0:00 /lib/systemd/systemd-journald
  329 ?        Ss     0:00 /lib/systemd/systemd-udevd
  814 ?        Ssl    0:00 /lib/systemd/systemd-timesyncd
  903 ?        Ss     0:00 /lib/systemd/systemd-logind
  910 ?        Ss     0:01 /usr/bin/dbus-daemon --system --address=systemd: --nofork --nopidfile --systemd-activation
 1092 ?        Ss     0:00 /lib/systemd/systemd-resolved
 1159 ?        Ss     0:00 /lib/systemd/systemd --user
 1675 ?        Ss     0:00 /lib/systemd/systemd --user
 1800 ?        Ss     0:00 /lib/systemd/systemd-hostnamed
 1816 ?        Ss     0:00 /lib/systemd/systemd-localed
 1919 pts/0    S+     0:00 grep --color=auto systemd
schiptsov@Ideapad-300-15ISK:~$ 

systemd-localed, Karl!

Essay on Non-Self (Anattman)

The concept of the self is closely related to archaic notion of the soul, which is an important vehicle of socially constructed organized religions (ancient social institutions designed to accumulate wealth and power by exploiting ignorance, superstitions and fear of death of its subjects and followers).

The ultimate non-existence of the self could be proved by the same chain of argumentation educated people dismiss the notion of a soul today as a mere social creation - a meme. The last 300 years of philosophy and science were, to a large extent, was the effort to discard religious dogmas and replace them by more adequate approximation to the truth.

At the time of the Buddha the only reliable method of discovering of the nature of reality was to trust our senses more than dogmas and be very precise with language use. This is the essence of the method of introspection, which, arguably, has been used by the Buddha to gain his insights about the nature of the mind.

Turning his attention inward (to put it in a modern language - he let his intellect observe its own working) he observed and classified mental phenomena (processes of the mind) which arise and fade in his own mind. After analyzing the nature of these processes he concluded that each one of these are impermanent (transitory) and specialized, which means not general enough to govern the whole observable human behavior.

This result of so-called mindfullness or self-awareness - when one aspect of the mind is tried (to a very limited extent) to observe the other aspects within the whole of brain's activity is, arguably, the best "philosophy of mind" has been done so far, and predates modern science of psychology.

One of the implications of observations made by the Buddha transformed by him into profound insights is the illusory nature of what we call "our self". He taught that this self is mere an appearance, a set of aggregates, like cooking spices wrapped in a banana leaf. Another famous metaphor is a process of pealing of a piece of onion, when by removing layer by layer of what we call nowadays habits, memories, personal experiences, social and cultural conditioning, no substantial self, no permanent core could be found.

This notion of an illusory nature of what we call self is the most fundamental concept of the Buddha's philosophy, which relates to the concept of Maya which predates him. Maya, according to some of Hindu mystics (which is another name for the seekers after the absolute truth, or god), is a veil which obstructs or view of what is - of reality as it is. Arguably, it is due to inevitable interference of some parts of the brain with another, especially so-called language area - a seat of linguistic abstractions, which produces its own constructs, which, it seems, looks from the other areas of the brain as a valid and accurate perceptions. The glimpse of such possible dynamics has been demonstrated in the set of experiments with hemisphere-split patients.

What is the mark of true genius is the use intuition - so called non-verbal knowledge of the body to guide ones search for the closest approximation of the truth. Modern scientists will tell us that the body has information about the nature of the environment it has been shaped by, encoded in the physical structure of the specialized sensors and brains areas. The Buddha has no such understanding, but he paid attention to intuitions from his own brain. Nowadays we would formalize this process as a heuristic-guided search.

Thus he, presumably, got the insight about a brain being an aggregate of highly specialized areas and the phenomena of the mind which is accessible to introspection by so-called primordial awareness turned inward, is mere a bunch of parallel processes which arise and fade on-demand to serve a particular low-level or high-level function which is a part of a observable behavior.

It is quite easy to mess everything up with the modern terminology and modern views based on decades of research in the field of cognitive psychology, nevertheless it is easy to observe that the insights about the fundamental principles about the nature of the mind, described by the Buddha, are still hold. His method of direct observation instead of mere abstract speculation and practice as the way of testing his hypothesis (insights) is a precursor of the scientific method we use today, giving that each one of his disciples is free to replicate test and validate his insights and conclusions.

Got flagged on HN again

This is the discussion https://news.ycombinator.com/item?id=12616426

This is my comment:


Nothing to see here. In the age of hipsters, cosplay of intelligence and "social" role-governed behavior in general there is no surprise that we could read all that nonsense in the New Yorker, which is a hipster magazine.

No sane person, who have enough intelligence to realize that music, sports, arts and craftsmanship are 95% based on practice would doubt it. Any good music teacher or Olympic coach will tell you this. The 5% of talent determines where most likely would end up on the spectrum, but talent without practice is nothing, like knowledge unapplyed.

Even in programming actually writing code on the daily basis is what distinguish top performers from mere mediocrity and talking heads and bloggers at the bottom.

For the hipster narcissistic sub-culture it is perfectly OK to "challenge" (they think it is a challenge, not a display of stupidity and lack of even basic understanding) the nature of reality with sophisticated (which does not imply intelligent) nonsensical blah-blah-blah in order to gain public attention to their acting according to the role.

Hipsterism as a social phenomena is an insult to intelligence the way commercialized empty yoga of cosplay and asanas is an insult to spirituality (which is seeking for the truth).

Kant was wrong..

It is not that math has been prior to the mind. It is exactly the other way around.

The sequential, serialized nature of perception gave rise the notion of ordering (sequential) - this then that, this before that. The mind, it's introspection and attention turned to the order of events within a perception gave us math. Not that the math existed somewhere gave us mind. The brain is a mechanical machine. As mechanical as Lego set.

No numbers exists outside the mind. There are only forms of energy - photons, fields we call atoms, and everything else built out of them.

Numbers, time, space are concepts of an observer. But there is no observer at the level of atoms or at the higher level of molecular biology. There is no math there either. Only "quantum" mechanics. Nu numbers, but physical shapes due to electro-chemical properties of proteins, which defines it's shapes.

Universe does not contain math. There is no one to observe that particles might be added or multiplied. Each one of them is independent and a part of the whole.

Math requres an observer.

And he was right - so called apriory knowledge do exist, but it is not math, but environmental conditioning encoded as a structure of specialized brain areas - it reflects so called apriory reality.

Philosophy and physics

When people are studying philosophy but not physics they end up in the realms of pure ideas - disconnected from reality chimeras of abstract nonsense.

When physicists does not study philosophy they end up in the same Hegelian abstrac nonsense of disconnected from reality pure mathematical abstractions, such as higher dimensions, space-time, and other god-like fancies.

Yet another HN rant..

This is exactly how you get professionally marketed crap like Mongo, Node, Hadoop or Java itself - you name it.

It is a fast-food way of building software - cheapest processed shit wrapped in a good SEO and naive user experience (quick and rewarding installation and effortless bootstrapping of a meaningless hello-world - I am so clever meme).

What matters in software in the long run, be it an application or even a programming language, is appropriate design decisions and strong emphasis on design around protocols (by duck-typed interfaces), using declarative (instead of imperative) mostly-functioal paradigm, bottom-up process with focus on modularity and code reuse, while sticking to the standard idioms and principle of less astonishment. This is what is still being taught at MIT, and this is how the internet works as a whole.

99% of Node or PHP code is a toxic amateurish crap, precisely due to ignorant over-confidence of the coders and lacking of proper CS education about right principles and programming paradigms, importance of louse coupling and proper encapsulation, which leads to modular, share-nothing, mostly functional designs with emphasis on immutability and persistence (that's why Clojure is so good - it has been, like Erlang, well-researched with tons of good stuff borrowed form Common Lisp).

Erlang or Haskell or Scala or Swift or Golang are on another side of the spectrum, which could be characterized by discipline of sticking to the right principles and rigorous attention to details (of which Haskell might be a bit too strict, but Erlang or Scala just right).

BTW, these observations about the impact of a proper design, based on right principles, which have been made almost 20 years ago still holds - http://norvig.com/java-lisp.html. Today we could state the same for Python3.5+, which, finally, evolved to be so carefully refined as old school Lisps, but emphasizing that Python3 is a prototyping language, while CL is both prototyping and implementation language.

No sane person should even touch PHP or Javascript or any other crap with implicit coercions, error supressions, etc, designed by amateurs (hello, MySQL, Mongo, Hadoop!) like one avoid chap processed junk food or drugs.

I am stupid..

I've completed only one task in little more than 1.5 hours in a Toptal test.. There was not a single chance for me to crack all 3 on time.

So I've quit. I am stupid.

Do not copy-paste code

Do not copy-paste shit without understanding.

This comes from straight from The Rossetta Code

def binary_search(l, value, low = 0, high = -1):
    if not l: return -1
    if(high == -1): high = len(l)-1
    if low >= high:
        if l[low] == value: return low
        else: return -1
    mid = (low+high)//2
    if l[mid] > value: return binary_search(l, value, low, mid-1)
    elif l[mid] < value: return binary_search(l, value, mid+1, high)
    else: return mid

apart from retarded formatting it seems OK

but if we test the code

binary_search([2, 3], 1)

we will get

RecursionError: maximum recursion depth exceeded in comparison

surprise!

all we did is searched for an element which is less than the smallest one in the list.

The bug is due to arrogance of some C or C++ coder

>>> len([]) - 1
-1

which means that -1 is a legit value for high

sane people are using None

One more thing. If you do this

binary_search([3, 3], 3)]

the code will return 1 instead of 0

There is a more correct version:

def binary_search(xs, x, lo=0, hi=None):
    if hi is None:
        hi = len(xs) - 1
    if lo > hi:
        return None
    if xs[lo] == x:
        return lo
    mi = lo + (hi - lo) // 2
    if xs[mi] > x:
        return binary_search(xs, x, lo, mi-1)
    elif xs[mi] < x:
        return binary_search(xs, x, mi+1, hi)
    else:
        return mi

What the fucking fuck?!

What is this? For the love of God, what the fuck is this?!

schiptsov@Ideapad-300-15ISK:~$ sudo reboot
Failed to start reboot.target: Transaction is destructive.
See system logs and 'systemctl status reboot.target' for details.
schiptsov@Ideapad-300-15ISK:~$ systemctl status reboot.target 
● reboot.target - Reboot
   Loaded: loaded (/lib/systemd/system/reboot.target; disabled; vendor preset: d
   Active: inactive (dead)
     Docs: man:systemd.special(7)

The world has been taken over by fucking idiots.

Major Cleanup

It seems that I am finally ready for a major cleanup of these pages in order to make them more comprehensible.

I hope I could be able to emphasize (there is never enough) the importance to attention to details, deep understanding of whys not just hows, and some passion and zeal.

It is all about a subtle balance, which it seems, underlies everything in the Universe. The balance is appropriate to everything, including passion and zeal. Lack of it will lead to ruin, like all these Zen zealots have ruined the very idea of Zen (which is a practice of not misusing of the mind or using it the way it was evolved - as a tool, not as a cause of suffering).

I would like to show the important difference between what might be called a crafted masterpiece and a dump. The best example of a dump I can think of is these bookshelves in guest houses, where they keep the books abandoned by tourists - the collection of crap to be never looked back again (this is what 98% of Java code is).

Emacs, for example, is a masterpiece, while Eclipse is a garbage dump. Scheme, Haskell or Erlang are a masterpiece, while Common Lisp, Java or C++ are dumps (given that Common Lisp is a truly masterpiece compared to C++).

I would like to talk about artistic sense and deep personal involvement of an artist, which makes code of guys like bbatsov, kennethreinz, sysoev, antirez, to name a few, so beautiful.

I hope I would be able to convey the importance of big ideas and vanity of popular buzzwords. That tables-driven approach as in pandas is a big old idea, and, for example, "reactive" is just nonsense.

So, lets try.

Proposal

Is there already a proposal to add that wonderful Type? syntax from Swift instead of ugly Maybes, at least as a standard syntac sugar?

It is so beautiful because it reflects a good tradition of naming predicates in Sheme and few other languages.

Destructuring is by pattern matching, as usual.

Docker, Kubernets, Mesos, etc.

There is nothing extraordinary, profound or even innovative in so-called Containerization, Orchestration and other silly memes.

It is a way to tightly pack applications into a data-center to extract more "value" from hardware, or simply put - to make more money by selling slices of servers with per-hour basis.

All other blah-blah about benefits of containers is nonsense. The more complex system you make, the more inefficient and fault-prone it becomes, compared to running the same service on a dedicated hardware.

The reasoning is quite simple. Suppose you have some Java pile of crap, which is, basically, a java.exe process which uses sockets for communication and, probably, some remote storage, which is also communicated via IP protocol. This kind of crappy app could be run in a chroot-based environment under some hypervisor. So, one isolates it into a "container" which is could be think of as a FS snapshot. You could isolate a Rails app, for sure, or some REST-ful service at a cost of losing I/O efficiency - basically, you are running in an emulator with para-virtualization (Xen, KVM+qemu).

This exactly what FreeBSD Jails has been designed for. The ideas came from IBM Mainframes.

Docker gives you way to describe in a declarative way (which is good) your images. Kubernets is a cluster management. It is all clever and saturated with a lots of sophisticated blah-blah full of long words, so the crowd is very excited.

But this does not solve any fundamental problems, it just squeezes a few more profit-driven middle-men between your code and hardware. The claims that it eliminates the costs of system administration is nonsense. Unless you are running a guest-book with a few html-forms Rails example, everything will become even more messy. Think what would happen when a some replications (Redis or MySQL or whatever) will fail. Or, my favorite example, your java.exe crashes, leaving your data in inconsistent state and cause a data lose.

But, of course, all this it is very cool. Especially to isolate these Apache Spark nodes, which are utilizing literally hundreds of thousands of lines of code, consuming gigabytes of memory to hold mutable, locked data to perform map-reduce operations on read-only, pre-sorted, partitioned data, which could be done in a few thousand of lines of Erlang or Common Lisp or even Scheme.

Blah blah

Yeah, is it such a fine set of ideas, or rather intuitions about the strict similarity between high-level programming and what is going on in what we call living beings.

There are some big molecules, made out of atoms in an almost uniform ways (I am oversimplifying, of course) they are chains of amino-acids (in case of proteins). These molecules have a structure, which determines its physical shape and hence its electrical and chemical properties. Some of these we call proteins, some enzimes, some act as a data, some as high-order procedures - they perform transportation of one physical forms (structures) into another.

In some sense, the ingenious intuition behind the original Lisp was that the same uniform chains of data (list structures) could represent the code and the data, and that there is no fundamental difference among them. Moreover, the internal representation and human-readable notation could be also uniform, and reflect and express this uniformity in the syntax - that's why we have all these parentheses and write in ASTs.

The structure is what lifts a "dumb" matter into a higher level. This is Yin. The procedures - the algorithms - are the second half. It is Yang. The list structure is what binds them together.

Here it is captured not just the essence of programming as a data-processing discipline, but also something of a higher level. An intuition, which, I believe, illuminated the famous minds behind many early Lisps.

Programming in proteins

Recently I have noticed the striking similarity between the basics of molecular biology and programming. Not only me, of course.

After the discovery of the DNA (and related RNA) structure, molecular biology has been a hot topic in 60s and 70s. So were AI and Lisp.

Grossly oversimplifying, proteins are data-structures, while enzimes are procedures (free of side-effects), both are made out of some base aminoacids, chained together by various bonds. They form sequential and even look-up table-like structures (DNA code-sequence to aminoacid look-up). And that is enough to sustain all the life.

The classic Lisp has been grounded on almost the same insights. The same list structures are data representations and some of these structures are procedures, but there is not much difference. Code could be treated as data, even modified or generated. And everything is type-tagged (has a recognizable marker), everything was a "first class", and there was a notion of many distinct environments (closures) - and that was good enough.

I think it is not a random consequence, I think it is a play of intuition that there is some small good-enough set of features to program everything, a biological equivalent of the Turing Machine.

Well, proteins *are* data-structures and enzimes *are* procedures (without side-effects).

What runs them? Well, Brahman.)

Ounce Of Mathematics

prof. Brian Harvey used to say "an ounce of mathematics is worth pound of computer science".

There is how. In famous Peter Norvig's course (on udacity) there is a small sub-task to write a tiny predicate procedure, which returns True if a hand of cards has the same suit.

Like a good student of CS61A I wrote this pile of functional-style code:

def flush(hand):
    suits = [s for r,s in hand]
    return reduce(lambda x, y: x and y, map(lambda a: a[0] == a[1],
                                     zip(suits, suits[1:])))

What could possibly go wrong? You could visualize how beautifully a list zips with itself (strictly via references, no data copied!) and how a new list of Booleans is going to be produced, to be folded then into a single value. You could see the flow and all the wonders.

Here is the Norvig's snippet

def flush(hand):
    suits = [s for r,s in hand]
    return len(set(suits)) == 1

What the ...? Well, the set-from-a-list constructor removes all duplicates (due to the nature of a Math set), so if the length of a resulting set is 1, then all elements of a list were the same value.

But, but we know how to make a set ADT out of conses and write short-circuiting and or every? for sequences.))

How the mind works.

If we think in "storage" or "representation" in the context of our minds or AI, we think it wrong, because we are trying to use the concepts form CS of memeory "cells" or "slots". There is nothing like that in a brain, of course.

The proper way of thinking is that our "memory" is "structured" instead of "stored".

This explains how each access "restructures" (re-builds) our "understanding" and that after very short time we are unable to "recall" any details of our "previous understanding".

The good metaphor form computers it that our "knowledge" has to be "periodically refreshed" the way a charge in RAM chips must be. Once we stop re-generating it, it is gone forever.

So, refreshing, re-constructing, "re-building", re-writing. And pattern-matching.

What I am trying to say is that there is no "universal encoding", like on hard-drive, there is even no permanent storage. We do not store anything "verbatim".

It is rather like this - we have seen some face a few times, There is no detailed "picture" of the face stored in our mind. When we see the same or similar face, we perform some sort of pattern-matching on key features - eyes, mouth, nose, shape and then we "knew" that we have seen it before, and then "prime" all the associations. But there is no "images form retina" stored in a "files". No hash-tags.

It is like "which structure of neurons lights up when I get this sensory input" pattern-matching. "Analog", not "digital".

Julia is awesome!

One coukd tell it comes from MIT,

julia> function sum(f, x, y)
           if x > y
               0
           else
               f(x) + sum(f, (x+1), y)
           end
       end
sum (generic function with 1 method)

julia> sum(x->x, 1, 10)
55

julia> sum(x->x*x, 1, 10)
385

julia> map(x->x*x, [1,2,3,4,5])
5-element Array{Int64,1}:
  1
  4
  9
 16
 25

julia> filter(x -> mod(x,2) == 0, [1,2,3,4,5])
2-element Array{Int64,1}:
 2
 4

julia> reduce(+, map(square, [1,2,4,5]))
46

julia> compose(f, g) = x -> f(g(x))
compose (generic function with 1 method)

julia> twice(f) = compose(f,f)
twice (generic function with 1 method)

julia> twice(square)(2)
16

julia> make_adder(n) = x -> x+n
make_adder (generic function with 1 method)

julia> add1 = make_adder(1)
(anonymous function)

julia> add1(5)
6

Humiliation

Just watched C9 Lectures: Dr. Erik Meijer - Functional Programming where he, perhaps unintentionally, is comparing Haskell to C#.

What a humiliation (of C#, not dr. Maijer, its meaningless verbosity, syntactic noise), especially when he is comparing such a gem of declarative programming

not False = True
not True  = False

to some class-based dynamic-dispatch in C# in episode 4 around 0:13.

Watch it.

btw, to fully appreciate the beauty we should look at the type-declaration first:

data Bool = True | False

and understand how not is defined using pattern-matching on data-constructors (True and False) of an algebraic data-type (Bool).

This means that when Haskell evaluates the expression True it produces a value of type Bool with corresponds to the truth. This is why such expression is called data-constructor - it produces a value of an either-of algebraic data-type. Likewise for False expression.

So-called reality checks.

Ericsson has about half of 3G and 60% of 4G base station hardware market share. High-reliable, scalable, soft-realtime systems written in Erlang. It just works because is based on sound, well-researched principles.

Just watch it: https://www.youtube.com/watch?v=rQIE22e0cW8

It is real success, even triumph, while piles of Java crap are just crashes and crashes and crashes (with increasing load and data flow) no matter how much RAM they add to the servers.

What attitude!

pg 64 days ago | link

I know almost nothing about Clojure. I saw some example code around 7 years ago, but I don't remember it well.

https://news.ycombinator.com/item?id=7493993

Why, Google, why?

Some things I cannot understand, no matter how hard I am trying.

Why on Earth, in Chrome when http[s] (standard, explicitly restart-able protocol) when a download of a 50+ Mb file fails at 98% (it knows the size and type of a content from headers) the damned thing removes file.crdownload? What prevents them to add a few likes like "if more than 50% completed then keep file and add option to restart.

What kind of stupidity it is? Following "Expire headers" which size that a content must be reloaded each time? Trying to minimize of idiot's confusion that file is of less size that it supposed to be - there is explicit .crdownload extension, to indicate that it is still in progress.

The only rational explanation is that such "feature" is sponsored by telecom operators, which is quite possible in a modern business..

OK, the more probable explanation, is that nowadays it is OK to reset a connection if server side considers it "too slow" - it is just an optimization technique - to serve fast [premium] connections fast, at the "compromise" of dropping the "losers". So, they simply don't restart connection in order not to interfere with site's "policy". But this is a crappy defaults. Why do they take site's side, not user's one?

Some narcissistic UX star might say "lets not clutter the interface to keep the illusion of simplicity". OK, fine, do restarts automatically for certain content-types, when you know the size of payload in headers and when connection is closed before file is retrieved.

The only answer it seems that everyone benefits if user restarts the download from the very beginning, because he would consume twice as much traffic, which everyone monetize. Otherwise I cannot understand why (wise and never evil =) Google cannot add a few lines of code to download manager.)

Practice makes perfect.)

A small, well-balanced set of very few selected ideas.

Zen and Art of Symbolics Common Lisp.

Let's say that Lisp is very different because it was based on a "good ideas", well researched in "good places" like MIT AI and CS labs (no punks could get there). It has something to do with shoulders of Titans. The set of selected "good ideas" is intentionally keept small and well-balanced. Let's take a walk.

The "layers of DSLs" metaphor mimics as close as possible the way our mind uses a language, which we call "verbal", or "logical" mind, as opposed to "non-verbal, ancient, emotional" mind, which is based on evolved reflexes, captured by genes (Sorry!).

We are using symbols to refer to inner representations (concepts) we have in our minds. so "everything is a symbol" (just a reference to a "storage") is a "good idea".

When we're talking about some specific context (situation, or "domain") we tend to use some reduced, appropriate set of words (symbols) related to it, along with usual "glue". This what we call Domain Specific Language, or a Slang, if you wish. This is also a "good idea", groups has slangs.

Layered structure is, in some sense, what the Nature is. Atoms, molecules, proteins, tissues, organs, systems, brain, body, you see. So, layers of data is a good idea, but layers of DSLs is even better one. It not only mimics how the world is, but how we should structure our programs.

Neither the language nor its layers are set in stone, they could be transformed, extended, adapted using the very same mechanism which underlies it. Iterative looping constructs, for example, were build out of macros, and the Loop DSL is the most striking example of how a language could be expended with itself as a Meta-language.

Some "good people", like R. Gabriel, have argued that we need more complex control constructs (special forms) as long as we are trying to write complex programs (language should be adequate to the domain), so, the ability do define new special forms as we go, without breaking everything, is a "good idea".

This is, btw, the idea behind the recursive bottom-up process of software development, popularized by SICP and championed by pg and rtm. Language should evolve together with our understanding of the problem domain.

Structures is also a DSL. This gives us the way to structure our data (everything is an expression in a Lisp, everything could be evaluated, this is another very "good idea") to mimic or represent more conveniently the objects of real world. Structures could be nested, getters and setters were created automatically, but could be redefined, etc.

So, by extending the language with new, appropriate constructs (by defining new special forms) we could improve our way of modeling the reality and create better "inner representations" for the concepts we made (captured)? Seems like a "good idea".

But wait, because everything is an expression (a Lisp form) which could be evaluated, why not just put code blocks (expressions) into the same structures? Thus we have "data structures" which captures not just characteristics (state) but also behavior of real world "objects".

To capture the behavior we need the notion of "protocols", which is just a named set of generic functions. So we have defprotocol which, essentially, creates a structure and binds names (symbols) to procedures (expressions) which consists of Lisp forms. Thus we got MOP implemented, which is the basis of CLOS.

I forgot to mention, that since everything is a symbol (reference) we could combine "objects" into lists and other aggregates, map them, reduce them - the whole set of layers of language "below" are available.

This doesn't mean that this is the only way to program, this is just one of the possible programming paradigms. What is really matter is that we could, by using the very same means of combination and abstraction "import" any other paradigm we wish.

And everything so uniform and concise that it could be easily traced back to "conses".

btw, this is not a "strict", "rigid", "set in stone" language (it is a nonsense to try to "fix" a language, it contradicts with the its nature). We have reasonable, "evolved" defaults, such as Lexical scoping for variables, but we could have Dynamic scoped ones if we wish. Thus, it is possible to extend the language with "non-local exit" control structures, which is your fucking Exceptions.

Immutability has also reasonable defaults. List and mapping functions are always producing a new copy of a list, leaving original ones unaltered, while their "destructive" equivalents were segregated by following an explicit calling convention (Scheme is famous for this).

Evaluation strategies could also be explicitly selected, so lazy lists or streams could be defined using the very same conses and macros and list notation.

Being a small language (after all the transformations - macro-expansions, rewriting rules, inlining has been done) it could be efficiently compiled (using a compiler written in itself) diretly into Machine code, which runs on more plathorms than fucking JVM.

This is what some people are considered as a work of a fine art, and called "programmable programming language".

But this is only a half of the story. There were machines (a hardware FSM if you wish) which was able to run the Lisp code efficiently. But this is another story.

OO and type "safety" memes.

When intelligent people are talking about different "kinds of objects", they are used to describe objects in terms of "flavors", "behaviors", "traits", while idiots tend to talk about "classes".

There are abstract, ephemeral creations of a mind, which almost always are in contradiction with so-called real world. Ideas, concepts, even logic, they are usually utopias or disconnected from reality oversimplifications.

Lets be very careful here. Numbers in Math are very good example. A number can be either of this or that "type". And it *sometimes* could be "coerced" into another "class" without "losing precision".

This is the most sacred notion of proponents of strictness and rigidness, so-called "strong typing", "strict languages", fixed routines.

Lets call this a "OR-mind". It is a rather naive notion that "things are either this or that, right or wrong, black or white, integer or rational (yeah, numbers are a special case).

This is the "naive logic" everyone begins with. Later they are trying to use "strict", "rigid categories" of the same kind. This is how wast (but meaningless) class hierarchies were created. And, indeed, in case of numbers they are good. As long as there is nothing but numbers to "model" Java is a great, the most strict language. Packers love it.

In so-called objective reality, however, their "strict" and "rigid" hierarchical classifications are failing. Packers are trying to classify the phenomena of the Nature in terms of "strong is-a", "strict this OR that" and fail.

They are talking about birds as something that "has wings" but there are bats and fish. They say that it "has wings" and "lay eggs" but there are reptiles. (they usually doesn't know that birds *were* reptiles, but that's OK - they have no notion of "were" in the packer's world)

First packers have tried to remove contradiction with so-called "multiple-inheritance". They could say "this IS this AND that", which is also a naive. Ostrich "is-a" Bird but while the method "HasWings()" returns "True", the call to the method "flyThere()" produces a very long stack trace.

Then "generics" and "interfaces" were added much later to these "packer's" languages (to mess everything up even more) so they could create "compound objects" as a "collection of traits".

Less "rigid" people, however, had the notion of so-called "duck-typing" from the very beginning. Smalltalk and modeled after it the Flavors DLS (the mother of CLOS) has exactly this approach. If it can do this-AND-that (follows the protocol) then it could be viewed (considered) as "one-of" this kind. This way of thinking we could describe as the "AND-mind". Less restricted, "light", context-aware.

The notion of "pattern matching" which is based exactly on the notion of "catching" different nuances (particulars, subtleties) of reality is closely related to a such mind-set.

Now about "type safety" meme. Again, if the world consist only of numbers and abstract ideas, then, perhaps, it could make some sense. However, processes are not "linear", classifications are not "three-like" when everything is just this OR that. Reality is much more complex than that.

Such naivety manifests itself when packers are trying to "model" moderately complex systems. They think that "there is a Stream, so we could define a class with methods. While reading a stream we will get Either data OR Nothing" they say. "So as our code covers both "cases", and the compiler checks the types of variables we are safe!". OK, what they got instead is SIGPIPE signal, or EAGAIN error code or "connection reset by peer" state, which are of very different "type".

Well, they say these are not our problems, it must be handled by runtime. We want Either data or Nothing, we don't want any signals, or states, or conditions.

In other words, when learning OO, follow the best minds, those behind Smalltalk or CLOS, which have a "correct" notion of what OOP is and what it is for. It is a way to structure the code for reuse and to avoiding duplication. It is a way to model a "complex" representation of real-world "objects" by composing an distinct, usually unrelated "behaviors" or "traits". It is about communication between these "objects", strong isolation of each particular "instance", and encapsulation of the its "internal state".

All this was implemented using only "conses", "closures" and "message-passing".

There is a beautiful and telling similarity between defstruct and defclass "special forms" in Common Lisp. Classes are viewed just as "structured expressions" while expressions themselves are "data to be evaluated". Code is data, so it could be "structured" this or that way.

Thus, logically, OO "adds a second (and third, etc) dimension" to the code, the same way "structures" do for the data. As long as one treats code as data, everything is perfectly natural and "logical".

The most important real-world idea behind OO is, perhaps, that something could exhibit more that one behavior, has more than one trait, the way people do. Food has more than one flavor too.

So, OO programming is not only about the way of packaging a code into methods of classes and enforcing the restriction that "every object must be a memeber of a class" (for most Java or Ruby coders OO ends here), but rather a way to model real-world objects by structuring the data not just as "2D-trees of variables of this OR that type" but as "nD-graphs of expressions AND actors/agents".

The people behind Lisps, Smalltalk, MOP/CLOS were much brighter, broad-minded, better educated and more enthusiastic than most of modern "punks".

Two ways of cooking.

Sometimes to understand better one complex system or a process we are applying knowledge from a different field, because it seems that there are some subtle ideas and general principles which seems to be valid across many domains and even cultures.

Lets consider cooking, a process of preparing food. Leaving enumerable subtle nuances aside, there are two common approaches to cooking.

The first one goes like this. One went to the shop and have brought a lot of very expensive, branded goods - an overpriced branded "organic farming meat", a huge piece of fillet in a vacuum box, ten boxes of different premium "oriental" spices, most expensive Italian olive oil, and then a full cart of different kinds of "organic" vegetables and bunch of fresh coriander, rosemarin, dill, green onion, etc.

Then he comes home and start cooking. Usually, such cooking is a process of "frying everything together" or some-times to "making a village-style curry". He just puts everything in a big pan and heats it up for a while. Because there are lots of expensive ingredients the whole dish is usually eatable, so he considers himself a good cook, get his gratification and re-enforces his self-esteem, becomes even more over-confident and proud of his accomplishments.

There is no wonder, because, at least for me, almost any freshly cooked, still hot, non-synthetic meal is quite eatable. The real question is - does this amounts for a good cooking at all?

The above scenario is very over-simplified and over-optimistic. First of all, if he would put to much heat, as they usually do, he would burn everything, or if he notices the burning and turns heat off at the right moment, would end up with a burned-outside-raw-inside pieces. Or he might easily over-boil everything with too much heat, causing all the complex molecules to be broken, with results is a thick, almost tasteless stew.

Heating food is a very subtle process, which requires literally tons of empirical knowledge which cannot be brought in an expensive grocery store or being copied from a recipe-book. One must know how to vary the temperature depending on which ingredients are processed now (like it is OK to heat oil before pitting spices in it, but if one puts onions first they will burn). One also supposed to know that different vegetables requires very different mount of exposure to heating in order to be properly cooked, not dissolved into a soft mesh, and so on.

So, roughly speaking, there is notions of varying of the amount of heat, notions of order - when to put what, and most importantly, to notions of how the whole process is changing with each new ingredient and/or change in processing.

This brings us to the notion of the second approach to the cooking. We could call it professional, or ironically, poor-mans cooking. It comes mostly from Asian countries, where people still aren't spoiled by over-consumption and not engaging in practices of spoiling huge quantities of expensive foods.

In Asia people have noticed millenia ago that some ingredients form a very good match and that this is just good-enough. With time they just refine the recipes, so traditional dishes emerge. Indian traditional roti-sabji, or alu-mottor, or Nepali daal-baat with tarkari, or all these amazing Chinese and Tibetan dishes.

The underlying principles here is, not surprisingly, less is more (because food is less abundant, especially in Himalayas) and to capture a good match (I prefer to call it to maintain a balance) and last but not least, heat it just-enough. It could be a bit raw (this means heated up to the temperature high-enough to kill all germs, but not for too long), but never spoiled by over-heating or burning. It also almost always consist of not more that 3 or 4 ingredients of different kinds with some hot spices (even fried vegetables are quite tasteless without proper spices).

This is by no means accurate outline of the Asian approach to cooking, and the point is in not to be very accurate. The point is to show that the same different approaches we have in software engineering.

The first approach corresponds to the very common and very popular "branded toolbox" approach, or, as I prefer to state it "put all the crap inside". Java, C++11, Ruby, PHP - they are all about having hundreds of classes with tens of methods, because, you know, "the more the better". They also usually follow some over-simplified all-or-nothing principles, similar to religious beliefs, which is based on a notion like "OOP is the best approach for all tasks and was the culmination of human knowledge" and similar nonsense. For them to have a soft mess of hundreds of classes (but everything must be an object! no exception!) is what makes them happy.

What is quite remarkable, is that they really believe that by pilling up even more Java crap, by adding even more expensive ingredients, putting more stuff and overheating it, the whole result would be better. Well, it definitely would taste a bit differently, but still like a crap.

The the second approach is of minority, of marginals, who, for some obscure and non-rational, not-customer-centric, non-efficient, not-getting-shit-done reasons are trying to find a balance, a "perfect" match, of, say, FP and networking and concurrency primitives, like in Erlang, or to model its runtime system according to the notions of how really complex systems of Nature, such as our body and brain work, the notions of loose-coupling, share-nothing (brain), hierarchy of receptors and actors, feedback loops (neurons system), communicating by message-passing (blood vessels are communication channels), etc.

Sometimes it works. Not perfectly, of course (Erlang's syntax is.. but it is much better than Java), but it works remarkably better, incomparable better and it tastes really good with very few, carefully selected ingredients and appropriate processing.

Java or C++ tastes like crap. No matter what. You look at the source and you feel sick. You look at runtime and you get nauseous. You look at documentation, you get headache. You look at forums and see over-confident idiots, oh, pardon me, amateur cooks.

The other example of a finding a good-enough balance approach are Lisps, like S4RS Scheme or Arc (while CL and S5RS+ Schemes are already suffering from putting everything in) Clojure, on the other hand, is a counter-example - it is a product of "put it all inside" approach. Haskell (but some syntax constructs are here because the more is better), Smalltalk in its best time, probably Effel, etc.

What is wonderful is that there is unlimited space for mixing a few selected ingredients in a new ways, like Arc-like Lisp which compiles directly into native code (X86_64 or ARM CPU is much better VM than JVM) with networking and concurrency primitives from Erlang, some selected abstractions form Haskell, etc.

Reactive buzzword.

There hardly any even more over-hyped buzzword that "Reactive" (aggressively marketed by Typesafe investor's money). Well, only "Bitcoin" has more hype.)

For a quick glance everything is very clever. They position themselves (in rather very politically correct wording) as finally a fix for that horrible mess called Java. Not only that, they, it seems, came with "innovative" solutions based on proper concepts and paradigms.

They use Scala language (ignore the "scalable" meme, think that it is a functional languages with advanced type system (a-la Haskell) which compiles into JVM bytecode and has easy Java interloop, so you could call your crap). Being functional and well-designed by prominent academics, it really allows to use order of magnitude ;) less lines of code to express the same ideas.

Not only that, they also "re-implemented" the proper concepts (Actor Model, Message Passing, Fault Tolerance, Light-weight Isolated Processes, etc.) pioneered by Erlang which was the first functional language "properly" designed designed for concurrency. They call it Akka.

And, finally, there is a MVC web-framework - Play!.

Well, I was hyped and decided to take a look. But there are no miracles in Java world.)

The file http://downloads.typesafe.com/typesafe-activator/1.0.10/typesafe-activator-1.0.10.zip is 238Mb.

Almost 250Mb of compressed crap (jars are zip archives too) with is supposed to be loaded into JVM, which, remember, is just an ordinary, user-level multi-threaded process, which tries to manage memory, threads, synchronization by itself (which is an OS job) and that means only problems.)

So, I supposed to load about a one-third of Gigabyte of compressed crap into a JVM instance, which will waste at least 2Gb of RAM in order to have a few nice things?

No, thank you. I will use Erlang.

btw, I could easily understand all the excitement of those pointy-haired bone-headed managers, inhabitants of the Java world - for them wasting machine resources and profound inefficiency when it comes to communicating with the outside world (JIT-ed native code runs quickly) is only source of bigger budgets, staffing and, you see, a few more years of getting salary for projects that will fail.

But this is the best thing we have today.)

Every self-respected programmer..

..must do this at least once.)

Real programmers are poets, while common coders are graphomans.

Programming is not about collecting code, it is an ability to write it down, spontaneously, as it "emerges" in your mind, like a poetry.

It is about harmony and balance, being as compact and meaningful as possible, without redundancy.

Real programmers are poets, while common coders are graphomans.

New year thoughts.

New Year eve is approaching, so it is a perfect time to pause and look back.)

As strange as it sounds, the only "big idea" which I like to continue to pursue in the next year the one was proven by Paul Graham with his Arc language and real-world services which were built upon it, such as http://news.ycombinator.com/

The idea is: We don't need all these piles upon piles of Java* crap.

arc.arc is just 2K LOC, ac.scm (compiler to mzscheme) is 1.5K LOC, news.arc + all the libs is just about 1.2Mb in size.

This proves his thesis given in the On Lisp book about bottom-up, adapt-as-you-go, design-for-changes, as layers-of-dsls approach to real-world programming, and that code can (and must!) be compact, concise, readable and clean at the same time.

This implies that thing like Hadoop (unimaginable inefficient and bloated pile of Java crap, wasting more resources that it serves - but that's OK, hardware is cheap!) could be implemented with orders of magnitude less code, less waste, less confusion.

Another point is that we don't really need "megabytes of stuff" and "thousands of macros" and even CLOS. There are a good-enough set of special forms and primitive procedures (the very idea behind original Scheme, before it turned with R6RS into yet another bloatware) for any kind of programming. Everything what is necessary could be written on Lisp on the go.

The only annoying thing is lack of a decent native compiler, as we have for CLs, especially that to compile such a tiny language (very few special forms and primitive procedures) we don't needs hundreds of megabytes of C++ code (clang).

We also want quick and efficient FFI of CLs. With native code and decent FFI the task like serving dynamic content would look attractive (imagine something like ring, but without the disaster called Java runtime).

The direction to look is the Golang, which is right now in the stage of re-writing the compilers in itself (the very old tradition for decent languages).

So, there are still small wonders around, like Plan9 or Emacs or Arc or MIT Scheme or CLs or Golang. They show us what is possible when a mind is used instead of copy-pasting.

As Paul Graham suggested in his On Lisp classic book, programming is a journey into unknown, as a life itself, without any plans and specifications set in stone, and one should change, adapt and evolve on the go, along with one's code.

More of my stuff.

I have no time to extract and put in one place these very few fine pieces I have wrote in last two years, which are buried in deep in HN, so there is just a link https://news.ycombinator.com/threads?id=dschiptsov

Some day I might do it, but the point is not in collecting pieces, but in an ability to write them spontaneously, on the spot, in the best Kerouac style.)

My Take 5

Here is another attempt of describing the beauty using ugly words,)

Why I hate SAP.

Got hell-banned HN again, this time due to not politically correct remarks about some SAP product:

Oh my, that "Delphi mentality" in 2013..) ExtJS, if I remember correctly, is 5 or 6 years old already?) But we, no doubt, will read about ground-breaking innovations from their PR machine.)

Is there any info which "startup" they have acquired or in what third-world country it was outsourced?)

  1. My position about practices SAP uses remains unchanged: https://news.ycombinator.com/item?id=5158953
  1. I do have reasons to suspect that SAPUI5 could be outsourced third-party project because it is way SAP "cutting costs" by outsourcing development in so-called third-world, while asserting "world-class quality of its software".
  1. I do understand that my remarks bases on guesswork could produce unpleasant feeling in honest developers, but who even considered feeling thousands of ruined carers, teams and even companies because they do messed up with SAP.
  1. I do not follow the practice of so-called polit-correctness toward incompetent, impotent "management" or "IT specialists" who was brain-washashed by SAP or Java propaganda, on the very same reasons these people would not tolerate a drunk or dope addict.
  1. I am quite sure, that in some corners of the world there are true and honest SAP consultants and certified professionals, who honestly trying to help their customers and not mislead and manipulate them, but, unfortunately, I never have seen one in my life.

Truth is out there..

http://www.reddit.com/r/programming/comments/1qw73v/til_oracle_changed_the_internal_string/

Hint: read carefully about the mess before and after the patch, then read usual marketing BS about the industry standard.

We are back!

We are back after resolving some payment issues with a hosting company.

Thanks to everyone who found our writings beneficial ,)

There is no short-cut.

Learning how to program is about developing appropriate habits of the mind, so it could perform basic tasks, such as problem domain analysis, data definitions and some template and syntax transformations very quickly and almost unconsciously.

The recursive process itself - the patterns of thinking while one writes a part of a program, are described in a great details in HtDP books and prior works on so-called data-driven development. In this books (and on-line course based on it) shows how following the simple patterns of thinking leads from problem analysis to data definitions, describing concrete examples and checking expectations by writing test cases before the actual code.

The shape of a data structures influences the final shape of corresponding function, so, based on the kind of data-definitions which domain analysis has produced an appropriate template could be selected. Then a template must be filled and transformed into the actual code of a procedure.

Everything looks great, but there is one big problem. This habit cannot be developed quickly, just by reading the book or watching the on-line course - you will forget everything you have read in a week and no skills would be gained, only some vague view of what is possible (which is already good, compared to ignorance of an typical coder).

One must do all the exercises/homeworks, which means instead of just reading a book in one week, the training process will expand into few months. 3 months only for HtDP.

There is also SICP, On Lisp, ANSI CL and few Haskell books, K&R, Programming Erlang, some Algorithms and Data-Structures book (from MIT press) and AIMA, which means years of boring exercises. After that one might think about as or call oneself, well, a programmer.

Look how different this is from a typical Java or Javascript drone coder, who acquired some skills of copy-pasting code from examples or gluing objects together using a context-help windows, without even an attempt to understand how thing really works, leave alone any theory of why it is so.

The way of becoming a top-performer is not hidden or secret - it simply goes through a few great books. The problem is that, unfortunately, just to read them is not enough, and one must start in ones twenties, because the journey takes time..

The most important think to realize is that programming is a writing skill, which cannot be practiced by reading other people's texts/code. One must practice in crafting one's own code.

This is why to read other people's code is so difficult - your are trying to read the final version of the code using, so to speak, "sequential, top-down process", while the process of writing, which produced it, is "bottom-up, recursive, spiral" one and by its nature is completely different.

So, do not waste your time on mere reading, practice the writing skills by doing exercises from a few great books. After that everything else would be quite easy, no matter what crappy languages or products you will encounter with in the industry, dominated by idiots.

As simple as that.

Too long. An idea could be stated in a few sentences.)

Java:

This is an instance of an mammal of an animal kingdom which doesn't include dolphins and whales, which has a..., placed within the instance of a class Plain of polymorphic shape which has some private attributes...

ML-family:

This is a member of a set of only mammals of animal kingdom, excluding dolphins and whales, of small size, which has a..., located on the member of a set of geometric figures....

Lisp:

The cat sat on the mat.

Why Haskell?

It is tempting to say that Haskell is the same for hardcore Math nerds as perl was for hardcore UNIX nerds (better than shell with better regexps), but it doesn't stand.

It seems like this is rather cultural phenomena, about being able to say or show off something cool and clever, something special, not for everyone.

There are no shortage in analogies. Yoga and Oriental philosophy in general - everyone starts to talk unimaginable nonsense using words "chakra", "dharma", "atman", "brahman", etc. Sounds very distinguished.

Few decades before everyone used to insert at random places words "unconscious", "repressed", "libido", "alter-ego", then, "stimulus", "reinforcement", "conditioning"..

Now it is fashionable to talk about functional programming, the same way it was fashionable to talk about object-oriented one in 90s. New pop-culture.

But it will not work unless you have the aura of being sophisticated. All those references to advanced branches of mathematics and fancy theories creates an illusion of touch with real science - the same way they put block diagrams and on a tooth-cream.

So, if some young guy managed to express some idea using this special sophisticated, incomprehensible by mere mortals syntax, he wrote a long blog post, praising his own cleverness.

Some people wrote compilers for brainfuck or rewrote qemu in Javascript, but it is not even near to the cleverness of a Haskell guy. Not sophisticated enough. No references to Group or Category theories.

Now some bloggers argue that one should place Haskell as a must-have requirement for a job applicant, because, you see, they must be really smart if they can code in it.

Just imagine what a disaster could be a project where each coder will try to show how clever he is. The same phenomena - look how clever I am - have been seen in Perl and C++ worlds. Unreadable spaghetti, which even author can't read two weeks later.

It is also possible, that I'm just isn't sharp enough to "get it", to appreciate esoteric qualities, which makes this language superior to Erlang or Lisp, which I really admire and love.

It that case it is a Perl for Mach nerds, which means a disaster for rest of us.

Idiots in charge.

There is a rare, decicious thread on StackOverflow, from which one could get a lot of insights, what happens in Java-sweatshops running by ignorant "managers":

http://stackoverflow.com/questions/218123/what-was-the-strangest-coding-standard-rule-that-you-were-forced-to-follow

Follow up discussion on HN and Reddit

There are a few patterns to recognize.

Fist, we could notice how incompetence and stupidity leads to inventing a meaningless rituals to perform, to delude themselves, to gain unsubstantiated confidence, and get rid of anxiety and doubts without understanding or explaining anything.

This is like a naive religions beliefs. We do not understand laws of nature and mind, so, everything is created by god by its will. Go back shopping.

So, idiots invented rules to follow, instead of trying to learn and improve their own understanding.

Paul Graham, in one of his essays, told us a story about how to deal with bullies in a childhood. The answer is run upstairs. It not just means literally make him exhausted and give up, but more general, universal meaning - grow up, learn more, improve. Become a highly developed individual.

This is a universal principle. By knowing more you would be able to see more, able to understand and realize phenomena which idiots unable to notice.

More importantly, the greater understanding allows you to drop meaningless activities, cease to perform useless repeated rituals, free your own resources, and, because of that, improve your realizations even more, on the next recursive step.

For example, the whole idea of giving a idiotic notations including a encoded type is so stupid and creates only more confusion.

It is like writing thatSmallAnimalWithFourLegsAndATail instead of using a word cat. The idea that there could be one hierarchical classification or a type system for everything is so naive.

Our Natural language is evolved to deal with real-world phenomena, and the underlying idea is that we could attach a symbol (sound) to an anything we could perceive, describe, visualize or point out with a hand. It is like attaching a tag, a label. It is not about maintaining a classification. There are many of them, and they are subtle and dynamic.

All those problems with idiotic naming schemes, rules and rituals came from ignorance - lack of understanding and domain-specific knowledge, as if a plumber would be promoted to supervise a software project. This is cause of all that pain and suffering we could see in the links above.

How to avoid crap.

The rule of thumb when dealing with Open Source solutions is this:

It must be evolved solution, built in a recursive bottom-up process, for the author's own needs to deal with a real-world problem.

This is how a masterpieces such as Scheme, Erlang, memcached, nginx, redis or riak were made.

The authors start from the foundations - design decisions, appropriate level of abstraction and clean, close-to-hardware implementation of the basic building blocks, such as buffers, hashes, RPC. Then they grow, evolve, adapt the code according with evolving and growing of their own understanding and experience, in parallel.

What we have instead, are tons of crap, piled up without any serious design work or proper studying of underlying ideas, by jumping right into IDE to monkey-patch it all together and sell to a bigger fool.

This is the standard approach in Java world, which we do not even considering here. This is how we got a crap like PHP or MongoDB.

MongoDB is supposed to be a fast and very popular document-oriented database, the default choice for ignorant amateurs. They even tried to exploit the idea of redefining the LAMP stack - "M now stats for MongoDB instead of MySQL". In other words, it is professionally marketed and pushed.

When you take a lock at its website it is all about success of developers, how easy and fast they can start coding, without any thinking. This is the basic selling strategy - it is all easy and no thinking or even understanding is required.

There is what they say:

MongoDB allows very fast writes and updates by default. The tradeoff is that you are not explicitly notified of failures. By default most drivers do asynchronous, ‘unsafe’ writes - this means that the driver does not return an error directly, similar to INSERT DELAYED with MySQL. If you want to know if something succeeded, you have to manually check for errors using getLastError.

Notice the language usage - fast writes and updates, and especially not explicitly notified of failures. Actually it means that they just cheating you providing an unsafe storage, with possible data-lose in case of a failure (segfault, or kernel trap or a hardware or FS failure).

Notice also that getLastError - it is definitely from PHP.)

Would a sane developer create such solution for himself?

There is more.

Older versions of MongoDB - pre 2.0 - had a global write lock. Meaning only one write could happen at once throughout the entire server. Wait, what?!

This is needed to be re-read. This means that before 2.0 it has the same logic as a plain file and a lock - acquire a lock, append to the end of file (no, the have no append-only journal until 2.0), release lock. Do they call it a database? Of course, you cannot find such details on their web site. There is nothing but marketing-speak there.

Would a sane developer create such solution for himself?

There is one more thing:

MongoDB uses memory mapped files and flushes to disk are done every 60 seconds, which means you can lose a maximum of 60 seconds + the flush time worth of data.

I think comments aren't necessary.

So, there is two basic strategies - grow up your own solution, or pile up some code for sale. The second one is a major one, and products built this way should be avoided. PHP, MySQL (with MyISAM), MongoDB, NodeJS, Clojure are among the list of the most popular examples).

As in any other markets, detecting and avoiding scams is a crucial skill. If something was created for sale, like Java, it is probably a well-financed scam.

Real-time!

Some pearls of convention wisdom says that we cannot make things less crappy by adding more crap, or, say, solve a dept problem adding more dept.

On the contrary, we can read, in market speak, that piling up more Java could solve its problems, and even provide real-time querying. Real-time! In Java!

Lets take a look for this mirracle:

https://github.com/metamx/druid/tree/master/realtime/src/main/java/com/metamx/druid/realtime

package com.metamx.druid.realtime;

import com.metamx.common.lifecycle.Lifecycle;
import com.metamx.common.logger.Logger;
import com.metamx.druid.log.LogLevelAdjuster;

/**
 */
public class RealtimeMain
{
  private static final Logger log = new Logger(RealtimeMain.class);

  public static void main(String[] args) throws Exception
  {
    LogLevelAdjuster.register();

    Lifecycle lifecycle = new Lifecycle();

    lifecycle.addManagedInstance(
        RealtimeNode.builder().build()
    );

    try {
      lifecycle.start();
    }
    catch (Throwable t) {
      log.info(t, "Throwable caught at startup, committing seppuku");
      System.exit(2);
    }

    lifecycle.join();
  }
}
  • John, do you see any real-time?
  • Nope.
  • Oh, I found real-time! Look!
package com.metamx.druid.realtime;

import com.google.common.collect.Lists;
import com.google.common.collect.Maps;
import com.metamx.common.ISE;
import com.metamx.common.concurrent.ScheduledExecutorFactory;
import com.metamx.common.concurrent.ScheduledExecutors;
import com.metamx.common.config.Config;
import com.metamx.common.lifecycle.Lifecycle;
import com.metamx.common.lifecycle.LifecycleStart;
import com.metamx.common.lifecycle.LifecycleStop;
import com.metamx.common.logger.Logger;
import com.metamx.druid.client.ClientConfig;
import com.metamx.druid.client.ClientInventoryManager;
import com.metamx.druid.client.MutableServerView;
import com.metamx.druid.client.OnlyNewSegmentWatcherServerView;
import com.metamx.druid.client.ServerView;
import com.metamx.druid.collect.StupidPool;
import com.metamx.druid.db.DbConnector;
import com.metamx.druid.db.DbConnectorConfig;
import com.metamx.druid.http.QueryServlet;
import com.metamx.druid.http.RequestLogger;
import com.metamx.druid.http.StatusServlet;
import com.metamx.druid.initialization.Initialization;
import com.metamx.druid.initialization.ServerConfig;
import com.metamx.druid.initialization.ServerInit;
import com.metamx.druid.initialization.ZkClientConfig;
import com.metamx.druid.jackson.DefaultObjectMapper;
import com.metamx.druid.query.DefaultQueryRunnerFactoryConglomerate;
import com.metamx.druid.query.QueryRunnerFactoryConglomerate;
import com.metamx.druid.utils.PropUtils;
import com.metamx.emitter.EmittingLogger;
import com.metamx.emitter.core.Emitters;
import com.metamx.emitter.service.ServiceEmitter;
import com.metamx.http.client.HttpClient;
import com.metamx.http.client.HttpClientConfig;
import com.metamx.http.client.HttpClientInit;
import com.metamx.metrics.JvmMonitor;
import com.metamx.metrics.Monitor;
import com.metamx.metrics.MonitorScheduler;
import com.metamx.metrics.MonitorSchedulerConfig;
import com.metamx.metrics.SysMonitor;
import com.metamx.phonebook.PhoneBook;
import org.I0Itec.zkclient.ZkClient;
import org.codehaus.jackson.map.BeanProperty;
import org.codehaus.jackson.map.DeserializationContext;
import org.codehaus.jackson.map.InjectableValues;
import org.codehaus.jackson.map.ObjectMapper;
import org.codehaus.jackson.map.jsontype.NamedType;
import org.codehaus.jackson.smile.SmileFactory;
import org.codehaus.jackson.type.TypeReference;
import org.jets3t.service.S3ServiceException;
import org.jets3t.service.impl.rest.httpclient.RestS3Service;
import org.jets3t.service.security.AWSCredentials;
import org.mortbay.jetty.Server;
import org.mortbay.jetty.servlet.Context;
import org.mortbay.jetty.servlet.ServletHolder;
import org.skife.config.ConfigurationObjectFactory;

import java.io.File;
import java.io.IOException;
import java.nio.ByteBuffer;
import java.util.List;
import java.util.Map;
import java.util.Properties;
import java.util.concurrent.ScheduledExecutorService;
...

Yeah.. That's it..

Enterprise software Next.

Enterprise software, unlike hot internet startups in Valley (overpriced rapid Rails coding) is not sexy, but there is still tons of money here.

There is a deep crisis in in-house development because of the Java fiasco. Instead of getting results on schedule and within budget all they got instead is piles and piles of unmaintainable Java crap, which doesn't work as expected, late for years and still doesn't meet the requirements, consumes all available resources and crashes unexpectedly every week. What is even more important - several times overspending without even coming close to meet the specs.

This is reality of today's Enterprise Development with Java. No wonder that budgets were cut out and enthusiasm vanished.

But don't you see, no one in Valley are using Java or PHP for their startups. A few, however, believes that if they will pile up another layer on top of JVM then it will magically become more efficient, stable, predictable and less resource consuming. Call it Scala or Clojure - it doesn't matter. They want to believe.

Btw, have you ever ask yourself, why there is NDK in Andoroid? Are you aware how many open-source shared libraries, written in C and C++ included in Android installation? Have you ever considered how Google Chrome was implemented.

The answer is - Chrome reuses tens of open-source libraries. No one usually rewrites complex specific code, such as cryptography functions of hardware-optimized math functions, they just dlload and call the code. This is why we have NDK and JNI in Android.

Wait, there is another story. Openssl is now included in NodeJS source tree to be compiled and linked with Node's binary. Guess what? It is because calling openssl's functions from inside V8 is very inefficient - it is slow has huge overhead, so it is reasonable to make a local wrapper and use it.

The bigger idea? VMs are bad when you need to interact with underlying OS. Re-using specially optimized functions provided by OS or open source libraries is a much better idea.

There is a very short list of open source packages engineer reuse very often - openssl, pcre, expat, sqlite, berkeley db, postgresql, gmp, curl, etc.

There are binding for almost every scripting language to make dynamic calls to these libraries. Why? Because re-using best quality, community-tested code is much more efficient.

Hey, but what about Enterprise Software Development? Well, look at Android, Google Chrome or even MacOSX. When a company needs some software product it does not going to buy a bloatware from, say, SAP, IBM or Oracle. It is legacy. They bootstrap the product themselves, reusing open-source protocols and software packages.

People are building internet startups very the same way, the very same 80/20 rule is here - they reuse ideas, they reuse protocols, they reuse code - 80% and they write their own code - 20%.

Those who are smart enough even make their own code reusable for the open source community, so that other people may test it, fix and even improve it.

Why open source model works and people do contribute, spent their time writing or fixing publicly available code? The answer is quite simple - because they find it useful for themselves. They themselves reusing the code. So, the secret is - make something useful for other people.

Now, if by some divine miracle we can eliminate all those bone-headed managers striving to keep their positions, there is a very clear way to make business software - the very same way the internet community do. Take the best ideas, take the best protocols, take the best tools and packages. This is the simple part.

The difficult part, which bone-headed crowd is unable to grasp - is about structuring your effort and code to make products that other people will reuse and enjoy. They will do it only when several criteria were meet:

  • it is doing an important thing
  • it doing it efficiently
  • it reuses familiar packages
  • it uses appropriate tools

This is how nginx and redis and postfix and hundreds of other software engineering masterpieces were made. This is how Google Chrome or Android were made.

Now look. Chrome is No.1 browser. nginx is No.1 web server. Android is No.1 smartphone platform. Is there any SAP, Oracle or IBM? Nope. All those projects were bootstrapped and based on open-source.

Are these idea new or original? Nope. The only problem we have is the ignorant and stagnant minds of the Enterprise folks. Their main concern is not innovations or efficiency which could be attained using a new software product, their main concern is their position and their money, their small very limited and very fixed world of ass-covering and responsibility avoiding.

So, putting them aside, there is no obstacles for making the next level of the enterprise, or B2B or whatever software, the very same way all those internet services we use every day, were made.

Fix for Ubuntu 12.10

There is a quick fix for Ubuntu 12.10. (for experts)

switch to a text console (Ctrl+Alt+F1) and log in, then do this

$ sudo apt-get remove unity* oneconf* lightdm nautilus
Y
$ sudo apt-get install gdm gnome-desktop-data gnome-panel

follow the dependencies.

Packers vs. Mappers

There was a thought provocative essay, called "Programmers Stone" written in 90s. It was rather long, but its opening was outstanding. Author discussed two different mind-sets - packers and mapper. No we can call them coders and programmers.)

The wast majority belongs to the first category, so they set rules and influence fashion. That is why almost everything eventually becomes Java - explicit typing, redundancy, forced description of every single little detail - all the packer's mindset likes and welcomes.

Even a guidelines for Common Lisp are heavily influenced by packers (corporations are hiring them to code from 9 to 5).

The all small marvels which together create the pure artistic sense of using Lisp packers want to throw away and replace with strict typing, ugly constructs, fixed rules, concrete recipes.

Don't use dynamic lists, they say, use fixed structures. Don't do recursion, use loops. Don't use nil, it is tricky. Do not return multiple values, it is confusing, and so on.. Everything with is unique and liberating, gives you more freedom of expression, leads to more compact, more natural idioms considered harmful.

They want it back to the realms of strictly controlled, imperative packing world, where they feel safe and confident, performing their restricted, repetitive tasks according to fixed rules.

On the other hand, going into unknown, exploring, observing, figuring-out, describing and prototyping on the go, this very moment - the attitude that is alien to packers and hated by them.

The very essence of an art is an intense awareness, attempting to catch this very moment and act in a harmony with it on the spot. It is about flowing, being flexible and spontaneous. It has nothing to do with restricted, fixed, repetitive behavior, which characterizing stupidity.

To learn the art of programming you should learn the artistic Lisp - its Scheme dialect, with remarkable general ideas, wise conventions, and very few, carefully selected special forms.

After you will *realize* the beauty of one general application rule *together* with prefix notation *and* common underlying list structure, after you *internalize* that having *less* pre-defined special forms is much better than having lots of confusing special cases, that you could extend this core language with any function you wish. After you will *realize* that code is data, represented as lists, and together with the list-aware reader it gives you almost algebraic transformations for free, that the-empty-list is a nice thing and recursion *is* much more natural way to think, then, you will look at world of packers very differently.

Scheme language is a natural programming language for an artists, people with taste and drives, with the passion for exploration and going into unknown, open-ended problems, for people with different mindset.

For safe packing with primitive, fixed hierarchies there is imperative Java.

The Google Effect.

Everyone was told in a elementary school that being a copycat is not not good and will cause lots of problems in later life.

Why? Because by copying a ready solution, produced by someone else, we gain nothing, even developing a wrong habit of thinking, as if all we need is an ability to find a ready solution and copy-paste it.

Now Google provides us a really easy way to find a solution to copy-paste, and this is what crowds do. Searching is the new normal.

Instead of trying to identify underlying ideas, concepts and principles, to understand how it works, and why it is so, people are searching for a ready solution, produced by someone else, that matches their assumptions and fulfill their expectations.

Snap-judgments on quickly matched features against expectations without any attempt to understand, leave alone analyze, of what is inside is the cause of later problems.

Professionally advertised features, a promise of easy live and tons of ready solutions is the major selling strategy. This is why such crap like php, mysql (with myisam and table-level locking) mongodb, nodejs, clojure or JVM gained such popularity. They promised an easy live without understanding of what is inside.

The bandwagon and the peer effects, of course, are in a play. Everyone are using php and mysql (now rails and nodejs), so it must be good. Everyone are using Java and Javascript, so they must be good.

But we know, that exactly the opposite is the case. Everyone are eating a cheap, processed fast-food, so, it must be good. Everyone are drinking booze, so, it is OK. Everyone are getting credits having no means to pay back - so we shall do.

There are different kinds of solutions, which was not a result of marketing efforts to sell crap to suckers, but an evolved results of a bottom-up processes of producing a solution by themselves, for themselves. There are Linux kernel, FreeBSD, postgresql, MIT Scheme and SBCL, nginx and redis, Go and python3, even Rails at the time of 1.x versions.

As long as people will continue to search for a ready solution instead of spending time to master the concepts and underlying principles, and then select the appropriate tools, we will have piles of lowest quality crap for sale, and even open-source solutions will be bloated and ill-designed, put together in a hurry without any understanding of underlying abstraction layers, guided only by the strategy how to sell.

Lisp-like syntax on top of JVM - Wow! Side-effects, underlying general List structure? Tail-call optimized recursion? Who cares?! Lets marry V8 with event-driven development concept - wow! Server-side Javascript. Underlying OS? Event notification model? Calling conventions? Who cares?!

This tendency is not limited to Information Technology field, it is in finance and in investing, in every bubble, produced by a wave of mass hysteria.

We could call it the bubble of copy-pasters, the bubble of mediocre, the age of incompetent. The Google effect.

Just watch this..

What programming really is? Watch this. http://youtu.be/efhh0Cf6sT8

Make sure everyone can understand what you have done". Brilliant.

the New Normal

The only one thing that is constant in the Universe is change itself. Lots of things have been changed in last year, particularly in the realm of education.

One year ago I started the very first on-line course of Machine Learning by prof. Andrew Ng. At that time it was a pilot, a try to make a feeling what it is like.

I've completed this course and got the most important realization - "I can do it". (given that I'm not a native English speaker, have no higher education, never studied English in a school and have Asperger's).

The second realization was that this will change everything, I mean the very way people could improve their life. In is not about grades of diplomas, it is about ability to learn online, without pouring through tons of low quality crap, without the need to filter out 95% of nonsense and annoying narcissism.

These changes are so deep that it is even hard to see.

Now, everyone could train themselves very quickly to an advanced (not an expert, it requires 10000 hours of practice) level without loosing any time. Wanna find out what is that 'Big data' buzz is all about? Close all blogs and product placements, and learn what's going on in Stanford or MIT.

Of course, someone pays the bills. This time it is corporations who need cheap but skilled workforce. With pay-walled education systems there are a huge shortage (not everyone could pay $50k/year or even could get any US visa). Now they publish the lectures (teachers got payed) and within a year or two all the major recruiting agencies will look at your list of accomplishments.

There is much better way to use this rare opportunity. Forget the grades and certs, learn to understand, to become a master in several interconnected, closely related subjects, and instead to become an employee try to apply all gained knowledge in your own startup.

This is the way out, a nice hack of the system, a chance to lift off.

One more thing..

Time to finish my studies and return back to Bydlostan. So, there is a small essay as a final exam at the end of semester.

a few more pages..

Tata trucks and Royal Enfield cycles.

Let's add some Oriental flavor to this blog..

Everything in Nepal, small but beautiful Himalayan country is driven by Tata trucks. They are custom build (the cab and truck) and vividly painted, but its technology - the engine, transmission, brakes, etc. seems like that of 1970-s.

It is not the case that there is any difficulty, finance or trading restrictions, to get a "modern" trucks - China is one day ride away. The reason is that those trucks are good enough. They are chap, reliable and what is most important - easily maintainable.

In any village in whole Indian sub-continent, including Sri-Lanka they could me serviced, disassembled and put back together in the day or two. It does not require any special computerized diagnostic equipment, special unique tools and expensive machinery, and specially trained personal.

In other words, they are simple, predictable and familiar things. They are popular because they are good-enough. People feel confident with them.

These criteria are exactly the same for the good software. It should be simple, familiar and easily serviceable. This is what Scheme system such Gambit-C or MIT Scheme are like, as opposed to the JVM based technology, which requires a large set of complicated tools just to be able to do basic tasks.

Unfortunately, the analogy with BMW (a brilliant engineering in a closed, sealed, non-serviceable by amateurs engine) is incorrect in the case of JVM. It is not brilliant at all, it is a complicated mess, based on wrong ideas and design decisions (The isolation from an underlying OS is one of them).

So, instead of BMW engine we got a sealed piece of crap, marketed to idiots as a solution to all their shortcomings - it is a way of thinking (everything in the world is a hierarchy of object - with is deeply wrong) and the way of writing (public static void - describing all the details, instead of meaning).

Why it is so? Because it is easy to and sell - idiots know nothing better (they really believe nothing better exist) and they influence each other (think of peer and bandwagon effects combined along with confirmation bias and few other cognitive biases)

Some day I will write down what cognitive biases and effects produced this reality distortion.)

Magic Scroll

Today we're publishing some runes form ancient and sacred Magic Scroll to meditate and enjoy, as an answer to the forces of the dark side...)))

;; Definition of CONS/CAR/CDR in Lambda Calculus.

(define (cons a d)
  (lambda (m)
    (m a d)))             ; accidental emergence of Truth

(define (car p)           ; CAR - Content of the Address Part of the Register
  (p (lambda (a d) a)))   ; A - Address part

(define (cdr p)           ; CDR - Content of the Decrement Part of the Register
  (p (lambda (a d) d)))   ; D - Decrement part

;; the explanation of Magic:
;;
;; define a procedure of two arguments in the global environment, called CONS,
;; which returns a procedure that accepts a procedure-of-two-arguments as its
;; argument and applies that procedure to the values of its formal parameters
;; stored in a frame that extends the global environment.
;;
;; This means - give me a procedure which knows what to do with my parameters.
;;
;; So, CAR invokes CONS with the argument which is a procedure that returns
;; the first of the two given arguments.
;; CDR invokes CONS with an argument that is a procedure which returns the
;; second of the two given arguments.

;; a trace
(car (cons (35 47)))
(car (lambda (m) (m 35 47)))                ; a cons object is a procedure
((lambda (m) (m 35 47)) (lambda (a d) a))   ; m is a procedure of two arguments
((lambda (a d) a) 35 47)                    ; application
35

;; encapsulating mutation

(define (cons a d)
  (lambda (m)
    (m a d
       (lambda (v) (set! a v))
       (lambda (v) (set! d v)))))

(define (car p)
  (p (lambda (a d sa sd) a)))

(define (car p)
  (p (lambda (a d sa sd) d)))

(define (set-car! p v)
  (p (lambda (a d sa sd) (sa v))))

(define (set-cdr! p v)
  (p (lambda (a d sa sd) (sd v))))

;; Happy Happy, Joy Joy!

the pure evil..)

Wanna see the pure evil? Here you go:

http://www.chris-granger.com/2012/06/24/its-playtime/

look what we have here:

(defn my-add[3 45]
    (+ 3 45))
  1. the lack of understanding the difference between symbols and numbers
  2. the (+ 3 45) expression is an application of a procedure '+ and it must be always evaluated to 48
  3. why square brackets are here - is it an array or something? =) Oh, it is a different type of structure, but why to expose it in this level? How about the principle of less astonishment and 'don't make me think/switch'?

So, what does it like? It is an emergence of something a-la Node.js or Clojure community - a lot of buzz and excitement together with lack of understanding of the basic principles.)

shall we continue?)

What next?

We have seen Groupon, now we have seen Facebook. And Yahoo, and Flickr and MySpace. The Internet is nothing special, and rather boring, like a mobile or a microwave or a car. And the whole IT industry is going into stagnation as it is in the auto industry.

So, what next? It is not yet another 'free' public service that collect and monetizes user's data. At least not a general purpose service. We have enough of facebooks.

It is not mobile. Yes, there are much more smartphones than computers, but they are also a mere boring thing. the first iPhone or Samsung Galaxy were something. Now, after almost 4 years the curve is flattening.

So what? It isn't an enterprise software - we have enough piles of java classes and man-hours, and it is not The Cloud. Amazon already ate this market.

What remains is specialized services, based on the same innovative ideas and technologies as Facebook or Amazon or whatever.

The strategy is very straightforward. First there must be some real problem to solve, and people who have a budget to solve this problem. Then, as it happens many times before, the solution is designed in the way which allow it to be reused in the future.

This is the way engineers Ericsson developed Erlang or the engineers of Akamai developed Riak - they develop innovative solutions for their problems and then open up it to the global community. Google's Go is the another brilliant example.

The question is - where to find those people with budgets? The answer is - well, we should find the problem and invent the solution and figure out the way of financing the development process. Difficult? Yes. Impossible? No!

The first step is to become an expert in some field. Then you might figure out what service would be useful and demanded. One cannot skip this step.

For example, if you're in a tourism business, then you might figure out that there is a demand for a 'departure display', a scheduling and coordination service for managing groups of tourists intended to the same destination on the same time.

You may take a picture of tourist's passport with a mobile phone's camera (to machine-read it on the server side - it is not difficult - there are finite numbers of different passports, so a neural network could be trained to recognize each type). Take a photo of a visa for (to validate OCR'img) and you have a database of the tourists for free.

This technology, could be reused in many other projects, including those of a government, which might be interesting in a good enough solution to begin a migration to paper-less system, at least on paper - they know very well how to benefit from such ideas.))

Everything else are technical details. The crucial idea is - we should do everything ourselves, and technology is the second part, after marketing.

But this is not a conquest for a single person. It should be a small group of young and passionate people. Like our own.

How it works 2

There is yet another success story of a guy who got accepted by YCombinator, which means a success.

What he did, is no more, no less a New IDE Concept, which described on this link http://www.chris-granger.com/2012/04/12/light-table---a-new-ide-concept/

First of all, one should notice, that novel ideas comes not from sub-committees of Microsoft or IBM, but from a wandering mind of a student.)

OK, lets take a little bit closer look.

The idea that the process of programming is most efficient when programming environment is interactive and implemented in the same language is not very new.

First and recent Smalltalk environments were implemented in Smalltalk, and everything in the environment could be examined, modified and evaluated on the fly.

The same concept was implemented in Lisp as Genera OS - take a note - the whole operating system in highest level language, translated directly into a microcode of a Lisp Machine, or later, Dec Alpha CPU.

Emacs is a modes example of the same concept - editor for lisp code with Lisp system inside.

The second important idea, that code must be represented not as a mere text, is also 50 years old. Lisp code is actually s-expressions made out of pairs. So, it could be easily and with minimal overhead transformed into a tree, graph, or whatever you wish.

The idea to use browser window to display the code in various fancy forms, such as bubbles or squares, is also not so innovative, Smalltalk and Lisp systems can do so easily, because their graphics and display procedures are written in the same language and immediately available to call or extension.

So, what is cool about Light Tables?

It is an web-application build on Clojure and ClojureScript - the fattest buzzwords of the moment. So, everything seems to be utilized - JVM with hundreds of of jars of dependencies, which eats gigabytes of RAM and CPUs.

Here comes the important point to understand. This is an re-implementation of ideas which were successfully implemented decades ago, hundred times more efficiently, on incomparable modest and very limited hardware, using much less code, which was more clean and readable.

This is the very definition of the crisis/bubble of the mediocre in IT. (Think of millions of Homer Simpsons with some Java/Javascript coding experience hunting for a better jobs).

Second thing to notice - all the most hyped recent developments, such as Clojure and Coffescript are about to address somehow the very shortcomings of Java and Javascript or, to name things as they are, their deep flaws.

The idea to use a Lisp-like syntax and macros along with data structures and control abstractions based on java classes is, of course, a good one, but what does it tell you about bloatedness and meaningless verbosity of Java language? The causes of Coffescript are much the same.

Now here is the question? Why YC gives money to this? Because they invest in persons, not in software of even technology. They're trying to catch that special kind of people - producers - focused inroverts, natural engineers, those who favor this activity more than anything else in life (for this moment). This is the only way to make millions - using such people.

So, there is how it works.

Reducers? Oh, come on..

There are a disturbance in the force caused by an unprecedented paradigm-shifting event - blog post by Rich Hickey, the greatest JVM language developer:

http://clojure.com/blog/2012/05/08/reducers-a-library-and-model-for-collection-processing.html

Amazingly, reducers was not invented by this great mind, and being around for quite while. There is a paper, which won several awards, published in 2009 about how to use the concept of reducers to deal with idiot's code (that uses global variables) to make it some-how run in parallel with minimum modifications. (they wrap those variables into another ADT and implement their own green threads and scheduling. All together it is called MIT Cilk.)

But it seems like an adaptation of an old idea from functional programming world (in which MIT is the years ahead of anyone else) to address actual problem faced by industry - how to run idiot's code after on multicore with minimal changes. Btw, Intel already licensed Cilk and now developing and pushing its own implementation. They even created a new branch of GCC 4.7

So, now we could see some connections.. Some smarties at MIT got grants, adopt some proven concepts, and re-implemented them "in a modern way" - using STL.))

And now, the greatest inventor of programming languages, after just a 3 years, found a novel way to back-port the concept back to.. what he called his Lisp 1, and the crowd froze in admiration. Well, we also enjoying.

Innovators, Imitators, Idiots.

Here is a famous trinity or triple I - Innovators, Imitators, Idiots. This is what happens to any market, any paradigm, any idea that becomes mainstream..

Long ago, some talented people invented object-oriented programming and implemented it in language called Smalltalk. This is not a whole story. The second, part is that objects in Smalltalk are communicating with each other using message passing, which means they are actors.

Then, another engineers designed and implemented object-oriented layer for Common Lisp. It is called Common Lisp Object System or CLOS. The second part is that object communicate to each other following Meta-Object Protocol, or MOP.

Then idiots came and start to shout on every corner that everything must be an object and that object-oriented paradigm is superior and simply the best. They gave a primitive examples, which other idiots were able to grasp, and due to peer effect, it became a common sense.

The raise of J2EE and XML madness, STL-everywhere nonsense (there are no .cpp file without words Template <class T>), and my website must be a hierarchy of objects instead of mere text files and so on we all have seen.

Now we are in the bubble, but it isn't financial one, like over-priced housing (well, all those framework-copy-paste drones are grossly overpriced) it is a bubble of idiots, a third and final phase before a collapse.

The same thing is going on in finance right now (take a seat and watch the collapse) - some smarties innovated CDS and other instruments and products. Then idiots flew in and now we have what we have. Everyone have borrowed from everyone else and then brought products they don't understand and made wrong unhedged bets, whose logic they unable to grasp. Quite standard behavior among idiots.

Now what? Well, back to the basics, as usual. Back to school. Algorithms and Data Structures, but without words Java or C++ or God forbid Javascript, is what programming is all about. Algorithms and Data Structures are (surprise! surprise!) language independent. (In some cases they are machine dependent, but that isn't difficult at all).

So, time to wipe the dust from HtDP, TAOCP, SICP, AIMA, PAIP, Introduction to Algorithms (very useful for keeping a window from closing) and learn what programming is really. hint: good coders code, good engineers reduce complexity. It is like in high-school - simplify''')

Innovators make everything more simple, idiots - more complicated.

The foundation of Lisp.

Before Computer Science (which is not a science, and not about computers) became mainstream and crowded by punks and mediocre (the bell curve, you know) mostly people proven brilliant were in the field, so, their ideas and decisions were based on deep reasoning, rather than snap-judgments of a typical consumer. They were standing on the shoulders of Titans..

Lets consider very few ideas which forms the foundation of Lisp Language, as it was defined by sir John. Ideas from:

  • Set Theory
    • Sets are collections of objects. Any type of object can be collected into a set.
    • Ordered Pairs. The entries of an ordered pair can be other ordered pairs, enabling the recursive definition of ordered lists.
    • Binary Relations are defined as a collection of ordered pairs which represents a set.
  • Graph Theory
    • Graphs are structures used to model pairwise relations between objects from a certain collection.
  • Lambda Calculus
    • anonymous functions (a form of defining function without giving a name)
    • high-order functions (a function that returns an anonymous function)
    • currying (transformation of a function with multiple arguments into a chain of function applications with a single argument)
    • the variable binding and scope (a lambda expression binds its variables, a bound variable has the lambda expression as its scope)
    • the substitution model (recursive replacing of all occurrences of a variable with its value)
    • combinators (expression that contains no free variables is said to be closed. Closed lambda expressions are also known as combinators)

Together it gives you following conclusions:

  • Almost anything could by defined as (or reduced to) a structure made out of pairs. Cons is the basic building block for a symbolic expressions.
  • Almost anything could be expressed as a list of procedure applications using the substitution model.
  • Almost anything in the real world could be modeled as a graph.
  • For a special cases (such as I/O events) we will use special forms.
  • In Untyped Lambda Calculus function application has almost no restrictions. For those who can't live without restrictions there is Haskell, and we are free to shot ourselves in the leg.
  • The Mathematical notation is not suited for a symbolic language. For those who got stuck with Math notation there is Python3.

Why Lisp.

Every self-respecting blogger should write about Lisp at least once. So, there is my take.

There is a nice quote, I don't remember of whom - All models are wrong, but some are useful. They are over-simplifications, but they help us think more clearly. Lets make one simple model.

What is an idea? The word "idea" is the name for some representation I hold in my mind. It is linked with a cloud of other words, with some visualizations, some memorized sentences and some non-verbal feelings. (just a different types of storage with different encoding). When I hear or see the word "idea", this is the meaning I have.

The meaning is an inner representation, encoded as a compound expression, a composite object of the mind. The word is just one of many symbolic pointers, a tag, a label, an anchor for this mental object.

There are several such meanings for each word. They can be fetched from the storage, depending on the context, the current state of the mind, the current mental environment. It is like using different dictionaries to look up the meaning of a word.

A word points to a slot in a look-up table - the context. It provides the addresses in a storage for each slot.

It is a huge over-simplification, but it is very useful one.

For example, what is recursion? It is the word of English language, which is the name for the idea of recursion. Each person on earth have his own inner representation of what it is for him. Well, some people have not. The word have no meaning for them, it is not associated with any inner representation. They have no idea.

Depending on the context it could be a mathematical definition, or an idea from computer science, or even a some feeling of being puzzled.

So, what is a word? It is a symbol which points to a chain of associations - its meaning. What is an context? It is a look-up table in between, filled with the pointers to the meaning of symbols. The storage remain unchanged, it is the contexts that change.

So, the same construction of the language, which describes a person, could have two completely different meanings, depending on the context currently present in my mind:

((The young girl) I have seen esterday).

Depending on context, it could be:

  • a cute high-school gal I saw on a street last morning.
  • my girlfriend, ten year younger, with whom I spent saturday night.

Now, what is Lisp?

Lisp is a language to represent any knowledge stored in our mind, according to this simple model - the chains (actually graphs) of named and unnamed inner representations, verbal and non-verbal.

It uses a list-structure to encode the graphs of associations as a sequence. It is the way to represent a many-dimensional hierarchical data as an one-dimensional ordered sequence.

Any kind of objects could be made out of lists. In particular there is a special kind of objects called closure which represents a piece of encoded knowledge - a procedure along with its data and its context - nested environments where the meaning of all symbols can be found.

Closures are kind of inner representations of the pieces of knowledge inside of a lisp process. They could be manipulated by procedures as data objects, or could be parts of any kind of Lisp expressions.

This provides almost unlimited power to express any idea, comparable with any Natural language. There is no distinction between code and data, between nouns and verbs. One can mix everything to produce any kind of expressions he wishes.

How it works..)

some smart-ass (A) meet some bankster (B)..

A: Look, I've heard you are looking for an opportunity to invest..

B: Forget about it!

A: Listen, I have very good idea - software for Cloud Computing! (Serious people cannot cope with this punk's Linux OS - all those text files and bloody command line! in 2010! People need some serious enterprise tools with industry-strength interface, message-passing middle-ware, database back-end) There is a huge market for it.

B: Um.. Cloud Computing (I'm recalling that John told me something about clouds on that cocktail party) But there are others doing it, aren't they? How could we sell it?

A: Easy, we will make it open source and suckers will download and install it themselves!

B: Well, do you know anything about making a software?

A: Forget about it, we will use Java and proven top-down process, like everyone else!

B: Java.. Sounds good (How do I know? Everyone speaks so. It is industry standard, you know, everyone are using it.)

A: So, we need a headquarter in a Valley, and a Coding team in East Europe, and Q/A team in Asia, and Sales and Marketing here. (I will hire remote coders instead and keep all money..)

B: Yeah, yeah. I know how to run a joint..

A: I will watch all this as CEO (and offload all that development process and all responsibility to VPs)

B: But what if is something will go wrong?

A: C'mon, I will send you a monthly reports with all the numbers and plots - team's sizes, lines of code, man hours, quality control results. You know, the more lines of Java code you have, the more money it worth, the more man-hours spent on it, the more valuable product you have, the more coders you hire, the faster it grows. You will see our growth on a daily basis!

B: Um, well, $10 millions for the first stage..

A: Yeah, 20, we should hire teams and stuff.

B: Yeah, I know!

A: When shall we start?

Something is very wrong here.

Something is very wrong when someone spends time and earns money for typing such piles of nonsense.. Get Context, Get Instance, Target, Injection? Why we need all those words (and Java! OS management from JVM! btw, it is not cool anymore! What's really cool is OS management from NodeJS written in CoffeScript! Look, you can run it on WebOS!), which is an alternative to [ -f /etc/nginx/sites-enabled/default ]. I know the answer - because some pays for it. And this is exactly what is very wrong.

This very idea of trying to manage UNIX-like operating system, which is founded on the concepts of text streams and pipelines, from JVM using all that Enterprise stuff, is so deeply wrong, that I cannot find appropriate description.. It isn't stupidity, it isn't ignorance, it is something like abandoning the reason completely.

Well, some YAML-based configuration, simple templates and some lightweight and self-evident non-OO Python3 code or even Scsh. But this..

services/service-nginx/src/main/java/org/openstack/service/nginx/ops/NginxServerBootstrap.java

package org.openstack.service.nginx.ops;

import java.io.File;

import org.platformlayer.ops.Handler;
import org.platformlayer.ops.Injection;
import org.platformlayer.ops.OpsContext;
import org.platformlayer.ops.OpsException;
import org.platformlayer.ops.OpsTarget;

public class NginxServerBootstrap {

    @Handler
    public void handler() throws OpsException {
        OpsTarget target = OpsContext.get().getInstance(OpsTarget.class);

        target.rm(new File("/etc/nginx/sites-enabled/default"));
    }

    public static NginxServerBootstrap build() {
        return Injection.getInstance(NginxServerBootstrap.class);
    }

}

Look what they do with hostname:

package org.openstack.service.nginx.model;

import javax.xml.bind.annotation.XmlAccessType;
import javax.xml.bind.annotation.XmlAccessorType;
import javax.xml.bind.annotation.XmlRootElement;

import org.openstack.service.nginx.ops.NginxBackendController;
import org.platformlayer.core.model.ItemBase;
import org.platformlayer.core.model.PlatformLayerKey;
import org.platformlayer.xaas.Controller;

@XmlAccessorType(XmlAccessType.FIELD)
@XmlRootElement
@Controller(NginxBackendController.class)
public class NginxBackend extends ItemBase {
    public String hostname;
    public PlatformLayerKey backend;
}

This is what a crisis in IT looks like. The replacement of classic SYSV start-up scripts with Java-based crap in Solaris, it is the same madness..

What is Engineering

We are in the middle of the yet another bubble in IT - the bubble of mediocre coder.

We have told in high school that pointers are evil, that memory allocation is difficult, that resource management is boring, so we should buy all that JVM-based crap. (now it is Javascript or Rails - it doesn't matter).

Coders are talking about their tools and their wage labor. They are neglecting ideas, ignoring underlying principles and hardware platforms. They are sure JVM (or V8) will do everything for them.

The art of crafting simple and efficient software was lost in layers upon layers of useless abstractions coders pile up without understanding or even thinking, to get their wages based on lines of code or hour of coding.

This is where we are. In the bubble of wrong tools, bureaucratic processes and mediocre coders.

On the other side, the essence of software engineering is to apply ideas and principles exactly on the intersection between hardware and software.

It is same as body and mind - the body cannot function without the mind and mind cannot exist without the body. Neither cannot be neglected or ignored. Everybody knows what happens if it does.

Engineering is the art of managing complexity and joy of producing simple and efficient, good-enough solutions, the way Nature do.

Why not Clojure

#summary Clojure isn't a Lisp.

What's wrong with Clojure?

Clojure is not Lisp. It is some looking-like-lisp language that somehow compiles into Java bytecode with some features of a Lisp, such a prefix notation, reader and even macros.

There is one of the very essential abilities of a Lisp as a language. This is citation from the original John McCarthy paper:

d. Recursive Function Definitions. By using conditional expressions we can, without circularity, define functions by formulas in which the defined function occurs.

For example, we write n! = (n = 0 → 1, T → n · (n − 1)!)

which means exactly this:

(define (factorial n)
   (if (= n 0)
      1
      (* n (factorial (- n 1)))))

But what we got in idiomatic Clojure is this imperative nonsense:

(def factorial
  (fn [n]
    (loop [cnt n acc 1]
       (if (zero? cnt)
            acc
          (recur (dec cnt) (* acc cnt))))))

So, we cannot define a function recursively. We must use imperative looping instead. That is not Lisp way.

The lack of tail call optimization is not the issue, and usage of a loop macro stolen from Common Lisp with this recur hack doesn't make it more Lisp-ish. We cannot express our idea clearly, without avoiding useless state variables.