Posts for the month of December 2014

Ounce Of Mathematics

prof. Brian Harvey used to say "an ounce of mathematics is worth pound of computer science".

There is how. In famous Peter Norvig's course (on udacity) there is a small sub-task to write a tiny predicate procedure, which returns True if a hand of cards has the same suit.

Like a good student of CS61A I wrote this pile of functional-style code:

def flush(hand):
    suits = [s for r,s in hand]
    return reduce(lambda x, y: x and y, map(lambda a: a[0] == a[1],
                                     zip(suits, suits[1:])))

What could possibly go wrong? You could visualize how beautifully a list zips with itself (strictly via references, no data copied!) and how a new list of Booleans is going to be produced, to be folded then into a single value. You could see the flow and all the wonders.

Here is the Norvig's snippet

def flush(hand):
    suits = [s for r,s in hand]
    return len(set(suits)) == 1

What the ...? Well, the set-from-a-list constructor removes all duplicates (due to the nature of a Math set), so if the length of a resulting set is 1, then all elements of a list were the same value.

But, but we know how to make a set ADT out of conses and write short-circuiting and or every? for sequences.))

How the mind works.

If we think in "storage" or "representation" in the context of our minds or AI, we think it wrong, because we are trying to use the concepts form CS of memeory "cells" or "slots". There is nothing like that in a brain, of course.

The proper way of thinking is that our "memory" is "structured" instead of "stored".

This explains how each access "restructures" (re-builds) our "understanding" and that after very short time we are unable to "recall" any details of our "previous understanding".

The good metaphor form computers it that our "knowledge" has to be "periodically refreshed" the way a charge in RAM chips must be. Once we stop re-generating it, it is gone forever.

So, refreshing, re-constructing, "re-building", re-writing. And pattern-matching.

What I am trying to say is that there is no "universal encoding", like on hard-drive, there is even no permanent storage. We do not store anything "verbatim".

It is rather like this - we have seen some face a few times, There is no detailed "picture" of the face stored in our mind. When we see the same or similar face, we perform some sort of pattern-matching on key features - eyes, mouth, nose, shape and then we "knew" that we have seen it before, and then "prime" all the associations. But there is no "images form retina" stored in a "files". No hash-tags.

It is like "which structure of neurons lights up when I get this sensory input" pattern-matching. "Analog", not "digital".