X-Message-Number: 8516
Date: Mon, 01 Sep 1997 11:20:43 -0400
From: "John P. Pietrzak" <>
Subject: Re: Digital Shakespeare
References: <>

>         >The _only_ thing a computer does is flip switches.
> 
> OK so it's digital, that doesn't mean something profound, beautiful,
> even poetic isn't happening.

Ah, but it does mean something profound, beautiful, even poetic is
happening.  It means that something so simple that all it does is
modify the state of a series of switches over and over and over again
is all that you need to perform mathematics, display graphics, and
interact with other similar systems over the internet.  It's wonderful
that such a simple device can do so much.

> English is digital too, base 26, to say that the _only_ thing
> Shakespeare did was put down letters in a particular sequence
> is true in a way but sort of misses the point.

Language is a fascinating subject in itself; the various sounds/letters
are certainly discrete, and the structures made out of those units
(words, phrases, sentences, etc.) are very regular and recognizable,
but finding the meaning associated with those structures is often
maddeningly hard.  (At least when you're trying to get a computer to
understand natural language.)  The point is, English is a (digital)
way of representing something which may or may not itself be digital.

> The Genetic Code is also digital, base 4, and it came up with some
> things that were rather interesting, things like me and you.

On DNA, I'm no biologist, but as I understand it the DNA itself does
nothing, it is rather "read" by other cellular structures which use
the information stored there to choose what to do next.  To my mind,
this does make it the closest thing to a computer program in the
natural world.  In fact, the concept of "genetic algorithms" is a well
understood idea at this point; these "virtual pets" springing up all
over the place in the last few years are one result of that.  Still,
getting a genetic algorithm to evolve to have the ability to interact
with the real world involves the same problem that real-world genetics
does: it takes lots and lots of time to get anywhere.

>     >Given sufficient time, any human can perform exactly the same
>     >program that any computer does
> 
> And a monkey could write Shakespeare.

It's not exactly the same point.  The argument that a monkey randomly
punching keys on a typewriter, given infinite time, would eventually
(and, in fact, always) manage to come up with the complete works of
Shakespeare is more of a comment on infinite time rather than on
computability.  (If that is the example you were pointing to.)
However, there are a large class of algorithms which can be proven to
complete in finite time.  So long as the human can stick around to
perform as many steps as the computer does, both will finish the
program in finite time, which is a much stronger argument.

> Actually, a human could not execute any given computer program
> because he doesn't have enough memory, he'd need an external
> mechanism to store information in a digital mode, like a pencil and
> paper.

The processor in my computer (here at work) has an internal cache of
(I think) 256k.  When that is filled up, the computer resorts to an
external cache (L2 cache) of 256k.  When that is filled up, it stores
data in the main banks of RAM (64 megs in this case).  When that is
filled up, the hard drive is used (2 gigabytes here).  When that is
used up, things get dumped to tape.

Humans and computers are equivalent in their need for "external"
storage.  (Right now, I believe humans have a distinct advantage in
the storage capacity, power consumption, and portability of their
internal memory store. :) )

> All of computer science is AI in that if a machine was not performing
> the task a human intelligence would be, or be trying to.

It would be more appropriate to say that all of computer science is
the design of state machines; whether those machines actually represent
what we mean by "intelligence" is yet to be determined for certain.
Sure, they can do chess, they can do some forms of math, they can even
drive cars (a little bit) nowadays.  There are still a lot of tasks
humans perform that computers have trouble with, though.  (I personally
believe that a state machine will eventually be human-equivalent, but
I can't prove it.)

> Light moves at 300,000,000 meters a second, signals in the brain
> move at 100, lots of room for improvement.

Maybe.  Of course, you're still talking digital signal here.  What if
it's an analog computation?  (I.E., the length of time the signal
takes, or the resistance it encounters, or the effects it has on the
medium through which it passes affects the result of the computaton.)
It could be that much more than a single bit of data is being passed
in each signal, or even that a computation is ongoing _during_ the
movement of the signal.  Just something to think about.


John

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=8516