X-Message-Number: 11718
From: Thomas Donaldson <>
Subject: to Daniel Crevier
Date: Mon, 10 May 1999 23:46:52 +1000 (EST)

For Daniel Crevier:

First, apologies for getting your name wrong in my subject line and not
correcting it.

As for consciousness, we are not doing philosophy, we are doing science.
Basically I agree with Ettinger here: it's quite clear that the ability
to reason about one's thoughts is not necessary for consciousness. It's
not even obvious that such an ability implies consciousness, unless we're
extremely careful about our definitions. After all, the PC on your desk
can be considered as reasoning about its thoughts, too --- and if you
think it is conscious, then your notion of consciousness is so broad
that discussion becomes impossible.

I also notice that your idea of consciousness consists only of response
of a brain to events inside it. That idea suggests that with no sensory
input at all we might still be conscious so long as we knew what we
were thinking. That seems to be false, as a matter of fact. If we're
deprived of sensory stimulus, with go to sleep. Not only that, but it
looks to me as an attempt to assimulate our thinking to that of a 
computer --- not something which is obviously worthwhile.

And if you believe that a sequential computer might emulate a human
brain (or even the brain of an octopus) then you need to think on that
problem a good deal more. We're going to need a lot of parallelism,
and no sequential computer, even one at the farthest reach of anyone's
imagination, could work fast enough to emulate 1 billion neurons. But
I may be attributing a belief to you which you do not have --- if so
I apologize.

			Best and long long life,

				Thomas Donaldson

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=11718