X-Message-Number: 11792
Date: Fri, 21 May 1999 10:53:10 -0400
From: Daniel Crevier <>
Subject: zombies and consciousness

To Robert Ettinger. You wrote:

>[Daniel Crevier] is confused by mixing up two separate 
>questions--consciousness in computers on the one hand, and on the 
>other hand comparisons of living organisms respectively with and
>without consciousness.

>As I have previously discussed, a computer (if fast enough) could (in
>principle) direct a robot to behave as if it were conscious, resulting
>in a Zombie; but the computer would still not be conscious. 

>A naturally evolved, living organism would not behave the same way 
>with or without consciousness.

In order to say this, you ought to have already resolved the very point
that we are arguing. You must have decided that consciousness in com-
puters and consciousness in humans are completely different matters.
I am saying that they are fundamentally the same, because there cannot
be one kind of mechanisms (the A's) that make physical systems just
behave as if they were conscious, and another kind (the B's) that make 
them really conscious. If there were, the A's, which would do less,
would presumably be more simple. If this were the case, as nature pre-
fers simple solutions, we'd be A's. But we're not, since we really 
are conscious. 

I agree with the rest of your message, which describes the survival ad-
vantages of consciousness. But this is orthogonal to my argument.

To Chris Fideli: Your explanation of consciousness as a mechanism 
involving two monitoring brain organs makes sense. However we seem
to have different definitions of zombies. Your zombie is a being with
only the first brain organ. It is not conscious, and its behavior is
different from that of a conscious being. My zombie has, by defi-
nition, a behaviour indistinguishable from that of a truly conscious
being. I am arguing that such a zombie is impossible. In terms of your
theory, I am saying that a being that seems to be conscious in all 
respects must have both brain organs, and therefore must be truly
conscious. 

Brook Norton writes:

>Biological brains require consciousness as part of the data processing
>engine.  An emulation of a brain uses various circuitry to achieve the
>same data processing without the need for consciousness.

My whole point is that you cannot think of consciousness as an extra in-
gredient that could replace computational processes. Consciousness is
the result of computational processes. As Chris Fideli's posting shows,
we are starting to have an idea of what these processes might be.

Daniel Crevier

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=11792