X-Message-Number: 11731
Date: Tue, 11 May 1999 18:41:58 -0400
From: Daniel Crevier <>
Subject: consciousness

To Robert Ettinger. You wrote:

>No, the essence of consciousness is not in representation or in 
>reasoning --it is in feeling, qualia, subjectivity. A dog cannot 
>reason about its mental states, but it is certainly conscious.

How do you know that? Did a dog ever tell you that it was conscious?
We need to make a distinction here between awareness and consciousness:
A dog is aware of its environment, but it cannot say "I think therefore
I am". Consciousness is awareness of self. Philosophers are not the only
ones to make this distinction. So do legislators, which is why dogs do 
not have the same civic rights as we do.

Here is another example: if hungry and presented with juicy dog food, a 
dog will just go ahead and eat it. We, on the other hand, if hungry and 
facing junk food, sometimes say "Hold on, is eating this good for me 
(ME!) in the long run?" It is this distanciation from reality, and the 
referencing of it to an abtract representation called the self, that is 
the essence of consciousness (and morality, I should add).

>Conversation is not the criterion, and passing the Turing Test is 
>neither necessary nor sufficient. 

Agreed: many programs that make conversation have no representation of 
self, and cannot be conscious. I would suggest an examination of the 
innards of a program to find such representation as an alternative to 
the Turing test. I doubt, however, that a program could make 
*intelligent* conversation without a thorough understanding of the 
meaning of the pronouns "you" and "me".

To Thomas Donaldson: you make a similar mistake when when you say
>the PC on your 
>desk can be considered as reasoning about its thoughts, too --- and if 
>you think it is conscious, then your notion of consciousness is so 
>broad that discussion becomes impossible.

The PC does not have a representation of itself, neither the broad 
common sense knowledge base or the inference mechanisms needed to reason 
about itself and its position in the world. So it does not qualify as 
conscious.

>...your idea of consciousness consists only of response
>of a brain to events inside it. That idea suggests that with no sensory
>input at all we might still be conscious so long as we knew what we
>were thinking. That seems to be false, as a matter of fact. If we're
>deprived of sensory stimulus, with go to sleep.

Saying "I think therefore I am" does not require any sensory inputs.
I'll agree that the absence of such becomes rather dull after a while, 
but this has to do with the way evolution wired us, not with the nature 
of consciousness.

>And if you believe that a sequential computer might emulate a human
>brain (or even the brain of an octopus) then you need to think on that
>problem a good deal more.

Frankly I have no opinion on that.
A sequential computer can simulate any parallel machine, so there is no
fundamental distinction between the two, except for speed. But this is a 
relatively minor point. Why you insist on it so much?

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=11731