X-Message-Number: 8139
Date: Fri, 25 Apr 1997 12:15:07 -0700 (PDT)
From: Olaf Henny <>
Subject: CRYONICS Message #8133 from: Charles Platt; Consciousness

Re: Message #8133 from: Charles Platt
Subject: CRYONICS Consciousness
>
>On Thu, 24 Apr 1997, John K. Clark asked:
>
>> I've asked this question before but never got an answer, Do you think I'm
>> conscious?                                       
>
>Hey, I'm still waiting for an answer to the question I asked, weeks ago:
>Does my cat have a self-circuit?

While we are all pouting, let me get in my gripe too, that I did 
not receive an applause for my assertion, that that ant, which I 
successfully chased down a few weeks ago had a consciousness, ;-)

Here are some of my thoughts to consciousness:

-  In order be conscious an entity must have a minimal data 
processing capacity.

-  Entity includes human, animal, plant and construct including 
sophisticated robots and simple hand calculators.

-  I reject the idea, that a thermostat has a consciousness, just 
like I reject the idea, that a brick, which drops, when I let go 
of it has a consciousness, not because I can prove that these 
items don't have one, but simply for the purpose of limiting the 
parameters of this discussion.  Otherwise it could be expanded to 
include the clay and the sand in the brick and right down to a 
single atom.  I remember, that somebody in this forum theorized 
earlier, that an atom does not have a consciousness (,but that 
consciousness evolved at some stage on the way of atoms being 
assembled to a human being).  Since I did not see any objections, 
I assume that we can agree on that.  

-  I would accordingly submit, that our discussion, to be limited 
to entities, which are capable of accepting and *actively* 
reacting to data (A thermostat *is* a datum. I.e., if the little 
metal rod expands (contracts) to a certain point, a switch is 
thrown.  No other information, no matter how limited can be 
stored in it).

That means that the entity must have for the purpose of this 
discussion at least a neuron or two or otherwise the means to 
receive and *actively* react (to) outside information.

A brick would *passively* react to sun shining on it by getting 
warm.  A plant would *actively* react to sunshine, by turning its 
leaves into a position, where it would receive either more or 
less of it, to satisfy its requirements. Do plants have a 
consciousness?  Many of us think they do.

In my opinion one of the most basic measures of self-
consciousness is an *inherent* desire for self-preservation.  I 
therefore submit: If an entity is not conscious of a *self*, it 
will not have the impetus necessary to protect that *self*.

To get back to cryonet Message 8133, my legendary ant, which took 
evasive action, when I tried to catch it, Charles Platt's cat, 
which I trust will head for a tree, when chased by a Pitt Bull  
and even John C. Clark, providing he tries to get out of the way 
when a truck without brakes bears down on him [;)] can all be 
presumed to be conscious of a danger to their *self* and must 
accordingly be conscious of their self, i.e., possess 
consciousness.

>How can I be sure that the ape is not just pretending to be
>conscious? And how can I be sure that the ape is not a robot cunningly
>built from materials identical to those used in a real ape? How do I know
>that I'm not such a robot? How do I know that Robert Ettinger is not such
>a robot? 

In the context of my above assertions it does not matter if 
Robert Ettinger is a robot.  If he [it ;)] has any desire for 
self-preservation (and from his postings I conclude, that he 
has), then he has a consciousness.  Period.

I submit there is no possible way to answer any of these questions, which
makes this debate a REALLY big waste of time. 

I disagree. If you accept the above parameters, that self-
preservation is a function of self-consciousness, then it is 
simple to prove it.  It is more difficult to disprove it.  I 
would opine, that a rocket or robot, which is programmed by an 
outside intelligence to take evasive action under certain 
conditions does not necessarily have a consciousness.  Nobody had 
programmed my ant to take evasive action if a hand approaches.  
It likely had never seen a hand before. It merely sensed (-was 
*conscious* of) 'threat' and scooted.  That proves consciousness 
by my criteria.  

My computer, although, I assume, with a much greater capacity for 
data processing than my ant, has not shown any to me recognizable 
urge for self-preservation (that will come in handy, when I want 
to junk it).  Neither have I heard of any indications of 
*inherent* (not programmed into it by an outside intelligence) 
desire for self-preservation in any other construct with digital 
data processing capability.  Therefore I do not know at this time if 
digital "intelligence" has the capability of consciousness

Olaf Henny

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=8139