X-Message-Number: 8051
Date:  Sat, 12 Apr 97 14:22:23 
From: Mike Perry <>
Subject: Re: CryoNet #8038, #8040

Bob Ettinger (#8038) wrote: 
 
>Mike Perry (#8020) conjectures (If I understand him 
>correctly) that the brain may contain two kinds of 
>conscious regions; kind A can communicate to the outside 
>world and can receive informtion (but not feelings) from 
>B; B is conscious, but cannot communicate feelings or 
>consciousness either to A or to the outside world. 
>Therefore, he concludes, consciousness may be, but is not 
>necessarily, associated with a "seat of consciousness."  
> 
>I see serious flaws in this idea. First of all, feeling and 
>consciousness are not just a matter of a locale or region 
>(whether distributed or not); they are also defined by 
>specific physiology (e.g. something like a standing wave 
>or reverberating feedback, the self circuit). Once we 
>understand the physiology of feeling in A, we can then 
>examine the physiology of B to look for similarities. If we 
>find none, then there is serious doubt that B has its own 
>subjectivity. Likewise, if an artifact lacks the physiology 
>of feeling, we are entitled to doubt that it has any.  
 
This seems to somewhat misinterpret what I
was trying to say. I wanted to consider the case
that B is not merely "another region of 
the brain" but *is* physiologically and functionally similar 
enough to A that to all appearances it too has conscious-
ness. (For simplicity I'm assuming single regions A and B
here.) Yes, B can communicate information to A but cannot 
make A share its feeling or consciousness directly,
much as is true between two separate individuals. A on
the other hand is the entity we communicate with when
we talk to the "person" whose brain A and B both inhabit.
 
Say we call that person Smith. I wouldn't say, in this case, 
that Smith lacks a seat of consciousness. When Smith is 
conscious, A must be active in some way. Smith's seat of 
consciousness, then, is somewhere inside A. It is not in B 
(we assume A can be conscious and communicating as 
Smith, even if B is switched off completely). However, 
even though Smith's seat of consciousness is in A, B too (it 
is reasonable to say) is capable of consciousness. And in 
fact there is experimental evidence that not all of the 
conscious areas of the brain are contained in the "seat of 
consciousness." (See Steve Harris's posting, #8029, for
instance.) B could thus have its own seat of
consciousness. (Still, what we usually refer to as "the" seat
of consciousness would be in A--this then would be *Smith's*
seat of consciousness.) So we need to consider, not merely 
whether consciousness is present, but "whose" conscious-
ness it is. 
 
 
Olaf Henny (#8040) wrote: 
 
>Maybe I lack your perception, but we are talking about 
>self-consciouness/self-awareness, which requires at least 
>a minimal amount of constructive thinking, i.e. reaction to 
>threat or lure. Self preservation is probably the most basic 
>and primitive manifestation. I have to this date not 
>detected the slightest hint of that in any artificial data 
>processing device. 
 
You should look into artificial life. This is where the
computer emulates a population of "lifelike" forms that
do basic things such as feeding, reproducing, defending
against danger, etc. They evolve over time and get better
at what they do. In this case, it's not the machine as a
whole that exhibits intentionality (including self-
preservation behavior) but the little bugs it's "running"
can be said to have this trait, at least at a primitive level. 
 
Mike Perry 
 
http://www.alcor.org 

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=8051