X-Message-Number: 7971
Date: Sat, 29 Mar 1997 21:30:34 -0800 (PST)
From: John K Clark <>
Subject: Uploading

-----BEGIN PGP SIGNED MESSAGE-----

In #7959  On  Fri, 28 Mar 97 "Robert Ettinger" <>  Wrote:
          

        >not all traits have been selected for by evolution [...]  Many
                >traits are just accidents          
        
I agree.

        >sometimes long-persisting, neutral in survival or procreative value.  

                  
I disagree. All genes are subject to mutation, the more important the gene 
the less likely the mutation will spread in the population because mutations  
are almost always harmful. All your "self circuit" does is provide a 
subjective feeling of self to the organism, that's why The Turing Test can 
not detect it, so from evolution's point of view it is not important. 
A mutation that rendered the "self circuit" inoperable would nevertheless 
rapidly spread through the population, there would be nothing to check the 
explosive growth of such a destructive mutation. This is called  
"Genetic Drift".
           
        >Conceivably, then, feeling could have arisen by accident and         
        >persisted by being harmless 


Conceivably you could be the only one left with a working "Self Circuit".
                  


        >I think there is a positive possibility also. Feeling could have
        
        >survival value by making certain "computations" or decisions or
                >reactions more efficient 


I think you're on the right track here, it means it would be easier to make a 
conscious intelligent computer than a unconscious one.
                                      

In  #7962   (Thomas Donaldson) On Thu, 27 Mar 1997 Wrote:
           

        >One major problem with any computer program which might simulate me
        
        >is simply that it will necessarily have to omit information, and
                >therefore will have errors       
 

That is not A problem it's THE problem, and it's a problem we'll have 
regardless of whether we use bits or biology.
           


        >if I were stored digitally in a computer and recreated as a real
        
        >person in the world at some later time. THEN the errors would be
                >damped out. 


I am not aware of any error correcting strategies that biology can implement 
that bits can not.
           


        >As for the Turing Test, you seem to automatically assume that it
                >measures consciousness. 


I assume that intelligent behavior implies consciousness, I'll never be able 
to prove it.

            

        >As an argument rather than a statement of faith, you would need to
                >do more than assert that. 


I can not do more, so I must accept it as an axiom because I could not 
function if I thought it was untrue, I very much doubt that you could either.
                  

        >our consciousness, quite unlike most of our brain, is SERIAL        

        >rather than parallel. At some point these drives and our perceptions
        
        >must come together; I see no way to do that without a serial process
                >to guide both what we see and what we do. 


I think there is a lot of truth in what you say, and I think an intelligent 
computer would need exactly the same thing.
                  


        >you raised Lewis Carroll's story as an argument AGAINST what I was
        
        >saying. I was saying that I thought it had caught exactly what I was
                >saying, that I agreed with the Tortoise. 


Yes, I know you agree with the Tortoise. The conclusion the Tortoise reached  
from his reasoning was that machines could not reason and that people could 
not reason and that Tortoises could not reason, and for that reason I don't 
think The Tortoise's view is very reasonable.



        >there are two very different kinds of information which should not
                >be confused: our drives, wants, etc, and our perceptions. 


There is a difference between wants and perceptions, just as there is a  
difference between a poem and a blueprint, but you can send them all over a 
telegraph line because they're all information.
                  


        >we should never confuse the simulation with the actual neurons or
                >brain we try to simulate. 


Then there would be no reason to bother to simulate them.



        >I suspect that we will also find, if we try to do as accurate a
                >simulation as we can, that it will diverge over time 


I'm sure that's true, but so what? Your future self will diverge from what it  
would be if you didn't read Cryonet, but you're still you, I think.


                                              John K Clark    

-----BEGIN PGP SIGNATURE-----
Version: 2.6.i

iQCzAgUBMz30z303wfSpid95AQHTEATwrK6Edm2YuFZV5FoHgj645oDw47wtJyA6
vMm3toasmqJjvwCsru6f5eVPpwUC9BV+dvARoh6OqaEQrD49kL/YADqKwUUlBazQ
ccywBsgCbOpBTn2bBf/BquXDGJRu0WGJeuXshGLkXWpGQtOqod3me1Ycq5lsOTFB
B8ZL180JQ84R473EFrZinlKJnOkBFzeBZUT24CdmYkKaLjxTKdQ=
=7abr
-----END PGP SIGNATURE-----

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=7971