X-Message-Number: 7951
Date: Wed, 26 Mar 1997 11:12:29 -0800 (PST)
From: John K Clark <>
Subject: Uploading

-----BEGIN PGP SIGNED MESSAGE-----

In his last several messages Thomas Donaldson keeps referring to some very 
interesting sounding posts by a "Mr. Lynch", for some reason I have not seen 
any of them, sure wish I could though because that Mr. Lynch sound like a  
insightful fellow, an absolutely splendid human being, and his posts sound an  
awful lot like the ones I write.
               


        >To Mr. Lynch: It really would help if you would read a bit more
        
        >about this consciousness question. It would also help if you were
                >not quite so full of yourself, but that's much harder to cure.
              

Wow, where did that come from Thomas? I don't see why a nice Philosophical 
discussion must turn ugly, and Philosophy is all it is, none of this will 
have practical implications for years. If I was that Mr. Lynch fellow 
I'd be insulted.
                   


        >When I make a decision that someone is aware, I do make a judgement,
                >but I hardly ignore obvious signs of illness etc.


Yes, you ask them questions, if they respond in a non random manner to them 
you figure they are aware, if they don't answer you or even move, then you 
suspect that they are asleep or in a coma or dead. In other words you give 
them The Turing Test.
                   

        >analyze the Chinese room situation    
                   

All Searle did was come up with was a wildly impractical model (the Chinese 
Room) of an intelligence in which a human being happens to play a trivial 
part. Consider what's in Searle's model:
                   
1) An incredible book, larger than the observable universe even if the    
   writing was microfilm sized. 
2) An equally large or larger book of blank paper. 
3) A pen, several trillion galaxies of ink, and oh yes I almost forgot,   
   your little man.

Searle claims to have proven something profound when he shows that a trivial 
part does not have all the properties that the whole system does. In his 
example the man could be replaced with a simple machine made with a  few 
vacuum tubes or even mechanical relays, and it would do a better job.  
It's like saying the synaptic transmitter dopamine does not understand how
to solve  differential equations, dopamine is a small part of the human brain 
thus the human brain does not understand how to solve differential equations. 

Yes, it does seem strange that consciousness is somehow hanging around the 
room as a whole, even if slowed down by a factor of a billion trillion or so, 
but no stranger than the fact that consciousness is hanging around 4 LB's of 
gray goo in our head, and yet we know that it does. It's time to just face 
the fact that consciousness is a property matter has when it is organized in 
certain complex ways.
                   

        >If we set up a (slightly faulty) copy of you, not only will it       

        >simply not behave like you, but it will go very badly awry, breaking
                >down after a short time. 
              
              
Then Cryonics can not work, no matter how well the people of the future 
repair your frozen body you can be certain it will not be perfect.
                   


        >Lewis Carroll's story catches very well what I was trying to say.
        
        >[...] if you really believe that Carroll's story represents my own
        
        >argument, something has gone badly wrong in our conversation.
                

I see, Carroll's story catches very well what you were trying to say but it 

does not represent your argument.  Ah, could you run that past me again?


        >You might begin by telling me what you think my argument is.        



I wouldn't want to hazard a guess, but I hope you find it.
                   

                >I am surprised you did not notice. 


I did notice that Carroll used your reasoning to conclude that not just  
machines, but humans too, could not reason. I am surprised you did not notice.
                   


        >We cannot use only symbols to deal with  the world. At some point we
                >must stop that and deal directly with the world. 


Yes, and that's just as true for a biological brain as for an electronic 
computer.

                                 John K Clark ( not Lynch)   

-----BEGIN PGP SIGNATURE-----
Version: 2.6.i

iQCzAgUBMzlwNX03wfSpid95AQEXqQTwuuo5iGUE5dTK9YVFJmNGJbNisIgjbWj7
Al4KY17qnfx8t1aAHTisLweFziblPMp8211l29C4yr269y2HZbJN9+jN0kHJuj6i
NYcvWCTgZyzBEaPQugTdh02nBiplZaOSflyLyacM3+4bhTe272PvDjUoQ/vdWGyC
MekptYg/obttuxZWpF3nduT/xiI6F8hwG00MC4Vsui/wF37ojIg=
=AWXT
-----END PGP SIGNATURE-----

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=7951