X-Message-Number: 30216
From: 
Date: Wed, 26 Dec 2007 14:20:41 EST
Subject: Clark disappoints

John Clark is pretty smart, so I am chagrined at what I see as  his errors, 
since I must conclude that I have failed to explicate my points  adequately. 
(What else is new?) Try again.
 
>It is not only possible to write a program that  experiences pain it is easy
>to do so, far easier than writing a program  with even rudimentary
>intelligence. Just write a program that tries to  avoid having a certain
>number in one of its registers regardless of what  sort of input the machine
>receives, and if that number does show up in  that register it should stop
>whatever its doing and immediately change it  to another number.

He is defining pain in a computer as any state it has been  programmed to 
avoid. I am trying to think of something polite to say about those  who accept 
such a definition.
 
Also:
 
>The quale people ask people like me  how a computer can have feelings, 
>that is they want a description of how it could happen,  but when I give 
what 
>they were asking for they say a description of a quale is  not a quale. 
>Because there is no conceivable answer that would satisfy  you I conclude 
>the question is meaningless.

He has NOT answered the question. He has not described a  quale, but merely 
asserted that e.g. a computer feels pain if it is in a state  it has been 

programmed to avoid. (And I suppose it  experiences orgasm when in a state it 
has 
been programmed to achieve?  Must be a lot of very jolly computers around.) 
 
A quale is a physical construct or system in the brain,  possibly based on 

some kind of standing wave, and some day its  anatomy/physiology will be known.
At that point, we will know whether the same  thing can be achieved in silicon 
(or whatever). If it can, then in principle  some type of computer could be 
conscious. If it cannot, then the possibility of  feeling in computers remains 
in question.
 
Also:
 
>>if [for example] subjectivity depends on unique  properties of carbon,
>> then it  cannot  be duplicated in  silicon.

>So carbon atoms can be conscious but silicon atoms cannot,  I'd say that's
>about as likely as white people are conscious but black  people are not.

Sigh. It's not the atoms that are putatively conscious, but a  system partly 
made of those atoms. Not all atoms are created equal for all  purposes. 
 
 
Minor point:
 
>> a programmed requirement for  human
>>  review  before any "execute" order.

>A computer like that would  be of no danger to us, or be of any use to us;
>it couldn't even balance  your checkbook.

Not true. A program to balance your check book doesn't require "execute"  

orders. And a requirement to review execute orders prior to execution would slow
things down but not make the program useless.

R.E.



**************************************See AOL's top rated recipes 
(http://food.aol.com/top-rated-recipes?NCID=aoltop00030000000004)


 Content-Type: text/html; charset="US-ASCII"

[ AUTOMATICALLY SKIPPING HTML ENCODING! ] 

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=30216