X-Message-Number: 11529
From: 
Date: Thu, 8 Apr 1999 11:35:46 EDT
Subject: The Chinese Closet

Mike Perry writes:

>I think Searle makes a fundamental error with the Chinese room experiment,
which is to confuse emulator and emulatee. To those on the outside, it
appears that the person inside understands Chinese, yet the man inside
understands no Chinese, but is just following elaborate rules. The man, in
effect becomes the emulator of the real Chinese conversationist. Under
appropriate conditions it would, I think, become reasonable to say that (1)
there is a conversationist in the room who does understand Chinese,
(2) the man is a device that emulates this conversationist but does not
understand what the conversationist understands, and thus (3) the man in the
room is not the conversationist. 

Instead of the Chinese Room--a bad metaphor--look instead at the Chinese 
Closet. This is a very small Chinese Room, with nothing inside except a 
primitive computer using one simple rule and a modest data store: "Answer 
string Qn with string An." In a very short conversation, it might answer 
appropriately, and thus give the impression of intelligence; yet clearly, by 
any ordinary criterion, it understands nothing. Despite a plausible short 
conversation, it emulates nothing, not even an idiot. From this point of 
view, I believe, Searle was right.

>As for the more general point that Bob makes, basically, that feeling is not
necessarily captured in an emulation, I would again bring up the
possibility, which we have no way of disproving, that we are right now  in
an emulation of some sort, 

This "possibility" exists only if we concede ahead of time the very point at 
issue! Further, as I have noted at more length previously, there IS in 
principle the possibility of determining by experiment whether we are 
simulations. A simulation cannot anticipate new discoveries in science; the 
program can only include what was known when the program was written, and 
deductions therefrom. Therefore there is an insuperable natural barrier for a 
simulation that does not exist for a "real" entity. This in itself does not 
preclude the possibility of feeling in computers, but it undercuts the 
erroneous notion that a simulation from the inside is in principle 
indistinguishable from "reality." 

>So I don't accept the argument that feeling is something
specifically physical that can't be captured, equivalently, in a different
substrate.

I think Mike has overlooked my point that real time correlations may be 
essential to feeling, and these cannot exist in Turing computers.  Feeling 
may depend on time-binding and space-binding constructs.


>  I do think, once again, that there is
no way to resolve this issue by scientific "proof" or "almost-proof" as we
usually understand it. 

And again I point out that, once we understand the anatomy and physiology of 
feeling in mammals, this will tell us for sure what is sufficient for 
feeling, and it may well also give us clues as to what is necessary.

Robert Ettinger
Cryonics Institute
Immortalist Society
http://www.cryonics.org

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=11529