X-Message-Number: 8135
Date:  Thu, 24 Apr 97 21:14:25 
From: Mike Perry <>
Subject: CRYONICS Mechanical Ducks, Consciousness

Bob Ettinger, #8124 (22 Apr 97) wrote:

> If it looks like a duck, walks like a duck, and quacks like a duck, 
>it still might be something that German artisans produced a century ago,
>a clockwork duck. In other words, the Turing Test is garbage. 

My reaction is that a clockwork duck would not pass a reasonable test 
of duckhood--for one thing it presumably would have no ability to 
respond differently to different external stimuli. It would simply go 
through a predetermined series of steps, no matter what, unless 
prevented in some way. If the Turing test is "garbage" (though I 
don't agree) it is because it is inadequate, not because it is 
wrong-headed entirely. In other words, there surely is a behavioral 
test of intelligence, such that passing it would reasonably qualify 
the taker as intelligent.

And here I'll raise an issue that seems to be lurking in the background 
on much of the recent discussions of consciousness on this forum, 
though it doesn't seem to have been addressed directly (or not 
enough). Suppose we have an entity that behaves *as if* it
is conscious. For example, maybe it is a mechanical human, 
only with far more sophistication than the clockwork ducks.
In particular it has an onboard device, made with advanced
technology, that serves as its "brain." It reacts very convincingly
like a human--the main difference being that *it is not made of meat*,
but something very different. Does it have feeling?

This question is a little more complicated than it may seem. Let's 
consider two possibilities. In case 1, the "brain"--call it George--
emulates a human brain, neuron-by-neuron, and maybe down
to much lower levels. You can establish a close correspondence
between processes going on in a human brain, and what is going on in 
George. You could, if you wished, make a human brain that would 
function analogously to George, making allowance only for the fact 
that both are probabilistic devices (we'll assume) so that no two 
"runs" are likely to be exactly the same. Again, though, George is 
not made of meat. Maybe he is a fancy optical computer made of glass 
fiber and such, or a quantum computer, or something else we haven't 
thought of at all. But whatever he is, he is very different from
what we are made of, yet he emulates a human brain at every level
that is significant, as far as we can tell.

Here it seems we either have to grant that George has 
feeling, or accept the vitalistic position that there is some 
unknowable "essence" we must attach to a meat machine, that cannot be 
captured at the information-alone level. This latter I simply would
refuse to do, and instead give George the benefit of the doubt. In other 
words, I see no way to avoid the conclusion that feeling is reducible 
to information processing.

For the second scenario, I imagine that again you have George, who 
directs behavior that seems entirely human, yet there is *no* close 
resemblance to what goes on in a real human brain. So again, 
does George have feeling? Here I would say that, if George would fail 
the test based on processing details, it's probably a sign that the 
definition of "system with feeling" is faulty and in need of 
generalization. So again, I'd probably give George the benefit of the 
doubt, though here I can see possible complications. "George" for 
example, could be a composite like a nation, made of agents who 
individually have feelings, thoughts, etc. bearing little resemblance 
to George's. But overall, the way a system behaves seems adequate to 
determine whether *that system*--considered as an agent in its own 
right--has feeling, and what its feelings are.

An apparent contradiction of this principle would be found with a 
person who is perfectly motionless. No behavior would seem to 
necessitate no feeling, yet clearly such a person could have feeling. 
But I see two ways of resolving this paradox. One is, we could just 
cut the gordian knot and say that the motionless "person" is an entity 
with no feeling, and the active brain inside is not this statue, but 
another agent entirely. Or alternatively, we could observe what is 
going on inside and include that in our total assessment of behavior. 
In any case, though, what we regard as subjective states should be 
deducible from what we can, in principle, observe.

Mike Perry

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=8135