X-Message-Number: 8237
Subject: definitions of consciousness
Date: Fri, 23 May 1997 10:32:55 -0400
From: "Perry E. Metzger" <>

> From:  (Thomas Donaldson)
> 
> You claim, then, that "consciousness", the word, is nonsense. Well, if you
> have no idea what I mean, I'm sorry. 

Absent a definition, its hard to even begin reasoning about the
property. Things which are so slippery that they have no good
definition should arouse our suspicion.

> The test I proposed was not a global, always working, test of consciousness,
> nor did I claim so. I just said that it was a way of testing the presence of
> consciousness in human beings.

Since I have no idea what you might mean by "consciousness", I have no
way of evaluating the quality of the stated test.

> Now as for definitions: in the very first place, I don't think that's the
> way to proceed.

Well, here we part company. If you wish to test an organism for the
presence of the ability to see, we can come up with a nice objective
definition or two (we can define crude vision as the ability to respond
to light stimulus, or precise vision as the ability to distinguish
differently shaped or colored objects based on the light bounced off
of them alone). We may then easily think of ways to test the "vision
hypothesis". 

If you want to test an organism for consciousness, if you can't even
define what it is that you are testing, how can you expect someone to
reason objectively about the question -- instead of in the emotional
"I'm conscious, Nya Nya Nya" way that so many of the people here have
repeatedly descended to?

The arguments sound very much like the sort given when people in past
years claimed the existance of an extra-physical soul.

How can people start making claims about something they can't even
define or reason objectively about? How can you say "a robot can't be
conscious" if you can't even give the word a definition?

> We form a definition once we get some kind of empirical hold
> on what we're trying to find.

Fine. Go off -- get a nice, solid grasp on consciousness, and report
back with a good definition. Given that definition, we can start
reasoning on the subject. Until then, I prefer not to live in
squishy-feely land. Personal predjudice, you know.

> We can already see some good signs of whether or not someone is
> conscious or not, given that they are human beings. A conscious
> human being responds to events quite differently from an unconscious
> one. A conscious human being, for instance, will respond if you try to speak
> to him/her in a normal tone of voice. That is not true of an unconscious
> person. There are a variety of other behaviors which distinguish conscious
> people from unconscious ones.

I dunno. You are already assuming a lot about the definition
there. There is a claim that I could construct a robot which would
behave in every way like a human being -- including responding if you
speak to it in a normal tone of voice -- and yet would NOT be
conscious. Given this, I'm not sure that I can believe a human is
conscious simply because it responds to auditory stimulii.

Are you sure you aren't just thinking of what most people term
"awake"? Are you sure this is what "conscious" means? If you are happy
with this as a pure operational definition, then I suppose we can just
accept "acts like a human being" as our definition of "conscious",
accept the Turing Test, and go home.

And yet, we are told the Turing Test *isn't* a good test. Given this,
I insist on having a solid definition in hand before we proceed.

> Why don't I believe in starting off with definitions? Because we might
> discover that our notion of consciousness is somehow inadequate.

Isn't that precisely why we need a definition?

> Right now, consciousness is one of those words which has no
> "official" definition, even though many have ideas about what it
> might mean.

Makes it very hard to defend Joey the Robot's consciousness if you
won't even deign to give the word a definition, doesn't it?

> Most words don't really have noncircular definitions: you did not
> learn English from reading the dictionary alone. You are being quite
> disingenuous when you claim you don't know what I mean.

Not in the least. I can define most terms well enough that I can test
for them. "Sunny day", for example. I can go out and observe solar
visibility and use a light meter to test light levels. "Gasoline" can
be easily defined, examined, tested for.

What of this mysterious fluid you call "consciousness", which you
possess but Joey the Robot who behaves more or less like you and me
somehow DOES NOT POSSESS?

> Finally: you say that these ideas need to be falsifiable, and I set up a 
> scheme in which we might falsify one notion of consciousness. You then tell
> me that I'm foolish. Just who is foolish here?

I did not say you were foolish. I merely said you gave an operational
definition that depends on the subject having a human brain. Given
this, you have more or less ruled, ab initio, that aliens and robots
can't be conscious. Thats fine if thats the way you want to define it
-- if it requires a human brain, I can accept that Robots aren't
conscious and we can all go home. Of course, if thats the definition,
I'm no longer particularly interested in the property nor do I think
it is any longer interesting to say that a Robot lacks consciousness.

> Certainly, if we did the tests I describe on someone, and he gives
> every external appearance of being conscious but there is no area in
> his brain which is always active when he appears conscious, that
> would falsify this notion of consciousness.

Given that the original motivation for this was to show why the Turing
Test isn't a good measure of consciousness -- Searle's claim, in other
words -- testing a biological brain isn't a very useful test, as I've
noted. 

> One problem with unthinking reliance on Popper or any philosophical account
> of science is that no philosopher has actually done science.

Is this not Ad Hominem reasoning? Frankly, I don't give a flying
penguin if Popper has or has not done science. My concern is only if
his ideas hold up.

To tell the truth, I find it difficult to find fault with Popperian
epistemology.

> As for intelligence and machines, perhaps I misunderstood Searle, but I do
> not find him simply foolish. I have already given a test for consciousness
> and intelligence in a computer: I ask for more than just the ability to 
> play with words, wanting IN ADDITION an ability to deal with objects in
> the world.

You do misunderstand Searle. He explicitly states that a robot moving
around and behaving like a human is NOT CONSCIOUS. Read his essay
again. According to Searle, a robot that behaved EXACTLY LIKE A HUMAN
would not, could not be conscious.

And yes, I stand by my statement: Searle is a fool. He has produced an
utterly undefinable and unfalsifyable notion of consciousness.

> The Chinese Room is really suggesting that the ability to carry
> on a conversation is only a small part, and quite possibly one which might
> be imitated on a computer with no more awareness than any other computer
> (such as, for instance, Deep Blue).

I would be shocked to find Deep Blue programmable to carry out a
conversation. However, ignoring that, I cannot escape the fact that
you have now sentenced any human being who is blind and quadraplegic
to unconsciousness. Obviously since such a person cannot deal with
"objects in the world" they must not be conscious.

> For that matter, I'd probably accept the computer
> as conscious if I could pass it photographs of paintings --- a Gauguin,
> a Renoir, a Rubens, a Rembrandt, and it could tell me what was in them. 

Okay, so you are happy with a test in which we have slots in front of
black boxes, some of which contain humans and some of which contain
scanners hooked up to computers, and we slide photographs of paintings
into the boxes and in which the output of the boxes containing the
humans and containing the computers is indistinguishable?

> But note that passing it photos isn't part of the Turing Test as set up.

I'm happy with any objective test you care to produce.

Perry

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=8237