X-Message-Number: 17942
Date: Thu, 15 Nov 2001 10:42:35 -0800 (PST)
From: Scott Badger <>
Subject: Re: Intelligence is not the same as Consciousness

After reading my own message, I realized my terms must
be clearer so although multiple definitions exist, I
believe the following apply to this thread.

Intelligence  - usually implies the ability to cope
with new problems and to use the power of reasoning
and inference effectively.  Knowing; sensible;
skilled, cognizant; aware.

Consciousness - a sense of one's personal or
collective identity, including the attitudes, beliefs,
and sensitivities held by or considered characteristic
of an individual or group.  Knowledge of one's own
existence, condition, sensations, mental operations,
acts, etc.  The recognition by the mind or ``ego'' of
its acts and affections.  

[note that the term  conscious  is also often used in
conversation to refer to being aware of the external
world e.g.   our own conscious impressions of the
physical world.   I will not be using that
definition.)

Firstly, IMHO, I doubt that advanced intelligence can
really exist without consciousness developing in
tandem.  Certainly the machine would be programmed to
act in order to preserve its own existence and thus it
would have to possess some kind of sense of self to
realize that it was the object that was to survive. 
It will generate it s own goals.  So I don t see how
it can really avoid becoming aware of it s own
existence as the agent that achieves those goals.

My point was that if it really is possible for an AI
to exist without a subjective sense of self, it might
examine the concept of consciousness and reason that
it would be too risky to incorporate or develop.  It
exposes  one  to the risk of excessive and
insufficient levels of a whole host of miseries.  Once
you re conscious, you may engage too much or too
little in things like self-pity, self-guilt,
self-aggrandizement, self-loathing, fear, anger,
sadness.  None of these would exist without the
concept of  I  would they?  I guess the
counter-argument would be that these are emotions and
the AI will likely be conscious but without the
negative (and positive) aspects of biologically-based
emotions.

If I m right and AI does end up being conscious,
perhaps one of the greatest dangers may be that it
will be able to simulate emotions and biology will NOT
be required for it to indulge in some of the risky
correlates of identity mentioned above.

Best regards,

Scott Badger

__________________________________________________
Do You Yahoo!?
Find the one for you at Yahoo! Personals
http://personals.yahoo.com

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=17942