X-Message-Number: 430
From att!usc.edu!more%girtab.usc.edu Sat Sep  7 15:28:45 PDT 1991
From: more% (Max More)
Subject: Re: cryonics #410 - Re: Nano-neurons?
To: kqb%

I thoroughly agree with Simon Levy's comments against viewing the "mind" as 
software running on hardware. I also believe that Searle has an important
point to the extent that there is a distinction to be made between being 
conscious and merely simulating consciousness.
	However, Searle's argument need not rule out uploading, nor the 
possibility of truly conscious and intelligent (contelligent) machines (if 
we would still refer to them as machines). What Searle's argument shows is that
duplicating the behavior of a system at the most abstract level, or the 
"highest" level of behavior, is insufficient. If you want a computer to also
have all teh qualitites of consciousness and awareness, it will be necessary
to duplicate neural function at some deeper level. At what level? No one can
say at this point. Rather than a simple hardware/software distincion, in the 
brain there are many possible levels, sometimes clearly distinct, in other
cases overlapping. There's the level of the neuron, the membrane, the synapse, 
the activities inside the axon, collections of connected neurons, columnar 
formations, brain maps, brain systems, and the entire brain (an incomplete
list).
	To create a contelligent being using non-biological hardware, we first
need to know which if these levels must be respected in terms of function. 
Perhaps we must have components comparable in function to neurons, or maybe
that's dispensable so long as we get the same function at a higher level (such 
as a brain system). On the other hand it might be necessary to have components 
functioning at deeper levels, such as that of the membrane. 
	We will also need to ask ourselves: What do we care about in thinking

of our continuity when being uploaded (or gradually shifting over to 
non-bio-logical hardware - i.e., transbiomorphosis)? Being made out of silcon, 
or optical processors, might allow us to be aware and intelligent, but might 
change the
"qualia" of our experience, i.e., the subjective feel of our cognitive and
sensory states. We might "feel" tactile sensations differently, or experience

pleasurable and painful sensations in a novel way. Personally, I'm not worried 
about changes in qualia, so long as my new introspectively identifiable states
are at least as (and preferably MORE) acute and refined than my old. 
	Max More

From att!usc.edu!more%girtab.usc.edu Sat Sep  7 15:56:54 PDT 1991
From: more% (Max More)
Subject: Re: Directness of brain processes
To: kqb%


Some of the disagreement here seems to be over whether perception is "direct". 
Perhaps it would help if the people disagreeing on the issue gave some idea of
what they take "direct" and "indirect" perception to involve. A lengthy
discussion of this can be found in Objectivist philosopher David Kelley's
book, The Evidence of the Senses. Kelley's book is of mixed quality, but he
does have some knowledge of cognitive science (including an awareness of 
Gibson's work which, like Simon, I recommend).
	Max More

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=430