X-Message-Number: 4127
From:  (Brian Wowk)
Newsgroups: sci.cryonics
Subject: Re: another (Re: uploading)
Date: 1 Apr 1995 02:48:55 GMT
Message-ID: <3lieun$>
References: <>

In <>  (Ian Taylor) writes:

>Brian Wowk () wrote on 27.03.95:

>>I am simply observing that my brain is a physical system, and as such is
>>amenable to at least some degree of modification and performance
>>enhancement by physical means.
>Fine. That happens all the time as any of vast number of compounds in the
>blood stream interact with it, such as alcohol for example :)

	You are beginning to get the idea. 

	This thread started because somebody claimed (among other things)
that humans would never be "uploaded into computers".  If you recall
the general theme of that post, it was *not* a philosophical challenge
of uploaders and Strong AI, it was a sweeping dismissal of life
extension, cryonics, and human improvement generally.  I thought my
reply was a reasonable response to that post, and I believe
my reply was essentially accurate in its final conclusions.
Apparently my reward was to get jumped on by other immortalists
like you over a minor quibble with terminology.  Have you really nothing
better to do with your time?

	Alright, forget I said we are "computers".  (Apparently that is
a loaded word in this newsgroup right now.)  Why don't I just say we
are *machines* (read: understandable, manipulatable physical entities).
We can already today improve the mental performance of this machine
with a variety of drugs (so-called smartdrugs).  In the future we
improve its performance even more with genetic engineering, cybernetic 
implants (Library of Congress on a chip!), and other changes barely
conceivable today.  These changes, as dramatic as they are, can, if
necessary, happen *without any change in our basic neurobiology*.

	Look at it this way.  The *vast* majority of our memory and
cognitive processing takes place unconsciously.  We bring ideas
and memories into our conscious perception selectively and in small
chunks at a time.  Suppose Ettinger is right, and our "feeling" and
"selfhood" does depend on a very specific kind of biochemistry 
perhaps in even a specific part of our brain.  Fine.  We'll keep
that essential part of ourselves.  This will not stop us from
someday using nanotechnology to build unconscious subprocessors 
of such power as to make us godlike in our mental abilities.

	The bottom line is that the uploaders are right.  They might
not be right in terms of the specific hardware implentations they
envision, but they are certainly right in that the future marriage
of nanocomputers with the human mind will bring vast enhancements
of human (or whatever we call ourselves then) experience.

	And you know what else?  The Strong AI people are right too.
They might not be right in saying that human brains can be fully
simulated by Turing machines, but they are certainly right in saying
that machines can be built as intelligent as people.  *We* are the
proof of that.  Whether we are Turing machines or not, the physics
of how our brains work will eventually be understood.  At that time
it will become possible to build other "machines" that are just
as smart as us, if not much smarter.  If Penrose is right, we may
find that such machines must be composed of quantuum-computing
microtubules.  Fine.  We'll build them however we have to, but we
*will* build them.  Our own brains constitute the most irrefutable
proof that atoms can be organized into intelligent, thinking
entities.

	So where does this leave the recent sci.cryonics and CryoNet
uploading and Strong AI debates?  In my view it's all moot.  These
things (vast mental performance enhancements, and super-intelligent
machines) are all going to happen anyway.  It's merely a question
of implementation.

---Brian Wowk


Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=4127