X-Message-Number: 3682
Date: Fri, 13 Jan 1995 18:13:43 -0500
From: "Bruce Zimov" <>
Subject: SCI.CRYONICS: Uploading


Bob Ettinger is correct in pointing out that:

a) It's not just memories, but the "subjective circuit" that
   needs to be uploaded. and

b) We do not know enough at our current state of science to
   be able to answer these questions conclusively.

These are my position as well. Regarding a), Locke in the
18th Century tried to argue that uploading only required
transfer of memories.  By the way, every potential "uploader"
should probably bite the bullet, and go to the library
and read Personal Identity ed. John Perry which is a recap
of the modern discussion of the problem starting with Locke,
and Reasons and Persons by Derek Parfit. Just read Part III
of Parfit regarding Personal Identity.  These texts remain
the most exhaustive analysis of the identity problem in
uploading to date.  Of course, Locke didn't know about
possible uploading technology, but his brilliant analysis
of the problem included the transfer of minds from host to host.

For a concrete theory for the existence of the "subjective circuit"
but by no means the last word, see Edelman The Remembered Present.
Edelman is a neurobiologist.  The emergent statistical model
is another candidate.  And, of course, the explain-it-all-away-
like-telepresence-we're-zombies model is another more radical
alternative.  In this last, we would be created anew each day,
stored at night, and any continuity would be a psychological
illusion.

I appreciate very much the reference to Rogue Moon and would like
to throw in my own SF reference, if I may, viz. Rudy Rucker's
Software.  In this novel, the uploaded individual is inside a
computer with a bunch of others in this roving truck and tele-linked
to walking android terminals.  The telepresence makes them think
they are really in the android body.  Well, at the end of the
book, the truck gets attacked and is irreparably damaged. The
end is near:

"There's heat leaking in from where your friend rammed us.  The
temperature's up five degrees.  One more, and our circuits melt
down.  Thirty seconds, maybe."

"Am I on tape somewhere else?" Cobb asked. "Is there a copy on
the Moon?"
"I don't know," Mr. Frostee said. "What's the difference?"

I always wondered why he said "What's the difference?" The
character had to have solved the uploading problem to have
that technology.  I always imagined that he said "What's the
difference?" because whether or not there was a backup was
irrelevant, the point was that THAT Cobb's subjective circuit 
was about to terminate.

Notice that the problem really has nothing to do with whether or not
the subjective circuit can be transferred as easily as memory. Cobb's
problem is that his instance was kaput, and the stored copies were still
OK.  In other words, at that point, there was NO causal link between
Cobb and his copies. Think about that. NO causal link. None. Yes, the
information stored may be the same, but there is NO causal link. At least,
when you go to sleep and wake up again, your body and brain are a 
causal link, in fact, there are the ideal storage medium!

I don't agree with those that say that behaviour is the only test
necessary to identify whether or not a "black box" has an operant
subjective circuit.  If it were, we might wrongly conclude that
tape recorders or films had subjective entities that experienced
things everytime the recording was replayed since the behaviour
test passed!  Locke made the same point about parrots in the 18th
century.

The only subjective experiences you remember are your wake states, and
certain half-awake or dream states, and these are strongly correlated
with 30Hz frequencies in the cortex. Until we completely unlock
the brain's mechanism in these matters, it is silly to be fooled by
clever programming. I stated as much in my response to Coles in AI Expert
last year.

Bruce Zimov


Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=3682