X-Message-Number: 33430
Date: Sun, 6 Mar 2011 11:49:13 -0800 (PST)
From: Luke Parrish <>
Subject: Re: CryoNet #33423 - #33427

Robert Ettinger claims that uploading to a digital substrate
does not produce a real human being, on the simple grounds that
a description is not the thing it describes 

The point Bob seems to be missing is that the value of this
distinction rests *completely* on the assumption that a
"description" does not follow the functional rules of the object
it describes. For example, a blueprint does not directly shelter
a person from the rain, which makes it useful to distinguish it
from a house. The distinction between a program and a copy of
that program is a useless one by comparison.

I suppose if Bob wants his own private definition of "human"
which is based on composition rather than function, he is as
entitled to it as he is to any other religious belief or
philosophical position. But if he wishes to force that upon the
rest of the cryonics community as though it were scientifically
factual in nature, he needs to better describe why composition
is preferable over function, or else describe convincingly why
some function essential to human nature cannot be emulated on a
state machine.


Ron Havelock brings up a different objection, that a person who
has been scanned would still not wish to commit suicide
afterward even once a duplicate has been made. This is a
completely separate objection from Robert's in that it would
apply equally to a situation where an organic copy has been
made, whereas Robert's objection is specific to the use of a
digital substrate. The logical error Ron makes in my opinion is
to assume that it is impossible for one person to anticipate
being two separate (non-telepathic) people simultaneously at
some point in the future.

My answer to his scenario is that the uploader should indeed
feel fear beforehand because they will be conscious of being
about to die in one of their future branches. However this would
not be the case if for example the duplicate was constructed
instantaneously as the original was being destroyed, e.g. in the
case of a nanotech based "portal" which continuously
disassembles the object in one location and reassembles it in
another, along a two-dimensional plane. In this scenario one
could stand halfway between two remote locations and perceive no
ill effects or distinction between the two half-selves.

A more interesting question is whether creating more duplicates
lessens the importance of the original's fate to the pre-scanned
individual. An uploader given the choice might decide to create
99 copies rather than just one, in an event where they knew the
original was about to be killed painfully, because their
anticipated chance of being the person who is subject to pain
would go from 50% to 1%. Equivalent alternative choices would be
to make the single duplicate 99 times as aware as the original
for the duration of the distress, or to diminish the awareness
of the original to around 1%, e.g. by using an anesthetic.

In the case of a cryonics patient, it can be argued that if 99
copies are made of the patient, the 100 are all just as humanely
valuable as any one. It is no different from a file copied to
100 locations, you still have a single coherent data unit, i.e.
exactly one person exists. We can examine this with a thought
experiment:

Suppose an individual knows he will be copied in an atomically
precise fashion 99 times after being cryopreserved, and that all
100 copies including the original will be shuffled around and
one picked at random to be the first reanimation. Furthermore,
the reanimated individual will be given the choice of their own
suicide followed by the reanimation of all 99 other individuals,
or their own survival at the cost of destroying the 99 frozen
copies.

If the individual is a rational uploader, they have no issue
with destroying the frozen copies as they believe that survival
is dependent only on one copy surviving. The fact that none of
them are conscious or have separate memories and experiences
renders the redundancy of their material existence as irrelevant
-- there is no additional data to be considered. The conscious
individual making the choice on the other hand does have some
amount of additional experiences and thoughts, so they take
precedence over the 99 frozen copies.

If they are a rational anti-uploader they would need to
pre-commit to suicide in the unlikely event that they should
awaken as the first individual, because there is a 99% chance
the awakening individual is just a copy remembering the
pre-commitment, and a 99% chance of the original surviving is
superior to them than a 1% chance.

Thus the anti-uploader is in the same position as the uploader
in needing to precommit to suicide in some cases to maximize
personal chances of survival. That an uploader would balk at
suicide and have mixed feelings on the topic in some situations
cannot be used as evidence against their position -- it is just
another kind of trolley problem for which the correct answer in
an unusual situation runs counter to our natural intuition.

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=33430