X-Message-Number: 25255
Date: Thu,  9 Dec 2004 18:09:50 -0800
Subject: Reply to Scott's Questions on Identity
From: <>

Before I reply to Scott, a brief note to Robert:

>Mike Donahue mentions EMF (electromotive force?) freezing, but I
>must have missed that discussion.

Mike is referring to magnetic resonance freezing. Strong magnetic
fields at resonant frequency (with respect to H20) can inhibit the
crystallization of water as the temperature is lowered, and can
therefore reduce or prevent cellular dehydration and other
freezing damage. I think it's an interesting line of research that
should definitely be pursued, possibly in concert with traditional
techniques.

Another possibility is 10 Ghz microwave radiation.

Now onto Scott.

You wrote:

>Perhaps my time-worm scenario example wasn t a good
>one. Apologies. But if you could go back in time and
>meet yourself, would it make sense to ask which one
>was you?.

As Ettinger mentions, time travel (without branching) is a logical
impossibility. If you allow branching, then since there are two
universes, whenever you meet 'yourself', it is clear that is not
you, but a copy of you existing in an alternate universe. Of
course, one of you existed prior to the branching, and the other
did not; that former is the 'original', while the latter is the
copy.

>Let s try something else. How about the
>classic thought experiment where the natural brain s
>neurons are replaced, one at a time, by artificial
>neurons which are precise duplicates until the normal
>brain was completely replaced by an artificial brain?
>Where is the line at which the QE is destroyed and how
>do you justify the existence of that line?

Assuming the scenario is possible, then the QE is NOT destroyed.
Remember the criterion for survival: you survive from time T0 to
T1 if at all points T in [T0, T1], your physical system is capable
of experiencing qualia. This implies that replacing the neurons of
your brain one at a time with (as you say) precise functional
duplicates would not result in your personal destruction.

Contrast this with near instantaneous replacement of all your
neurons (e.g. mass disassembling and then recreation): there is a
time when your physical system lacks the ability to experience
qualia---when you exist as 'information' (which is another way of
saying, you don't exist at all, since nothing can exist as
information). Since the physical system that was your brain ceased
to have a QE (i.e. became incapable of experiencing qualia), you
did not survive; and assembling from your remains another QE, even
in the likeness of your own, is not a continuation of your inner
subjective life.

>The closest thing to what Richard is describing that
>I ve seen is Michael Gazzaniga s  Interpreter  theory.

If these quotes are representative, and I have interpreted them
correctly, then Michael is in gross error. Evolution does not
produce a little man in the brain, whose purpose is to passively
observe the macroscopic operation of the system, but effect no
change.

Rather, our consciousness plays THE defining role in what makes us
so intelligent, and so unlike computer systems. It is a crowning
achievement of evolution. Yes, we are layered on top of systems of
which we have no knowledge and over which we have no control. But
that in no way diminishes the tremendous effect that consciousness
does have over our behavior.

>Below is an excerpt that helps explain this  user
>illusion  perspective. I m sure most cryonicists are
>hoping to rescue their  self    (i.e. their identity)
>from death and extinction. But if Gazzaniga and others
>are right, it won t make a whole lot of sense to place
>so much value on the  self  since it has such a minor
>role in our mental affairs.

This is a ridiculous statement. The value I place on the self is
not rooted in any supposed major (or minor) role in my mental
affairs. Rather, I place a lot of value on the self because it is
ME. My subjective inner-life is all I know about, and all I care
about. I don't care what computer-like mechanics are operating
beneath the level of conscious experience. I have no interest in
such things.

>If a way can be found in
>the future to expand our consciousness to the  entire
>mind, it will most likely mean shedding the thin
>veneer of consciousness Gazzaniga calls the
> Interpreter  since it appears to be primarily a
>watcher, not a doer in the brain.

This is not true, see my comment above regarding evolution.

Further, don't expect people to expand their consciousness. There
is no reason for me to be aware of much of what's going on in my
brain; no more reason than a sentient AI program has to be aware
of low level IO routines.

The only thing I care about is my happiness. I would not want to
expand my consciousness unless it directly increased my happiness
in good way; but merely being aware of more of the (currently
unconscious) things going on in my brain would do no such thing.

I have a specific neural wiring which gives me happiness from
doing very specific things, such as eating food, sleeping, mating,
improving my skills. Doing random things to my brain, such as
increasing my awareness of brain functions, is not going to
increase my happiness. You would have to rewire my brain so that
the increased consciousness would bring me more happiness.

But why???

You could just as easily rewire my brain to get happiness from
petting rocks. Or from twisting my fingers. Or from going on
endless searches to find invisible pink unicorns.

I have no need of rewiring. I derive pleasure from a set of
completely arbitrary things (not arbitrary from an evolutionary
perspective, obviously, but arbitrary from an absolute
perspective). I don't need to change it to some other arbitrary
set. Yes, I COULD get pleasure from flapping my wings like a
chicken for 24 hours a day, or from watching red balls bounce off
of rocks, or a million other arbitrary things, but I don't. And I
don't want to. I am happy being me.

When brain modification becomes a possibility, you will see people
doing very weird things. Many (perhaps most) modified people will
end up dying (evolution has selected for so long to produce
something whose primary goal is survival; tampering with the
circuit will surely result in a suboptimal design, from the
perspective of survival; at least, this is true for a long, long
time, far past when brain modification becomes possible). Others
will have altered their pleasures in such a way that, they engage
in behavior that might seem ridiculous to you. But they don't get
ahead, even when they survive. All value systems are arbitrary; as
long as one isn't any more likely to get you killed than another,
you have no reason to change.

So not only do I want to be me, I want to be me for all time. Any
enhancements I might make to myself would be done very cautiously,
and I would do them only if they were consistent with how I am.

[snip]

Best Regards,

Richard B. R.

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=25255