X-Message-Number: 6800
From: 
Date: Sun, 25 Aug 1996 04:26:39 -0400
Subject: SCI. CRYONICS ei fallacy

                      UNGER'S "EI" FALLACY

In a previous brief discussion of Peter Unger's IDENTITY, CONSCIOUSNESS &
VALUE, I neglected to provide an explicit analysis of the failure of his
thought experiment, the "Experience Inducer" (EI), as applied to altruism. I
merely pointed out that in effect he relies on fallible intuition, assuming
that if we believe something deeply enough, it must be true. But it will be
much more effective, and more interesting, to pinpoint the precise locale of
error.   

The EI is postulated to provide the subject with simulated
experiences--subjectively
indistinguishable from reality. It is a kind of perfected Virtual Reality.
The subject or victim or beneficiary may be lying on a table on life support,
but the EI makes him think he is living out various scenarios. Subjectively,
the scenarios are totally realistic.

Now, many philosophers (myself included) have said that all human motivation
is self interest, and that nothing can ever matter to you--DIRECTLY--except
what happens in your own head, your own experiences or subjective states. (To
make this convincing, or even to make it really clear in meaning, requires
extended discussion which I omit here; we'll proceed anyway.)

Now Unger asks us to imagine the experimenter making the potential subject an
offer he can't refuse--supposedly--if indeed he believes nothing is important
except his brain states. Consider this dilemma:

a) If you choose (a), you will live a long and happy subjective life through
the EI. You will not know about the EI once you are in it, and will believe
you are interacting with the objective world in the usual manner. However,
your daughter will be made to endure long and bitter suffering (which of
course you will know nothing about in the EI).

b) If you choose (b), your daughter will be spared, but you will have a less
happy life in the EI (or out of it).

Unger thinks almost anyone would choose (b), and that this proves that your
future subjective states are not your only concern. This is a simple lapse in
logic, an oversight.

***The reason you choose (b) is precisely that you ARE greatly concerned
about a future unpleasant subjective state--not your potential states in the
EI, but your state in the immediate future upon choosing (a), or in the act
of choosing (a) if you were to do so, BEFORE you enter the EI.***

If you were to choose (a), or to begin to make that choice, you would
IMMEDIATELY feel terrible, and it is this feeling you want to avoid. It
outweighs everything else, because of its nearness and intensity. 

Whether it "ought" to outweigh everything else is another long, complex and
subtle story. Philosophy is basically the logic and mathematics of choice,
and in a user-not-so-friendly universe the choices are often hard to
calculate and are bought without warranty.  






























                                                                             
Robert Ettinger


Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=6800