X-Message-Number: 10140
Date: Thu, 30 Jul 1998 10:15:25 -0400
From: Thomas Donaldson <>
Subject: CryoNet #10135 - #10139

Hi everyone!

To Brook Norton: The problem with your criterion comes down to just
what creates happiness, and why. Why would it be wrong to take a 
(hypothetical) drug which makes you permanently happy at the cost
of removing your ability to think? Suppose that you could even do
this in a situation (say a bit in the future, with robots to care
for you) in which you would not be a burden on anyone?

It often happens, as you probably know yourself, that we quite
rightly WANT to be unhappy. It comes from knowledge and thinking.
After all, think of all the animals that are unaware of aging
and death. I have a cat lying on my lap this moment; she is still
young, but will never know all the turmoil of being a cryonicist.
I (and I'd bet you too) would like to be more intelligent, and with
that intelligence may easily come both more frustration and more
sadness. So should we want knowledge or happiness?

			Best and long long life to all,

				Thomas Donaldson

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=10140