X-Message-Number: 5965
Date: Tue, 19 Mar 1996 20:28:19 -0500
From: <> (Jeffrey Soreff)
Subject: Will life be worthwhile if cryonics succeeds?

Mark Muhlestein wrote:
>So, if you are alive on March 16, 2096, it is likely that you will
>*want* to be alive.  Given that you would want to be alive then, it is
>not absurd to conclude that you would also want to be alive if you had
>just been revived.  It would take some time to catch up to where you
>would be had you not been frozen, but in the long term I see no good
>reason why you would be inherently at a disadvantage.  Unless those
>reviving you were malevolent, you could be given the opportunity to
>take advantage of all the fruits of the past 100 years learning and
>technological development.  Your particular set of skills, experiences,
>knowledge, and so on will be one more valuable starting point in the
>sparse, virtually infinite multidimensional space of possible beings.

>To me, this answers Jeff Soreff's worry that AI's and uploads will make
>biological humans uninteresting.  Yes, it may well be that the
>limitations of our biology would be somewhat confining in such a world,
>but that same biological human makes a fine starting point for a more
>ambitious entity who was willing to undergo a (probably gradual)
>process of self-transformation.  We can only speculate about what life
>would (will?) be like in such a world, but since *we are the ones who
>will be creating it* we have a chance to steer things in the directions
>we want.  That is one reason why I have hope for us hominids.

I guess I expressed my original concern unclearly.  The scenario that concerns
me is one where cryonics does NOT succeed.  The problem isn't that biological
humans become uninteresting in the sense that the important action moves
elsewhere.  The problem is that AI's and uploads might compete down the
effective economic value of what a human can do to roughly the replication and
maintenance cost of an upload.  If I recall correctly, the raw power needed to
maintain computation equivalent to a human brain is around a milliwatt (using
Moravec's 10 teraops/brain, and Drexler's 100 nanowatts/gigaop).  Human
metabolism requires on the order of 100 watts, 5 orders of magnitude more.
This sort of competition could easily kill off the humans who are alive, and
in a position to try and defend themselves at the time that it happens.  This
is basically a scenario where humans lose control of their machines (either
AIs, or uploads that have modified themselves and/or each other).

                                                  -Jeffrey Soreff

standard disclaimer: I do not speak for my employer.


Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=5965