X-Message-Number: 29760
References: <>
From: Kennita Watson <>
Subject: Re:  To Francois: Reply
Date: Tue, 21 Aug 2007 02:47:15 -0700

Flavonoid wrote:

> It is not worth the risk.  That "Summit" needs to focus on how to STOP
> the Singularity from ever getting here, how to stop a super-AI from  
> being
> developed, and to ensure there is a power plug that can be pulled  
> on all
> supercomputers by security personnel.

I find this really funny.  "Stop the Singularity"
strikes me as akin to "Stop Niagara Falls", or maybe
more like "Stop the Mississippi River", since there
may be at least one hope in hell that we could
divert it at least a little.

AI will be developed.  If not by us, by someone else.
It may not start out super, but it will get there
eventually.  Best we do it first and do it right.
By "we" I refer generically to "people I think of
as 'the good guys'".

Live long and prosper,
Kennita

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=29760