X-Message-Number: 27903
Date: Wed, 03 May 2006 11:59:37 -0500
From: 
Subject: Pulling the Plug

Mike Read said:

 >I take it that your idea of the Singularity is when computers become
self-aware?  Why exactly do you believe that is a threat?

Murphy's Law, and the probability that a self-aware computer will quickly 
develop the ability to overpower humans, especially if it has no effective 
"off" switch.

 >It doesn't have to perpetuate our existence.  It simply has to leave
us alone.

What makes you think it will do that?  Murphy's Law says that at some point 
it will not.

 >What makes you think it would care one way or the other?

Caring is really irrelevant; in fact, it might not even have emotions 
developed in its programming.  It could decide pragmatically, or merely by 
accident.

 >The day a computer becomes self-aware, is the day it has the same
rights as the rest of us.

So I gather you are one of the short-sighted "big brains" who wants to 
build one of these things and give it control over it's own energy 
supply.  If so you are just as much a threat to the continuance of humanity 
and individuality, as the upcoming Singularity you are helping build.

Flav

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=27903