X-Message-Number: 27978
Date: Tue, 23 May 2006 04:57:43 -0700 (PDT)
From: Scott Badger <>
Subject: Re: The Singularity's First Agenda Item

Flavonoid wrote:

> It occurred to me recently that shortly after the 
> Singularity becomes aware, it will want to know where 
> all its humans are at.

You presume to know the AI's agenda, huh? There's a reason they
use terms like singularity and event horizon. By definition you
can not predict what will happen. It'd be like a mouse trying to
anticipate a human's next move.

And why do you assume people are unconcerned? The primary goal
at the singularity institute is the development of a friendly
AI, isn't it? The threat of hostile AI is part of our culture
after movies like the matrix, terminator, and a dozen others. I
think people get it.

Every effort should be taken to increase the likelihood of FAI
but our best efforts may not be enough, sad to say. Too many
uncertainties come into play when you try to build a god.

Although I have no way of knowing, I would think the first thing
I'd do if I were an AI would be to take control over the means
of creating any competition. IOW, I'd stop the means of
production of a second AI. Whatever that required. But it's
still all wild speculation.
 
Scott

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=27978