X-Message-Number: 30160
From: 
Date: Mon, 17 Dec 2007 23:59:47 -0500
Subject: To James on Singularity and talk-only organizations

--_----------=_1197953988260311
Content-Disposition: inline

You say "From what I've seen, the SIAI is almost entirely dedicated to
the "risk" side of the equation.  Certainly 90% of the AI discussion and
Q&A at the Singularity Summit had to do with the risks of creating
unfriendly AI and how best to go about avoiding such.  SIAI would
probably go so far as to endorse banning Strong AI research if the
"Friendly" characteristic could not be assured."

Mr. Clement, either you need to spend a lot of time looking at the SIAI
yourself, or you are intentionally trying to mislead people.  I
downloaded and listened to much of the "Summit" material and found very
little discussion of the risks of promoting the Singularity.  Even Rudi,
who went, could not articulate any information on such material, when
asked.  What do you think you heard?  As to their website:

http://www.singinst.org/  talks about "the opportunity and the risk".  So
I went looking for the "risk".

http://www.singinst.org/overview/whatisthesingularity

The word "risk" is not in the entire document.

http://www.singinst.org/overview/whyworktowardthesingularity

Only mention of "risk" is the risk of third world countries and
underground operations getting to the Singularity first, if in the USA
research is restricted.

The SIAI clearly has no idea of the risk involved in promoting the
Singularity, and does not advocate any safeguards against the Singularity
AI usurping power over the fate of humankind.

The Lifeboat Foundation?  Here is their mission statement:  "The Lifeboat
Foundation is a nonprofit nongovernmental organization dedicated to
encouraging scientific advancements while helping humanity survive
existential risks and possible misuse of increasingly powerful
technologies, including genetic engineering, nanotechnology, and
robotics/AI, as we move towards a technological singularity."

So they are going to protect us against risks, UNTIL the Singularity
arrives, eh?   Is their goal to continue to protect us then?  Or is their
goal to protect us just long enough to become eradicated?  If they are
going to protect us then, how?

Again I say - if the WTA advocates anything about the Singularity at all,
it also needs to advocate SAFEGUARDS.  Where do you talk about that?

As to your "umbrella organization that discusses all these subjects,"  I
ask again do we need to be supporting organizations that do, or
organizations that are just a lot of talk?  And 4700 members and you need
money for basic marketing functions?  They must not pay very much in
membership dues.  Why not charge them more, and beg less?

-- 
Got No Time? Shop Online for Great Gift Ideas!
http://mail.shopping.com/?linkin_id=8033174


--_----------=_1197953988260311
Content-Disposition: inline

 Content-Type: text/html; charset="iso-8859-1"

[ AUTOMATICALLY SKIPPING HTML ENCODING! ] 

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=30160