X-Message-Number: 15885
Date: Sun, 18 Mar 2001 14:09:31 +1100
From: Damien Broderick <>
Subject: Re: [off topic] Singularity... Bah Humbug!

At 10:00 AM 17/03/01 -0000, James Swayze wrote:

>I don't buy the singularity, at least I'd rather not. I'll tell you why.
One oft
>touted product of the singularity is super AI so powerful as to take on
the role
>of a near god. 

James, I think you're confusing two different issues (at least; maybe
three). Will it happen? Will you like it happening, if so? Is there only
one sort of It to happen?

My claim: I think a technological singularity is on the cards, because it's
a convergent outcome of a lot of different current inputs. But it's not
obvious to me what *kind* of spike it'll be, or how swiftly it will occur,
and I certainly don't consider it part of the hypothesis that a *cold
machine god* will be created. Well, except in the sense that by comparison
with Cretaceous dinosaurs Homo sapiens sapiens is a *sort* of [old-style
Greek] god.

Whether you care for this or not is probably out of the question. I really
dislike the fact that there are still 50,000 nuclear weapons in the world.
But closing my eyes won't make them go away.

>Does AI have to be machine? I don't think so. Why trust super intelligence to
>some inhuman machine.[...] Put the AI inside each of
>us. If we can indeed someday link human neurons usefully to digital data
devices
>then do so and then link us all together. Each of us a single processor in
a 6+
>billion strong and growing multiprocessing super computer.
[...]
>Nanotech should be able to reduce the size of a quantum laser electron
hard drive
>to oh maybe the size of a dime or even the head of a pin. They'll be the
rage!
>Everyone will want one. 

You just described a couple of possible Singularity outcomes. The first,
intelligence amplification of existing humans and their coordination into a
kind of emergent group mind, is... let's see... Spike version B ii (THE
SPIKE, 2001, p. 327). The teeny nano (or peta) tech you project are almost
the very definition of a Singularity technology.

>Singularity? Who needs it? Or should I rather say, scary unknowable outcome
>Singularity who needs that? Not I! I believe we can be totally in control
of THE
>singularity if we don't act stupidly.

We can't *help* acting stupidly, by comparisons with the fast, commodious
intelligences that will be here in 50 or 100 years. But we can surely act
as sensibly and cautiously as possible, rather than the reckless,
short-sighted ways we usually act.

I think, though, that my earlier claim withstands this kind of objection.
Cryonics revival might well require a sufficient number of advanced
technologies that other side-consequences of those technologies will have
precipitated the world very fast up the exponential slope of a Spike.
Whether we, here and now, like it or not. And when making our plans we
ought perhaps to take this predictable discontinuity into account (in so
far as one can plan while moving ahead into pitch blackness--or blinding
light).

Damien Broderick

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=15885