X-Message-Number: 15981
Date: Sat, 31 Mar 2001 10:38:08 -0800
From: Lee Corbin <>
Subject: Re: Reliability of Friendly AI 

Eliezer Yudkowsky wrote

[Lee wrote]
>> I really don't think that "rights" in the abstract mean 
>> anything. Usually when someone forms a sentence "X has 
>> the right to do Y", it really means nothing more than 
>> "I approve of X being able to do Y".

> "X has the right to do Y" means (1) "I approve of X
> being able to do Y"; plus either (2a) "I believe that
> society at large also approves of X being able to do Y",
> or (2b) "I believe that society at large believes in moral 
> principles whereby I should be able to convince them that
> they should approve of X being able to do Y, even if they
> do not presently believe as such". At least, that's the
> moral relativist's version.

Well said, but it's still not worth it in my book.  I will
continue to avoid using the term myself, and continue to
be suspicious and critical of sentences where others employ
this tendentious term.

>> So far as I can see, the infinite nosiness of the Sysop...

> Could you *please* restrain your impulse to use terminology
> like this and just concentrate on debating whether or not
> the Sysop is, in fact, infinitely nosy?

Thanks for the good laugh.  Pardon me, but aren't we being
a bit sensitive?  :-)  (That's not a serious question.) Okay,
I'll try to remember.

>> I am opposed to legal rights for animals, and do not believe 

>> that we actually need statutes on the books forbidding 
>> cruelty to animals (if we ever did need them), even 
>> though I base my beliefs on utilitarianism also.

> Forgive me, but aren't you now talking about rights? 

No.  Sorry, but I am sure that my statement is clear.  I
want to say that I don't approve of having statutes like
that on the books.  Understandably, the double usage of
"need" may have confused you, but most people would react
to me saying "We don't need that" as "Lee doesn't like
that", or "Lee doesn't think it's a good idea". 

> If this entire conversation is phrased in terms of simple
> cause and effect, then a Friendly AI comes into existence
> which respects your rights and also the rights of any
> citizens which you create. 

I understand that you wish to promulgate your ideas in
terms of these "rights", but I have an observation and
a suggestion.  In my experience, people who rely too
heavily on a single term or word aren't communicating
as well as they might, and if it turns out that the word
is indispensible---there are just no other ways that they
can find to express something---then their ideas are not
very clear.  As an exercise, you may find it instructive
to see (for a period of time) if you can communicate without
using the term.  For one thing, at least talking to me, as
I said, I just don't know exactly what is meant.  (Also, I
studied under the double-plus-good writer George Orwell... :-)

>> and how many would agree with me, that our 
>> government should exist only to enforce freely arrived-at 
>> contracts, and to protect private property?

> A rather prejudiced way of phrasing it, don't you think?

I'm honestly puzzled by some of your reactions.  Prejudiced?
Well, for sure, who ever is saying that is probably a 
libertarian, but, the statement has no hidden tendentiousness
so far as I can see.  Many people clearly disagree with it;
many people understandably think that governments should
do a lot more, like offer free education or health coverage.

You ask what my answer would be to this scenario:

> Mr. Smith creates a simulation of he doesn't like; say,
> Mrs. Jones next door from Old Earth....Mr. Smith then
> works up a recreation of a medieval torture chamber -
> and keeps Mrs. Jones there for a thousand years.

My nearest one would be
> 3) "No, I hate it, and I'd [probably] do something about
>     it if I could get away with it, but I believe that
>     the morally optimal structure for society is such
>     that society would successfully prevent me from
>     interfering with Mr. Smith in any way."

I would replace the phrase "morally optimal structure" with
"best structure so far evolved".  I've read Hayek and Sowell,
and no longer believe we have the ability to design "morally
optimal" governments), and probably not morally optimal AIs
either.

Really, though, I might be closer to your option number 6,
wherein you provided for the possibility that I might find
the scenario unrealistic.  Indeed I do.  In order to emulate
someone (that is, I mean to say, simulate them to the point
that someone is "really there"), you need to be vastly more
complex than they are.  For example, if I were Edward O. 
Wilson, it's conceivable that I could write a program that
emulated an ant.

I have thought for some time that you worry too much about
ordinary human-level folks running simulations wherein 
trillions of people get tortured.  What is more relevant
is to consider that our mind-children may indeed have that
power.  Yet again, I'd maintain that it just doesn't figure
that any but a tiny fraction of the entities living in the
solar system ten thousand years from now, would waste their
time on something like that.  (How many people on Earth today
deliberately torture insects, and how much does it matter
anyway?)

Lee Corbin

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=15981