X-Message-Number: 12577 From: Thomas Donaldson <> Subject: more on intelligence, values, and emotion Date: Sat, 16 Oct 1999 22:03:39 +1000 (EST) Hi everyone! Yet more on the issue of intelligence, emotion, and values. This is not intended to be another discussion of Kurzweil's book, but instead a discussion of the questions it raised (and did not answer). The first point I will make here is that yes, we can make a machine do what our values want it to do. That's one major use of machines, even now. So I perhaps should have said that our future hyperintelligent hyperslaves will not produce values or emotions OF THEIR OWN. We'd make them programable so that they did what we wanted; since they are very intelligent, they might well ask us to explain what we want when they see ambiguity while we did not. But that does not remove their hyperslave character. Again, some such as Bob Ettinger seem to believe that intelligence and values cannot be separated. Naturally this can depend on just how we define intelligence (and values, too, for that matter). We might say of our hyperslaves that they have values in them too: to do for us what we ask them to do, applying all their intelligence to do so. My problem with that reading of such situations is that we could do the same verbal trick to attribute values to ANYTHING: Our cars right now are designed to behave (mostly!) as we tell them to behave (and moreover, if they DO NOT, their behavior is considered the result of faults, which must be fixed). I would say that for a machine to have values, it must produce those values for itself in a way similar to the way we do so (but not necessarily identical). Its values come from inside it, not from outside (as in the case of cars or hyperslaves). Yes, we may adopt ideas about values from outside, but we still have values and emotions based on our own brain and glands. And there are limits to just how far outside influences can affect those values and emotions. Best wishes and long long life for everyone, Thomas Donaldson Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=12577