X-Message-Number: 15033
From: "Eunice Corbin" <>
Subject: Weaker Conditions for Computer Emulation
Date: Sun, 26 Nov 2000 19:37:11 -0800

In  #14987: Jeffrey Soreff wrote


>I agree with the importance of causal connections, for two
>basic reasons:
>1) If you build a system with the causal connections between
>   successive states broken, it cannot respond to inputs from
>   the real world any more.  In a _very_ general sense, it
>   cannot possibly defend itself.


Although I agree with your conclusion, I don't quite see these
latter points as necessary.  Yes, causality is important,
exactly as you state.  But I think that the ability to respond
to inputs is not critical for life or consciousness.  To be
more precise, an isolated physical system (such as a person) or
(as we AI types claim) a computer program might indeed be
conscious even though unable to defend itself and unable to 
respond to inputs from the outside world.  More pointedly, 
consider a "recording" of the initial state of a conscious
interval.  Since a session calculated from this initial state
when allowed to proceed to its end, is fully deterministic in
every way, it (obviously) cannot respond to a fairly conducted
Turing test, nor could it respond to new inputs, nor could it
defend itself.  Yet if this process (say an exact re-enactment of
an earlier run) got run time again, there is no reason to believe
that it would be any less conscious the second time around.

>[One can] speed up or slow down clock rates.  One can
>change the implementation of many processes between
>physically realistic simulations and pre-computed table
>lookups.  One can change the granularity of simulation...

Although I agree with many of your points here, "lookups"
do not necessarily produce conscious runs in all cases. Of
course it is axiomatic in computer science that it
doesn't matter how you achieve a computation, either by
calculation or by table lookup, the _experiences_ of the
program will be affected if the table lookup is too
(unrealistically) immense.  The problem is this.  Suppose
that we made a complete recording of a conscious interval
by simply recording each and every state that the program
went through.  Then for the "rerun", we just sequentially
load each state into, say, the CPU.  Can this produce any
sensation in the computer?  No, because there is no causality,
no flow of information from one state to another.  Each 
might as well have stayed in some other medium, with the
(comical) attention of someone directed to each in turn.
This is sort of back to the "theory of dust" again.  In
other words, a real emulation must allow the causality
to operate just as it did in the first run.

Lee Corbin

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=15033