X-Message-Number: 14914
From: "Gary Tripp" <>
Subject: reality and abstraction
Date: Mon, 13 Nov 2000 21:33:23 -0500

This is a multi-part message in MIME format.

------=_NextPart_000_0068_01C04DB9.5968C1E0
Content-Type: text/plain;
	charset="iso-8859-1"

Thomas raises an interesting point when he says:

"However there IS another way of looking at identity and its survival:
just what attributes of identity at time t0 must be preserved at
time t1 > t0 for us to say that the identity is the same? Yes, I am
different from the person I was yesterday, but are those differences
important or minor?

While I personally would say that in the sense of this previous
question we ourselves are abstract beings, it most certainly does not
follow that we can be treated as if we need not take any special 
form ie. biological, for instance. Even a little reading on just
how neurons work makes me wonder whether a computer in the present
sense could really imitate us at all well. Yes, they're nice analogies,
but the real question is whether or not they really match us closely
when we learn everything about how our brains work. If nothing else,
the assumption that we work like Turing machines looks very faulty..."


A digital computer may simulate an analogue process to any desired degree of 
precision but perhaps eventually the original and the simulated version would 
diverge - OR would they?

I believe that close scrutiny of the notion of identity would compel us to 
concede that we are more than a collection of bones and guts. As Thomas points 
out, the thought that we can strictly equate identity with our immediate 
physical state is untenable because we are constantly changing from one moment 
to the next. Yet there is a unifying thread of coherence over these changes and 
a sort of homeostasis about our purpose. So in this wider sense a digital 
simulation may not perform identically at every nanosecond but it would 
eventually converge to the same outputs given the same inputs in much the same 
way that a dampened iteration may find a solution to a system of equations 
faster than a simple iteration yet both would converge to the same limit.

On that note, the myriad little mental subprocesses that ultimately coalesce, 
combine and  then percolate to the surface of our consciousness as a thought 
might be streamlined so that they achieve their contributing affects in a more 
efficient way. The end result would be the same and perhaps the feeling would be
the same but the precise details of these machinations would be different. How 
far could we go with this? We could continue to make adjustments in subtle ways 
until we've radically changed the underlying structures for optimal efficiency 
yet preserved the outward behaviour of the system. I feel that we must redefine 
identity in a way that accounts for these possibilities. We cannot be mere guts 
& bones, nor, it appears, can we be a fixed algorithm; rather, it would appear 
that we are an equivalence class of algorithms. I speculate that digital 
computers would be entirely adequate for our simulation.

/gary


------=_NextPart_000_0068_01C04DB9.5968C1E0

 Content-Type: text/html;

[ AUTOMATICALLY SKIPPING HTML ENCODING! ] 

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=14914