X-Message-Number: 15360
Date: Tue, 16 Jan 2001 23:51:39 -0700
From: Mike Perry <>
Subject: Bounds on the number of memories

Thomas Donaldson, #15344, says

> the imitation of a human being would
>not last long at all before the real human being it imitates would
>acquire very different sets of connections. Just how he or she does 
>this does not matter here; the point is that the low figure he discusses
>is for an instantaneous static picture. If we want to discuss all the
>different memories such a person may have learned in the past or will
>learn in the future, we're immediately plunged into factorials such
>as N!. 

It's reasonable, I think, to assume that any device that would "imitate" a
human being would be probabilistic rather than deterministic in its working,
to allow for the fact that different brain events are possible and are
unpredictable. (Thus we would not be duplicating some real, observable brain
in lock-step, but instead exhibit "typical" brain behavior, in a suitably
isomorphic sense). With that assumption, though, there is no reason that the
complexity would grow faster than polynomially with time, and indeed, no way
it could, unless perhaps by some consequence of quantum computing. There, on
the other hand, you have the universal quantum simulator I've referred to
before, an analogue of a universal Turing machine, which again can duplicate
the computation in polynomial time. A classical Turing machine in turn can
always do what the quantum machine can, though it might require exponential
time.

Returning to the issue of neurons and their interconnections, though, it's
easy to see that the total number of configurations over time is also
polynomially bounded. Neurons must occupy space--a finite volume in 3-space.
So at time t the maximum number per brain is no more than ct^3 for some
constant c and t in convenient units. (We also assume the starting time t=0
is chosen to be before the brain first comes into existence). The number of
connections at any time t is no more than the square of the number of
neurons or c^2*t^6. Each "memory" corresponds at most to an entire array of
connections, i.e. it can't occupy more than the whole brain; it will be
convenient here to assume it does include the whole brain. Let's allow,
generously, that each ordered pair of neurons (A,B) defines a "connection
site" for which an event (either forming or breaking the 1-way connection
from A to B) could occur. To change from one memory to another requires a
change in one of these connections, i.e. an event, and there is a limit to
how fast events can proceed, at each connection site. Say there are at most
q events per time unit per connection site. Over a short interval dt
starting at time t, then, the number of events that complete should be qdt
per connection site, or, for the whole brain, q*c^2*t^6dt. Integrating this
from time t=0 to t=T should give the maximum number of memories that could
have occurred up to time T, or q*c^2*T^7/7, which is still polynomial in T.

No doubt the above analysis needs much refinement in view of quantum
effects, bekenstein bounds, basic structural considerations, etc., but I'm
sure that some form of polynomial bound will still apply. I will not deny,
however, that the brain is very complex and imitating it via computer, if it
is to be done in realtime or faster, will be quite a challenge.

Best to all,
Mike Perry

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=15360