X-Message-Number: 26815
From: 
Date: Tue, 16 Aug 2005 03:39:39 EDT
Subject: Uploading technology (1.iv.0) Goldilock's choice ?

Uploading technology (1.iv.0) Goldilock's choice ?
 
Because here seems to be a low trafic these days on cryonet, I start again  
the brainstorming on uploading with current technology.
 
It seems there are three possible concepts for uploading a neural system in  
some electronics devices: The dumb, the median and the  smart.

First, the dumb:
In 1943, McCulloch and Pitts (1) in  their threshold model demonstrated that 
the brain could work as a network of  neurons, each with a number of positive 
or negative inputs on the dendrites, an  algebraic linear sum at the cell body 
and a threshold gated firing in the axon.  The thresold could be adjusted at 
different values for different neurons. This  seems dumb, but Kleene in 1956 

(2) and then Palm in 1982 (3) first hint and then  fully demonstrated that such
a network can solve any problem. It is a complete  calculator similiar in 

potential power to ordinary computers. Simply, it seems  more akin to the brain
and best fitted to copy a neuron system.
Most present  day neural networks are built on that concept. The billion of 

neurons is not far  in the future. The only problem is that biological research
has advanced a lot  between 1943 and now and we know that a biological neuron 
is far more complex  that what was understood in 1943. That don't invalidate 
the first concept,  simply up to 100,000 elementary electronics neuron would 
be used to simulate a  single true, biological neuron.  Given that conversion 
factor, actual  neural nets are far from the biological brain power of a small 
animal. On the  other hand, much neural nets are built from discrete 

components. If they was  integrated at the scale of a computer microprocessor 
with 100 
millions  transistors on each chip, at least 100,000 electronics neurons could 
be built  here. This would be equivalent to a single biological neuron. Given 
that the  processing speed would be at least 10,000 time larger, time share 
would allows  the simulation of 10,000 biological neuron per chip. A brain 
would need one  million such chip, each in the $1,000 range. Not impossible, 
simply difficult  and costly.

The median solution looks at the current understanding  of biological neurons 
and aim to build a neuromorphing electronics circuit able  to copy directly 
all known biological properties.Neuromorphing circuits are a  new concept 

dating back 5 years or so. This concept has been pioneered by K.  Boahen as said
in 
the 1.iii.0 message (Cryonet #26325). I'll come back on that  device in 

another message. Suffice to say here that it must take into account  information
processing at presynaptict, post-synaptic, dendrite section,  dendrite trees, 
soma and axon. This is not a simple bunch of electronics gates,  it is a full 
microprocessor. The largest present day chips could harbor may be  up to one 
thousandt of them. Given a time share factor near 1,000 a single chip  would 

process one millions neurons. A brain would include 10,000 chips for a  cost in
the $10 millions range.

The smart way was suggested by M.  Soloviev : It would simulate on each 

processor a sub-network of biological  neurons. The idea is that neurons may not
be 
optimised and some economy can be  done if we take a lot of them and work out 
the package as a black box. Knowing  only the input and looking for the 

output. How could we do that ?  Given  that we can't test all possible input, 
how 
to guess all potential output? Worst:  For cryonics, neurons are no more 

active, so all the potential output must be  deduced from the scanned structure.

The solution can come from the  precomputer era of the classical physics. 

Assume we have a small physical system  with a small number of freedom degrees 
or 
"dimensions". Everything can be  computed from the newtonian dynamics resting 
on the formula : Force = mass x  acceleration. Unfortunately, when the number 
of freedom degree of the system go  up, even by a modest value, the 

computation becomes so complex that it is  impossible for all practical purpose.
The 

solution is then today to build a  simulation on computer. Well, but what about
the era before cheap electronics  computing ?

Many bright mind have taken that problem  and have  found new formulation of 
the dynamics, for example Lagrange, Hamilton, Jacobi.  These analytics methods 
traded computing complexity against analytics one. This  concept culminated 

with the calculus of variation, a very powerful analytics  tool able to reduce,
starting from lagrangian formulation of dynamics, the  computing load at the 
price of advanced analysis. People having studied the  basic logical circuits 
can remenber the Karnaugh diagrams a tool used to  simplify a logical network. 
Here, the basic idea is the same, but the task is  far more complex, the 
starting element is not the boolean algebra, it is the  full real world.

Neurons can be seen as dynamical systems an so can  be worked out as 

variational systems of greath complexity. Can we take a brain  column with may 
be 

10,000 to 100,000 neurons, translate it into a variational  problem and solve 
it, 
producing a small bunch of final computation? The analytic  task would be 
daunting. On the other hand, the benefit would be enormous : A  present day 

computer, on the order of 1,000 PC power could simulate a full human  brain. The
key 
to this concept is that we have some sotfwares such Mathematica,  able to do 
the analytics work for us.

So, what solution must we  choose? The dumb one is the main track today and 

will continue to be so in the  academic environment. It seems the hard way, but
may succeed given sufficient  time and money. It would be counterproductive 

to invest in it, this is a  playground for state financed research institute or
big corporation or charity  grants.
 
The median solution seems the most risk-free today with the best prospect  

for a continuous growth. The smart one could follow on its heel, but this is a
totally new concept and we don't know what problem can be found on the road. 
The  best strategy could be to implement the median solution step by step and 

at each  completed step see if we can convert it to the smart one. For example,
the first  step could be to define a complete electronics neuron and 

implement it on FPGA.  At the same time, a "variational neuron" would be 
studied.  The 
next round  would be to produce an ASIC chip able, with the time sharing 

technology, to  simulate 1,000 neurons. The smart variational work would try to 
do 
the same job  in a small software package. May be some brain area may be 
simpler to work out  with one technology or another. For example the sound 

processing area seems very  structured with similar circuits, it would be a 
prime 

candidate for variational  compression. Olfactory domain contains random linked

neurons, it seems difficult  to compress such a network and the median solution
may be the best here. Visual  area may be intermediate, with neverthless a 
good hope of variational  compression.

(1) W. S. McCulloch and W.Pitts (1943) A logical  calculusof idea immanent in 
neural nets. Bull.Math.Biophys. vol.5  p.115-137.
(2) S.C. Kleene (1956) Representation of events in nerve nets and  finite 
automata. In: Automata Studies, C.E. Shannon and J.McCarthy, eds. p. 3-41  
Princeton Univ. Press.
(3) G. Palm (1982) Neural Assemblies, Springer,  Berlin. 

Yvan Bozzonetti.
 




 Content-Type: text/html; charset="US-ASCII"

[ AUTOMATICALLY SKIPPING HTML ENCODING! ] 

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=26815