X-Message-Number: 25890
From: 
Date: Sun, 27 Mar 2005 02:19:02 EST
Subject: Uploading technology (2.ii.0).

Uploading technology (2.ii.0).
 
Memory ask for some proteins produced in the body cell of neurons, a neuron 
model must so take into accout the gene regulation system responsible for that 
process. In the synaptic area, here are very complex biochemical reactions in 
a limited space with a limited number of molecules. To create a model for 
these phenomenon is somewhat tricky.
 
Not all synapses are active, a dendrite spine may have no active receptor for 
fast ionic signals. Even more: It can shift from active to inactive and back 
in some hours. It may be created or disappear in the same time frame. Many 
people reject the uploading concept because they want a building back of the 
brain atom by atom. They don't realize that many synapses in their brain are 
created and destroyed every day.
 
When dealing with such complex process with very many elements, there are 3 
possible models: The boolean, the differential and the stochastic. Two more 
could be taken into account in the future: The positional and the quantum.
 
The Boolean model assume that each process is either on or off, 0 or 1 and 
define a binary variable. The full system is then a boolean function readily 

implemented on a computer. This is interesting for the researcher to get a first
look at interactions of very many variable. It is far too crude for any use in 
uploading.
 
The next tool is the differential equation: Each chemical reaction is defined 
by the concentration of its reactants and products. The time derivative of 
these concentration define the reaction speed. The space derivative gives the 
concentration gradient from place to place. Differential models are continuous 
and address the point made by some uploading critics that computer models are 
discrete and don't take into account the continuous nature of biochemical 

reality. Unfortunately, in the real world, continuity is not always a good 
thing. 

Very often, a differential equation system predict more than one stable states.
If the system get to one of them it remains here. In the dendrite spine yet, 
the number of molecules of a given species is often very limited, in the 100's 
range. Simple random fluctuation of this number may then dislodge the system 
of a stable state and send it to another. These models display so more 
stability than the real thing.
 
The third model tool is the stochastic one, here each molecule of biochemical 
interest is counted and a chemical reaction is watched molecule by molecule. 
This take fully into account the random fluctuation of the molecule number. 

The problem is that the reactants concentration is assumed to be everywhere the
same, there is no gradient in space and no or badly taken into account 
reaction speed with concentration.
 
The last two systems are a model of the position of each atom and one 

computing from first quantum mechanical principles. They are so much computing 
hungry 
that there is no practical interest in them now. This could evolve if here 
was very large quantum computers someday.
 
There are some intermediate models between boolean and differential and 

between differential and stochastic ones. For uploading purposes, the last are 
of 
particular interest.
 
Starting from the differential model, the first idea is to add some random 
function so that here is no over stability when the number of molecules is in 
the 10 - 1000 range. This is the so called Lancevin's equation.
 
Starting from the stochastic side, some gradient terms can be added, this is 
the Fokker-Planck equation. It may be demonstrated that Langevin's and 

Fokker-Planck are equivalent in modeling power. Depending on the case, it is 
possible 
to use one or another, the predicting power is the same.
 
The most advanced system is the 1/(omega) model from van Kampen (*). Here, 
random effects are defined by a variable called omega. The omega function is 

expanded and all terms with the same power are summed up. When they are ordered
in growing power, they form a Taylor serie for the omega function. That serie 

can be cut at any power in omega. At the zeroth power there is the differential
and stochastic models. At first power there are the Langevin and 

Fokker-Planck equations. Higher powers give more precise models at the cost of 
more 
mathematical complexity and computing power.
 
It seems the general model frame must be cast in that kind of system. 

Depending on each case and technology at hand, the expansion in omega will be 
cut at 
one or another power. In a first generation system, it will not be possible to 
go beyond Langevin and Fokker-Planck in most case. Because there has been 

many works on the differential side and far fever on the stochastic one, it 
seems 
best to start from the differential coast. The Langevin's equation is then 
better than the Fokker-Planck one for merly practical purposes.
 
(*) van Kampen N.G. (1992) Stochastic Processes in Physics and Chemistry. 
North-Holland, Amsterdam.
 
Yvan Bozzonetti.


 Content-Type: text/html; charset="US-ASCII"

[ AUTOMATICALLY SKIPPING HTML ENCODING! ] 

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=25890