X-Message-Number: 26096
From: 
Date: Wed, 27 Apr 2005 02:39:51 EDT
Subject: Uploading (3.ii.1) First bits.

 Uploading (3.ii.1) First bits.

There is a first attempt to define the rough needs in term of bits allocation 
inside an artificial neuron.
 
The action potential is a simple rectangular signal, it needs only one bit.
The presynaptic vesicle release needs at most 9 bits, these split into: 7 for 
one site and 2 for up to 4 sites.
The probability P must rule a seven levels system, if it has far more than 7 
levels itself,  there is a waste of information, so P must have 3 bits.
The post synaptic dendrite spine head may have up to some hundreds receptors 
of a given kind and may be 3 or 4 kind. It seems that 10024 levels would fit 
nearly all needs. That define a 10 bits signal.
The dendrite head, for a given neuron kind, may be defined by 4 receptor 

species, each with 256 possibilities, from 0 receptor to 255. This would take 8
bits and for 4 species: 32 bits.
This give 4 billions possibilities for a spine, hardly a standardization. In 
fact, we may be unable to count the receptor number in a given spine head 

after some hours of ischemic decay. If we stardardize the number with 16 level 
or 
4 bits, that would say that we count the receptor number as 0 to 7 with code 
size 0000, 8 to 15 with code 0001, and so on up to between 248 and 255 

receptors in code 1111. There would be 16 bits used for the spine size or 16,000
possibilities.
 
The next element is a dendrite section with Green's function differential 

solution. There may be from zero to 20 amplifier spines. the number and position
matter here, this give one million of possible solution. More: One spine may 

have different gate levels from another for different signal intensity, because
of different channels types. So we can't assume a single probability table 
for all spines. Depending on the trafic density for excitatory or inhibitory 
signals, the probability table can be shifted to another.
 
At synaptic level, 16,000 spine heads possibilities have been assumed, this 
translate here into 16,000 different probability tables.
The probability curve is defined by a number of point. Each channel block 

define one such point. There was 16 blocks at most for one channel species and 4
species. This give 64 possibilities to adjust the probability curve. Each 
table has so 64 element.
 
At a given potential intensity, it would be useless to have a probability 

defined on more bits than the gap to the next channel number. Because the block
here have 8 channels, this may be expressed with 3 bits. To take into account 
the Shannon's theorem on digitized signal, we may use x2 more states or 16. A 
probability would then be expressed with 4 bits and would run from 0/15 to 
15/15.
 
So we have 16,000 probabilities tables, each with 64 entries and each entry 
defined on 4 bits. The memory to hold that would need  22 bits.
 
I understand such computations are somewhat tedious. I think two lessons can 
be drawn from this:
1/ The computing requirement for uploading quality neurons is well in the 
range of present day technology.
2/ The serious thinking is no more at the philosophical level or even the 

biological data collecting, it is at the electronics implementation state of the
model. It bite a lot!
 
Yvan Bozzonetti.


 Content-Type: text/html; charset="US-ASCII"

[ AUTOMATICALLY SKIPPING HTML ENCODING! ] 

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=26096