X-Message-Number: 25618
References: <>
From: Peter Merel <>
Subject: Limpinwood X-Prize: Butterflies vs. K-Spaces
Date: Tue, 25 Jan 2005 00:03:00 +1100

Yvan Bozzonetti writes,

>  Butterfly are insects and all insects have faceted eyes. These works  
> as four-waves interferometers and produce data in a so called K-space.  
> One dimensional K-spaces are lines segment with a centered point. Data  
> near the center contain mostly contrast information and more distant  
> points are more concerned with spatial resolution. K-spaces are used  
> for example in magnetic resonance scanners.

You may use K-spaces to represent spatial frequency data. There are  
other ways to do this; for example  
http://appft1.uspto.gov/netacgi/nph-Parser? 
Sect1=PTO1&Sect2=HITOFF&d=PG01&p=1&u=%2Fnetahtml%2FPTO%2Fsrchnum.html&r= 
1&f=G&l=50&s1=%2220040107059%22.PGNR.&OS=DN/20040107059&RS=DN/ 
20040107059

Just how butterflies represent their visual field is not obvious.

> If the butterfly memorize both, the contrast and spatial complexity of  
> its starting point, it has two points to recall in the K-space. To get  
> back, it has find a picture whose the point pair position match what  
> it has memorized. This task must take less than 3 000 neurons. So, it  
> could be indeed implemented in an ANN.

You assume

i) that artificial neurons are computationally equivalent to biological  
neurons. This appears quite unlikely, at least in terms of current ANN  
technologies.

ii) that I was correct in the number of neurons I quoted. While the  
source I used seemed authoritiative, most of the articles I've found  
online suggest 300,000 neurons is more like it. Mea culpa.

iii) that K-space is sufficient for spatial orientation. When a human  
looks at an MRI they do a lot of pattern-recognition work to orient  
themselves wrt the image. K-space doesn't appear to provide any  
inherent pattern recognition facilities. What it does provide is a very  
useful means of quantization of interferometry data into a Hausdorff  
space. But the method of identifying location in terms of structures of  
relationship on this space remains to be provided.

iv) that it is obvious how many levels of abstraction/classification  
are inherent in butterfly behavior - and that there is only one. Since  
ANNs only provide a single level of classification it may be that no  
ANN is appropriate to the task, K-space pattern extraction  
notwithstanding.

>  The next question is : why and how the butterfly get programmed with  
> K-space software. The answer comes from the way 4-waves  
> interferometers work. In one "snapshot" the picture definition is very  
> bad but it improve as more snapshots are added. When the butterfly  
> remains at the same place, the picture becomes more and more finely  
> defined, : The resolving power get larger. When it fly and move from  
> instant to instant, it can't accumulate a large number of snapshots  
> and the world look blurred. If it  get near its initial rest place,  
> the K-space data can be partly superimposed on what it has memorized  
> and the picture get clearer. So it gets back to that place because it  
> is the only one outside the fog.

Butterflies obviously do not search for their roost by a random walk.  
If so they'd by highly unlikely to get back to the well-resolved part  
of the world. Whatever their representation of their bodies and their  
environment, it must account for more than a single POV or the  
fixed-point behavior of being stationary on a grass blade. This seems  
to me to scotch your approach - or at least to require a considerably  
more capable design within which to embed it. Still I'd like to  
understand more and will be cheerfully proved wrong.

Peter Merel.

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=25618