X-Message-Number: 11575
Date: Mon, 19 Apr 1999 01:34:01 -0700
From: Mike Perry <>
Subject: Quantum mechanics, many-worlds, etc.

Thomas Donaldson, in #11566, notes that relativity and quantum mechanics
haven't been harmonized yet, which is my understanding too. This could spell
trouble for any arguments or extrapolations that depend too heavily on one
or the other of these theories being correct or at least very nearly so. On
the other hand, though (and I'm not an expert on this but it seems to be a
reliable conclusion) there are efforts underway to reconcile the two, and it
appears that such a reconciliation is possible without bending either theory
too much. At least the outlook seems reasonably hopeful. A reference to this
is in Moravec's new book, *Robot: Mere Machine to Transcendent Mind*, p.
214, note 11. Interestingly enough, the "bending" in this case would appear
to affect relativity, not quantum mechanics, in that it calls for time and
space to be discretized on very small scales, rather than being the familiar
continua they have seemed to be for so long.

I also see a flaw in Thomas's argument:

>Furthermore, the claim that we can use a Cantor-like argument to show that
>no possible finite computer can produce all possible worlds does not show
>the nonexistence of SOME finite computer capable of producing any
>arbitrary possible world.

If we replace "possible world" by "computable function" we see that there
are functions that no computer can compute. I.e. we can't just
specially-design one computer to compute such a function. No such computer
is possible, even in principle. And on the other hand, there *does* exist
one particular computer that, with suitable programming, can compute any
function that any other computer can compute. The same sort of argument is
used by Deutsch in his claims about which worlds are producible by a
computer and which are not. 

Thomas also says

>... but whoever said that we must remain of any fixed size? 

With that I completely agree. Bekenstein bounds and discreteness would not
preclude the possibility of indefinite growth in an expanding universe, and
thereby, immortality.

Bob Ettinger, in #11569, raises the possibility that the Schorr
factorization algorithm might be explained as an analog effect of some kind,
without invoking the many-wprlds idea of quantum mechanics. It is worth
pointing out that the many-worlds theory accounts *in detail* for how the
algorithm is supposed to work, in a reasonably straightforward (if
complicated) way. In particular it accounts for how a quantum device could
perform the factorization more efficiently than any classical computer, this
being where the parallel universes come into play. Any rival theory worth
its salt must do the same. So far, I haven't heard of any such rival. It's
true that there are single-world theories that make the same predictions.
But (and Deutsch discusses this in his book) they seem to be forced into the
position of asserting that things behave "as if" there were many universes,
etc., and thus to be contrived explanations.

Bob also says,

>I was going to stop here, but as long as I'm gnawing on Deutsch I may as
>well  mention another defect in his book as it seems to me. He claims that
>interference effects prove the many-worlds hypothesis, and talks about 
>photons interfering with each other, and discusses some well known 
>experiments. But interference is most easily understood as a wave
>phenomenon,  and nowhere does he so much as mention waves, let alone some
>way of  reconciling the famous wave/particle dualism.

The "wave" explanation of interference effects breaks down when only one
photon at a time is involved. How does that one particle manage to interfere
with itself? A straightforward answer is that "something" is nudging that
photon, one or more "ghost" photons from parallel universes.

Bob adds,

>Unless I have been  totally oblivious, he has provided no explanation for
>interference.   

Starting on p. 41, you can read about an interesting variation of the famous
two-slit experiment, that is, a four-slit experiment, in which half of the
interference bands that you get with two slits are virtually cancelled out.
The explanation of this is, again, that "ghost" photons from parallel
universes are nudging "our" photons so they don't strike where they
otherwise would. So how is this "no explanation for interference"? (The
"ghosts" by the way, are just as "real" as the "real" particles, just parts
of other "real" universes than our own.)

Mike Perry

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=11575