First --- this is not science. I know what science is, I work
very hard at it at times, and I respect it deeply. But there is room in life
for some parallel thoughts, which may be informed and enriched by science but are
not science at all, in any sense. Still, I will talk about some science here today.
Tonight, in meditation, a thought emanated from me:
"Universes are like rabbits. Once you have two of them, beware,
a billion more can't be far behind. It's hard enough to cope with
one little world of a billion people or so, let alone billions of parallel universes.."
It reminds me of the poet who stared at lots of little pebbles on the ground and thought "If just one of these pebbles should suddenly rise up from the ground, without being pushed or pulled, our whole universe of thought would be totally wrenched out of place." A similar thought...
There were reasons why I thought about multiple universes this morning, and thoughts came back to me about this stream of thought.
But first: this was not motivated by the strong, mainstream thought about parallel universes
and "multiverse" as that thought is expressed in today's quantum mechanics. Still, it's close enough that I should say something about the connection.
Lately, when I get deep into the theories of the universe that people actually use in a practical way,
to make predictions about advanced electronics and the technology of light, and to design new
systems, I focus on just three core theories of physics which have become popular: one is called
"Feynman path," one is called "canonical" and a third is called "cQED," which stands for cavity or circuit quantum electrodynamics. One of these theories, the oldest version, canonical quantum mechanics, summarizes "the law of everything" as a single strange mathematical operator, H,
called "the normal form Hamiltonian." More and more, as I review proposals from different parts of engineering and physics, I tend to be convinced that very few people truly understand all three of these theories. They form a lot of beliefs about the three (such as the convenient belief that they are equivalent) based on what is socially convenient, based on the same kind of mass psychology effects
which also brought us beliefs in epicycles, superstrings, creation science and suicide bombing.
(To think that scientists once used the expression "Holy Cow!"....) Based on the humble criterion of what works and what fits empirical reality, I certainly take cQED more seriously than the other two.
But... this morning I was reminded again that it's not just those three theories, or the radical revision I propose to them. WITHIN the world of canonical QED, there are still different views or interpretations of what's going on. One of the most important views is called "The Many Worlds Interpretation of Quantum Mechanics," expounded in a classic book from Princeton University Press edited by DeWitt. According to that view, we do not live in a mere three dimensions of space. We live in a "multiverse" which is infinite dimensional. The whole vast three-dimensional universe of space which we see around us is not actually the whole cosmos; it is just one strand or thread or track WITHIN that larger infinite dimensional cosmos or multiverse. What's more, the strands all interact with each other according to the laws of quantum mechanics.
This is very much a mainstream theory of physics (unlike the heresy which I have
developed and promoted over time). It is this thoery of physics which led to the
field of "quantum computing" or "quantum information science."
It is really sad, and tragic, and an example of behavior which seriously degrades our progress in science, that people do not give more credit to David Deutch of Oxford, the inventor of quantum computing. Deutch proved, decades ago, that a "universal quantum Turing machine" provides a whole new level of power in computing, in a very basic way, beyond what classical Turing machine computers are capable of. The whole idea was based on the many worlds version of quantum theory.
One could say: "Neural networks tell us how to harness the power of thousands or millions of processors all working in parallel, which is inherently more powerful than what you can do with just one processor chip. Quantum computing extends that still further, by letting us harness the power of millions of universes all coordinated to work in parallel, to divide up a computing task."
So here are my zingers.
First, I do not believe that this orginal, first generation version of quantum computing
is the most powerful one. Almost everyone working on quantum computing in a serious way is taking that approach now, but they are encountering very fundamental problems in going very far with it.
One can accomplish MUCH more by shifting to a kind of second generation quantum computing, which exploits the symmetry between forwards time and backwards time. cQED is a modest step in that direction. It was exciting for me to learn of the paper by Blais (and the 900 people who cite it!)
showing how cQED can point the way to solving the most intractable problems people have been having lately in quantum computing, in a fundamental way. But we can go much further in the same direction by a full-fledged understanding and use of backwards time physics, as described in my open access article published in IJTP about 5 years ago. A lot of initial work has been done, both on thoery side, and on the experiment side...
The first step in that new process, on the theoretical side, is to get back to a new formulation
of the "laws of everything" which does what Feyman path tried to do and does not succeed in doing: get us back to a viable formulation of physics as a theory over nice, symmetric four dimensional space time. No more multiverse. Just one vast space-time continuum. Maybe with "dice" included, but I see no real need for them right now. The effects of chaos and such do seem to be quite enough to explain what seems like random phenomena to us. (Lots of people say that kind of thing, but here it is what we see directly in the math. It is shown.)
Even as we need to strive for a clearer and simpler understanding of how the universe of physics actually works...
Well, not having forever to live, I still think ahead on my own beyond that next step.
The great task of physics in the coming decades (or centuries?) is to get back to
three dimensions of space and one of time (or rather,, four-dimensional space-time). Back to reality.
But in actuality -- the IDEA of multiverse or multiple dimensions might well turn out to be right in the end, even if all the present mathematics and theories embodying it are understood to be completely wrong. It may turn out to be like what they say in Voyage to Arcturus:
"The way out of this world of illusions is not by runing away, but going through it, all the way from here to the end, and out the other side."
Realistic time-symmetric physics (the heresy I have fought for) seems to be essential to letting us build hardware which actually lets us TEST the idea that there is only one universe.
There is a wonderful science fiction story by Connie Willis, "Blackout' and "All Clear,"
which explores some of the key concepts about time. In a clear-thinking concrete way it raises the question: "Is it possible to change the past?" We cannot even do the experiment until we develop
the basic technology of backwards time telegraphs (sketched out in some of my posted papers).
Realistic time-symmetric physics points the way to how to do this. And then what?
Well, I like to do some experiments outside the physics lab, to try to get some feeling about how
This past year -- ironically, in the week when I was speaking at Singularity University, housed at NASA Ames -- I did one little experiment (if you can call it that) which altered my feelings about
the probability that it is possible to change the past. I am certainly not convinced as yet..
there are lots of grounds for being skeptical about that kind of thing... but bit by bit evidence
is beginning to accumulate in my mind that it may in fact be possible to change the past.
(Oops: I hear some folks saying I should cite another sci fi, the saga of oversoul seven.
If that is true, it would immediately imply that the cosmos is more than just four dimensions.
It sounds at first like "hypertime" (a concept I published in Nuovo Cimento in 1973 or 1974,
which like many things I published, was "later reinvented," though it's so basic I wouldn't want to waste time on clearing up the histories). In the "hypertime" idea, the cosmos consists of the entire space-time continuum "as it now exists," at the present moment in hypertime, but it also includes the earlier versions of the space-time continuum as it existed before. The hypertime model is basically the simplest possible way to express the idea that we can change the past.
But is it so simple?
In 2009, I was intrigued by the last Star Trek movie, in which Spock in one universe, as an old man, communicates back in time to another universe, where there is a younger version of himself.
Actually, I thought a lot about that movie that year... and later wished I had taken it more seriously;
as the science officer of a certain office in the Senate, I should have worked harder to help the tough guy who ran the place. He really could have used more help, and the entire earth might be in better shape if I had pushed harder to give it to him.
And yet I can't help thinking... if I HAD done that, a lot of crucial real-world problems in energy and space and economy might be in much better shape today.. but I myself probably would NOT have learned the many really important things I have learned as a result of taking a different path.
I wondered: which is really the better path, in the end?
And then I remember the spirit of David Deutch's ideas, which remain valid even if we just chuck out the old Hamiltonian version of the dynamics of the multiverse. Could it be that NO SINGLE
choice of path or thread or time track is best? Could it be that things really get worked out only
with parallel processing, with two universes benefitting from each other?
And that's what led me to react logically as I srtaed with here: once you admit two universes like that,
it suddenly implies a billion. I have enough troubles with a billion people at lonce, let alone a billion
universes. "Time for me to check out. No way can I handle anything like that!"
But then the voice of reason started to assert itself...
"Hey, think about neural networks, think about parallel processing, think about David Deutch."
"With neural networks, no one expects a single neuron to synapse with every other neuron in the entire brain. You can even build decent CNN chips, perhaps, in which it is good enough for
neurons to connect with just a couple of their neighbors, if they do it well. But if they DON'T connect with their neighbors, at all, you don't have a brain, you have a frothing soup."
Oh God, do we really have to?
And what about those clearly visible time tracks in which all humans die an unspeakable death?
Whatever. Back to doing my job... paperwork to catch up on. And human conundrums which are already complex enough to stretch (or break?) my ability to get it right...
Best of luck,
By the way, for those interested in the three theories I mentioned...
my impression is that Feynmann path is largely equivalent to somehting which might be called "raw form canonical" as opposed to "normal canonical."
Here's how it works. Feynmann path and both types of canonical quantum field theory (QFT) begin by mapping a classical Lagrangian or Hamiltonian into the "corresponding quantum field theory."
In principle for canonical field theory, this is really just a kind of heuristic device for
trying to come up with an interesting version of the Hamiltonian operator, which needs to be tested empirically in any case.
In raw canonical QFT, we do the mapping by replacing each classical field, like phi or psi,
by the "corresponding field operator." And we map multiplication into ordinary matrix multiplication.
But in normal canonical QFT, as we map classical fields into different objects (field operators),
we also map classical multiplication into something called "the normal product," which is really fundamental to canonical field theory.
And so, for basic QED, some folks would look at raw canonical QED, note some weird terms which are mathematically intractable (assuming infinite energy density in raw empty space), wave their hands vigorously, and say "we all know that infinity minus infinity equals zero, so we can assume all this stuff cancels out." That's their analysis of the fuzzy sgtuff which is sort of equivalent to Feynmann path (to the edxtent that either is mathematically well-defined). The alternative, more traditional approach (see Mandl and Shaw), is just to START from the normal form Hamiltonian,
which does not contain the most egregious terms to start with. WHY assume that the egregious terms are there?
There are some very well-placed but ill-informed people who will tell you "but wait, we need those vacuum terms to explain basic stuff like the Lamb shift." But, sadly, lots of folks get confused by
the differences between vacuum energy, vacuum terms and vacuum polarization. ("Vacuum" is a popular term, as is the letter mu, which I have at times seen used for three different things in the
same equation.) Mandl and Shaw show perfectly clearly how we can replicate the Lamb shift and plenty of other things, using the normal form Hamiltonian.
There are other differences. For example, Rajaraman shows how Feymann path requires that we have to add "quantum corrections" to the classical mass of a soliton in a bosonic field theory, to get the full path mass. But the "P" mapping (see Glauber, etc...) shows that the quantum mechanical mass is excatly the same as the classical mass in these cases, if mass is defined by the NORMAL FORM HAMILTONIAN, rather than the raw form.
Also, Feynmann path predicts certain very interesting transitions called instantons
beyond what canonical would predict. Bezryadin, in thorough empirical studies of superconducting nanowires, has shown definitively that those instantons are not there. As for cQED... it is basically just a minimal variation of normal canonical QED to accommodate empirical reality (very important classes of devices such as VCSELs). Again, see my IJTP paper, 2008 as I recall.