Wednesday, May 28, 2014

protecting your local power system from hackers

Cybersecurity of your microgrid

An old friend walked by, said he is building a microgrid, and wonders if we have any thoughts about how to protect it.

Well, I haven’t been trying to build a coherent defense strategy. Not my job, and I have enough other jobs. But I have heard a whole bunch of things here and there. Yes, I reviewed the new NIST cybersecurity framnework/”standards” for power systems, and I remember the three way turf war between commerce type folks, defense and security on cybersecurity for power. Security of critical infrastructure, Clancy games… yes, have heard lots.

Four quick thoughts came to my mind for a friend:

1.     Whatever they tell you, first priority is to use only fully compliant, verified compliant, SE
2.     Use fiber, no wireless.
3.     I know a guy who seems to be leading the pack of current ordinary intrusion detection.. so contact him, but I think we could do better. (Sandboxing and pattern recognition… one can do lots better in pattern recognition if one really exploits the underlying principles, from epistemology to game theory).
4.      Don’t forget the issue of hardware backdoors.

I have normally thought of hardware backdoors as “impossible” in a way, though I suppose a real US-China partnership might be able to do something.

Impossible… if something is lurking there in a piece of physical hardware, in computer science, how could one possibly hope to elicit it? Well, OK,  a bit of the usual sandboxing analysis does apply.. but
it’s hard. (Maybe serious… sorry if I have underestimated it, subconsciously.)

But then comes an amusing idea: 2QC is not just a kind of fast computer.  It’s physical. It’s using physics as its computer. So hooking up a 2QC to a suspect chip…. endogenizes the chip into the physical computational process. What is a black box to us is a white box to physics. So things become feasible which would have seemed infeasible.

HOWEVER… problem formulation in specifying what a back door IS might even be the harder issue. This is not like factoring a number, where we have a clean formulation of the thing to be computed.

And in any case, building the basic 2QC comes first, before such interesting possible extensions/applications.

By the way, my friend said "we HAVE to use some wireless, to get all the sensors we need." OK, that makes life more complicated, if economics DEMAND a mixed system.  For some systems, the security realities just create infeasibility, but for others some compartmentalization and immune system make sense.

What of national policy issues to harden the power grid? Not this morning. I’ve made some noises through proper channels in routine government business already anyway… for whatever that may be worth.

Clancy's Threat Vector can mislead people in some ways, but his "four vectors" are familiar, as are the types of hacker attacks -- including those which can physically destroy big machinery.

Tuesday, May 27, 2014

The Science of the Shower

About a year ago, I understood that I needed to change how I take a shower, because I finally internalized what I saw around the world and what we hear from science. Even though a lot of hotels in the US have plumbing which makes it hard to do right – plumbing which encourages less effectiveness,  waste of water, and more time.

The basic idea: get wet quickly, and turn off the water. Then soap up effectively all over. Then… in no hurry.. turn it back on, to rinse of the soap. The optimal details depend somewhat on the needs of the day; on some days, this can be done quickly, but on leisurely days I add some use of one of those long rubbing tools (like brush on a long handle or the bright blue and white plastic thing we have now to serve the same purpose).

More detail: on a typical kind of day, the minimum steps:

(1)  unless it is summer, be sure fans are off, windows and doors closed, and a clean pair of socks and underpants are near at hand.
(2)  turn shower water to maximum heat and full strength, and feel the water
(3)  when it is hot enough, immediately turn the temperature knob to a level which will be bearable hot as it gets to equilibrium
(4)  get wet enough all over as soon as possible, and turn it off as soon as possible. Experience gives a feeling for how wet is wet enough, anticipating the next stage.
(5)  Soap up all over, from bottom up. First, sweep soap all over the left leg, quickly moving to use fingers to be sure soap is between the toes and on all fingers, and then enough on the rest of the leg. Then the same on the right, Then the thighs – possibly using left hand if there is a dirty area one must be strict about. Then quickly arms, front, back neck face. Be sure you see where on the floor of the shower is the soap to be used on hair. Then face and hair.
(6)  Rinsing off, just by pulling the volume, same temperature as before. Do rub with hands on body as water/soap mix starts to flow… make sure to use it for the moments it is there. Rinse off hair and bottom orifice last.  IF any part of the body seems to want extra hot water, and if there is time, do it.
(7)  Floor should be dry enough that soap stayed between toes until step 6. Then, drying off… top down… maybe don’t dry foot or bottom of foot.
(8)  Get out… and sit with feet to the right. Dry feet throughly first. (“Angelical cats dry between the toes.”  Immediately put on the socks, most days. Then out towel back up, put on underpants, go to next station.
(9)  At next station, brush hair, and use Qtips dipped in vinegar to clean out ears.
Not enough time this morning to summarize all the observations which led to this. Japan, Ukraine, Fareed Zakaria, interagency committee on WMD terrorism, simple logic., watching what works in washing floors. Zakaria reciting doctor’s rules about letting the soap act on the skin for a minimum of 30 seconds. Still wondering about solvent efficiency of how I soap up now, versus the more watery stuff in stage two.  And of course, I recognize that the optimum is a function of the needs of the specific day – “a parametrized decision block or skill,” as I would say in brain math or robotics respectively.

All for now.

Having some knowledge of engineering, I have at times thought about how one could use a shower as a testbed or problem set for a few basic principles. I have also thought of using additional degrees of freedom here, like volume of water flow, and mobile showerhead. However, even good showerheads from Costco seem to have some maintenance problems when one does fancy things, so I mostly don't.
ON rare occasion, however, I have found it important to use mobile showerhead to direct high temperature to some areas.

Complementing this is ringing mouth with the hottest water I can get, salt water, at time of getting into bed or out, when I suspect bad bugs in the mouth or other parts inside connected to the mouth.
Easier and cheaper than chlorhex, which has its uses at rarer times. Am glad I do not need to mix vinegar with stronger stuff in the ears, for years, but there were times when that was necessary,
and a bit tricky in the US medical system which has obvious problems.

Saturday, May 10, 2014

Update on quantum computing and secure communication

This past week, and a month ago, I attended two conferences (SPIE and Princeton) which represented the best current work anywhere, by certain metrics, in the areas of quantum computing and secure communications. I learned a lot about where things are really going now – both from the talks, and from things which were said by people who attend “all the conferences.”

There is a lot of grumbling about funding now. That was not true about five to ten years ago, when I went to bigger conferences on the subject, and quantum computing was really unusual for the amount of money pouring in, in the US, the EU, China, and other places.  Here, people said “Since DARPA and NSF cut back , a lot of us have been squeezed very badly, and have changed what we are working on.”
But of course, there is still NSA and IARPA, and at least one talk gave them great credit for their ongoing support – though others said they have become a bit short-term product oriented in a way which makes it hard to do major breakthroughs. But funding in China is a different story. 

In talks people said – “It comes down to how many qubits you have. For really solid secure communications, you only need about 8 or 10 qubits. But for quantum computing, to factor large numbers and break codes, you need more like 100. People have done 8 to ten, but with 100 not in sight there is not so much concern.” Thus quantum communications has become a practical, industrial strength area, but computing is another matter.

8 qubits or 8 entangled photons? That needs pinning down.  They are certainly related, but yes, a more precise story is possible. Sorry.

People have persisted with the story that only three groups in the world have ever produced more than TWO entangled photons – the group of Yanhua Shih at Maryland, the group of Anton Zeilinger in Vienna, and the group of a former student of Zeilinger who now has much more funds than the other two now that he has moved to Sichuan province in China and helped them develop an edge in space communications, among other things. Shih and Zeilinger have gotten up to three entangled photons (“GHZ states”), but a guy who visited the Sichuan lab reports that they are up to 8.

But 100 (and code breaking) is not so way out at this point. Shih has announced a new way to produce entangled photons – but an issue about funding to move it to the next stage.  Gerald Gilbert of MITRE said we need to pay attention to the work of Fister of UVA, who has a way to generate 1000 qubits, but as yet can address only about 60 of them, apparently with photonic lattices. Michael Leuenberger, a former NSF grantee, descdribed his new approach to generate hundreds of entangled photons in lattices of cavities connected by “optical wires.”

Someone else has been paying attention. A major group at Nanjing University has had experimental results with optical lattices -- cavities connected by a lattice of waveguides),; this was reported by their collaborator, Xiaodong Li, of CUNY Brooklyn.  They can adjust or adapt refractory coefficients in the coupling of these lattices. As with adapting weights in a neural network,  they say they have proven this gives them universal computing power. But still they refer to this as a kind of “linear quantum computing,” echoing Dowling’s talk a few years ago about linear quantum computing with optics.

My own paper (previous post) raised questions about just how linear optical computing has to be. Are polarizers best represented by the usual superoperator,
Assumed in previous theory, or will new experiments force us to change the model to a nonlinear superoperator or to something completely different (MRF)? The n nonlinear superoperator model I propose would only involve a “small nonlinearity” – but a small nonlinearity like the small nonlinearity in neurons in neural networks, enough to give truly universal nonlinear computing power. I asked Leuenberger: “With all this spin and spintronic kind of stuff in these lattices, don’t you have polarizers in here too? How do you model them, for purposes of systems design?”
He referred to a book by Joos, which I clearly need to follow up on, though I have no idea as yet how that affects things. It seems really important here that the underlying physics is not well known enough to justify high confidence in what comes out of theory when it is not checked empirically.

Yesterday (Friday, the last day of the conference) also included a tribute to Howard Brandt, who set up the SPIE series of conferences in quantum computing and information, but died suddenly just a few weeks ago. This particular track of SPIE was initially only attended by about 15 people, but grew to 35 to 50 under him. More important, they said, it is truly unique in reaching out to a cross-cutting interdisciplinary perspective, bringing together the diversity of backgrounds necessary to look beyond short-term things to larger directions and possibilities in the field.  Howard also played a key role in much larger meetings, such as the earlier DARPA QUIST meetings or interagency review meetings which I went to in earlier years. (I stopped going when they moved them far form Washington, and travel limited me. This one was in Baltimore.) Howard was clearly a great guy in many ways, but… on to details of this one.

It is clear that the old idea that we can encode our knowledge into wave functions, as opposed to density matrices, causes confusion even now, even in mathematicians and quantum theorists oriented towards practical systems who are doing serious important work. My IJTP paper a few years back was pretty explicit about this basic issue, but a hundred years of misleading introductory courses have not been overcome yet.  Many theoreticians still simply assume that macroscopic objects like polarizers can be adequately modeled by operators, even unitary operators, when solid state empirical reality shows that we need to encode information into density matrices, whose transformations are superoperators, not operators. My SPIE paper and my other recent arxiv paper on Bell’s Theorem experiments spells out the simple algebra involved.  One important source on this subject is Carmichael’s two-volume book on quantum optics.

Years ago, I was really sad when a guy I knew named Kak fell into that same intellectual trap. He read lots of stuff about quantum computing, from quantum theorists who were imagining the world in terms of wave functions. At the time, he argued people needed to pay attention to the question “How do we know what the starting phase really is?” From the viewpoint of density  matrices, this is a silly red herring, and working quantum computing systems (like seminal early stuff by Gershenfeld) did require translating the early concepts into real stuff with density matrices. I strongly respected Kak’s interest in Upanishads, but…..

But at this conference, a woman named Darunkar from Oklahoma presented some recent work by Kak which I was not aware of, on secure communications, and presented an extension which was very exciting to some of the folks in secure communications. While most of us in the US have been almost mesmerized by the beautiful vision of quantum ones and zeros, and excited by our ability to work out all the many things which can be done in that mental space, she quietly suggested a way to get beyond the limits of that space. These photons are not really just ones and zeroes, after all; they may be at any linear polarizaton angle from zero to pi. If an eavesdropper Eve doesn’t know what ANGLE is used to define zeroes and ones, she may be badly confused if she uses the same angle.

“And so,” she said, “which not exchange theta along an expensive triple channel like what Kak devised, which is totally secure, and then simply encode using theta for some stretch of time, to achieve absolute security at lower cost”? 

That was a good lead-in to the next talk, from a collaboration of Beijing and Australia, which was far less lucid and photogenic.  (I halfway wondered whether I was the only one to catch the basic drift of it, and that only because I was so fully engrossed in the previous presentation.) My guess at the thoughts I think I heard, amidst many many slides of small print, equations and lemmas with a thick soft-spoken Chinese accent: “Why just do one theta rotation? Why not break the data into blocks, and try out random unitary transformations – rotations – in the higher dimensional space which defines a block of data? Why do Kak stuff? Bob can give Alice a unitary transformation when he first meets her. Then, after each transmission, they can end with an encoded definition of a new randomly generated unitary transformation to be used next time.  As it keeps changing, there is no way that Eve can keep up.” (Especially if Eve is thinking all in 1’s and 0’s!)

In general, both here and in Princeton, I notice that a lot of research is focusing on how to keep out Eve – in Europe and in China.  I wonder how the influence of Snowden has affected research in those countries. At Princeton, I heard of very sophisticated and rigorous German research  , in the land of 1’s and 0’s, where.. people seemed to say “We used to worry about simple kinds of eavesdroppers, but now, after Snowden, we are doing research on what we can do to keep from being overheard by the very most intelligent and diabolical eavesdroppers which could possibly exist.”

At times, I was reminded of a brief passage in Clancy’s one-sided but interesting book Threat Vector, where the US is nearly destroyed by a cyberattack from China – which was possible, he said, because of all the bright young Chinese students whom we educate but refuse to hire here, who are given almost no choice but to return home.  (Soon after reading Clancy, I immediately grabbed an antidote, Stapleton’s
Novel First and Last Men, where muddling thorugh to a breakthrough in US-China relations was crucial to creating a prosperous world civilization which endured for 4,000 years after that. And then died, due to overdependence on fossil fuels.)
Lots of other things here. There was a fascinating, lucid and unique talk by Micheel Frey of Bucknell which I need to follow up on – about extracting energy from lattices through global or local-quantum operations, relevant both to decoherence and to energy as such.  There were practical talks by Alan Mink of NIST, on how to do ordinary error correction together with quantum QKD, aiming to move from megabit rates now possible to gigabit, with a search for polarization codes to make that possible.  (He had practical comments on GPU (SIMD) versus FPGA of interest to me for other reasons.)  The Mitre talk on remote sensing talked about the need to search on suitable diffraction-free light patterns to make the power of quantum imaging really work in remote sensing (1/N versus classical 1/sqrtN) means 3 times the resolution if you get to 10 entangled photons.) I wondered why they did not use more automated search technology to find a good pattern.

John Myers (retired from Harvard) commented, among other things, on how it was proved in 2005 that you can’t just encode your information into a wave function. I have the impression that this is another side of the wave function versus density matrix story. He also argued for the need for adaptive clocks, when clocks are crucially embedded into computer systems and need to respond to what is actually happening in the computer system.  Would this extend even to clock cells like in the nonspecific thalamus? I emphatically said I don’t know, and haven’t seen discussion of that before.

One guy asked: “How real ARE those Bell states that Yanhua Shih’s group generated with thermal light?” I began to hope that the new triphoton experiment can be done IN PARALLEL in Maryland and in other places with SPDC apparatus doing multiphoton states, so that results can be announced jointly, so as to prevent nasty “antibodies” forming, as in the movie Inception, making it hard to sustain the new direction even after it is confirmer LATER in another place. Life has taught me a lot about how important the “mental antibody” phenomenon can be.  Will our entire civilization be destroyed by a kind of mental autoimmune disease? It seems that way at times.

Best of luck,


Added: of course, in quantum computing, "decoherence" really disentanglement) has been a crucial huge barrier. Quantum demolition spaces haven't been the breakthorugh many hoped, and quantum error correct seems to have exponential costs offsetting quantum improvements n performance. Quantum modeling is crucial with or without quantum computing at nanoscale, simply because things become quantum mechanical there, like it or not.

One guy asked me offline: "What of the cavity or circuit QED breakthrough hope in decoherece?"
Yes, that's one of one to three. (For reasons of time, I won't check my notes on others.)
Yes, I said, from VCSELs we know we can use cavity effects to suppress the usual "unavoidable background noise." But to do it, we need proper models for systems design. The usual infinite resrevoir of noise model, which Carmichael refers to, won't do it; it's basically MORE noise. That's why new models ae crucial in these new breakthrough quantum lattice semi-optical computers. (I have to review Rabi-Hubbard as one small part of this, as Princeton folks taught me. And of course work by Glauber and Scully on noise, the theme at Princeton.)

Many of the folks we used to fnd in electronic quantum modeling at NSF disappeared to the EU, because of the great funding there, albeit more near-term industry market oriented.

There were other interesting offline conversations in both places, but must run.

At some point, I may need to write a paper on the type of quantum NP hard quantum Boltzmann machines which easily become available if my MRF models are vindicated. Also more on the x y z aspects of some types of triphoton experiments. For later.

Friday, May 9, 2014

Faster than Light and a real experiment

Yesterday I gave a talk at SPIE ( proposing a new experiment, probably to be performed soon, where standard quantum mechanics does seem to predict a way to do faster than light (FTL) communication. This was my second talk on this subject; about a month ago, at a conference in Princeton for leaders in quantum optics and communication, I ran across a team of experimenters which can do the experiment, using a new low-cost technology to produce entangled photons.
  The talk last month went very well, and it went well enough yesterday -- but, thanks to feedback, I see how to explain some key points more clearly. (My SPIE paper and two related recent papers are easy to find at, searching on Werbos.)
   The new experiment is basically a straightforward enhancement of the famous "Bell's Theorem" experiments, which helped give rise to all the huge new efforts in quantum communication, quantum computing, quantum imaging and so on. My paper begins by explaining the classic review article by Clauser and Shimony, which described the first decade of Bell's Theorem experiments, predictions and analysis. But it does more than just explain all that. When people say that "quantum mechanics correctly predicted those experiments,' where did those predictions come from? They come from standard assumptions made by Clauser and Shimony -- but that paper did not include all the algebra! My paper used more elegant mathematics to show how their assumptions about quantum mechanics led to their famous prediction formula (R2/R0=(1/2) cos**(theta_a - theta_b)) -- in just a few pages.
       The most important assumption was that polarizers produce an effect called "collapse of the wave function" or "operator projection." Using that exact same assumption, I calculate the predicted THREE-PHOTON counting rate, for a certain set of polarizer angles. These assumption, traditional quantum mechanics, predict that the counting rate will be different, depending on which polarizer/counter the
light reaches first. My paper also mentions three different models, based on simple probability theory, which also make correct predictions of the Bell's Theorem experiment -- but a different prediction for the triphoton experiment.
   Thus: when the experiment is done, there are only three possibilities. Maybe it will agree with the version of quantum mechanics which Clauser and Shimony used. Maybe it will agree with my new MRF models (especially MRF3). Maybe it will disagree with both. All three possibilities would have huge implications.
  If Clauser's way of computing the quantum mechanical prediction does not work here, does that invalidate quantum mechanics AS SUCH? No. Neither did Bell's Theorem experiments invalidate classical field theory.
Clauser's prediction of the Bell experiments relied on 'the collapse of the wave function," the model of the polarizer as a projection operator (or, really, a superoperator, which is different -- explained in my paper.).
You can still believe in the Schrodinger equation in Fock space without believing in the collapse of the wave function. Likewise, the Bell's Theorem papers use the powerful loaded word "causality" for a specific assumption about statistics, coming from a kind of untutored common sense, which certainly cannot be derived as a consequence of classical field theory (PDE).
   HOWEVER... if the methods which Clauser used break down, what can we do to actually predict the crucial function here, R3/R0(theta_a, theta_b, theta_c, p) (where p is one of the six possibilities for which polarizer/counter pair is reached first)? The only alternatives NOW on the table are the three MRFmodels
I have given (and the modified nonlinear superoperator I propose in my paper as an alternative to the usual one.) For now, if thew old way fails, my four alternatives are the only game in town, and I do hope they work. (By the way, Clauser is a great guy, and I hope I don't make it sound as if I identify him with the calculations he used in this one paper in the past. )
  Of course, many people would be most excited if R3/R0 DOES depend on p. My first concern is to help make sure we have DATA on the whole function R3/R0, for the type of entangled photon source I discuss in the paper. Then we can say more about what predicts it.
  If R3/R0 doesn't depend on p, what then? No FTL. But maybe some backwards-time communication of information, new possibilities for quantum computing and other such things. Maybe even more than FTL, ONCE we understand enough to correctly model such things.
  At the conference, I heard a talk form Leuenberger, who has ANOTHER way to generate multiple entanglement. (So far, everyone still says there are only three groups -- one in Maryland, which has the new method, one in Austria, and one in China which gets lots of money in Sichuan province where people
are directly aware of major national security applications.) In essence, he wants to use a big lattice
of wires of light and cavities, and has a new way. He recommends that I read a book by Joos, for a different
way to model the kinds of polarizers which are a crucial part of "spintronics' and this kind of massive quantum computing, which may indeed be enough to break the codes now in use. I do not know how solid the empirical data is yet on the ways of modeling those more complicated systems. A talk today from U. Nanjing discussed work they do there on this kind of optical neural network, using refractoriness as a way to ad tunable weights and get universal computing; however, they still think they are stuck with linear quantum computing. If triphoton works, we are not.
    All for now -- but lots more to say.

Sunday, May 4, 2014

how does time connect with consciousness? -- new connections

Deepak Chopra and Menas Kafatos are working together to put together a new special issue of the journal Cosmology,
called "the Time Machine of Consciousness." From quantum physics, we know that time does not work the way people
used to think it does -- but how does that affect US? Do our minds depend on nonclassical aspects of time, or can they give us an opening into new aspects of time? They are collecting a variety of serious viewpoints, on this question, including mine.

I haven't yet written the paper... but here is a concise summary of my position, form my email reply to the invitation. (If you have questions...
I can try to make sure they are explained in the full paper, within reason.)

These are uncensored personal views, and I suppose I will tone it down for the journal paper. 


I feel a responsibility to do justice to this challenging invitation,
and I intend to do my best, probably just barely on time for the
deadline. I have so many urgent deadlines right now,
I should perhaps not spare even a thought for this AT THIS TIME... but
I actually tried to start writing this weekend at home, and realized
it will take some iteration to try to figure out how to write about
such a complex and entangled subject in a way which your audience and
others would appreciate.

To begin with, the task is made especially challenging for me because
I have been leading "separate lives" in parallel, pushing the envelope
on the physics of time, and on the mathematics of intelligent systems,
and on direct practical dealing as best I can with the "noosphere"
itself. So please forgive me,  but it may help if I think of how in
the world  to try to explain how I see these issues  to real people
like you.

To begin with, let me assure you I am not being schizophrenic here. I
really have figured out how to integrate these different aspects, and
to pay serious attention to the "paranormal" or "spiritual" side of
life, whatever we call it, viewed from multiple perspectives at once.

Next -- it does seem clear to me that "backwards causality" is a fact
of physics as we see it in the laboratory; I am hoping that THIS WEEK
I will be able to secure funding for a university researcher capable
of doing an experiment as decisive as Michelson-Morley, which should
put an end to some of the speculation. As a matter of social
convention and civility, I am taking a stance of "we will see" on the
outcome of this experiment
(though I am more sure about the outcome than about willingness of the
management chain to let it go through -- probably it will, but there
are times one must "run scared' and work to avoid roiling the waters).
But... I have three recent papers in quant-ph at, including
two which describe the triphoton experiment and what it tells us.
It seems likely to get rid of the last dying vestiges of the
"metaphysical observer" concept, but the MRF or stochastic path
replacement solidly violates the specific axiom of (time-forwards)
causality in the original Bell's Theorem (of Clauser, Horne, Shimony
and Holt, CHSH).

But what does this tell us about consciousness?

In my feeble inadequate partial draft from the weekend, I do at least
talk about the central importance of LEVELS OF CONSCIOUSNESS. Having
studied neural networks, brains and consciousness for a long time, I
do think I see the basic picture there. In fact, in the machine
learning (CS) part of arxiv,org, I posted another new paper, to be
presented at
the World Conference on Computational Intelligence this summer (for
which I need to reserve my flight and hotel and so on), which cites
some of that.

In essence, I have my own version of a "standard model" for
intelligence and for physics,
which I DO NOT ADHERE TO religiously. I view that standard model as a
step 'way beyond where most of the world is today, and as something
which MIGHT be the whole truth -- but probably it is just a
steppingstone to the whole truth. For example, in my 'standard model,"
I view the cosmos as totally four-dimensional (as in Einstein's
I have no real respect at all for the usual superstring or brane
models, and believe they have very little probability of being the
whole truth - yet I still tend to feel intuitively, that the cosmos is
most likely to have 8 or 11 dimensions or something like that. I
resolve this paradox by saying we need to understand four dimensions
much better, and push the empirical limits, before we can begin to get
the empirical data which we would need to get any serious handle on
how 8-dimensional reality is different from the best of four.

I do believe that almost all of the paranormal or spiritual phenomena
we have ever seen and validated COULD be understand within the limits
of the "standard model" -- yea even until 3+1D nonlinear PDE
(Lagrangian field theory), precisely or approximately.
(I even have a candidate Lagrangian, in a paper joint with my wife
Luda, which I promised not to publish until a really proper journal
option for that subject opens up.)

In that view, all intelligence or mind is an emergent property or form
within a cosmos which obeys mathematical "laws of physics," except
perhaps the universe itself which may or may not be viewed as a kind
of mind in itself (depending how one feels about that mathematics). I
see no reason at present to give up on the Pythagorean program of
trying to understand the laws of nature more precisely.

I see us being called to lead that least "three lives" in parallel. On
a scientific level, for the next century, we are called to try to
develop a really solid mathematical understanding of that LEVEL of
intelligence/consciousness/mind which exists even in the isolated
brain of the smallest mouse or "soulless rat" (a term suggested by
Cytowic). This has a lot to tell us about higher levels, because in
many ways the higher levels are built on top of the lower levels, and
the old adage "as above, so below" (or vice-versa) is relevant. Again,
at, I talk about that challenge, for which I have worked out
quite a bit of the mathematics (much of it validated in engineering

On the "normal level" (as in my 2012 Neural Networks journal article
cited in the arxiv paper), we are called to make better use of the
several levels of "mirror neurons" which humans possess but mice do
not, which give us some ability to LEARN to use symbolic reasoning in
a rational way, and to achieve a state I usually call "sanity" (which
confucians call zheng qi or integrity). Basically, sanity includes
learning to use the scientific method in coping with the large
database of first-person experience and feeling, which goes beyond the
more constrained shared database of replicable science, as discussed
by Kuhn. In essence, Kuhnian socialized science is to first person
what poetry is to prose -- an art which imposes severe disciplines and
constraints, but which derives great value as ONE DISTINCT PART of our
endeavors in life.

But the 'third life" is essentially paranormal or spiritual.  Nothing
in the first or second life really requires concepts of time beyond
classical time-forwards "causality -- but in my view, if people are
truly sane and truly open to all the human experience around them,
the pressure of first person experience will drive people to this. I
often cite the classic article "Are we a nation of Mystics" by
Greeley, on how large the base is of people who have been driven to
add a "third life" quietly. His paper is actually a lead-ion to
important issues beyond the scope of this email.

In my view, it is not possible to explain the full variety of
well-tested paranormal phenomena with "quick fixes" like the idea of
the pineal gland or the chakras as some kind of "third eye." We have
to "pay a big price" in complexity of what we assume (just as the
Bayesians say); thus it takes a major empirical stimulus to jump over
the cliff and be truly open to paranormal phenomena. Once we do... the
lowest cost model, in my view, is to assume we humans are a symbiotic
life form. Like lichen -- we are a symbiosis of TWO physical
biological organisms (a more or less many-one symbiosis), one which we
see easily in our microscopes, one involving fields or particles which
we do not yet understand in physics.  The "noosphere" is the "nervous
system" of the "invisible entity" we are all part of.

Nonclassical causality is not needed to understand consciousness at
the level of the first life or even the second life... but what of the
third life?

In my view, the noosphere is "two levels" of consciousness beyond the
level of sanity which human brains are evolving towards but have not
yet reached. The most important qualitative advance in the level of
consciousness is due to a special kind of "multimodular architecture,"
which implements the basic principle of exploiting symmetry in
learning beyond the level we see in the sane brain. The old Upanishads
idea of the Self seeing through many pairs of eyes at the same time
begins to capture this. Much of what we see as precognition may
actually be a matter of ideas in the noosphere which may seem
prophetic, but have yet to be really decided upon.  However, since the
time-symmetry is there in the physics, and we assume the noosphere is
the product of a greater field of natural selection than the brain, it
would be surprising if the additional capabilities of quantum
information were not present as well.

The curious thing is that there is no reason why we could not build
new types of computer which also embody higher-level multimodular
architecture and quantum principles -- more "conscious" than the naked
human brain. One of the greatest challenges to humanity is to find
ways to get human beings to make more use of their
inner higher level of consciousness, as may be required to cope with
the near-hopeless-looking challenges we are facing today.


Beyond that....

This "standard model" would come out with a picture of time
communication and paradoxes somewhat similar to the picture given in
the science fiction novels Chronoliths
and Blackout/AllClear (Willis). I could elaborate -- as you might
expect, since I have mathematics to refer to.

But is the picture true?

Willis' novel spends a lot of time questioning "the standard (future)
Oxford model
of time-travel.' If you follow it closely, it seems that one must
probably actually start building time travel -- or at least
communication -- systems in order to begin to develop the empirical
tests needed to seriously and usefully challenge the standard model.
That requires developing some new technologies first -- and then still
poses some open questions in experimental design.

In the world of esoteric experience, I do see some VERY elusive yet
significant suggestions that it just might be possible to change the
past after all (implying more than four dimensions in the overall
cosmos). But what does that really tell us?
In any case, I would feel somewhat uncomfortable getting too deep into
the specifics, except perhaps for an example or two which is not from
the set which strikes me as most veridical -- and may not be
appropriate in a journal paper anyway.
I also think sometimes about some of the better science fiction
novels, which hint at possible well-posed models different from the
"standard model' I usually fall back on.


Thanks very much for your patience, if you got this far.

Best regards,