Because science tends to be highly political these days, only about half the writing out there gives proper credit to Deutsch. Many writers like to give credit to the most famous or friendly person they can imagine. Thus many people quite how feynmann once said "there is plenty of room at the bottom." But those words did not actually produce the new technology! In Google scholar, it is easy to locate Deutsch's seminal papers, where he proved that his concept of (digital) quantum computing, based on superposition and on the many worlds
version of quantum field theory (qft), provides a new universal platform for computing, just like the old Turing machine concept but more universal. This kind of failure of credit seriously hurts science, because it causes people to pay less attention to important new work by the same person, like the book I just read this past week.
In essence, the book describes how explanations (narratives) are crucial to solving the problems now facing us, not only in science but in society (where the threstsbtomour very survival are serious indeed, but Deutsch builds a message of hope).
But first, what IS science? It is scary how many people c!aim to know how to manage science and science agencies without even an adequate understanding of what science actually is! Understanding what if us becomes more and more important as it grows more complex and as larger vested interests try to filter the discussion!
Many of us have already learned a lot of the basics, by paying real attention to the history parts of our science courses, and by understanding basic facts about Francis bacon and the Renaissance. Still, many people who study science itself say it is important to go past the simple basics, by seriously studying the work of two key thinkers, Thomas Kuhn and karl popper, who shaped the deepest reassessment of science in the twentieth century. For myself, I read only kuhn's work. More and more, I have learned how Kuhn's vision, passionately implemented by Dr. Joe bordogna (a former director of the National Science Foundation NSF) , was crucial to the greatest golden age of NSF, and his loss of that vision due to congressman lamar Smith was a huge tragedy beyond what I had imagined likely even so late as 2013.
But what of popper? Deutsch quotes extensively from Popper, and gives a new more modern version of Popper's ideas and their implications. Here are my comments on those thoughts, addressed to a listserv of professors who teach Popperism:
=================================
Yesterday I read moderately far into David Deutsch's book, the Beginning of Infinity, which argues very strongly and clearly for a vision of what Popperism really is and where it might lead us. Given the scope of the book, and the solid achievements of the author, it might be more useful for your discussion group than what I have seen so far.
It is true indeed, as the book says, that the mainstream of quantum computing and quantum communications today all flows from the fundamental analytic ("critical analysis" ) work of Deutsch, giving a kind of sequel to the analytical work of Turing which until recently totally dominated the mainstream of computer science. You could even choose to cite it as an example of how self-conscious applied Popperism can actually work.
Before I observed your group and read into Deutsch's book, I never used the word Popperism. Popper was just a person to me, not a person I argued with, a person who made a couple of important points we agreed on. Deutsch's book is serious, modern and substantive, in a way I find it easier to come to grips with in a constructive way than other written expressions of Popperism I have seen (other than the Popper experiment in quantum optics).
At the end of the day, however, I view his position (his version of Popperism) as very useful but incomplete and overextended in an important way, exactly as I view his position on multiverse.
These are tricky issues, requiring a kind of careful splitting of hairs in mathematics and in concepts, but not in words, where splitting of hairs on semantics is most often a waste if time and a distraction from reality. In fact, it is worth everyone's time to remember that "semantic fascism", behavior which puts too much importance in demanding the absolute truth and supremacy of one community's definition of a word over another's, is a clear common warning sign of a gross failure of logical, rational critical analysis. I have seen that over and over again across ever so many areas of science and life. Semantic fascism is a back door manifestation of authoritarianism as a form of "reasoning." Deutsch was clear in rejecting authoritarianism, and I never saw popper support it, but it is clear that some folks claim to be popperists who vigorously implement semantic fascism.
Deutsch's book addresses many important topics, but I will focus mainly on epistemology, which ties it all together, and multiverse, which I knew about long before this book. Crudely, the epistemology part is an interpretation or extension of popper, quitting popper extensively. The multiverse part is an extension of famous work by Everett and Wheeler, which Deutsch understood well enough to build a whole new technology based on that strong and clear understanding.
On epistemology, Deutsch uses the word "induction" with the same kind of definition your discussion group assumes, but he explains in much more detail precisely what he attaches to the definition and, most important, what his alternative is.
Unlike some of the postmodernist hermeneuticists in the group, Deutsch is firmly committed to the concept of objective reality and to the quest to understand and explain it. Popper and I have/had the same commitment. Some in the group might question that, but for God's sake, why do they imagine Popper pushed the experiment he proposed to try to refute quantum philosophies of unreality? (I previously gave a URL to physorg, describing that experiment and how it seems to support Popper, but remains controversial.)
Deutsch talks about other types of epistemology which stifled progress in past centuries and in other cultures, but mainly focused on "empiricism" versus Popperism, and he does let us know he is seeing these things through the eyes of a physicist. He defines the word "induction" as an aspect of "empiricism", in which response to the incoming flow of data is "the whole game" in determining what we do or should believe (or predict or conclude) in response to that flow of sensory input.
Deutsch's alternative epistemology based on Popper is to focus mainly on explanation, in the kind of critical analysis needed to make sense of what has been observed and of the reality which lies behind what has been observed.
In my view, his concept of empiricism and his concept of Popperism are like thesis and antithesis, both pure (and useful) extremes, but both incomplete. Both are capable of being extended (or redefined by folks who say "we knew it already and we only talked about it half the time") in a way which unifies the two extremes into the kind of synthesis we really need.
One should really not underestimate the relevance of the most well-grounded neural network theory to this issue. There is a very close connection between the questions "how can it should humans best learn from experience and reason?" versus "how DO human brains learn from experience and reason?" versus "how do mice and rats learn from experience and from whatever happens in their brains?" Neural network theory has gotten very deep into all three questions and into their interconnections. In truth, as brilliant and broad as Deutsch is, I do know more than he does about neural network theory. (As I type this, I am returning home from giving a plenary at this year's international conference on neural networks -- but yes Deutsch should be offered the same if he were interested, because it is a crossdisciplinary conference.)
What Deutsch calls empiricism or induction is essentially what neural network folks would call naive forgetful real-time or online learning.
Deutsch stresses that we do not just learn to predict, but also to learn the dynamics which explain what we see and to reconstruct the underlying reality. In fact, all three aspects are essential to what we learn and to each other, not only in neural network models of intelligent states like brains but also in their more primitive linear ancestors like Kalman filtering. Attention to prediction error really is essential as a kind of reality testing. Without it, computer programs can totally diverge from reality. When humans try to reason with words, without paying enough attention to the preverbal information common to both human and rodent brains, they too are capable of diverging from reality and from progress in a very florid way. That is ever so common today! How could any evolved biological organism ever lose touch with reality so much as humans often do? The key explanation (discussed in www.werbos.com/Mind_in_Time.
In the neural network field and in kindred parts of AI , we generally embrace both "empiricism" and "induction" and Occam's Razor, as essential foundations, but we assume different definitions from what Deutsch assumes.
In past decades, many of us believed in a narrow synthesis of Deutsch's "empiricism" and Popperism (without reference to Deutsch or to Popper). Let us call it the Solomonoff epistemology (though I independently developed it myself in undergraduate days, and discussed with Minsky).
In the Solomonoff epistemology, we all have a substantial need to do exactly what Deutsch proposes, the real core of his practical approach to science: to work hard to develop a promising and coherent list (explicit or generative) of possible, competing theories of how reality works. (I will assume my variant of this, where the operations include a random number generator.) That needs to be done over and over again, as experience is accumulated and remembered. Yet we also need to be able to evaluate the choice between theories, which is also important when we upgrade the list. We do this by applying Bayes' Law, an extremely important theorem, and we hope that folks who claim to be devotees of rational analysis would properly respect such a fundamental theorem. We MUST evaluate, in numbers, the relative probabilities of competing theories to be true ...
Notice the fundamental assumptions:
1. We of finite brains and minds can never legitimately claim certain knowledge of one theory being true. The best we can do is to maintain a list, and a shorter list of explicit and complete possibilities than of implicit possibilities.
2. The best we can do us to continually update the probabilities which we attribute to the possible theories, and keep updating the list itself. We must assess probabilities.. in order to have a rational basis for making decisions. (The need to make decisions is present even in the mouse brain, which is >90% homologous to the human brain.)
We view the application of Bayes' Law, explicit or implicit, as part of the core of induction, and of any ability of humans or other mammals to learn from experience.
Bayes' Law has a very tricky property which cannot be avoided, and has caused endless confusion. The problem is that the probability of a theory being true, after a long stream if experience which we may call "memory", always depends on pr(theory), the probability which the theory had PRIOR to any experience. More precisely, we may write:
Pr(theory | memory) = pr(memory | theory)*pr(theory)/Z
where pr(theory | memory) is the probability that the theory is true AFTER the experiences recorded in memory, where pr(memory | theory) is the probability that"memory" would have happened according to the theory being evaluated, and Z is a general scaling factor slightly beyond the scope of this email.
My short time of email access us running out. Some key summary points --
1. www.werbos.com/Erdos.pdf explains why we really need some version of occams razor in order to explain or replicate how brains can learn from complex experience
2. Yes, we have learned to do a bit better than the simple Solomonoff version we started from. Modern concepts of analog priors and robustness are crucial, and begin to reflect his our current explicit list if their is can never be complete.
3. As Deutsch sats, multiverse phtsucs and further concepts of physics are crucial to rounding out the picture. I tend to think of human learning as a kind of ouroboros work in which objective and subjective views are equally fundamental in different ways, linked together. New experiments beyond Popper I will be crucial to cultural acceptance if reality.
All for now. I apologize both for excessive brevity and for excessive length.
Best of luck,
Paul
No comments:
Post a Comment