Monday 19 November 2012

Probability theory as the physics of classical reality

John O. Campbell

In a previous post I discussed a startling new view concerning the relationship between quantum and classical realities; an issue which has trouble thoughtful observers since the time of Einstein . This view is supported by the work of Wojciech Zurek and papers by Muller and Masanes which suggest that classical reality is a subset of quantum reality, not a faithful mirroring of quantum systems but consisting of only the information quantum systems are able to exchange. Classical reality is composed of this quantum information exchange. The process of information exchange has also been described within quantum theory as 'measurement' or 'wave function collapse'.



Wojciech Zurek of Los Alamos National Laboratory.

Only information describing an extremely small portion of the quantum reality of one system may be exchanged with another but it is this information which forms classical reality. Any entity within classical reality can only effect or influence another through one of the four fundamental quantum forces: electromagnetism, strong nuclear force, weak nuclear force and gravity. An entity can participate in classical reality only via these forces and an entity that does not communicate via one of these forces cannot be said to have an existence within classical reality.

It turns out that in a mathematical sense classical reality is governed by a form of logic which is likewise a small subset of quantum logic: probability theory. Thus the claim that probability theory is the logic of classical reality, but how are we to understand this; what does it mean?


Fortunately a clear path is already documented in the scientific literature. Within information theory and perhaps contrary to popular usage, information and probability are defined in terms of one another.


The mathematical relationship is:


I = -log(P)

Information is the negative log function of probability. Intuitively this means that if we assign a probability that a particular outcome will take place and then find out that the event did actually occur the amount of information we receive (in bits) is inversely proportional to the probability we initially assigned. In other words if we considered something as likely to happen and it does we receive little information, if we considered it unlikely but it does occur we receive a greater amount of information.

Assigning probabilities to discernible states or to outcomes is only feasible if all the possible states are known and assigned probabilities which sum to 1 . Such a collection of probabilities is called a probability distribution and we may consider it a model for the possible outcomes of an event. 


In the beginning if the model has no prior data to determine which outcomes are more plausible than others it must assign an equal probability to each possible outcome. If there are n possible outcomes described by the model then each has the probability of 1/n.


A little magic takes place when the model does receive information pertinent to the outcome which actually does occur. Mathematically this data implies changes to the model; on the reception of this new data some outcomes must now be considered more likely and other less likely but they still must sum to 1.


The mathematical procedure for calculating the new probability is called the Bayesian Update.


P2 = R*P1

That is the initial probability (P1) for the occurrence of each outcome is multiplied by a ratio R to produce a new probability for the outcome (P2). The ratio is merely the likelihood that the new information would be received if the outcome described were the actual outcome. By incorporating information in this fashion into a model it can come to more accurately describe the event it models. This process of building knowledge is called inference.

To recap this short excursion into information theory we have seen that the existence of information mathematically implies probability theory and inference. However mathematics is not reality. So we still have the question 'in what sense is probability theory the logic of classical reality?'


I will argue, over several posts, that probability theory and inference are instantiated within classical reality through the actions of Darwinian processes and that Darwinian processes are the physical implementation of the mathematics of inference and have created the sum of classical reality. In this view Darwinian processes may be characterized as the engineers of classical reality. 


At the bottom of classical reality we have quantum information exchange which Wojciech Zurek has describe as the process of Quantum Darwinism.  In a manner this information exchange entirely explains classical reality and it may be understood as a Darwinian process. However a complete description of classical systems in terms of quantum systems is only possible in the micro-realm of sub-atomic particles, atoms, molecules and chemistry. 


Further Darwinian processes are involved with the creation and evolution of more complex classical systems including life, nervous systems and culture. I will leave a detailed treatment of quantum Darwinism and these other important Darwinian processes for latter posts but will attempt, in the remainder of this post, to demonstrate the plausibility of an isomorphism between Darwinian processes in general and the mathematics of inference. A more formal argument to this effect may be found in my paper Bayesian methods and universal Darwinism.


The Darwinian process of natural selection was the original Darwinian process described by science and has come to be understood as the 'central organizing principle' of biology. Indeed, Darwin himself, in his On the Origin of Species speculated that the same principle involved in natural selection might operate in other realms such as the evolution of languages and thus was the first to understand a generalization of natural selection to the principle of, what we now understand as, a Darwinian process.


The isomorphism between natural selection and inference is clear. Population biologists understand many characteristics of living things are determined by the particular genetic alleles, or  possible variations at the site of each gene, which are actually possessed by an organism. Within a given population there are only a finite number of alleles at the site of each gene. These various alleles may be assigned a probability describing their relative frequency within the population. 


Of particular interest to population biologists is the change due to natural selection which takes place in these probabilities between generations. The formula they use to describe this shift in allele frequency is:



                                                                     

Where p’ is the probability of the particular allele in the latter generation, p is the probability of the particular allele in the former generation, RA is the fitness of the particular allele and R is the average fitness of all competing alleles. This is clearly a Bayesian update of the probabilities making up the alleles’ distribution.

Thus natural selection may be understood as a form of inference; the process by which species gain knowledge concerning the implementation of strategies for reproductive success.


Another persuasive illustration of the isomorphism between inferential systems and Darwinian processes involves a device designed by Francis Galton (1877) and used in a lecture to the Royal Society. The device, pictured below, has recently been re-discovered within the statistics community and re-purposed as a ‘visualization of Bayes’ theorem’ (Stigler 2011).

The device contains three compartments: a top one representing the initial probabilities, a middle one representing the application of the likelihood to those probabilities and a third representing the re-scaling of the resulting probabilities so that they sum to 1. Beads are loaded in the top compartment to represent the initial distribution and then are allowed to fall into the second compartment. The trick is in the second compartment where there is a vertical division in the shape of the likelihood distribution. Some of the beads fall behind this division and are ‘wasted’; they are removed from sight and the remaining beads represent the ‘Bayesian update’ of the initial probabilities.

Perhaps the most remarkable thing about this demonstration is that Galton never mentioned Bayes in his lecture; he had probably never heard of him or of the Bayesian update. Rather, his purpose in building the device was to demonstrate the principle of natural selection.  The top compartment represents the child generation with variations and the middle compartment represents the individuals selected from that generation.

That this device may be used to model both Bayesian inference and Darwinian processes illustrates an isomorphism between the two. The pertinent implication is that any system driven by a Darwinian process is performing inference; the predictive accuracy of its internal model is increased by processing the data available to it, its experiences in the broader world. 

Part of human nature, expressed across all cultures, is to seek out explanations of our origins. Science has gone further than any other human endeavour in providing satisfying answers to this quest. In general the view offered by science is that we are children of the universe; we are not freaks or a special case but conform to the laws of the universe like everything else. In particular an understanding of Darwin's natural selection allows us to trace our ancestry back to the first life form, it reveals all life forms to be an extended family and details our relationships within this family of life to all other living things.

It is my belief that in recent years science has made great progress in extending knowledge of our origins all the way back to the basic emergence of classical reality from the quantum substrate. While this view has great import for our quest to discover our origins, surprisingly it remains little noticed. A particularly elegant feature of this understanding is that the processes which produced both our reality and ourselves may be viewed as a generalization of natural selection.

This view provides detailed understanding in support of  philosopher Daniel Dennett's claim that Darwin's idea was the best one ever:


 if I could give  a prize to the single best idea anybody ever had, I’d give it to Darwin—ahead of Newton, ahead of Einstein, ahead of everybody else.  Why?  Because Darwin’s idea put together the two biggest worlds, the world of mechanism and material, and physical causes on the one hand (the lifeless world of matter) and the world of meaning, and purpose, and goals.  And those had seemed really just—an unbridgeable gap between them and he showed “no,” he showed how meaning and purposes could arise out of physical law, out of the workings of ultimately inanimate nature. And that’s just a stunning unification and opens up a tremendous vista for all inquiries, not just for biology, but for the relationship between the second law of thermodynamics and the existence of poetry.





















Saturday 17 November 2012

Science and our big questions

John O. Campbell

Science is our evidence based method of answering questions concerning the nature of reality and the other 'big questions' which have always been on the human mind. I have long been captivated by the drama of this unfolding story. A trusted aide in this quest has been a subscription to Scientific American magazine. When the December 2012 edition arrived I quickly selected the first article I would read; one titled 'The Unquantum Quantum'. The bi-line for the article is: 'Contrary to the conventional wisdom of quantum mechanics, the physical world may be continuous after all - more analog than digital'

The article traces the history of scientific thought on the ultimate nature of reality as fluctuating between discrete and continuous models. Some early Greeks speculated matter was composed of discrete atoms, Newton endorsed the continuous, the modern atomic theory indicated the discrete as did Bohr in his early version of quantum theory but then the quantum wave equation of Schrodinger seemed to describe a continuous reality. The author of the Scientific American article comes down on the side of the continuous.

One might come away from this shaking their head at the fickleness of science but that would be a mistake. Each flip-flop in this succession actually provided a new depth of knowledge. Greek ideas of atomism were based on scant evidence, Newton provided continuous laws of motion to which the calculus applied, modern atomism explained the periodic table and quantum theory remains the most exquisitely accurate theory in the history of science.

Still one might suspect, given the perseverance of the apparent discrete/continuous duality, that it is a false dichotomy and that its solution may resemble the solution to the nature/nurture debate; it is best thought of as two faces to the same coin.

A couple of recent papers in the foundations of quantum theory by Muller and Masanes shed new light on this question and may have finally provided a near complete explanation:



These papers have had high impact within the physics community and have stimulated a number of further papers. One paper in particular (Are quantum states real?) by Lucien Hardy of Perimeter Institute may be of importance. It demonstrates that, given some reasonable assumptions, the wave function of quantum states must describe actual states of reality. It may go a long way in clarifying in what sense quantum theory describes the actual world; an issue which has plagued quantum theory for over a century.

The resolution described in the M&M paper to the discrete vs. continuous conundrum is that reality is both. However the paper brings clarity, perhaps for the first time, to the relationship between them. 

Quantum theory is important to anyone wishing to understand how the world works as at bottom information exchange between entities only occurs as a quantum process; via one of the four fundamental forces of physics. All other ways in which one entity may be affected by another are merely different organizations of information exchange at this fundamental quantum level .

For example the information of the world we gain from sight is at bottom due to a quantum interaction between photons from our environment and receptor molecules in our retinas. This is a general principle and not restricted to humans; it is not an anthropocentric phenomena. Any entity can only be affected by another via a quantum interaction. 

The Muller and Masanes (M&M) papers go to the heart of information exchange. Quantum theory is necessary to describe systems that are between interactions; when they are not exchanging information with their environments. The information that may be exchanged between quantum systems is only a small subset of the system's full description. Some theories put this in a Darwinian context: only a small subset of the information describing the system can survive the exchange.

The portion of information which is exchanged forms classical reality. Classical reality is the sum total of the effects which one quantum system has on another  and this basic discontinuity between a full quantum description and the information which can be transferred between systems is the basic discontinuity between quantum and classical realities.

The M&M papers show that the transition from quantum to classical involves a restriction on the quantum information which may survive and that the continuous evolution of quantum states is approximated as a succession of discrete states in classical reality. This transition from continuous to discrete involves a transition in logic; from quantum logic to the classical logic of probability theory. Thus the fundamental relationship between the continuous and the discrete at the basis of reality is revealed. 

M&M show that quantum theory belongs to a generalized set of theories which has only two members: quantum theory and probability theory. This may seem puzzling as quantum theory is a branch of physics and probability theory is a branch of mathematics. A possible resolution is to shift our view and consider probability theory as a branch of physics; the physics of knowledge.

To motivate this shift we might consider that the exchange of information between quantum systems provides them with a form of knowledge of each other. This is evidence based knowledge provided by actual information received. As all information exchange is at bottom exchanges between quantum systems we must understand all evidence based knowledge in a similar manner. 

Science is the form of evidence based knowledge most accessible to us and shares a deep relationship with probability theory. Indeed E.T. Jaynes titled his great textbook 'Probability Theory: the logic of science'. The M&M paper suggest we must consider probability theory in an even more general sense, as the logic of classical reality; as the logic by which the universe comes to know itself and that science is but a recent rediscovery of this ancient, timeless process.