Wednesday, 17 August 2022

Map and Territory as viewed from the free energy principle.

 John O. Campbell 

Charles Darwin clearly understood that a complete theory of biological evolution must be based on the processes of heredity, variation, and selection (1):

From these considerations, I shall devote the first chapter of this Abstract to Variation under Domestication. We shall thus see that a large amount of hereditary modification is at least possible, and, what is equally or more important, we shall see how great is the power of man in accumulating by his Selection successive slight variations.

His theory of natural selection is a brilliant explanation of selection supported by many observable phenotypic examples, such as artificial selection, that were well known to the biologists of his day. However, as we now know, heredity and variation have their basis in molecular biology and this micro realm was beyond the scope of science at the time. Although natural selection is conceptually based upon heredity, variation, and selection, he was only able to explain selection and had to accept heredity and variation as facts lacking explanation.

Fortunately, science, especially since the discovery that organisms’ heritable information is encoded by DNA molecules, has developed a detailed understanding of heredity and variation. But a complete synthesis between selection at the phenotypic level and heredity and variation at the genetic level has remained elusive (2). Indeed, Richard Lewontin considered the connection of phenotypic and genetic accounts as the primary task facing the field of population genetics (3):

According to Lewontin (1974), the theoretical task for population genetics is a process in two spaces: a "genotypic space" and a "phenotypic space".

The challenge of a complete theory of population genetics is to provide a set of laws that predictably map a population of genotypes (G1) to a phenotype space (P1), where selection takes place, and another set of laws that map the resulting population (P2) back to genotype space (G2) where Mendelian genetics can predict the next generation of genotypes, thus completing the cycle. 

Now a theory of biological evolution, called evolutionary developmental biology, has been proposed where this relationship between genotypic and phenotypic spaces is driven by the surprise reduction imperative of the free energy principle. This theory places biological evolution within a mathematical framework similar to the physical principle of least action. What saves it from a status of mathematical tautology, perhaps disconnected from actual phenomena, is the variational fitness lemma it proves which assumes a genotypic and a phenotypic space and demonstrates that if the likelihood of a genotype is proportional to its phenotypic trajectory, then the system’s autonomous dynamics will be a gradient descent on negative fitness – in agreement with natural selection.

As all existing biological process theories describing biological evolution including those dealing with natural selection, genetics and developmental biology share these fundamental assumptions, the theory of evolutionary developmental biology, may subsume those process theories and provide guidance for their further development.

Perhaps most exciting is that the assumptions of this theory may be applicable to natural systems beyond biology. For example, some neuroscientific process theories treating mental models as composed of beliefs constructed through inference may be likened to a genetic space and the actions or behaviours emanating from those models to a phenotypic space where the likelihood of the model is proportional to its phenotypic trajectory – and are thus subsumed within the evolutionary developmental theory. In fact, all existing things may fulfill these basic assumptions - supporting the notion of universal Darwinism (4).

Evolutionary development is but one theory utilizing the mathematics of the free energy principle to describe specific natural phenomena in a scientifically unprecedented manner. While many researchers view the FEP as a revolutionary framework promising to transform areas of scientific understanding, such as evolutionary biology, it has also met with some confusion and skepticism concerning its validity as a scientific framework. A 2022 paper is typical of this deep but thoughtful skepticism (5):

In this paper, we take up this debate in relation to the free energy principle (FEP) - a mathematical framework in computational neuroscience, theoretical biology and the philosophy of cognitive science. We shall ask: what is the relationship between the scientific models constructed using the FEP and the realities these models purport to represent? Our focus will be specifically on the FEP and what, if anything, it tells us about the systems it is used to model. We call this issue the map problem: how does the map (theory, model) relate to the territory (real-world, target system) of which it is a map?

Understanding both the FEP’s answer to this skepticism and its revolutionary power requires a rather deep dive into the various levels of scientific understanding. The top strata of which is composed of scientific principles, such as the FEP, that with sufficient empirical support may be considered laws. Luckily we have Albert Einstein as our guide on this portion of the dive into science as he was a great philosopher of science as well as the author of many scientific revolutions. First Einstein describes how scientific principles are formed (6):

The scientist has to worm these general principles out of nature by perceiving in comprehensive complexes of empirical facts certain general features which permit of precise formulation.

For example, a precursor of his theory of relativity was his principle of the constancy of the velocity of light, based on Maxwell’s well tested electromagnetic theory. He then describes how general principles may lead to testable theoretical conclusions, conclusion that in physics may be considered components of a theory of mechanics:

The theorist’s method involves his using as his foundation general postulates or ‘principles’ from which he can deduce conclusions. His work thus falls into two parts. He must first discover his principles and then draw the conclusions which follow from them.

And once the principle is formulated, how are the pertinent conclusions formed?

           Once this formulation is successfully accomplished, inference follows on inference, often revealing unforeseen relations which extend far beyond the province of the reality from which the principles were drawn.

Thus, scientific principles act as broad-based maps or models, perhaps lacking clear connections to the territory they describe. But the model’s implications may be explored to reveal pertinent relations capable of empirical testing. For example, Einstein’s principle of the constant speed of light and the principle of relativity formed the launching point of his own ‘inferences following on inferences’ arriving at the Lorentz factor . His derived Lorentz factor transforms Newtonian mechanics into relativistic mechanics where parameters of Newtonian mechanics, such as time, energy and momentum, are modified by the Lorentz factor to form relativistic mechanics.

Using deductive steps such as this, his principle of relativity transforms into a set of testable hypotheses. For example, the lifetime of high-speed cosmic rays may be measured, and the applicability of the Lorentz factor confirmed. Empirical conformity established via these inferences of mechanical hypotheses establish a correspondence between the model or map and the territory they describe

Einstein contrasted this top-down creation of theories of mechanics from scientific principles to a second method of theory formation which he called constructive and are sometimes called process theories.

They attempt to build up a picture of the more complex phenomena out of the materials of a relatively simple formal scheme from which they start out. Thus the kinetic theory of gases seeks to reduce mechanical, thermal, and diffusional processes to movements of molecules-Le., to build them up out of the hypothesis of molecular motion.

These process theories explain complex actual natural phenomena in terms of simpler actual phenomena. For example, the kinetic theory of gases explains complex phenomena, such as the weather, in terms of simpler natural components such as molecular motion. These scientific theories are no longer abstract maps or models, now they are theories about the territory in terms of other aspects of the territory. Now they describe not scientific understanding but how the world-in-itself functions, how nature implements mechanics and scientific principles.

And of course, we may logically move smoothly from principles to mechanics to processes and back again. For example, the principle of relativity transforms to relativistic mechanics which transforms to process theories of the behaviour of high temperature gases found in the solar corona (7). 

It is at the level of process theories that science converges with engineering. Once a phenomenon's components and relationships are understood these can be engineered to produce innovative structures and technologies. In this sense engineered structures share similarities with scientific experiments. Scientific experiments must be reproducible, meaning the same outcomes and behaviours must occur each time the same set off conditions are prepared – and consistent outcomes are essentially engineered technologies.

Both scientific experiments and engineered technologies establish empirical links between theoretical maps and the territory composed of the phenomena itself. Sufficient density of these confirming empirical connections leaves little room for significant differences between the logical structure of the map and the territory. And it is this shared logical structure between map and territory that some philosophers of science consider crucial. As Alfred Korzybski wrote (8):

A map is not the territory it represents, but, if correct, it has a similar structure to the territory, which accounts for its usefulness.

Things become more complex when both the map and the territory are dynamic rather than static and subject to continual change. Unless some mechanism operates to enforce a correspondence they will deviate from each other to the extent that the map becomes useless. But at the principle level of scientific mapping, invariant principles may be identified that are not subject to change in a dynamic world. For example, the FEP identifies surprise as this kind of invariant principle serving as a constraint on the dynamics of any mechanistic or process theories derived from it. Formally the surprise minimized in the free energy principle is the discrepancy between map and territory.

 

Or is it?

The FEP adds a novel twist to this paradigm of principle – mechanics – process. It is a principle, providing a model, from which a Bayesian mechanics that is currently under active development has been inferred (9; 10).  And it has further transformed into process theories, such as predictive coding (11) and evolutionary development. Predictive coding is a theory that describes the phenomena of brains as a relationship between neurons functioning to implement the surprise reduction imperative of the FEP.

It is at the level of process theories that the FEP’s revolutionary power is revealed in a twist where the target systems are considered to employ the same principle-mechanics-process transformations as does science. In other words, the FEP process theories postulate models within the natural systems they describe. These models, the genotype within evolutionary development and generative models within predictive coding, are principled and encompass the systems prior beliefs. These natural principled models logically transform into mechanical processes, such as gene expression within evolutionary development and backward and forward processing between neurons within predictive coding. And in the end these mechanisms transform into stable processes, phenotypes within evolutionary development and behaviours or actions within predictive coding.

In this manner the FEP understands maps and territories not as unrelated entities but rather as two components with the same logical structure and enforces this correspondence, under active inference (12), by physically transforming the map into the territory. Under active inference the system acts to ensure that the map transforms into a territory with a corresponding logical structure.

But there is one further twist. Not only does the map transform into the territory but the territory also transforms into the map. Within evolutionary development the territory or the phenotype transforms the genotype or map by updating it with the results of demonstrated fitness or phenotypic success and within predictive coding the actions or territory transform the generative model or map through learning from successful behaviours.

In this manner the FEP provides an evolutionary context for natural phenomena. The model or map transforms into the process or territory and the process transforms into the model, but in each cycle the model at the beginning is not the same as the model at the end; it has evolved or learned through the process of existence that it instigates – much as science evolves or learns through conducting experiments and simulations.

Due to the enforced correspondence between map and territory, the territory evolves adaptations in support of its existence while, in parallel, the map or model evolves knowledge specifying those adaptations and the transformational processes which instantiates them. The resulting accumulation of knowledge or information gain within the system’s model is required by the FEP as maximizing information gain is equivalent to a reduction of free energy or surprise to the system (13).  

Knowledge accumulation is widely considered the reason for the existence of science. And yet science largely relies on a definition of knowledge introduced by Plato 2,000 years before the scientific revolution and that has changed little since. Although philosophers continue to quibble over the details (14), Plato’s definition of knowledge as justified true belief remains the dominant definition in both science and philosophy. The primary clarification offered by science is that ‘justified’ means justified by the evidence, leading to the conclusion that all knowledge is ultimately evidenced-based.

As discussed above, from the vantage point of the FEP, science follows the same path of evidence-based knowledge accumulation as do other natural systems, leading to the conclusion that there may only be one method of knowledge accumulation. In this view science is ‘merely’ a rediscovery at the cultural level of an ancient process existing since things first came into existence. This FEP induced alignment of science with other natural systems may be another facet of its power to explain natural systems.

Finally, this discussion may raise the question of why there is a close relationship between knowledge and existence. The short answer on offer here is that knowledge is an essential component of existence or that existing is knowing. In this view, states in which an entity can exist are rare and fragile and knowledge is required to achieve and maintain those states. In the words of the physicist David Deutsch:

           Everything that is not forbidden by the laws of nature is achievable with the right knowledge.

While this insight is rather straight forward in its application to complex entities, such as organisms - it is well known that organisms cannot exist independently of their genetic knowledge - it may be less obvious for fundamental entities such as electrons. But quantum electrodynamics tells us that the bare electron cannot exist, as many of its properties involve infinities. Rather it must be surrounded by a vast cloud of virtual particles finely tuned to cancel the infinities by winking in and out of existence. And this intricate fine tuning and cancellations is orchestrated by the knowledge contained in the wave function of the dressed electron (bare electron plus cloud of virtual particles). It appears that even at the most fundamental level, knowledge is essential to existence.

We might consider scientific knowledge to be an exception, to be beyond the pragmatic dictates of existence and more a sort of cultural luxury or knowledge for the sake of knowledge. But this possibility fades if we question how many people would exist on the planet today if the scientific revolution had never occurred. Certainly, science is a prerequisite for the existence of modern society. This line of argument is also supported if we consider that humanities long march from scattered African tribes to world domination was due to the accumulated cultural knowledge (precursors to science and engineering) that underwrote the forms of niche construction, we call culture (15). In other words, cultural existence is highly dependent upon its accumulated knowledge and science is a recent, refined and powerful method of cultural knowledge accumulation. In this view science is just one of nature’s many methods for accumulating existential knowledge, the latest in a long succession of such natural systems that function in accord with the free energy principle. 

The FEP thus leads to a unified view of science as a typical process within the natural world, one that goes a long way to address skepticism such as (5):

Our focus will be specifically on the FEP and what, if anything, it tells us about the systems it is used to model. We call this issue the map problem: how does the map (theory, model) relate to the territory (real-world, target system) of which it is a map?

The answer offered here is that under the FEP, natural systems, including science, involve maps (principles, models) that transform into territories (target systems). And the territories they produce, acting as evidence, inferentially transform into increasingly knowledgeable maps. These natural systems are evolutionary processes, custom made for discovering and exploiting possibilities for existence offered by the laws of nature.

This move to subsume science within the natural world it describes provides a radical mapping of the territories of the natural world, one sure to stimulate further skepticism, model building and ultimately increased knowledge.

References

1. Darwin, Charles. The Origin of Species. sixth edition. New York : The New American Library - 1958, 1872. pp. 391 -392.

2. Neo-Darwinism, the modern synthesis and selfish genes: are they of use in physiology? . D., Noble. s.l. : J Physiol. Mar 1, 2011, Vols. ;589(Pt 5):1007-15. . doi: 10.1113/jphysiol.2010.201384. .

3. Wikipedia. Population Genetics. http://en.wikipedia.org/wiki/Population_genetics, as viewed Sept. 11, 2010 : s.n.

4. Universal Darwinism as a process of Bayesian inference. Campbell, John O. s.l. : Front. Syst. Neurosci., 2016, System Neuroscience. doi: 10.3389/fnsys.2016.00049.

5. The Literalist Fallacy & the Free Energy Principle: Model building, Scientific Realism and Instrumentalism. Kirchhoff, Michael, Kiverstein, Julian , and Robertson, Ian. s.l. : University of Chicago Press, 2022, The British Journal for the Philosophy of Science.

6. Einsein, Albert. The Collected Papers of Albert Einstein, Volume 6 (English): The Berlin Years: Writings, 1914-1917. (English translation supplement) . s.l. : Princeton University Press; Revised ed. edition (Oct. 5 1997), 1997. ISBN-10 : 0691017344 .

7. A Further Study of the Mixing of Relativistic Ideal Gases with Relative Relativistic Velocities: The Hot Plasma in the Sun’s Corona, the Type II Spicules and CMEs. Gonzalez-Narvaez, R. & Díaz Figueroa, Elton & Ares de Parga, Gonzalo. s.l. : Journal of Physics: Conference Series., 2019, Vol. 1239. 012002. 10.1088/1742-6596/1239/1/012002.

8. Korzybski, Alfred. Science and Sanity: An Introduction to Non-Aristotelian Systems and General Semantics. . s.l. : International Non-Aristotelian Library Publishing Company., 1933.

9. On Bayesian Mechanics: A Physics of and by Beliefs. Ramstead, Maxwell & Sakthivadivel, Dalton & Heins, Robert & Koudahl, Magnus & Millidge, Beren & Da Costa, Lancelot & Klein, Brennan & Friston, Karl. s.l. : ArxIv, 2022. 10.48550/arXiv.2205.11543.

10. Bayesian mechanics for stationary processes. Da Costa Lancelot, Friston Karl, Heins Conor and Pavliotis Grigorios A. s.l. : Proc. R. Soc. A, 2021, Vol. 477 20210518. http://doi.org/10.1098/rspa.2021.0518.

11. Predictive coding under the free-energy principle. Friston K, Kiebel S. s.l. : Philos Trans R Soc Lond B Biol Sci., 2009, Vols. 12;364(1521):1211-21. . doi: 10.1098/rstb.2008.0300. PMID: 19528002; PMCID: PMC2666703..

12. Active Inference: A Process Theory. Friston, Karl , et al. s.l. : Neural Comput, 2017, Vol. 29 (1). doi: https://doi.org/10.1162/NECO_a_00912.

13. Active inference and epistemic value. . Friston K, Rigoli F, Ognibene D, Mathys C, Fitzgerald T, Pezzulo G. s.l. : Cogn Neurosci, 2015, Vols. 6(4):187-214. . doi: 10.1080/17588928.2015..

14. Knowledge as Justified True Belief. . de Grefte, J. s.l. : Erkenn, 2021. https://doi.org/10.1007/s10670-020-00365-7.

15. Campbell, John O. The Knowing Univese. s.l. : Createspace, 2021. ISBN-13 : 979-8538325535.

 

 


No comments:

Post a Comment