Monday 13 December 2021

Does science describe the world-in-itself?

John O. Campbell

This is the conclusion of The Knowing Universe

This book proposed an inferential systems definition of existence. In this interpretation, the world-in-itself is portrayed as a vast hierarchically nested series of inferential systems, each diligently investigating the possibilities for specific forms of existence. One benefit of this interpretation is its casting of the universe in an easily knowable form; a little knowledge of inferential systems provides a little knowledge of everything.

Our interpretation of scientific knowledge within an inferential systems framework reveals science itself in a new light; science is an inferential system that accumulates knowledge in essentially the same manner as all other naturally occurring inferential systems. This view of science may inform some problems that have long plagued western philosophy.

One of those problems is the general relationship between existing entities and our perceptions or mental concepts portraying them. During the scientific revolution, roughly between 1543 and 1687 (1), it became apparent that a combination of sensory experience and rational hypotheses could form synergistic intellectual models having powerful, pragmatic effects. Some scientific pioneers, such as Francis Bacon (1561 - 1626), assumed the scientific method would provide ultimate and infallible knowledge of the universe (2). But questions soon emerged regarding the relationship between scientific models and the entities they described.

David Hume

Fifty years after the scientific revolution, David Hume cast some shade on empirical claims of certain knowledge, noting that many natural processes, such as causation, are not entirely amenable to sensory evidence. However, he retained the critical caveat that instead of certainty, sensory evidence can provide probabilistic knowledge. Using this, essentially Bayesian insight, Hume argued that hypotheses judged as unlikely in our prior experience require a greater weight of evidence, or as more recently framed, extraordinary claims require extraordinary evidence (3). He most famously used this insight as a basis for skepticism regarding Christian miracles. At the time, Hume’s views failed to carry the day, and although currently rated the greatest philosopher in British history, his contemporaries excluded him from holding a university post due to his alleged atheistic tendencies (4). Scientific and philosophical insights were not yet strong enough to pose a severe threat to established religious models.

Forty years later Immanel Kant, built upon Hume’s empirical skepticism claiming an essential dichotomy between sensory-based knowledge and the true nature of things in themselves (5):

And we indeed, rightly considering objects of sense as mere appearances, confess thereby that they are based upon a thing in itself, though we know not this thing as it is in itself, but only know its appearances, viz., the way in which our senses are affected by this unknown something.

Kant describes an apparent gulf between things in themselves and any possible empirical knowledge we can have of them. A recent statement of this dichotomy uses the example of the gulf between maps and the territory they model (and admonishes us not to confuse the two). In Kant’s time, many perceived this as a trivial intellectual gulf posing no reason for despair as it did not yet challenge the near-universal religious models. Although lacking empirical foundations, the Christian model of the universe was considered an accurate portrayal of the world-in-itself. Indeed, Kant’s contemporaries understood his radical philosophy as resonating quite well with Christian doctrine. One of his early commentators noted (6):

And does not this system itself cohere most splendidly with the Christian religion? Do not the divinity and beneficence of the latter become all the more evident?  

But scientific and philosophical developments were building bridges between perceptual models and the world-in-itself, bridges that would come to challenge those offered by religious teaching and to undermine our cozy place within Christian models of the universe. Astronomers such as William Herschel (1738 – 1822) found evidence of a vast universe, suggesting that both earth and humanity played apparently insignificant roles – a scenario challenging Christian doctrine. And then Darwin demonstrated that counter to religious teaching, people have descended from earlier forms of life. These and countless other scientific findings undermined the Christian worldview among the intelligentsia.

Not only did this new learning contradict many religious teachings, it also illustrated the constraints that religion placed upon knowledge. Most Protestant sects identified the Bible as revealed truth and considered it a complete worldview for humanity. But the scope of biblical knowledge is relatively minimal. How far could knowledge grow within this confine? For example, the bible makes only passing references to stars. One of the more explicit passages is:

And God made the two great lights—the greater light to rule the day and the lesser light to rule the night—and the stars.

It does not offer any detailed knowledge concerning stars-in-themselves; they are only bit players in this God-centric tale. And it provides no path to greater knowledge of stars. For that, we must look elsewhere.

Under these influences, acceptance of biblical teachings as literal descriptions of the world-in-itself became increasingly untenable. Finally, in 1882 Fredrick Nietzsche (1844 – 1900) announced God’s murder and held humanity responsible:

God is dead. God remains dead. And we have killed him. How shall we comfort ourselves, the murderers of all murderers? What was holiest and mightiest of all that the world has yet owned has bled to death under our knives: who will wipe this blood off us? What water is there for us to clean ourselves? What festivals of atonement, what sacred games shall we have to invent? Is not the greatness of this deed too great for us? Must we ourselves not become gods simply to appear worthy of it?  

After fifteen hundred years, Europe's foundational model of the universe crumbled, leaving no handy alternative. As Martin Heidegger (1889 – 1976), a leading 20th-century metaphysician, explained, humanity was left exposed to its most significant source of anxiety, the anxiety experienced when we face the finite nature of our existence.

And worse was to come. Ludwig Wittgenstein (1889 – 1951), perhaps the twentieth century’s most influential philosopher (7), described an even more profound metaphysical abyss – claiming that scientific understanding, the slayer of our old theological models, was incapable of offering a replacement model of the world-in-itself. He believed existence had no logical explanation and that its nature must forever remain a mystery (8).

It is not how things are in the world that is mystical, but that it exists.

Wittgenstein had experienced the terrors of WWI’s trench warfare, and although he rose to the occasion displaying remarkable valour, it left him deeply shaken and his philosophy practically a denial of any possibility for human meaning. Wittgenstein was not alone in his visceral reaction to the terrors of WWI. Many began to consider that at best, God was only remotely concerned with the world's workings and that to understand those workings and perhaps even shape them, we would have to look elsewhere. 

Together Nietzsche, Heidegger and Wittgenstein brought western philosophy to the brink of nihilism (9). We had outgrown the Gods providing meaning for millennia and found ourselves instead in a vast, uncaring universe devoid of meaning. Even worse, we were unable to conjure up any convincing alternative model of the world-in-itself. This era perhaps marked a low point for western philosophy. Existentialism, which succeeded this philosophical movement, also lamented human meaninglessness but supplemented despair with a growing sense that meaning was within us, that it was only necessary to pull ourselves up by our bootstraps. Existentialists, such as Soren Kierkegaard (1813 - 1855), Jean-Paul Sartre (1905 - 1980) and Albert Camus (1913 - 1960), maintained that living an authentic life, or at least persevering through the absurdities of life, could lead to meaning.

But even while philosophy despaired, science had quietly begun developing models detailing the human relationship to the world-in-itself. This scientific awakening offered a revolutionary new perspective of our place in the universe, one that converged on several different fronts into a single idea; humans are a part of nature - as the old saying goes, just like the trees and the stars you have a right to be here.

Just as western philosophy flirted with nihilism, science confirmed that everything in the universe, ourselves included, is made of the same hundred-odd elements. This discovery soon developed into the understanding that all elements are forged in stars from the same simple building blocks and spread to the rest of the universe when stars die. We, along with everything else, are composed of stardust.

But science soon moved beyond a simple unification of nature based on a shared universal composition. Perhaps the scientific finding having the greatest impact on our perceived relationship to the universe was the theory of natural selection developed by Charles Darwin. This theory offered an alternative to God’s special creation, placing humans above all other living things and describing humanity as a recent evolutionary design directly related to all other organisms. Darwin’s brilliant description of natural selection in On The Origin Of Species included many examples from the natural world supporting his theory that are incompatible with the Christian theory of creation from design. Many open-minded readers found Darwin’s arguments devastating to the biblical account, causing a collapse in credibility, a profound cultural shock recorded by pessimistic philosophers, such as Nietzsche and Heidegger.

Gone was our favoured place among the gods, and we found ourselves instead exposed to a sense of meaninglessness within a vast, uncaring universe. Yet, once over that initial shock, a closer reading of Darwin reveals new, more profound meaning. As noted by the Darwinian champion Thomas Huxley (1825 – 1895) in his great book Evidence As to Man's Place In Nature (10):

Mr. Darwin's hypothesis is not, so far as I am aware, inconsistent with any known biological fact; on the contrary, if admitted, the facts of Development, of Comparative Anatomy, of Geographical Distribution, and of Palaeontology, become connected together, and exhibit a meaning such as they never possessed before

We were no longer the favoured children of an all-powerful God, but we had gained membership in the more tangible family of all living things. As Darwin succinctly noted, this context provides us with significant meaning (11):

There is grandeur in this view of life

Although scientific explanations, such as natural selection, offered meaning through our context within nature, they often provide only a vague summary of nature-in-itself. Natural selection reduces almost to tautology in its central claim that only the fittest organisms exist because fitness is the relative frequency of existence.  It only escapes tautology due to the non-statistical or functional aspects of fitness composing a vastly complex network of adaptation working in conjunction to retain an organism within existence.  Although natural selection, by itself, does not go far in exposing the thing-in-itself of existence, science began developing knowledge of those mechanistic details.

Kant’s dichotomy, although softened by scientific understanding, remained stark. The modern scientific philosopher Alfred Korzybski (1879 – 1950) framed the Kantian dichotomy with a caveat offering a way forward (12):

A map is not the territory it represents, but, if correct, it has a similar structure to the territory, which accounts for its usefulness.

Here he makes two claims. The first supports Kant’s view that maps (models) and territories (things in themselves) are separate and should not be confused. He describes a dichotomy that is still somewhat faddishly used as a put down to proposed explanations, dismissing them as confusing the map with the territory, mistaking mere description for the thing-in-itself. His second point is more substantial than the first; any map is a good map to the extent that it shares its territory’s logical structure. 

Ensuing developments during the scientific age have explored this caveat bridging Kant’s dichotomy. As one example, Darwinian evolution argues that animals’ sensory perceptions should accurately portray the world-in-itself; senses and perceptions have evolved for the practical purpose of allowing animals to navigate challenges posed by the world-in-itself. It identifies perception as an adaptation for accurately portraying aspects of the world-in-itself and describes a mechanism for improving this perceptual accuracy over evolutionary time. In other words, the world-in-itself and animals’ perceptions of it must share much the same logical structure.

Nature's many domains such as life may be viewed, within the framework of inferential systems, as instances where autopoietic models, such as genomes, transform into existing entities such as organisms. Here the map transforms into the territory, revealing both as composed of the same logical structure. This transformative relationship allows scientific models to extend natural models and transform them into natural processes wielding tremendous powers. How could this kind of power be possible unless the underlying scientific models are, to a large extent, true?

A favourite, perhaps an apocryphal example, involves one of the first physicists to understand the process of nucleosynthesis within stars. While stargazing with a girlfriend, he bragged that he was the only person who truly understood why stars shone[1]. His brag implied a close correspondence between the scientific model he had discovered and stellar things-in-themselves, a correspondence borne out by utilizing his newly discovered model as a recipe for building thermonuclear bombs a few decades later. When carefully followed, this scientific model transforms into mini stars here on earth. How can models that prove so powerful fail to be faithful representations or share the same logical structure as the things-in-themselves?

The development of Covid-19 vaccinations provides a further example illustrating convergence between scientific models and the world-in-itself. The virus’ genetic code, published just weeks after it became a concern in China, stimulated a few medical labs to construct computer models of those genetic sequences expressing proteins on the virus’ surface. When these models are transformed into mRNA and injected into people, they alert the body's immune system, just as would an actual infection, ramping up antibodies capable of defeating future viral infections. This transformation from scientific model to immune system stimulation causes the vaccine's effectiveness and indicates convergence between scientific models and the world in itself.

As a final and perhaps decisive example of the convergence between scientific models and the world-in-itself, we consider computation's power to model natural processes. The Church-Turing thesis states that a Turing machine, the technical name for a classical computer having unlimited resources, can compute any computable function (13). As models describing natural systems are computable functions, we may say that a computational model is possible for every natural system. In other words, algorithms capturing scientific models of every natural system can simulate those natural systems. In turn, comparisons between these simulations and observation of natural systems become evidence updating the scientific models and algorithms to greater accuracy.  In this manner, scientific models and their simulations join in an inferential system that progressively bridges the chasm between scientific models and the world in itself.

The recent theoretical discovery of quantum computing has propelled this thesis into the Church – Turing - Deutsch principle (14). In noting a one-to-one correspondence between quantum phenomena and quantum computation and that all physical systems have a quantum description, this principle states that any physical system may be simulated to any degree of accuracy using a universal quantum computer. It is tantamount to claiming the world-in-itself, at the quantum level, is equivalent to a computational process, leaving little distinction between scientific models and the world-in-itself.

As illustrated by these examples, scientific understanding has evolved beyond mere descriptive models to provide models capable of duplicating or simulating nature’s many mechanisms and bringing entities to exist within the world in itself.  In other words, we can view the world in itself as essentially similar to our scientific models – resolving the Kantian dichotomy.

Scientific models simulate both nature’s models or generalized genotypes and their resulting physical forms, or generalized phenotypes, existing as the world in itself. But science goes beyond simulating existing forms and can serve as a generalized genotype that brings new technologies into existence.

The world itself and its scientific models are vastly complex, hierarchical nestings of inferential systems; each system is engaged in a cyclical two-step inferential process, evolving and implementing knowledge for existence. These two steps are consequences of the free-energy principle, which states that systems maximize the evidence predicted by their models and thereby reduce the surprise they experience. Systems may do this in two ways:

1)  They may accurately follow their models’ knowledge (active inference), causing the world in itself to conform to their models.

2) They may update their models to greater accuracy using evidence of their existence (learning or evolution) within the world in itself, causing their models to conform to reality.

These two steps form cyclic inferential systems where knowledgeable models create and maintain the world in itself (e.g. genotypes create and maintain phenotypes), and the phenotype’s experience within the world in itself updates model knowledge (e.g. natural selection). This universal dualism explains the accumulation of knowledge in the universe and identifies science as a recent, powerful, but essentially natural method of knowledge accumulation. We suggest that in this manner, the chasm between scientific models and the natural world in itself is bridged.

Inferential systems provide a general scientific description of existence and an account of the world-in-itself, where each entity within the vast web of existence is inferentially engaged in exploring possibilities for existence. Here, the polar concepts of scientific description and the world-in-itself converge in the nascent field of Bayesian mechanics (421), which, like Newtonian mechanics, serves as a scientific description and an account of the world-in-itself. Science’s inferential nature provides optimal models, those sharing maximal logical structure with the vast inferential system composing the world-in-itself. In other words, science provides potent maps of the world’s territory because of its shared logical structure.

Part I of this book claimed that all forms of existence are examples of inferential systems. Part II explored and described inferential systems, and Part III interpreted modern scientific findings in the domains of cosmology, quantum phenomena, biology, neural-based behaviour and culture as inferential systems. Almost certainly, many of this account’s details are incorrect but more accurate, universalist explanations of existence may soon emerge because many convergent frameworks are under development. And we may also have confidence that this eventual explanation will describe existence in terms of knowledge and knowledge in terms of inferential processes.

 

References

1. Wooten, David. The invention of science: A new history of the scientific revolution. s.l. : Harper Perennial; Reprint edition (December 13, 2016), 2016. ISBN-10: 0061759538.

2. Wikipedia. Francis Bacon. Wikipedia. [Online] [Cited: September 26, 2021.] https://en.wikipedia.org/wiki/Francis_Bacon.

3. —. Sagan standard. Wikipedia. [Online] [Cited: July 11, 2021.] https://en.wikipedia.org/wiki/Sagan_standard.

4. —. David Hume. Wikipedia. [Online] [Cited: November 22, 2020.] https://en.wikipedia.org/wiki/David_Hume.

5. Kant, Immanuel. Prolegomena to Any Future Metaphysics. 1783.

6. Wikipedia. Immanuel Kant. Wikipedia. [Online] [Cited: February 14, 2021.] https://en.wikipedia.org/wiki/Immanuel_Kant.

7. —. Ludwig Wittgenstein. Wikipedia. [Online] [Cited: Februarfy 15, 2021.] https://en.wikipedia.org/wiki/Ludwig_Wittgenstein.

8. Wittgenstein, Ludwig. Tractatus Logico-Philosophicus. New York : [Reprinted, with a few corrections] Harcourt, Brace,, 1933.

9. Wikipedia. Nihlism. Wikipedia. [Online] [Cited: February 24, 2021.] https://en.wikipedia.org/wiki/Nihilism.

10. Huxley, Thomas Henry. Evidence as to Man's Place in Nature. s.l. : Williams & Norgate, 1863.

11. Darwin, Charles. The Origin of Species. sixth edition. New York : The New American Library - 1958, 1872. pp. 391 -392.

12. Korzybski, Alfred. Science and Sanity: An Introduction to Non-Aristotelian Systems and General Semantics. . s.l. : International Non-Aristotelian Library Publishing Company., 1933.

13. Wikipedia. Church-Turing thesis. Wikipedia. [Online] [Cited: February 23, 2021.] https://en.wikipedia.org/wiki/Church%E2%80%93Turing_thesis.

14. —. Church-Turing-Deutsch principle. Wikipedia. [Online] [Cited: February 23, 2021.] https://en.wikipedia.org/wiki/Church%E2%80%93Turing%E2%80%93Deutsch_principle.

 



[1] Sometimes these eureka moment celebrations can be deflationary. The neuroscientist Geoffry Hinton (born 1947) is reported to have announced to his family that he believed he had finally figured out how the human brain worked. His fifteen-year-old daughter replied: ‘Oh Daddy, not again!’.


Friday 18 June 2021

Existential Knowledge

John O. Campbell

This is an excerpt from the book The Knowing Universe

Dictionary.com chose existential as its 2019 word of the year due to a spike in internet searches partly in response to Bernie Saunders' and Greta Thornburg's descriptions of global warming as an existential crisis (1). Dictionary.com defines existential as:

concerned with existence, especially human existence as viewed in the theories of existentialism.

Historically our species has tended to view its existence as given and unalterable, and this current interest in existential threats is heartening. Perhaps our view of existence is shifting from a static conception to one more dynamic. Static concepts of existence dominated western philosophical traditions before Darwin; the consensus view held that all existing things came into existence in a single event and would retain their initial forms until the end. For example, it was a consensus among western intellectuals that God created all species during his seven days of creation and that this original group of species had since remained static.

Darwin's findings compel us to consider existence in a new, dynamic light; existence may have only brief durations even for entire species. Since the beginning, novel entities embodying new strategies have winked into existence and those no longer able to resist natural tendencies towards dissipation have winked out. The current popularity of the word existential might be due to some lingering, justified unease about prospects for our continued existence. 

As this book argues, existence is fragile and entirely dependent upon knowledge inferred from the evidence. Cultures are not immune to this winnowing process. Human history documents an unending succession of cultures whose shockingly brief time in the sun is cut short through fatal flaws in inferential abilities. We have cause for worry when dire circumstances outmatch our nascent inferential abilities.

Some aspects of cultural inferences are nearly flawless. Our many scientific inferences efficiently develop powerful economies and technologies; these scientific models are flawlessly updated with the latest available experimental evidence, becoming ever more powerful. However, other cultural models, especially models concerning our use of technology, are neglected, and there is resistance to updating them with the evidence. Models concerning the usage of technologies form an essential part of cultural regulation, and neglect of these models may pose our most considerable existential challenge.

It is common for entities to wink out of existence when their models for achieving existence fall into error and are no longer sufficient to maintain them within changed circumstances. For example, among the most advanced life forms during the Late Ordovician, Trilobites met extinction due to volcanic release of greenhouse gases and subsequent planetary warming (2). Trilobites' genetic knowledge for maintaining their existence was overwhelmed by new and unforeseen circumstances. Our cultural existence is not currently challenged by volcanic greenhouse gases but by greenhouse gases whose production is entirely under our control. The intellectual gap threatening extinction is much narrower for us than it was for trilobites. It is something we are able to comprehend.

During almost all our species history, developing models for the wise application of technology has been relatively straightforward: use any available technology to maximize resource extraction in the service of biological existence. This strategy of maximizing resource extraction was hard-wired into our biological ancestors since the beginning, and the exponential increase in our numbers evidence its success. Wisely, our conservative inclinations counsel us not to tinker with a successful strategy. After all, existence is difficult, and if we already have a model that successfully achieves existence, why fix something that is working?

The wisdom of conservative strategies raises the interesting question of when, if ever, is the right time to revise our basic cultural models. What weight of evidence is sufficient to motivate adjustments to our primary strategies for existence? The short answer is that we should heed the scientific evidence.

Science is by far our most powerful process for understanding the workings of natural systems including culture. Powerful scientific models are responsible for the modern technologies that vastly increase our health and wealth and provide the improved circumstance in which we exist. It only stands to reason that if we are going to use science to alter our world, we should also use science to understand, as fully as possible, the implications these alterations hold for our existence. The failure to correctly apply scientific knowledge to strategies for existence opens the possibility of our extinction. Only scientific knowledge is capable of guiding us through this challenge.

The scientific method is routinely applied as a rigorous inferential system in laboratories and universities worldwide to understand natural processes and employ it as technological applications. Science and technology provide great power, but that power is inherently dangerous to us. Great power tends to provide increased forces towards dissipation, which is as true within the cultural domain as it is in any other. As we have seen, the only way great power can be compatible with existence is sufficient regulation, regulatory mechanisms discovered through careful inferences. One of my mother's wise sayings that have stuck with me is:

Fire is a good servant but a bad master.

Fire is an example of a great power bestowing great benefits, but if we have a fire in our homes, it is best contained in a stove where it can be regulated and held subservient. If we allow it to burn unconstrained outside of the stove, it follows an independent trajectory, one that may threaten our existence. Given that human actions are now the most powerful influences on our planet, we must model their effects as accurately as possible and regulate our actions so that our planet's future states provide for our existence.

Perhaps the primary obstacle preventing us from coming to grips with our predicament is a predilection for outmoded biological values such as resource accumulation and status. These values emerge as a striving for wealth in a cultural context, and although science and technology could easily provide sufficient resources for all, we remain obsessed with these strivings. Societies have become structured to facilitate competition for wealth, and this competition has concentrated wealth in a small portion of the population, often known as the one percent.


Within the US, for example, the top 1%, now own much more wealth than does the bottom 90% of the population (3), a concentration of wealth and power that Pope Francais lambasted as obscene (4):

The earth, entire peoples and individual persons are being brutally punished. And behind all this pain, death and destruction there is the stench of what Basil of Caesarea called "the dung of the devil". An unfettered pursuit of money rules. The service of the common good is left behind. Once capital becomes an idol and guides people's decisions, once greed for money presides over the entire socioeconomic system, it ruins society, it condemns and enslaves men and women, it destroys human fraternity, it sets people against one another and, as we clearly see, it even puts at risk our common home.

His last point, implicating current levels of wealth inequality as a risk to our common home on mother earth, deserves some unpacking within the context of existential threats.

References

1. .com, Dictionary. Dictionary.com's word of the year for 2019 is existential. Dictionary.com. [Online] [Cited: July 16, 2020.] https://www.dictionary.com/e/word-of-the-year/.

2. Late Ordovician mass extinction caused by volcanism, warming, and anoxia, not cooling and glaciation. Bond, David P.G. and Grasby, Stephen E. s.l. : Geological society of America, 2020, Vol. 48. https://doi.org/10.1130/G47377.1.

3. Saez, Emmanuel and Zucman, Gabriel. Wealth Inequality in the United States Since 1913: Evidence From Capitalized Income Tax Data. [NBER Working Paper] s.l. : National Bureau of Economic Research, 2014.

4. Kozlowska, Hanna. Pope Francis: Unfettered capitalism is "the dung of the devil". Quartz. [Online] [Cited: 3 3, 2016.] http://qz.com/450445/pope-francis-unfettered-capitalism-is-the-dung-of-the-devil/.