Paul P. Mealing

Check out my book, ELVENE. Available as e-book and as paperback (print on demand, POD). Also this promotional Q&A on-line.

Sunday 1 January 2023

The apparent dichotomous relationship between consciousness and determinism

 Someone (Graham C Lindsay) asked me a question on Quora:

Is it true that every event, without exception, is fully caused by its antecedent conditions?

 Graham Lindsay is Scottish, a musician (50 years a keyboard player) and by his own admission, has a lot of letters after his name. I should point out that I have none. The Quora algorithm gave me the impression that he asked me specifically, but maybe he didn't. As I say at the outset, David Cook gives a more erudite answer than me. It so happens, I've had correspondence with David Cook (he contacted me) and he sent me a copy of his book of poetry. He's a retired psychiatrist and lecturer.

In fact, I recommend that you read his answer in conjunction with mine - we take subtley different approaches without diverging too far apart.

I concede that there's not a lot that's new in this post, but I've found that rearranging pre-existing ideas can throw up new insights and thought-provocations.


Thanks for asking me, I feel flattered. To be honest, I think David Cook gives a better and more erudite answer than I can. I’d also recommend you ask Mark John Fernee (physicist with University of Queensland) who has some ideas on this subject.

I’ll start with Fernee, because he argues for determinism without arguing for superdeterminism, as Sabine Hossenfelder does. To answer the question directly, it appears to be true to the best of our knowledge. What do I mean by that? Everything in the Universe that has happened to date seems to have a cause, and it would appear that there is a causal chain going all the way back to the Big Bang. The future, however, is another matter. In the future we have multiple paths that are expressed in QM as probabilities. In fact, Freeman Dyson argued that QM can only describe the future and not the past. As another Quora contributor (David Moore) pointed out, you can only have a probability less than one for an event in the future. If it’s in the past, it has a probability of One.

In the Universe, chaos rules at virtually every level. A lot of people are unaware that even the orbits of the planets are chaotic, so they are only predictable within a range of hundreds of millions of years. Hossenfelder (whom I cited earlier) has a YouTube video where she demonstrates how a chaotic phenomenon always has a limited horizon of predictability (for want of a better phrase). With the weather it’s about 10 days. This doesn’t stop the Universe being deterministic up to the present, while being unpredictable in the future. The thing about chaotic phenomena is that if you rerun them you’d get a different outcome. This applies to the Universe itself. The best known example is the tossing of a coin, which is a chaotic event. It’s fundamental to probability theory that every coin toss is independent of previous tosses.

Regarding QM, we all know that Schrodinger’s equation is deterministic and time-reversible. However, as Fernee points out, the act of ‘measurement’ creates an irreversible event. To quote Paul Davies:

The very act of measurement breaks the time symmetry of quantum mechanics in a process sometimes described as the collapse of the wave function... the rewind button is destroyed as soon as that measurement is made.

David Cook, in his answer, mentions the role of imagination in his closing paragraph and I don’t think that can be overstated. To quote another philosopher, Raymond Tallis:

Free agents, then, are free because they select between imagined possibilities, and use actualities to bring about one rather than another.

I feel this is as good a description of free will as you can get. And like David, I think imagination is key here. And this raises the issue of consciousness, because I’m unsure how it fits into the scheme of things. As Schrodinger pointed out, consciousness exists in a constant present, which means that without memory you wouldn’t know you are conscious. And this has actually happened, where people have behaved consciously without being aware of it. It happened to my father when he was knocked unconscious in a boxing ring, and I know of other incidents. In my father’s case, he got back on his feet and knocked out his opponent – when he came to, he was standing over his opponent with no memory of what happened.

I tell this anecdote, because it begs a question. If we can respond to events that are harmful or life-threatening without conscious awareness, then why do we need consciousness?

All evidence of consciousness points to a neural substrate dependency. We don’t find consciousness in machines despite predictions that we eventually will. But it seems to me that consciousness acts outside the causal chain of the Universe. We have the ability, as do other sentient creatures, to perform actions on our physical environment that are purely determined by imagination, therefore thought. And we can even use thought to change the neural pathways in our brains, like a feedback loop, or as Douglas Hofstadter coined it, a ‘strange loop’.

 

Addendum: For my own benefit, I've coined the terms, 'weak determinism' and 'strong determinism', to differentiate between deterministic causality and superdeterminism respectively. I know there's a term called 'compatible determinism', from Hume, which, according to other sources, is the same as weak determinism, as I expound on below.

The point is that weak determinism (causality) is compatible with free will, which is what Hume argued, according to the Stanford Encyclopedia reference (linked above). However, Hume famously challenged the very idea of causality, whereas I'd argue that 'weak determinism' is completely dependent on causality being true and a universal principle. On the other hand, 'strong determinism' or superdeterminism (as advocated by Sabine Hossenfelder) axiomatically rules out free will, so there is a fundamental difference.

For the sake of clarity, the determinism I refer to in my essay (and its title) is weak determinism.

Tuesday 20 December 2022

What grounds morality?

 In the most recent issue of Philosophy Now (No 153, Dec 2022/Jan 2023), they’ve published the answers to the last Question of the Month: What Grounds or Justifies Morality? I submitted an answer that wasn’t included, and having read the 10 selected, I believe I could have done better. In my answer, I said, ‘courage’, based on the fact that it takes courage for someone to take a stand against the tide of demonisation of the ‘other’, which we witness so often in history and even contemporary society.
 
However, that is too specific and doesn’t really answer the question, which arguably is seeking a principle, like the ‘Golden Rule’ or the Utilitarian principle of ‘the greatest happiness to the greatest number’. Many answers cited Kant’s appeal to ‘reason’, and some cited religion and others, some form of relativism. All in all, I thought they were good answers without singling any one out.
 
So what did I come up with? Well, partly based on observations of my own fiction and my own life, I decided that morality needed to be grounded in trust. I’ve written about trust at least twice before, and I think it’s so fundamental, because, both one-on-one relationships (of all types) and society as a whole, can’t function properly without it. If you think about it, how well you trust someone is a good measure of your assessment of their moral character. But it functions at all levels of society. Imagine living in a society where you can’t say what you think, where you have to obey strict rules of secrecy and deception or you will be punished. Such societies exist.
 
I’ve noticed a recurring motif in my stories (not deliberate) of loyalties being tested and of moral dilemmas. Both in my private life and professional life, I think trust is paramount. It’s my currency. I realised a long time ago that if people don’t trust me, I have no worth.

Monday 14 November 2022

Kant and modern physics

 I wrote a post on Kant back in February 2020, but it was actually an essay I wrote more than 20 years earlier, when I was a student of philosophy. I would not be able to improve on that essay, and I’m not about to try now. In that essay, I argue that Kant’s great contribution to philosophy, and epistemology in particular, was his idea of the ‘thing-in-itself’, which may remain forever unknowable, as we only have our perceptions of ‘things’.
 
In other posts, I have sometimes argued that the ‘thing-in-itself’ is dependent on the scale that we can observe it, but there is something deeper that I think only became apparent in the so-called golden age of physics in the 20th Century. In a more recent post, I pointed out that both relativity theory and quantum mechanics (the 2 pillars of modern physics) are both observer dependent. I argue that there could be an objective ontology that they can’t describe. I think this is more obvious in the case of special relativity, where different observers literally measure different durations of both space and time, but I’m getting ahead of myself.
 
On Quora, there are 4 physicists whom I ‘follow’ and read regularly. They are Viktor T Toth, Richard Muller, Mark John Fernee and Ian Miller. Out of these, Miller is possibly the most contentious as he argues against non-locality in QM (quantum mechanics), which I’m not aware of any other physicist concurring with. Of course, it’s Bell’s Inequality that provides the definitive answer to this, of which Miller has this to say:
 
If you say it must because of violations of Bell’s Inequality, first note that the inequality is a mathematical relationship that contains only numbers; no physical concept is included.
 
But the ‘numbers’ compare classical statistical outcomes with Born statistical outcomes and experiments verify Born’s results, so I disagree. Having said that, Miller makes pertinent points that I find insightful and, like all those mentioned, he knows a lot more about this topic than me.
 
For example, concerning relativity, he argues that it’s the ruler that changes dimension and not the space being measured. He also points out, regarding the twin paradox, that only one twin gains energy, which is the one whose clock slows down. Note that clocks are also a form of ‘ruler’, but they measure time instead of space. So you can have 2 observers who ‘measure’ different durations of space and time, but agree on ‘now’, when they reunite, as is the case with the twin paradox thought experiment.
 
This point is slightly off-track, but not irrelevant to the main focus of this post. The main focus is an academic paper jointly written by Shaun Maguire and Richard Muller, titled Now, and the Flow of Time. This paper is arguably as contentious as Miller’s take on non-locality and Bell, because Muller and Maguire argue that ‘space’ can be created.
 
Now, Viktor T Toth is quite adamant that space is not created because space is not an entity, but a ‘measurement’ between entities called ‘objects’. Now, it has to be said, that Muller has stated publicly on Quora that he has utmost respect for Toth and neither of them have called each other out over this issue.
 
Toth argues that people confound the mathematical metric with ‘space’ or ‘spacetime’, but I’d argue that this mathematical metric has physical consequences. In another post, I reference another paper, recommended to me by Mark John Fernee (authored by Tamara M. Davis and Charles H. Lineweaver at the University of New South Wales) which describes how a GR Doppler shift intrinsically measures the expansion of space.
 
The general relativistic interpretation of the expansion interprets cosmological redshifts as an indication of velocity since the proper distance between comoving objects increases. However, the velocity is due to the rate of expansion of space, not movement through space, and therefore cannot be calculated with the special relativistic Doppler shift formula.
(My emphasis)
 
As I explain in that post: ‘What they are effectively saying is that there is a distinction between the movement of objects in space and the movement of space itself.’
 
The spacetime metric that Toth refers to provides a reference frame for c, the speed of light. So, whilst a spacetime metric (‘space’ by another name) can travel faster than light with respect to us (so over the horizon of the observable universe), an observer situated in that metric would still measure light as c relative to them.
 
Muller’s and Maguire’s paper goes even further, saying that space is created along with time, and they believe this can be measured as ‘a predicted lag in the emergence of gravitational radiation when two black holes merge.’ I won’t go into the details; you would need to read the paper.
 
A conclusion implicit in their theory is that there could be a universal now.
 
A natural question arises: why are the new nows created at the end of time, rather than uniformly throughout time, in the same way that new space is uniformly created throughout the universe.

 
The authors then provide alternative arguments, which I won’t go into, but they do ponder the fundamental difference between space and time, where one is uni-directional and the other is not. As far as we know, there is no ‘edge’ in space but there is in time. Muller and Maguire do wonder if space is ‘created’ throughout the Universe (as quoted above) or at an ‘edge’.
 
You may wonder how does Kant fit into all this? It’s because all these discussions are dependent on what we observe and what we theorise, both of which are perceptions. And, in physics, theorising involves mathematics. I’ve argued that mathematics can be seen as another medium determining perceptions, along with all the instruments we’ve built that now include the LHC and the Hubble and Webb telescopes.
 
Sabine Hossenfelder, whom I often reference on this blog these days, wrote a book, called Lost in Math, where she interviews some of the brightest minds in physics and challenges the pervading paradigm that mathematics can provide answers to questions that experimentation can’t – string theory being the most obvious.

Before the revolution in cosmology, created by Copernicus and built on by Galileo, Kepler and Newton, people believed that the Sun went round the Earth and that some objects in the night sky would occasionally backtrack in their orbits, which was explained by epicycles. That was overturned, and now it seems obvious that, in fact, the Earth rotates on its axis and orbits the sun along with all the other planets, which explains our ‘perception’ that sometimes the planets go ‘backwards.’
 
I wonder if the next revolution in science and cosmology may also provide a ‘simpler’ picture, where there is a ‘universal now’ that explains the age of the Universe, the edge of time that we all experience and non-locality in QM.
 
Of course, I’m probably wrong.

Addendum: This is Richard Muller talking about time on Quora.

Wednesday 28 September 2022

Humanity’s Achilles’ heel

Good and evil are characteristics that imbue almost every aspect of our nature. It’s why it’s the subject of so many narratives, including mythologies and religions, not to mention actual real-world histories. It effectively defines what we are, what we are capable of and what we are destined to be.
 
I’ve discussed evil in one of my earliest posts, and also its recurring motif in fiction. Humanity is unique, at least on this small world we call home, in that we can change it on a biblical scale, both intentionally and unintentionally – climate change being the most obvious and recent example. We are doing this in combination with creating the fastest growing extinction event in the planet’s history, for which most of us are blissfully ignorant.
 
This post is already going off on tangents, but it’s hard to stay on track when there are so many ramifications; because none of these issues are the Achilles’ heel to which the title refers.
 
We have the incurable disease of following leaders who will unleash the worst of humanity onto itself. I wrote a post back in 2015, a year before Trump was elected POTUS, that was very prescient given the events that have occurred since. There are two traits such leaders have that not only define them but paradoxically explain their success.
 
Firstly, they are narcissistic in the extreme, which means that their self-belief is unassailable, no matter what happens. The entire world can collapse around them and somehow they’re untouchable. Secondly, they always come to power in times of division, which they exploit and then escalate to even greater effect. Humans are most irrational in ingroup-outgroup situations, which could be anything from a family dispute to a nationwide political division. Narcissists thrive in this environment, creating a narrative that only resembles the reality inside their head, but which their followers accept unquestioningly.
 
I’ve talked about leadership in other posts, but only fleetingly, and it’s an inherent and necessary quality in almost all endeavours; be it on a sporting field, on an engineering project, in a theatre or in a ‘house’ of government. There is a Confucian saying (so neither Western nor modern): If you want to know the true worth of a person, observe the effects they have on other people’s lives. I’ve long contended that the best leaders are those who bring out the best in the people they lead, which is the opposite of narcissists, who bring out the worst.
 
I’ve argued elsewhere that we are at a crossroads, which will determine the future of humanity for decades, if not centuries ahead. No one can predict what this century will bring, in the same way that no one predicted all the changes that occurred in the last century. My only prediction is that the changes in this century will be even greater and more impactful than the last. And whether that will be for the better or the worse, I don’t believe anyone can say.
 
Do I have an answer? Of course not, but I will make some observations. Virtually my whole working life was spent on engineering projects, which have invariably involved an ingroup-outgroup dynamic. Many people believe that conflict is healthy because it creates competition and by some social-Darwinian effect, the best ideas succeed and are adopted. Well, I’ve seen the exact opposite, and I witness it in our political environment all the time.
 
In reality, what happens is that one side will look for, and find, something negative about every engineering solution to a problem that is proposed. This means that there is continuous stalemate and the project suffers in every way imaginable – morale is depleted, everything is drawn out and we have time and cost overruns, which feed the blame-game to new levels. At worst, the sides end up in legal dispute, where, I should point out, I’ve had considerable experience.
 
On the contrary, when sides work together and collaboratively, people compromise and respect the expertise of their counterparts. What happens is that problems and issues are resolved and the project is ultimately successful. A lot of this depends on the temperament and skills of the project leader. Leadership requires good people skills.
 
Someone once did a study in the United States in the last century (I no longer have the reference) where they looked for the traits of individuals who were eminently successful. And what they found was that it was not education or IQ that was the determining factor, though that helped. No, the single most important factor was the ability to form consensus.
 
If one looks at prolonged conflicts, like we’ve witnessed in Ireland or the Middle East, people involved in talks will tell you that the ‘hardliners’ will never find peace, only the moderates will. So, if there is a lesson to be learned, it’s not to follow leaders who sow and reap division, but those who are inclusive. That means giving up our ingroup-outgroup mentality, which appears impossible. But, until we do, the incurable disease will recur and we will self-destruct by simply following the cult that self-destructive narcissists are so masterfully capable of growing.
 

Sunday 25 September 2022

What we observe and what is reality are distinct in physics

 I’ve been doing this blog for 15 years now, and in that time some of my ideas have changed or evolved, and, in some areas, my knowledge has increased. As I’ve said on Quora a few times, I read a lot of books by people who know a lot more than me, especially in physics.
 
There is a boundary between physics and philosophy, the shoreline of John Wheeler’s metaphorical ‘island of knowledge in the infinite sea of ignorance’. To quote: “As the island grows so does the shoreline of our ignorance.” And I think ignorance is the key word here, because it’s basically speculation, which means some of us are wrong, including me, most likely. As I’ve often said, ‘Only future generations can tell us how ignorant the current generation is’. I can say that with a lot of confidence, just by looking at the history of science.
 
If this blog has a purpose beyond promoting my own pet theories and prejudices, it is to make people think.
 
Recently, I’ve been pre-occupied with determinism and something called superdeterminism, which has become one of those pet prejudices among physicists in the belief that it’s the only conclusion one can draw from combining relativity theory, quantum mechanics, entanglement and Bell’s theorem. Sabine Hossenfelder is one such advocate, who went so far as to predict that one day all other physicists will agree with her. I elaborate on this below.
 
Mark John Fernee (physicist with Qld Uni), with whom I’ve had some correspondence, is one who disagrees with her. I believe that John Bell himself proposed that superdeterminism was possibly the only resolution to the quandaries posed by his theorem. There are two other videos worth watching, one by Elijah Lew-Smith and a 50min one by Brian Greene, who doesn’t discuss superdeterminism. Nevertheless, Greene’s video gives the best and easiest to understand description of Bell’s theorem and its profound implications for reality.
 
So what is super-determinism, and how is it distinct from common or garden determinism? Well, if you watch the two relevant videos, you get two different answers. According to Sabine, there is no difference and it’s not really to do with Bell’s theorem, but with the measurement problem in QM. She argues that it’s best explained by looking at the double-slit experiment. Interestingly, Richard Feynman argued that all the problems associated with QM can be analysed, if not understood, by studying the double-slit experiment.
 
Sabine wrote an academic paper on the ‘measurement problem’, co-authored with Jonte R. Hance from the University of Bristol, which I’ve read and is surprisingly free of equations (not completely) but uses the odd term I’m unfamiliar with. I expect I was given a link by Fernee which I’ve since lost (I really can’t remember), but I still have a copy. One of her points is that as long as we have unsolved problems in QM, there is always room for different philosophical interpretations, and she and Hance discuss the most well-known ones. This is slightly off-topic, but only slightly, because even superdeterminism and its apparent elimination of free will is a philosophical issue.
 
Sabine argues that it’s the measurement that creates superdeterminism in QM, which is why she uses the double-slit experiment to demonstrate it. It’s because the ‘measurement’ ‘collapses’ the wave function and ‘determines’ the outcome, that it must have been ‘deterministic’ all along. It’s just that we don’t know it until a measurement is made. At least, this is my understanding of her argument.
 
The video by Elijah Lew-Smith gives a different explanation, focusing solely on Bell’s theorem. I found that it also required more than one viewing, but he makes a couple of points, which I believe go to the heart of the matter. (Greene’s video gives an easier-to-follow description, despite its length).
 
We can’t talk about an objective reality independent of measurement.
(Which echoes Sabine’s salient point in her video.)
 
And this point: There really are instantaneous interactions; we just can’t access them.
 
This is known as ‘non-locality’, and Brian Greene provides the best exposition I’ve seen, and explains how it’s central to Bell’s theorem and to our understanding of reality.
 
On the other hand, Lew-Smith explains non-locality without placing it at the centre of the discussion.
 
If I can momentarily go back to Sabine’s key argument, I addressed this in a post I wrote a few years back. Basically, I argued that you can only know the path an electron or photon takes retrospectively, after the measurement or observation has been made. Prior to that, QM tells us it’s in a superposition of states and we only have probabilities of where it will land. Curiously, I referenced a video by Sabine in a footnote, where she makes this point in her conclusion:
 
You don’t need to know what happens in the future because the particle goes to all points anyway. Except…  It doesn’t. In reality, it goes to only one point. So maybe the reason we need the measurement postulate is because we don’t take this dependency on the future seriously enough.
 
And to me, that’s what this is all about: the measurement is in the future of the wave function, and the path it takes is in the past. This, of course, is what Freeman Dyson claims: that QM cannot describe the past, only the future.
 
And if you combine this perspective with Lew-Smith’s comment about objective reality NOT being independent of the measurement, then objective reality only exists in the past, while the wave function and all its superpositional states exist in the future.
 
So how does entanglement fit into this? Well, this is the second point I highlighted, which is that ‘there really are instantaneous reactions, which we can’t access’, which is ‘non-locality’. And this, as Schrodinger himself proclaimed, is what distinguishes QM from classical physics. In classical physics, ‘locality’ means there is a relativistic causal connection and in entanglement there is not, which is why Einstein called it ‘spooky action at a distance’.
 
Bell’s theorem effectively tells us that non-locality is real, supported by experiment many times over, but you can’t use it to transmit information faster-than-light, so relativity is not violated in practical terms. But it does ask questions about simultaneity, which is discussed in Lew-Smith’s video. He demonstrates graphically that different observers will observe a different sequence of measurement, so we have disagreement, even a contradiction about which ‘measurement’ collapsed the wave function. And this is leads to superdeterminism, because, if the outcome is predetermined, then the sequence of measurement doesn’t matter.
 
And this gets to the nub of the issue, because it ‘appears’ that ‘objective reality’ is observer dependent. Relativity theory always gives the result from a specific observer’s point of view and different observers in different frames of reference can epistemically disagree. Is there a frame of reference that is observer independent? I always like to go back to the twin paradox, because I believe it provides an answer. When the twins reunite, they disagree on how much time has passed, yet they agree on where they are in space-time. There is not absolute time, but there is absolute space-time.
 
Did you know we can deduce the velocity that Earth travels relative to absolute space-time, meaning the overall observable Universe? By measuring the Doppler shift of the CMBR (cosmic microwave background radiation) in all directions, it’s been calculated that we are travelling at 350km/s in the direction of Pisces (ref., Paul Davies, About Time; Einstein’s Unfinished Revolution, 1995). They should teach this in schools.
 
Given this context, is it possible that entanglement is a manifestation of objective simultaneity? Not according to Einstein, who argued that: ‘The past, present and future is only a stubbornly persistent illusion’; which is based on the ‘fact’ that simultaneity is observer dependent. But Einstein didn’t live to see Bell’s theorem experimentally verified. Richard Muller, a prize-winning physicist and author (also on Quora) was asked what question he’d ask Einstein if he could hypothetically meet him NOW. I haven’t got a direct copy, but essentially Muller said he’d ask Einstein if he now accepted a ‘super-luminal connection’, given experimental confirmation of Bell’s theorem. In other words, entanglement is like an exception to the rule, where relativity strictly doesn’t apply.
 
Sabine with her co-author, Jonte Hance, make a passing comment that the discussion really hasn’t progressed much since Bohr and Einstein a century ago, and I think they have a point.
 
Mark Fernee, whom I keep mentioning on the sidelines, does make a distinction between determinism and superdeterminism, where determinism simply means that everything is causally connected to something, even if it’s not predictable. Chaos being a case-in-point, which he describes thus:
 
Where this determinism breaks down is with chaotic systems, such as three body dynamics. Chaotic systems are so sensitive to the initial parameters that even a slight inaccuracy can result in wildly different predictions. That's why predicting the weather is so difficult.
Overall, complexity limits the ability to predict the future, even in a causal universe.

 
On the other hand, superdeterminism effectively means the end of free will, and, in his own words, ‘free will is a contentious issue, even among physicists’.
 
Fernee provided a link to another document by Sabine, where she created an online forum specifically to deal with less than knowledgeable people about their disillusioned ideas on physics – crackpots and cranks. It occurred to me that I might fall into this category, but it’s for others to judge. I’m constantly reminded of how little I really know, and that I’m only fiddling around the edges, or on the ‘shoreline of ignorance’, as Wheeler described it, where there are many others far more qualified than me.
 
I not-so-recently wrote a post where I challenged a specific scenario often cited by physicists, where two observers hypothetically ‘observe’ contradictory outcomes of an event on a distant astronomical body that is supposedly happening simultaneously with them.
 
As I said before, relativity is an observer-dependent theory, almost by definition, and we know it works just by using the GPS on our smart-phones. There are algorithms that make relativistic corrections to the signals coming from the satellites, otherwise the map on your phone would not match the reality of your actual location.
 
What I challenge is the application of relativity theory to an event that the observer can’t observe, even in principle. In fact, relativity theory rules out a physical observation of a purportedly simultaneous event. So I’m not surprised that we get contradictory results. The accepted view among physicists is that each observer ‘sees’ a different ontology (one in the future and one in the past), whereas I contend that there is an agreed ontology that becomes observable at a later time, when it’s in both observers’ past. (Brian Greene has another video demonstrating the ‘conventional’ view among physicists.)
 
Claudia de Rahm is Professor of Physics at Imperial College London, and earlier this year, she gave a talk titled, What We Don’t Know About Gravity, where she made the revelatory point
that Einstein’s GR (general theory of relativity) predicted its own limitations. Basically, if you apply QM probabilities to extreme curvature spacetime, you get answers over 100%, so nonsense. GR and QM are mathematically incompatible if we try to quantise gravity, though QFT (quantum field theory) ‘works fine on the manifold of spacetime’, according to expert, Viktor T Toth.
 
Given that relativity theory, as it is applied, is intrinsically observer dependent, I question if it can be (reliably) applied to events that have no causal relation to the observer (meaning outside the observer's light cone, both past and future). Which is why I challenge its application to events the observer can't observe (refer 2 paragraphs ago).

 

Addendum: I changed the title so it's more consistent with the contents of the post. The previous title was Ignorance and bliss; philosophy and science. Basically, the reason we have different interpretations of the same phenomenon is because physics can only tell us about what we observe, and what that means for reality is often debatable; superdeterminism being a case in point. Many philosophers and scientists talk about a ‘gap’ between theory and reality, whereas I claim the gap is between the observation and reality, a la Kant.

Wednesday 7 September 2022

Ontology and epistemology; the twin pillars of philosophy

 I remember in my introduction to formal philosophy that there were 5 branches: ontology, epistemology, logic, aesthetics and ethics. Logic is arguably subsumed under mathematics, which has a connection with ontology and epistemology through physics, and ethics is part of all our lives, from politics to education to social and work-related relations to how one should individually live. Aesthetics is like an orphan in this company, yet art is imbued in all cultures in so many ways, it is unavoidable.
 
However, if you read about Western philosophy, the focus is often on epistemology and its close relation, if not utter dependence, on ontology. Why dependence? Because you can’t have knowledge of something without inferring its existence, even if the existence is purely abstract.
 
There are so many facets to this, that it’s difficult to know where to start, but I will start with Kant because he argued that we can never know ‘the-thing-in-itself’, only a perception of it, which, in a nutshell, is the difference between ontology and epistemology.
 
We need some definitions, and ontology is dictionary defined as the ‘nature of being’, while epistemology is ‘theory of knowledge’, and with these definitions, one can see straightaway the relationship, and Kant’s distillation of it.
 
Of course, one can also see how science becomes involved, because science, at its core, is an epistemological endeavour. In reading and researching this topic, I’ve come to the conclusion that, though science and philosophy have common origins in Western scholarship, going back to Plato, they’ve gone down different paths.
 
If one looks at the last century, which included the ‘golden age of physics’, in parallel with the dominant philosophical paradigm, heavily influenced, if not initiated, by Wittgenstein, we see that the difference can be definitively understood in terms of language. Wittgenstein effectively redefined epistemology as how we frame the world with language, while science, and physics in particular, frames the world in mathematics. I’ll return to this fundamental distinction later.
 
In my last post, I went to some lengths to argue that a fundamental assumption among scientists is that there is an ‘objective reality’. By this, I mean that they generally don’t believe in ‘idealism’ (like Donald Hoffman) which is the belief that objects don’t exist when you don’t perceive them (Hoffman describes it as the same experience as using virtual-reality goggles). As I’ve pointed out before, this is what we all experience when we dream, which I contend is different to the experience of our collective waking lives. It’s the word, ‘collective’, that is the key to understanding the difference – we share waking experiences in a way that is impossible to corroborate in a dream.
 
However, I’ve been reading a lot of posts on Quora by physicists, Viktor T Toth and Mark John Fernee (both of whom I’ve cited before and both of whom I have a lot of respect for). And they both point out that much of what we call reality is observer dependent, which makes me think of Kant.
 
Fernee, when discussing quantum mechanics (QM) keeps coming back to the ‘measurement problem’ and the role of the observer, and how it’s hard to avoid. He discusses the famous ‘Wigner’s friend’ thought experiment, which is an extension of the famous Schrodinger’s cat thought experiment, which infers you have the cat in 2 superpositional states: dead and alive. Eugne Wigner developed a thought experiment, whereby 2 experimenters could get contradictory results. Its relevance to this topic is that the ontology is completely dependent on the observer. My understanding of the scenario is that it subverts the distinction between QM and classical physics.
 
I’ve made the point before that a photon travelling across the Universe from some place and time closer to its beginning (like the CMBR) is always in the future of whatever it interacts with, like, for example, an ‘observer’ on Earth. The point I’d make is that billions of years of cosmological time have passed, so in another sense, the photon comes from the observer’s past, who became classical a long time ago. For the photon, time is always zero, but it links the past to the present across almost the entire lifetime of the observable universe.
 
Quantum mechanics, more than any other field, demonstrates the difference between ontology and epistemology, and this was discussed in another post by Fernee. Epistemologically, QM is described mathematically, and is so successful that we can ignore what it means ontologically. This has led to diverse interpretations from the multiple worlds interpretation (MWI) to so-called ‘hidden variables’ to the well known ‘Copenhagen interpretation’.
 
Fernee, in particular, discusses MWI, not that he’s an advocate, but because it represents an ontology that no one can actually observe. Both Toth and Fernee point out that the wave function, which arguably lies at the heart of QM is never observed and neither is its ‘decoherence’ (which is the measurement problem by another name), which leads many to contend that it’s a mathematical fiction. I argue that it exists in the future, and that only classical physics is actually observed. QM deals with probabilities, which is purely epistemological. After the ‘observation’, Schrodinger’s equation, which describes the wave function ceases to have any meaning. One is in the future and the observation becomes the past as soon as it happens.
 
I don’t know enough about it, but I think entanglement is the key to its ontology. Fernee points out in another post that entanglement is to do with conservation, whether it be the conservation of momentum or, more usually, the conservation of spin. It leads to what is called non-locality, according to Bell’s Theorem, which means it appears to break with relativistic physics. I say ‘appears’, because it’s well known that it can’t be used to send information faster than light; so, in reality, it doesn’t break relativity. Nevertheless, it led to Einstein’s famous quote about ‘spooky action at a distance’ (which is what non-locality means in layperson’s terms).
 
But entanglement is tied to the wave function decoherence, because that’s when it becomes manifest. It’s crucial to appreciate that entangled particles are described by the same wave function and that’s the inherent connection. It led Schrodinger to claim that entanglement is THE defining feature of QM; in effect, it’s what separates QM from classical physics.
 
I think QM is the best demonstration of Kant’s prescient claim that we can never know the-thing-in-itself, but only our perception of it. QM is a purely epistemological theory – the ontology it describes still eludes us.
 
But relativity theory also suggests that reality is observer dependent. Toth points out that even the number of particles that are detected in some scenarios are dependent on the frame of reference of the observer. This has led at least one physicist (on Quora) to argue that the word ‘particle’ should be banned from all physics text books – there are only fields. (Toth is an expert on QFT, quantum field theory, and argues that particles are a manifestation of QFT.) I won’t elaborate as I don’t really know enough, but what’s relevant to this topic is that time and space are observer dependent in relativity, or appear to be.
 
In a not-so-recent post, I described how different ‘observers’ could hypothetically ‘see’ the same event happening hundreds of years apart, just because they are walking across a street in opposite directions. I use quotation marks, because it’s all postulated mathematically, and, in fact, relativity theory prevents them from observing anything outside their past and future light cones. I actually discussed this with Fernee, and he pointed out that it’s to do with causality. Where there is no causal relation between events, we can’t determine an objective sequence let alone one relevant to a time frame independent of us (like a cosmic time frame). And this is where I personally have an issue, because, even though we can’t observe it or determine it, I argue that there is still an objective reality independently of us.
 
In relativity there is something called true time (Ï„) which is the time in the frame of reference of the observer. If spacetime is invariant, then it would logically follow that where you have true time you should have an analogous ‘true space’, yet I’ve never come across it. I also think there is a ‘true simultaneity’ but no one else does, so maybe I’m wrong.
 
There is, however, something called the Planck length, and someone asked Toth if this changed relativistically with the Lorenz transformation, like all other ‘rulers’ in relativity physics. He said that a version of relativity was formulated that made the Planck length invariant but it created problems and didn’t agree with experimental data. What I find interesting about this is that Planck’s constant, h, literally determines the size of atoms, and one doesn’t expect atoms to change size relativistically (but maybe they do). The point I’d make is that these changes are observer dependent, and I’d argue that there is a Planck length that is observer independent, which is the case when there is no observer.
 
This has become a longwinded way of explaining how 20th Century science has effectively taken this discussion away from philosophy, but it’s rarely acknowledged by philosophers, who take refuge in Wittgenstein’s conclusion that language effectively determines what we can understand of the world, because we think in a language and that limits what we can conceptualise. And he’s right, until we come up with new concepts requiring new language. Everything I’ve just discussed was completely unknown more than 120 years ago, for which we had no language, let alone concepts.
 
Some years ago, I reviewed a book by Don Cupitt titled, Above Us Only Sky, which was really about religion in a secular world. But, in it, Cupitt repeatedly argued that things only have meaning when they are ‘language-wrapped’ (his term) and I now realise that he was echoing Wittgenstein. However, there is a context in which language is magical, and that is when it creates a world inside your head, called a story.
 
I’ve been reading Bryan Magee’s The Great Philosophers, based on a series of podcasts with various academics in 1987, which started with Plato and ended with Wittgenstein. He discussed Plato with Myles Burnyeat, Professor of Ancient Philosophy at Oxford. Naturally, they discussed Socrates, the famous dialogues and the more famous Republic, but towards the end they turned to the Timaeus, which was a work on ‘mathematical science’, according to Burnyeat, that influenced Aristotle and Ptolemy.
 
It's worth quoting their last exchange verbatim:
 
Magee: For us in the twentieth century there is something peculiarly contemporary about the fact that, in the programme it puts forward for acquiring an understanding of the world, Plato’s philosophy gives a central role to mathematical physics.
 
Burnyeat: Yes. What Plato aspired to do, modern science has actually done. And so there is a sort of innate sympathy between the two which does not hold for Aristotle’s philosophy. (My emphasis)


Addendum: This is a very good exposition on the 'measurement problem' by Sabine Hossenfelder, which also provides a very good synopsis of the wave function (ψ), Schrodinger's equation and the Born rule.