Paul P. Mealing

Check out my book, ELVENE. Available as e-book and as paperback (print on demand, POD). Also this promotional Q&A on-line.
Showing posts with label Quantum Mechanics. Show all posts
Showing posts with label Quantum Mechanics. Show all posts

Saturday, 7 December 2024

Mathematics links epistemology to ontology, but it’s not that simple

A recurring theme on this blog is the relationship between mathematics and reality. It started with the Pythagoreans (in Western philosophy) and was famously elaborated upon by Plato. I also think it’s the key element of Kant’s a priori category in his marriage of analytical philosophy and empiricism, though it’s rarely articulated that way.
 
I not-so-recently wrote a post about the tendency to reify mathematical objects into physical objects, and some may validly claim that I am guilty of that. In particular, I found a passage by Freeman Dyson who warns specifically about doing that with Schrodinger’s wave function (Ψ, the Greek letter, psi, pronounced sy). The point is that psi is one of the most fundamental concepts in QM (quantum mechanics), and is famous for the fact that it has never been observed, and specifically can’t be, even in principle. This is related to the equally famous ‘measurement problem’, whereby a quantum event becomes observable, and I would say, becomes ‘classical’, as in classical physics. My argument is that this is because Ψ only exists in the future of whoever (or whatever) is going to observe it (or interact with it). By expressing it specifically in those terms (of an observer), it doesn’t contradict relativity theory, quantum entanglement notwithstanding (another topic).
 
Some argue, like Carlo Rovelli (who knows a lot more about this topic than me), that Schrodinger’s equation and the concept of a wave function has led QM astray, arguing that if we’d just stuck with Heisenberg’s matrices, there wouldn’t have been a problem. Schrodinger himself demonstrated that his wave function approach and Heisenberg’s matrix approach are mathematically equivalent. And this is why we have so many ‘interpretations’ of QM, because they can’t be mathematically delineated. It’s the same with Feynman’s QED and Schwinger’s QFT, which Dyson showed were mathematically equivalent, along with Tomanaga’s approach, which got them all a Nobel prize, except Dyson.
 
As I pointed out on another post, physics is really just mathematical models of reality, and some are more accurate and valid than others. In fact, some have turned out to be completely wrong and misleading, like Ptolemy’s Earth-centric model of the solar system. So Rovelli could be right about the wave function. Speaking of reifying mathematical entities into physical reality, I had an online discussion with Qld Uni physicist, Mark John Fernee, who takes it a lot further than I do, claiming that 3 dimensional space (or 4 dimensional spacetime) is a mathematical abstraction. Yet, I think there really are 3 dimensions of space, because the number of dimensions affects the physics in ways that would be catastrophic in another hypothetical universe (refer John Barrow’s The Constants of Nature). So it’s more than an abstraction. This was a key point of difference I had with Fernee (you can read about it here).
 
All of this is really a preamble, because I think the most demonstrable and arguably most consequential example of the link between mathematics and reality is chaos theory, and it doesn’t involve reification. Having said that, this again led to a point of disagreement between myself and Fermee, but I’ll put that to one side for the moment, so as not to confuse you.
 
A lot of people don’t know that chaos theory started out as purely mathematical, largely due to one man, Henri Poincare. The thing about physical chaotic phenomena is that they are theoretically deterministic yet unpredictable simply because the initial conditions of a specific event can’t be ‘physically’ determined. Now some physicists will tell you that this is a physical limitation of our ability to ‘measure’ the initial conditions, and infer that if we could, it would be ‘problem solved’. Only it wouldn’t, because all chaotic phenomena have a ‘horizon’ beyond which it’s impossible to make accurate predictions, which is why weather predictions can’t go reliably beyond 10 days while being very accurate over a few. Sabine Hossenfelder explains this very well.
 
But here’s the thing: it’s built into the mathematics of chaos. It’s impossible to calculate the initial conditions because you need to do the calculation to infinite decimal places. Paul Davies gives an excellent description and demonstration in his book, The Cosmic Blueprint. (this was my point-of-contention with Fernee, talking about coin-tosses).
 
As I discussed on another post, infinity is a mathematical concept that appears to have little or no relevance to reality. Perhaps the Universe is infinite in space – it isn’t in time – but if it is, we might never know. Infinity avoids empirical confirmation almost by definition. But I think chaos theory is the exception that proves the rule. The reason we can’t determine the exact initial conditions of a chaotic event, is not just physical but mathematical. As Fernee and others have pointed out, you can manipulate a coin-toss to make it totally predictable, but that just means you’ve turned a chaotic event into a non-chaotic event (after all it’s a human-made phenomenon). But most chaotic events are natural, like the orbits of the planets and biological evolution. The creation of the Earth’s moon was almost certainly a chaotic event, without which complex life would almost certainly never have evolved, so they can be profoundly consequential as well as completely unpredictable.
 

Monday, 18 November 2024

What’s inside a black hole?

 The correct answer is no one knows, but I’m going to make a wild, speculative, not fully-informed guess and suggest, possibly nothing. But first, a detour, to provide some context.
 
I came across an interview with very successful, multi-award-winning, Australian-Canadian actor, Pamela Rabe, who is best known (in Australia, at least) for her role in Wentworth (about a fictional female prison). She was interviewed by Benjamin Law in The Age Good Weekend magazine, a few weekends ago, where among many other questions, he asked, Is there a skill you wish you could acquire? She said there were so many, including singing better, speaking more languages and that she wished she was more patient. Many decades ago, I remember someone asking me a similar question, and I can still remember the answer: I said that I wish I was more intelligent, and I think that’s still true.
 
Some people might be surprised by this, and perhaps it’s a good thing I’m not, because I think I would be insufferable. Firstly, I’ve always found myself in the company of people who are much cleverer than me, right from when I started school, and right through my working life. The reason I wish I was more intelligent is that I’ve always been conscious of trying to understand things that are beyond my intellectual abilities. My aspirations don’t match my capabilities.
 
And this brings me to a discussion on black holes, which must, in some respects, represent the limits of what we know about the Universe and maybe what is even possible to know. Not surprisingly, Marcus du Sautoy spent quite a few pages discussing black holes in his excellent book, What We Cannot Know. But there is a short YouTube video by one of the world’s leading exponents on black holes, Kip Thorne, which provides a potted history. I also, not that long ago, read his excellent book, Black Holes and Time Warps; Einstein’s Outrageous Legacy (1994), which gives a very comprehensive history, in which he was not just an observer, but one of the actors.
 
It's worth watching the video because it highlights the role mathematics has played in physics, not only since Galileo, Kepler and Newton, but increasingly so in the 20th Century, following the twin revolutions of quantum mechanics and relativity theory. In fact, relativity theory predicted black holes, yet most scientists (including Einstein, initially) preferred to believe that they couldn’t exist; that Nature wouldn’t allow it.
 
We all suffer from these prejudices, including myself (and even Einstein). I discussed in a recent post how we create mathematical models in an attempt to explain things we observe. But more and more, in physics, we use mathematical models to explain things that we don’t observe, and black holes are the perfect example. If you watch the video interview with Thorne, this becomes obvious, because scientists were gradually won over by the mathematical arguments, before there was any incontrovertible physical evidence that they existed.
 
And since no one can observe what’s inside a black hole, we totally rely on mathematical models to give us a clue. Which brings me to the title of the post. The best known equation in reference to black holes in the Bekenstein-Hawking equation which give us the entropy of a black hole and predicts Hawking radiation. This is yet to be observed, but this is not surprising, as it’s virtually impossible. It’s simply not ‘hot’ enough to distinguish from the CMBR (cosmic microwave background radiation) which permeates the entire universe. 

Here is the formula:

S(BH) = kA/4(lp)^2 

Where S is the entropy of the black hole, A is the surface area of the sphere at the event horizon, and lp is the Planck length given by this formula:

√(Gh/2πc^3) 

Where G is the gravitational constant, h is Planck’s constant and c is the constant for lightspeed.

Hawking liked the idea that it’s the only equation in physics to incorporate the 4 fundamental natural constants: k, G, h and c; in one formula.

So, once again, mathematics predicts something that’s never been observed, yet most scientists believe it to be true. This led to what was called the ‘information paradox’ that all information falling into a black hole would be lost, but what intrigues me is that if a black hole can, in principle, completely evaporate by converting all its mass into radiation, then it infers that the mass is not in fact lost – it must be still there, even if we can’t see it. This means, by inference, that it can’t have disappeared down a wormhole, which is one of the scenarios conjectured.

One of the mathematical models proposed is the 'holographic principle' for black holes, for which I’ll quote directly from Wikipedia, because it specifically references what I’ve already discussed.

The holographic principle was inspired by the Bekenstein bound of black hole thermodynamics, which conjectures that the maximum entropy in any region scales with the radius squared, rather than cubed as might be expected. In the case of a black hole, the insight was that the information content of all the objects that have fallen into the hole might be entirely contained in surface fluctuations of the event horizon. The holographic principle resolves the black hole information paradox within the framework of string theory.

I know this is a long hop to make but what if the horizon not only contains the information but actually contains all the mass. In other words, what if everything is frozen at the event horizon because that’s where time ‘stops’. Most probably not true, and I don’t know enough to make a cogent argument. However, it would mean that the singularity predicted to exist at the centre of a black hole would not include its mass, but only spacetime.

Back in the 70s, I remember reading an article in Scientific American by a philosopher, who effectively argued that a black hole couldn’t exist. Now this was when their purported existence was mostly mathematical, and no one could unequivocally state that they existed physically. I admit I’m hazy about the details but, from what I can remember, he argued that it was self-referencing because it ‘swallowed itself’. Obviously, his argument was much more elaborate than that one-liner suggests. But I do remember thinking his argument flawed and I even wrote a letter to Scientific American challenging it. Basically, I think it’s a case of conflating the language used to describe a phenomenon with the physicality of it.

I only raise it now, because, as a philosopher, I’m just as ignorant of the subject as he was, so I could be completely wrong.


Addendum 1: I was of 2 minds whether to write this, but it kept bugging me - wouldn't leave me alone, so I wrote it down. I've no idea how true it might be, hence all the caveats and qualifications. It's absolutely at the limit of what we can know at this point in time. As I've said before, philosophy exists at the boundary of science and ignorance. It ultimately appealed to my aesthetics and belief in Nature’s aversion to perversity.

Addendum 2: Another reason why I'm most likely wrong is that there is a little known quirk of Newton's theory of gravity that the gravitational 'force' anywhere inside a perfectly symmetrical hollow sphere is zero. So the inside of a black hole exerting zero gravitational force would have to be the ultimate irony, which makes it highly improbable. I've no idea how that relates to the 'holographic principle' for a black hole. But I still don't think all the mass gets sucked into a singularity or down a wormhole. My conjecture is based purely on the idea that 'time' might well become 'zero' at the event horizon, though, from what I've read, no physicist thinks so. From an outsider's perspective, time dilation becomes asymptotically infinite (effective going to zero, but perhaps taking the Universe's lifetime to reach it). In this link, it begs a series of questions that seem to have no definitive answers. The alternative idea is that it's spacetime that 'falls' into a black hole, therefore taking all the mass with it.

Addendum 3: I came across this video by Tibbees (from a year ago), whom I recommend. She cites a book by Carlo Rovelli, White Holes, which is also the title of her video. Now, you can't talk about white holes without talking about black holes; they are just black holes time reversed (as she explicates). We have no evidence they actually exist, unless the Big Bang is a white hole (also mentioned). I have a lot of time for Carlo Rovelli, even though we have philosophical differences (what a surprise). Basically, he argues that, at a fundamental level, time doesn't exist, but it's introduced into the universe as a consequence of entropy (not the current topic). 

Tibbees gives a totally different perspective to my post, which is why I bring it up. Nevertheless, towards the end, she mentions that our view of a hypothetical person (she suggests Rovelli) entering a black hole is that their existence becomes assymptotically infinite. But what, if in this case, what we perceive is what actually happens. Then my scenario makes sense. No one else believes that, so it's probably incorrect.

Monday, 28 October 2024

Do we make reality?

 I’ve read 2 articles, one in New Scientist (12 Oct 2024) and one in Philosophy Now (Issue 164, Oct/Nov 2024), which, on the surface, seem unrelated, yet both deal with human exceptionalism (my term) in the context of evolution and the cosmos at large.
 
Staring with New Scientist, there is an interview with theoretical physicist, Daniele Oriti, under the heading, “We have to embrace the fact that we make reality” (quotation marks in the original). In some respects, this continues on with themes I raised in my last post, but with different emphases.
 
This helps to explain the title of the post, but, even if it’s true, there are degrees of possibilities – it’s not all or nothing. Having said that, Donald Hoffman would argue that it is all or nothing, because, according to him, even ‘space and time don’t exist unperceived’. On the other hand, Oriti’s argument is closer to Paul Davies’ ‘participatory universe’ that I referenced in my last post.
 
Where Oriti and I possibly depart, philosophically speaking, is that he calls the idea of an independent reality to us ‘observers’, “naïve realism”. He acknowledges that this is ‘provocative’, but like many provocative ideas it provides food-for-thought. Firstly, I will delineate how his position differs from Hoffman’s, even though he never mentions Hoffman, but I think it’s important.
 
Both Oriti and Hoffman argue that there seems to be something even more fundamental than space and time, and there is even a recent YouTube video where Hoffman claims that he’s shown mathematically that consciousness produces the mathematical components that give rise to spacetime; he has published a paper on this (which I haven’t read). But, in both cases (by Hoffman and Oriti), the something ‘more fundamental’ is mathematical, and one needs to be careful about reifying mathematical expressions, which I once discussed with physicist, Mark John Fernee (Qld University).
 
The main issue I have with Hoffman’s approach is that space-time is dependent on conscious agents creating it, whereas, from my perspective and that of most scientists (although I’m not a scientist), space and time exists external to the mind. There is an exception, of course, and that is when we dream.
 
If I was to meet Hoffman, I would ask him if he’s heard of proprioception, which I’m sure he has. I describe it as the 6th sense we are mostly unaware of, but which we couldn’t live without. Actually, we could, but with great difficulty. Proprioception is the sense that tells us where our body extremities are in space, independently of sight and touch. Why would we need it, if space is created by us? On the other hand, Hoffman talks about a ‘H sapiens interface’, which he likens to ‘desktop icons on a computer screen’. So, somehow our proprioception relates to a ‘spacetime interface’ (his term) that doesn’t exist outside the mind.
 
A detour, but relevant, because space is something we inhabit, along with the rest of the Universe, and so is time. In relativity theory there is absolute space-time, as opposed to absolute space and time separately. It’s called the fabric of the universe, which is more than a metaphor. As Viktor Toth points out, even QFT seems to work ‘just fine’ with spacetime as its background.
 
We can do quantum field theory just fine on the curved spacetime background of general relativity.

 
[However] what we have so far been unable to do in a convincing manner is turn gravity itself into a quantum field theory.
 
And this is where Oriti argues we need to find something deeper. To quote:
 
Modern approaches to quantum gravity say that space-time emerges from something deeper – and this could offer a new foundation for physical laws.
 
He elaborates: I work with quantum gravity models in which you don’t start with a space-time geometry, but from more abstract “atomic” objects described in purely mathematical language. (Quotation marks in the original.)
 
And this is the nub of the argument: all our theories are mathematical models and none of them are complete, in as much as they all have limitations. If one looks at the history of physics, we have uncovered new ‘laws’ and new ‘models’ when we’ve looked beyond the limitations of an existing theory. And some mathematical models even turned out to be incorrect, despite giving answers to what was ‘known’ at the time. The best example being Ptolemy’s Earth-centric model of the solar system. Whether string theory falls into the same category, only future historians will know.
 
In addition, different models work at different scales. As someone pointed out (Mile Gu at the University of Queensland), mathematical models of phenomena at one scale are different to mathematical models at an underlying scale. He gave the example of magnetism, demonstrating that mathematical modelling of the magnetic forces in iron could not predict the pattern of atoms in a 3D lattice as one might expect. In other words, there should be a causal link between individual atoms and the overall effect, but it could not be determined mathematically. To quote Gu: “We were able to find a number of properties that were simply decoupled from the fundamental interactions.” Furthermore, “This result shows that some of the models scientists use to simulate physical systems have properties that cannot be linked to the behaviour of their parts.”
 
This makes me sceptical that we will find an overriding mathematical model that will entail the Universe at all scales, which is what theories of quantum gravity attempt to do. One of the issues that some people raise is that a feature of QM is superposition, and the superposition of a gravitational field seems inherently problematic.
 
Personally, I think superposition only makes sense if it’s describing something that is yet to happen, which is why I agree with Freeman Dyson that QM can only describe the future, which is why it only gives us probabilities.
 
Also, in quantum cosmology, time disappears (according to Paul Davies, among others) and this makes sense (to me), if it’s attempting to describe the entire universe into the future. John Barrow once made a similar point, albeit more eruditely.
 
Getting off track, but one of the points that Oriti makes is whether the laws and the mathematics that describes them are epistemic or ontic. In other words, are they reality or just descriptions of reality. I think it gets blurred, because while they are epistemic by design, there is still an ontology that exists without them, whereas Oriti calls that ‘naïve realism’. He contends that reality doesn’t exist independently of us. This is where I always cite Kant: that we may never know the ‘thing-in-itself,’ but only our perception of it. Where I diverge from Kant is that the mathematical models are part of our perception. Where I depart from Oriti is that I argue there is a reality independently of us.
 
Both QM and relativity theory are observer-dependent, which means they could both be describing an underlying reality that continually eludes us. Whereas Oriti argues that ‘reality is made by our models, not just described by them’, which would make it subjective.
 
As I pointed out in my last post, there is an epistemological loop, whereby the Universe created the means to understand itself, through us. Whether there is also an ontological loop as both Davies and Oriti infer, is another matter: do we determine reality through our quantum mechanical observations? I will park that while I elaborate on the epistemic loop.
 
And this finally brings me to the article in Philosophy Now by James Miles titled, We’re as Smart as the Universe gets. He argues that, from an evolutionary perspective, there is a one-in-one-billion possibility that a species with our cognitive abilities could arise by natural selection, and there is no logical reason why we would evolve further, from an evolutionary standpoint. I have touched on this before, where I pointed out that our cultural evolution has overtaken our biological evolution and that would also happen to any other potential species in the Universe who developed cognitive abilities to the same level. Dawkins coined the term, ‘meme’, to describe cultural traits that have ‘survived’, which now, of course, has currency on social media way beyond its original intention. Basically, Dawkins saw memes as analogous to genes, which get selected; not by a natural process but by a cultural process.
 
I’ve argued elsewhere that mathematical theorems and scientific theories are not inherently memetic. This is because they are chosen because they are successful, whereas memes are successful because they are chosen. Nevertheless, such theorems and theories only exist because a culture has developed over millennia which explores them and builds on them.
 
Miles talks about ‘the high intelligence paradox’, which he associates with Darwin’s ‘highest and most interesting problem’. He then discusses the inherent selection advantage of co-operation, not to mention specialisation. He talks about the role that language has played, which is arguably what really separates us from other species. I’ve argued that it’s our inherent ability to nest concepts within concepts ad-infinitum (which is most obvious in our facility for language, like I’m doing now) that allows us to, not only tell stories, compose symphonies, explore an abstract mathematical landscape, but build motor cars, aeroplanes and fly men to the moon. Are we the only species in the Universe with this super-power? I don’t know, but it’s possible.
 
There are 2 quotes I keep returning to:
 
The most incomprehensible thing about the Universe is that it’s comprehensible. (Einstein)
 
The Universe gave rise to consciousness and consciousness gives meaning to the Universe.
(Wheeler)
 
I haven’t elaborated, but Miles makes the point, while referencing historical antecedents, that there appears no evolutionary 'reason’ that a species should make this ‘one-in-one-billion transition’ (his nomenclature). Yet, without this transition, the Universe would have no meaning that could be comprehended. As I say, that’s the epistemic loop.
 
As for an ontic loop, that is harder to argue. Photons exist in zero time, which is why I contend they are always in the future of whatever they interact with, even if they were generated in the CMBR some 13.5 billion years ago. So how do we resolve that paradox? I don’t know, but maybe that’s the link that Davies and Oriti are talking about, though neither of them mention it. But here’s the thing: when you do detect such a photon (for which time is zero) you instantaneously ‘see’ back to 380,000 years after the Universe’s birth.





Thursday, 29 August 2024

How scale demonstrates that mathematics is intrinsically entailed in the Universe

 I momentarily contemplated another title: Is the Planck limit an epistemology or an ontology? Because that’s basically the topic of a YouTube video that’s the trigger for this post. I wrote a post some time ago where I discussed whether the Universe is continuous or discrete, and basically concluded that it was continuous. Based on what I’ve learned from this video, I might well change my position. But I should point out that my former opposition was based more on the idea that it could be quantised into ‘bits’ of information, whereas now I’m willing to acknowledge that it could be granular at the Planck scale, which I’ll elaborate on towards the end. I still don’t think that the underlying reality of the Universe is in ‘bits’ of information, therefore potentially created and operated by a computer.
 
Earlier this year, I discussed the problem of reification of mathematics so I want to avoid that if possible. By reification, I mean making a mathematical entity reality. Basically, physics works by formulating mathematical models that we then compare to reality through observations. But as Freeman Dyson pointed out, the wave function (Ψ), for example, is a mathematical entity and not a physical entity, which is sometimes debated. The fact is that if it does exist physically, it’s never observed, and my contention is that it ‘exists’ in the future; a view that is consistent with Dyson’s own philosophical viewpoint that QM can only describe the future and not the past.
 
And this brings me to the video, which has nothing to say about wave functions or reified mathematical entities, but uses high school mathematics to explore such esoteric and exotic topics as black holes and quantum gravity. There is one step involving integral calculus, which is about as esoteric as the maths becomes, and if you allow that 1/ = 0, it leads to the formula for the escape velocity from any astronomical body (including Earth). Note that the escape velocity literally allows an object to escape a gravitational field to infinity (). And the escape velocity for a black hole is c (the speed of light).
 
All the other mathematics is basic algebra using some basic physics equations, like Newton’s equation for gravity, Planck’s equation for energy, Heisenberg’s Uncertainty Principle using Planck’s Constant (h), Einstein’s famous equation for the equivalence of energy and mass, and the equation for the Coulomb Force between 2 point electric charges (electrons). There is also the equation for the Schwarzschild radius of a black hole, which is far easier to derive than you might imagine (despite the fact that Schwarzschild originally derived it from Einstein’s field equations).
 
Back in May 2019, I wrote a post on the Universe’s natural units, which involves the fundamental natural constants, h, c and G. This was originally done by Planck himself, which I describe in that post, while providing a link to a more detailed exposition. In the video (embedded below), the narrator takes a completely different path to deriving the same Planck units before describing a method that Planck himself would have used. In so doing, he explains how at the Planck level, space and time are not only impossible to observe, even in principle, but may well be impossible to remain continuous in reality. You need to watch the video, as he explains it far better than I can, just using high school mathematics.
 
Regarding the title I chose for this post, Roger Penrose’s Conformal Cyclic Cosmology (CCC) model of the Universe, exploits the fact that a universe without matter (just radiation) is scale invariant, which is essential for the ‘conformal’ part of his theory. However, that all changes when one includes matter. I’ve argued in other posts that different forces become dominant at different scales, from the cosmological to the subatomic. The point made in this video is that at the Planck scale all the forces, including gravity, become comparable. Now, as I pointed out at the beginning, physics is about applying mathematical models and comparing them to reality. We can’t, and quite possibly never will, be able to observe reality at the Planck scale, yet the mathematics tells us that it’s where all the physics we currently know is compatible. It tells me that not only is the physics of the Universe scale-dependent, but it's also mathematically dependent (because scale is inherently mathematical). In essence, the Universe’s dynamics are determined by mathematical parameters at all scales, including the Planck scale.
 
Note that the mathematical relationships in the video use ~ not = which means that they are approximate, not exact. But this doesn’t detract from the significance that 2 different approaches arrive at the same conclusion, which is that the Planck scale coincides with the origin of the Universe incorporating all forces equivalently.
 
 
Addendum: I should point out that Viktor T Toth, who knows a great deal more about this than me, argues that there is, in fact, no limit to what we can measure in principle. Even the narrator in the video frames his conclusion cautiously and with caveats. In other words, we are in the realm of speculative physics. Nevertheless, I find it interesting to contemplate where the maths leads us.



Sunday, 7 April 2024

What does physics really tell us about reality?

 A little while ago I got into another discussion with Mark John Fernee (see previous post), but this time dealing with the relationship between ontology and epistemology as determined by physics. It came about in reference to a paper in Physics Today that someone cited, by N. David Nermin, a retired Professor of physics in Ithaca, New York, titled What’s bad about this habit. In particular, he talked about our tendency to ‘reify’ mathematically determined theories into reality. It helps if we have some definitions, which Fernee conveniently provided that were both succinct and precise.

Epistemology - concerning knowledge.

Ontology - concerning reality.

Reify - to think of an idea as real.


It so happens that around the same time I read an article in New Scientist (25 Mar 2024, pp.32-5) Strange but true? by philosopher, Eric Schwitzgebel, which covers similar territory. The title tells you little, but it’s really about how modern theories in physics don’t really tell us what reality is; instead giving us a range of possibilities to choose from.

I will start with Nermin, who spends the first page talking about quantum mechanics (QM), as it’s the most obvious candidate for a mathematical theory that gets reified by almost everyone who encounters it. This selected quote gives a good feel for what he’s talking about.

When I was a graduate student learning quantum field theory, I had a friend who was enchanted by the revelation that quantum fields were the real stuff that makes up the world. He reified quantum fields. But I hope you will agree that you are not a continuous field of operators on an infinite-dimensional Hilbert space. Nor, for that matter, is the page you are reading or the chair you are sitting in. Quantum fields are useful mathematical tools. They enable us to calculate things.

I found another quote by Freeman Dyson (2014), who makes a similar point to Nermin about the wave function (Ψ).

Unfortunately, people writing about quantum mechanics often use the phrase "collapse of the wave-function" to describe what happens when an object is observed. This phrase gives a misleading idea that the wave-function itself is a physical object. A physical object can collapse when it bumps into an obstacle. But a wave-function cannot be a physical object. A wave-function is a description of a probability, and a probability is a statement of ignorance. Ignorance is not a physical object, and neither is a wave-function. When new knowledge displaces ignorance, the wave-function does not collapse; it merely becomes irrelevant.


But Nermin goes on to challenge even the reality of space and time. Arguing that it is a mathematical abstraction. 

What about spacetime itself? Is that real? Spacetime is a (3+1) dimensional mathematical continuum. Even if you are a mathematical Platonist, I would urge you to consider that this continuum is nothing more than an extremely effective way to represent relations between distinct events.

He then goes on to explain that ‘an event… can be represented as a mathematical point in spacetime.’

He elaborates how this has become so reified into ordinary language that we’re no longer aware that it is an abstraction.

So spacetime is an abstract four-dimensional mathematical continuum of points that approximately represent phenomena whose spatial and temporal extension we find it useful or necessary to ignore. The device of spacetime has been so powerful that we often reify that abstract bookkeeping structure, saying that we inhabit a world that is such a four (or, for some of us, ten) dimensional continuum. The reification of abstract time and space is built into the very languages we speak, making it easy to miss the intellectual sleight of hand.


And this is where I start to have issues with his overall thesis, whereas Fernee said, ‘I completely concur with what he has written, and it is well articulated.’ 

When I challenged Fernee specifically on Nermin’s points about space-time, Fernee argued:

His contention was that even events in space-time are an abstraction. We all assume the existence of an objective reality, and I don't know of anyone who would seriously challenge that idea. Yet our descriptions are abstractions. All we ask of them is that they are consistent, describe the observed phenomena, and can be used to make predictions.

I would make an interesting observation on this very point, that distinguishes an AI’s apparent perspective of space and time compared to ours. Even using the word, ‘apparent’, infers there is a difference that we don’t think about.

I’ve made the point in other posts, including one on Kant, that we create a model of space and time in our heads which we use to interact with the physical space and time that exists outside our heads, and so do all living creatures with eyes, ears and touch. In fact, the model is so realistic that we think it is the external reality.

When we throw or catch a ball on the sporting field, we know that our brains don’t work out the quadratic equations that determine where it’s going to land. But imagine an AI controlled artillery device, which would make those calculations and use a 3-dimensional grid to determine where its ordinance was going to hit. Likewise, imagine an AI controlled drone using GPS co-ordinates – in other words, a mathematical abstraction of space and time – to navigate its way to a target. And that demonstrates the fundamental difference that I think Nermin is trying to delineate. The point is that, from our perspective, there is no difference.

This quote gives a clearer description of Nermin’s philosophical point of view.

Space and time and spacetime are not properties of the world we live in but concepts we have invented to help us organize classical events. Notions like dimension or interval, or curvature or geodesics, are properties not of the world we live in but of the abstract geometric constructions we have invented to help us organize events. As Einstein once again put it, “Space and time are modes by which we think, not conditions under which we live.”

Whereas I’d argue that they are both, and the mathematics tells us things about the ‘properties of the world [universe]’ which we can’t directly perceive with our senses – like ‘geodesics’ and the ‘curvature’ of spacetime. Yet they can be measured as well as calculated, which is why we know GR (Einstein’s general theory of relativity) works.

My approach to understanding physics, which may be misguided and would definitely be the wrong approach according to Nermin and Fernee, is to try and visualise the concepts that the maths describes. The concept of a geodesic is a good example. I’ve elaborated on this in another post, but I can remember doing Newtonian-based physics in high school, where gravity made no sense to me. I couldn’t understand why the force of gravity seemed to be self-adjusting so that the acceleration (g) was the same for all objects, irrespective of their mass.

It was only many years later, when I understood the concept of a geodesic using the principle of least action, that it all made sense. The objects don’t experience a force per se, but travel along the path of least action which is also the path of maximum relativistic time. (I’ve described this phenomenon elsewhere.) The point is that, in GR, mass is not in the equations (unlike Newton’s mathematical representation) and the force we all experience is from whatever it is that stops us falling, which could be a chair you’re sitting on or the Earth.

So, the abstract ‘geodesic’ explains what Newton couldn’t, even though Newton gave us the right answers for most purposes.

And this leads me to extend the discussion to include the New Scientist article. The author, Eric Schwitzgebel, ponders 3 areas of scientific inquiry: quantum mechanics (are there many worlds?); consciousness (is it innate in all matter?) and computer simulations (do we live in one?). I’ll address them in reverse order, because that’s easiest.

As Paul Davies pointed out in The Goldilocks Enigma, the so-called computer-simulation hypothesis is a variant on Intelligent Design. If you don’t believe in ID, you shouldn’t believe that the universe is a computer simulation, because some entity had to design it and produce the code.

'Is consciousness innate?' is the same as pansychism, as Schwitzgebel concurs, and I’d say there is no evidence for it, so not worth arguing about. Basically, I don’t want to waste time on these 2 questions, and, to be fair, Schwitzgebel’s not saying he’s an advocate for either of them.

Which brings me to QM, and that’s relevant. Schwitzbegel makes the point that all the scientific interpretations have bizarre or non-common-sensical qualities, of which MWI (many worlds interpretation) is just one. Its relevance to this discussion is that they are all reifications that are independent of the mathematics, because the mathematics doesn’t discern between them. And this gets to the nub of the issue for me. Most physicists would agree that physics, in a nutshell, is about creating mathematical models that are then tested by experimentation and observation, often using extremely high-tech, not-to-mention behemoth instruments, like the LHC and the James Webb telescope.

It needs to be pointed out that, without exception, all these mathematical models have limitations and, historically, some have led us astray. The most obvious being Ptolemy’s model of the solar system involving epicycles. String theory, with its 10 dimensions and 10^500 possible universes, is a potential modern-day contender, but we don’t really know.

Nevertheless, as I explained with my brief discourse on geodesics (above), there are occasions when the mathematics provides an insight we would otherwise be ignorant of.

Basically, I think what Schwitzgebel is really touching on is the boundary between philosophy and science, which I believe has always existed and is an essential dynamic, despite the fact that many scientists are dismissive of its role.

Returning to Nermin, it’s worth quoting his final passage.

Quantum mechanics has brought home to us the necessity of separating that irreducibly real experience from the remarkable, beautiful, and highly abstract super-structure we have found to tie it all together.


The ‘real experience’ includes the flow of time; the universality of now which requires memory for us to know it exists; the subjective experience of free will. All of these are considered ‘illusions’ by many scientists, not least Sabine Hossenfelder in her excellent book, Existential Physics. I tend to agree with another physicist, Richard Muller, that what this tells us is that there is a problem with our theories and not our reality.

In an attempt to reify QM with reality, I like the notion proposed by Freeman Dyson that it’s a mathematical model that describes the future. As he points out, it gives us probabilities, and it provides a logical reason why Feynman’s abstraction of an infinite number of ‘paths’ are never observed.

Curiously, Fernee provides tacit support for the idea that the so-called ‘measurement’ or ‘observation’ provides an ‘abstract’ distinction between past and future in physics, though he doesn’t use those specific words.

In quantum mechanics, the measurement hypothesis, which includes the collapse of the wave function, is an irreversible process. As we perceive the world through measurements, time will naturally seem irreversible to us.


Very similar to something Davies said in another context:

The very act of measurement breaks the time symmetry of quantum mechanics in a process sometimes described as the collapse of the wave function…. the rewind button is destroyed as soon as that measurement is made.

Lastly, I would like to mention magnetism, because, according to SR, it’s mathematically dependent on a moving electric charge. Only it’s not always, as this video explicates. You can get a magnetic field from electric spin, which is an abstraction, as no one suggests that electrons do physically spin, even though they produce measurable magnetic moments.

What most people don’t know is that our most common experience of a magnetic field, which is a bar magnet, is created purely by electron spin and not moving electrons.

Sunday, 3 March 2024

Is randomness real or illusion?

 Let’s look at quantum mechanics (QM). I watched a YouTube video on Closer To Truth with Fred Alan Wolf, a theoretical physicist, whom I admit I’d never heard of. It’s worth watching the first 7 mins before he goes off on a speculative tangent that maybe dreams are a more fundamental level of reality, citing Australian Aboriginal ‘dreamtime’ mythology, of which I have some familiarity, though no scholarship.
 
In the first 7 mins he describes QM: its conceptual frustrations juxtaposed with its phenomenal successes. He gives a good synopsis, explaining how it describes a world we don’t actually experience, yet apparently underpins (my term, not his) the one we do. In particular, he explains:
 
There is a simple operation that takes you out of that space into (hits the table with his hand) this space. And that operation is simply multiplying what that stuff - that funny stuff – is, by itself (waves his hands in circles) in a time-reverse manner, called psi star psi (Ψ*Ψ) in the language of quantum physics.
 
What he’s describing is called the Born rule, which gives probabilities of finding that ‘stuff’ in the real world. By ‘real world’ I mean the one we are all familiar with and that he can hit his hand with. Ψ (pronounced sy) is of course the wave function in Schrodinger’s eponymous equation, and Schrodinger himself wrote a paper (in 1941) demonstrating that Born’s rule effectively multiplies the wave function by itself running backwards in time.
 
Now, some physicists argue that Ψ is just a convenient mathematical fiction and Carlo Rovelli went so far as to argue that it has led us astray (in one of his popular books). Personally, I think it describes the future, which explains why we never see it, or as soon as we try to, it disappears, and if we’re lucky, we get a particle or some other interaction, like a dot on a screen, all of which exist in our past. Note that everything we observe, including our own reflection in a mirror, exists in the past.
 
Wolf then goes on to speculate that the infinite possibilities we use for our calculations are perhaps the true reality. In his own words: What I’m interested in are the things we can’t see… And he makes an interesting point that most people don’t know: that if we don’t take into account the things we can’t see, ‘we get the wrong answers’.
 

And this is where it gets interesting, because he’s alluding to Feynman’s sum-over-histories methodology, which takes into account all the infinite paths that the particle (as wave function) can take. In fact, the more paths that are allowed for, the more accurate the calculation. Wolf doesn’t mention Feynman, but I’m sure that’s what he’s referring to.
 
Feynman’s key insight into QM was that it obeys the least-action principle, which is mathematically expressed as a Lagrangian. It’s the ‘least-action principle’ that determines where light goes through a change in medium (like glass), obeying Fermat’s law where it takes the path of ‘least time’. It also determines the path a ball follows if you throw it into the air by following the path of ‘maximum relativistic time’. I elaborate on this in another post.
 
There is something teleological about this principle, as if the ball, particle, light, ‘knows’ where it has to go. Freeman Dyson, who was a close collaborator with Feynman, argued that QM cannot describe the past, but only the future, and that only classical physics describes the past. So these infinitude of paths that are part of the calculation to determine the probability of where it will actually be ‘observed’ make more sense to me if they exist in the future. I don’t think we need a ‘dream state’ unless that’s a euphemism for the future.
 
Like Dyson, I don’t think we need consciousness to make a quantum phenomenon become real, but it does provide the reference point. In his own words:
 
We do not need a human observer to make quantum mechanics work. All we need is a point of reference, to separate past from future, to separate what has happened from what may happen, to separate facts from probabilities.
 
The thing about consciousness is that it exists in a ‘constant present’, as pointed out by Schrodinger himself (when he wasn’t talking about QM), so it logically correlates with 'a point of reference, to separate past from future', that Dyson refers to.
 
Schrodinger coined a term, ‘statistico-deterministic’, to describe quantum phenomena, because, at a statistical level, it can be very predictable, otherwise we wouldn’t be able to call it ‘successful’. He gives the example of radioactive decay (exploited in his eponymous cat thought experiment) whereby we can’t determine the decay of a single isotope, yet we can statistically determine the half-life of astronomical numbers of atoms very accurately, as everyone knows.
 
I contend that real randomness, that we all observe and are familiar with, is caused by chaos, but even this is a contentious idea. I like to give the example of tossing a coin, but a lot of physicists will tell you that tossing a coin is not random. In fact, I recently had a lengthy, but respectful, discourse with Mark John Fernee (physicist at Qld Uni) on Quora on this very topic. When I raised the specific issue of whether tossing a coin is ‘random’, he effectively argued that there are no random phenomena in physics. To quote him out of context:
 
Probability theory is built from statistical sampling. There is no assumed underlying physics.
 
The underlying physics can be deterministic, while a statistical distribution of events can indicate random behaviour. This is the assumption that is applied to every coin toss. Because this is just an assumption, you can cheat the system by using specific conditions that ensure deterministic outcomes.
 
What I am saying is that randomness is a statistical characterisation of outcomes that does not include any physical mechanism. As such, it is not a fundamental property of nature.
(Emphasis in original)
 
I get the impression from what I’ve read that mathematicians have a different take on chaos to physicists, because they point out that you need to calculate initial conditions to infinite decimal places to achieve a 100% predicted outcome. Physicist, Paul Davies, provided a worked example in his 1988 book, The Cosmic Blueprint. I quoted Davies to Fernee during our ‘written’ conversation:
 
It is actually possible to prove that the activity of the jumping particle is every bit as random as tossing a coin.
 
The ‘jumping particle’ Davies referred to was an algorithm using clock arithmetic, that when graphed produced chaotic results, and he demonstrated that it would take a calculation to infinity to get it ‘exactly right’. Fernee was dismissive of this and gave it as an example of a popular science book leading laypeople (like myself) astray, which I thought was a bit harsh, as Davies actually goes into the mathematics in some detail, and I possibly misled Fernee by quoting just one sentence.
 
Just to be clear, Fernee doesn’t disagree that chaotic phenomena are impossible to predict; just that they are fully deterministic and, in his words, only ‘indicate random behaviour’.
 
Sabine Hossenfelder, who argues very strongly for superdeterminism, has a video demonstrating how predicting chaotic phenomena (like the weather) has a horizon (my term, not hers) of predictability that can never be exceeded, even in principle (10 days in the case of the weather).
 
So Fernee and Hossenfelder distinguish between what we ‘cannot know’ and what physically transpires. But my point is that chaotic phenomena, if rerun, will always produce a different result – it’s built into the mathematics underlying the activity – and includes significant life-changing phenomena like evolutionary biology and the orbits of the planets, as well as weather and earthquakes. Even the creation of the moon is believed to be a consequence of a chaotic event, without which life on Earth would never have evolved.
 
Note that both QM and chaos have mathematical underpinnings, and whilst most see that as modelling or a very convenient method of making predictions, I see it as more fundamental. I contend that mathematics transcends the Universe, yet it’s also a code that allows us to plumb Nature’s deepest secrets and fathom the dynamics of the Universe on all scales.

 

Follow-up (30 Mar 2024)

Following my discourse with Fernee, I reread Davies’ book, The Cosmic Blueprint (for the third time since I bought it in the late 80s), or at least the part that was relevant. I really did Davies a disservice by just quoting one sentence out of context. In fact, Davies goes to a lot of trouble to try and define what randomness means. He also acknowledges that, despite being totally unpredictable, chaotic phenomena are still ‘deterministic’ – it’s just the initial conditions that are unattainable (mathematically as well as physically). That is why, when you rerun a chaotic event, you get a different result, despite being so-called ‘deterministic’.
 
As well as the mathematical example I gave above, Davies discusses in detail 2 physical systems that are chaotic – the population of certain species of animals and the forcing of a pendulum (where a constant force is applied to a pendulum at a different frequency to its natural frequency). Marcus du Sautoy in his book, What We Cannot Know, interviews ex-pat Australian, Robert May (now a Member of the House of Lords), who did pioneering work on chaos theory in animal populations.
 
Davies quotes Ilya Prigogine concerning ‘…the conviction that the future is determined by the present…  We may perhaps even call it the founding myth of classical science.’
 
He also quotes Joseph Ford: ‘…the fact that determinism actually reigns only over a quite finite domain; outside this small haven of order lies a largely uncharted, vast wasteland of chaos where determinism has faded into an ephemeral memory of existence theorems and only randomness survives.’
 

And then Davies himself:
 
But in reality, our universe is not a linear Newtonian mechanical system; it is a chaotic system… No finite intelligence, however powerful, could anticipate what new forms or systems may come to exist in the future, The universe is in some sense open; it cannot be known what new levels of variety or complexity may be in store.
 
In light of these comments from last century, and considering that under Newton and Pascale, everyone thought that given enough information, the entire universe’s future could be foreseen, I see ‘strong determinism’ (as opposed to weak determinism) as a scientific ‘fashion’ that’s come back into favour. By ‘weak determinism’, I mean that all physical phenomena have a causal relationship; it’s just impossible to predict beyond a horizon, which is dependent on the nature of the phenomenon (whether it be the weather or the planets). Therefore, I think randomness is built into the Universe, and its principal mechanism is chaos, not quantum.

Monday, 23 October 2023

The mystery of reality

Many will say, ‘What mystery? Surely, reality just is.’ So, where to start? I’ll start with an essay by Raymond Tallis, who has a regular column in Philosophy Now called, Tallis in Wonderland – sometimes contentious, often provocative, always thought-expanding. His latest in Issue 157, Aug/Sep 2023 (new one must be due) is called Reflections on Reality, and it’s all of the above.
 
I’ve written on this topic many times before, so I’m sure to repeat myself. But Tallis’s essay, I felt, deserved both consideration and a response, partly because he starts with the one aspect of reality that we hardly ever ponder, which is doubting its existence.
 
Actually, not so much its existence, but whether our senses fool us, which they sometimes do, like when we dream (a point Tallis makes himself). And this brings me to the first point about reality that no one ever seems to discuss, and that is its dependence on consciousness, because when you’re unconscious, reality ceases to exist, for You. Now, you might argue that you’re unconscious when you dream, but I disagree; it’s just that your consciousness is misled. The point is that we sometimes remember our dreams, and I can’t see how that’s possible unless there is consciousness involved. If you think about it, everything you remember was laid down by a conscious thought or experience.
 
So, just to be clear, I’m not saying that the objective material world ceases to exist without consciousness – a philosophical position called idealism (advocated by Donald Hoffman) – but that the material objective world is ‘unknown’ and, to all intents and purposes, might as well not exist if it’s unperceived by conscious agents (like us). Try to imagine the Universe if no one observed it. It’s impossible, because the word, ‘imagine’, axiomatically requires a conscious agent.
 
Tallis proffers a quote from celebrated sci-fi author, Philip K Dick: 'Reality is that which, when you stop believing in it, doesn’t go away' (from The Shifting Realities of Philip K Dick, 1955). And this allows me to segue into the world of fiction, which Tallis doesn’t really discuss, but it’s another arena where we willingly ‘suspend disbelief' to temporarily and deliberately conflate reality with non-reality. This is something I have in common with Dick, because we have both created imaginary worlds that are more than distorted versions of the reality we experience every day; they’re entirely new worlds that no one has ever experienced in real life. But Dick’s aphorism expresses this succinctly. The so-called reality of these worlds, in these stories, only exist while we believe in them.
 
I’ve discussed elsewhere how the brain (not just human but animal brains, generally) creates a model of reality that is so ‘realistic’, we actually believe it exists outside our head.
 
I recently had a cataract operation, which was most illuminating when I took the bandage off, because my vision in that eye was so distorted, it made me feel sea sick. Everything had a lean to it and it really did feel like I was looking through a lens; I thought they had botched the operation. With both eyes open, it looked like objects were peeling apart. So I put a new eye patch on, and distracted myself for an hour by doing a Sudoku problem. When I had finished it, I took the patch off and my vision was restored. The brain had made the necessary adjustments to restore the illusion of reality as I normally interacted with it. And that’s the key point: the brain creates a model so accurately, integrating all our senses, but especially, sight, sound and touch, that we think the model is the reality. And all creatures have evolved that facility simply so they can survive; it’s a matter of life-and-death.
 
But having said all that, there are some aspects of reality that really do only exist in your mind, and not ‘out there’. Colour is the most obvious, but so is sound and smell, which all may be experienced differently by other species – how are we to know? Actually, we do know that some animals can hear sounds that we can’t and see colours that we don’t, and vice versa. And I contend that these sensory experiences are among the attributes that keep us distinct from AI.
 
Tallis makes a passing reference to Kant, who argued that space and time are also aspects of reality that are produced by the mind. I have always struggled to understand how Kant got that so wrong. Mind you, he lived more than a century before Einstein all-but proved that space and time are fundamental parameters of the Universe. Nevertheless, there are more than a few physicists who argue that the ‘flow of time’ is a purely psychological phenomenon. They may be right (but arguably for different reasons). If consciousness exists in a constant present (as expounded by Schrodinger) and everything else becomes the past as soon as it happens, then the flow of time is guaranteed for any entity with consciousness. However, many physicists (like Sabine Hossenfelder), if not most, argue that there is no ‘now’ – it’s an illusion.
 
Speaking of Schrodinger, he pointed out that there are fundamental differences between how we sense sight and sound, even though they are both waves. In the case of colour, we can blend them to get a new colour, and in fact, as we all know, all the colours we can see can be generated by just 3 colours, which is how the screens on all your devices work. However, that’s not the case with sound, otherwise we wouldn’t be able to distinguish all the different instruments in an orchestra. Just think: all the complexity is generated by a vibrating membrane (in the case of a speaker) and somehow our hearing separates it all. Of course, it can be done mathematically with a Fourier transform, but I don’t think that’s how our brains work, though I could be wrong.
 
And this leads me to discuss the role of science, and how it challenges our everyday experience of reality. Not surprisingly, Tallis also took his discussion in that direction. Quantum mechanics (QM) is the logical starting point, and Tallis references Bohr’s Copenhagen interpretation, ‘the view that the world has no definite state in the absence of observation.’ Now, I happen to think that there is a logical explanation for this, though I’m not sure anyone else agrees. If we go back to Schrodinger again, but this time his eponymous equation, it describes events before the ‘observation’ takes place, albeit with probabilities. What’s more, all the weird aspects of QM, like the Uncertainty Principle, superposition and entanglement, are all mathematically entailed in that equation. What’s missing is relativity theory, which has since been incorporated into QED or QFT.
 
But here’s the thing: once an observation or ‘measurement’ has taken place, Schrodinger’s equation no longer applies. In other words, you can’t use Schrodinger’s equation to describe something that has already happened. This is known as the ‘measurement problem’, because no one can explain it. But if QM only describes things that are yet to happen, then all the weird aspects aren’t so weird.
 
Tallis also mentions Einstein’s 'block universe', which infers past, present and future all exist simultaneously. In fact, that’s what Sabine Hossenfelder says in her book, Existential Physics:
 
The idea that the past and future exist in the same way as the present is compatible with all we currently know.

 
And:

Once you agree that anything exists now elsewhere, even though you see it only later, you are forced to accept that everything in the universe exists now. (Her emphasis.)
 
I’m not sure how she resolves this with cosmological history, but it does explain why she believes in superdeterminism (meaning the future is fixed), which axiomatically leads to her other strongly held belief that free will is an illusion; but so did Einstein, so she’s in good company.
 
In a passing remark, Tallis says, ‘science is entirely based on measurement’. I know from other essays that Tallis has written, that he believes the entire edifice of mathematics only exists because we can measure things, which we then applied to the natural world, which is why we have so-called ‘natural laws’. I’ve discussed his ideas on this elsewhere, but I think he has it back-to-front, whilst acknowledging that our ability to measure things, which is an extension of counting, is how humanity was introduced to mathematics. In fact, the ancient Greeks put geometry above arithmetic because it’s so physical. This is why there were no negative numbers in their mathematics, because the idea of a negative volume or area made no sense.
 
But, in the intervening 2 millennia, mathematics took on a life of its own, with such exotic entities like negative square roots and non-Euclidean geometry, which in turn suddenly found an unexpected home in QM and relativity theory respectively. All of a sudden, mathematics was informing us about reality before measurements were even made. Take Schrodinger’s wavefunction, which lies at the heart of his equation, and can’t be measured because it only exists in the future, assuming what I said above is correct.
 
But I think Tallis has a point, and I would argue that consciousness can’t be measured, which is why it might remain inexplicable to science, correlation with brain waves and their like notwithstanding.
 
So what is the mystery? Well, there’s more than one. For a start there is consciousness, without which reality would not be perceived or even be known, which seems to me to be pretty fundamental. Then there are the aspects of reality which have only recently been discovered, like the fact that time and space can have different ‘measurements’ dependent on the observer’s frame of reference. Then there is the increasing role of mathematics in our comprehension of reality at scales both cosmic and subatomic. In fact, given the role of numbers and mathematical relationships in determining fundamental constants and natural laws of the Universe, it would seem that mathematics is an inherent facet of reality.

 

Addendum:

As it happens, I wrote a letter to Philosophy Now on this topic, which they published, and also passed onto Raymond Tallis. As a consequence, we had a short correspondence - all very cordial and mutually respectful.

One of his responses can be found, along with my letter, under Letters, Issue 160. Scroll down to Lucky Guesses.
 

Friday, 18 August 2023

The fabric of the Universe

Brian Greene wrote an excellent book with a similar title (The Fabric of the Cosmos) which I briefly touched on here. Basically, it’s space and time, and the discipline of physics can’t avoid it. In fact, if you add mass and charge, you’ve got the whole gamut that we’re aware of. I know there’s the standard model along with dark energy and dark matter, but as someone said, if you throw everything into a black hole, the only thing you know about it is its mass, charge and angular momentum. Which is why they say, ‘a black hole has no hair.’ That was before Stephen Hawking applied the laws of thermodynamics and quantum mechanics and came up with Hawking radiation, but I’ve gone off-track, so I’ll come back to the topic-at-hand.
 
I like to tell people that I read a lot of books by people a lot smarter than me, and one of those books that I keep returning to is The Constants of Nature by John D Barrow. He makes a very compelling case that the only Universe that could be both stable and predictable enough to support complex life would be one with 3 dimensions of space and 1 of time. A 2-dimensional universe means that any animal with a digestive tract (from mouth to anus) would fall apart. Only a 3-dimensional universe allows planets to maintain orbits for millions of years. As Barrow points out in his aforementioned tome, Einstein’s friend, Paul Ehrenfest (1890-1933) was able to demonstrate this mathematically. It’s the inverse square law of gravity that keeps planets in orbit and that’s a direct consequence of everything happening in 3 dimensions. Interestingly, Kant thought it was the other way around – that 3 dimensions were a consequence of Newton’s universal law of gravity being an inverse square law. Mind you, Kant thought that both space and time were a priori concepts that only exist in the mind:
 
But this space and this time, and with them all appearances, are not in themselves things; they are nothing but representations and cannot exist outside our minds.
 
And this gets to the nub of the topic alluded to in the title of this post: are space and time ‘things’ that are fundamental to everything else we observe?
 
I’ll start with space, because, believe it or not, there is an argument among physicists that space is not an entity per se, but just dimensions between bodies that we measure. I’m going to leave aside, for the time being, that said ‘measurements’ can vary from observer to observer, as per Einstein’s special theory of relativity (SR).
 
This argument arises because we know that the Universe is expanding (by measuring the Doppler-shift of stars); but does space itself expand or is it just objects moving apart? In another post, I referenced a paper by Tamara M. Davis and Charles H. Lineweaver from UNSW (Expanding Confusion: Common Misconceptions of Cosmological Horizons and the Superluminal Expansion of the Universe), which I think puts an end to this argument, when they explain the difference between an SR and GR Doppler shift interpretation of an expanding universe.
 
The general relativistic interpretation of the expansion interprets cosmological redshifts as an indication of velocity since the proper distance between comoving objects increases. However, the velocity is due to the rate of expansion of space, not movement through space, and therefore cannot be calculated with the special relativistic Doppler shift formula. (My emphasis)
 
I’m now going to use a sleight-of-hand and attempt a description of GR (general theory of relativity) without gravity, based on my conclusion from their exposition.
 
The Universe has a horizon that’s directly analogous to the horizon one observes at sea, because it ‘moves’ as the observer moves. In other words, other hypothetical ‘observers’ in other parts of the Universe would observe a different horizon to us, including hypothetical observers who are ‘over-the-horizon’ relative to us.
 
But the horizon of the Universe is a direct consequence of bodies (or space) moving faster-than-light (FTL) over the horizon, as expounded upon in detail in Davis’s and Lineweaver’s paper. But here’s the thing: if you were an observer on one of these bodies moving FTL relative to Earth, the speed of light would still be c. How is that possible? My answer is that the light travels at c relative to the ‘space’* (in which it’s observed), but the space itself can travel faster than light.
 
There are, of course, other horizons in the Universe, which are event horizons of black holes. Now, you have the same dilemma at these horizons as you do at the Universe’s horizon. According to an external observer, time appears to ‘stop’ at the event horizon, because the light emitted by an object can’t reach us. However, for an observer at the event horizon, the speed of light is still c, and if the black hole is big enough, it’s believed (obviously no one can know) that someone could cross the event horizon without knowing they had. But what if it’s spacetime that crosses the event horizon? Then both the external observer’s perception and the comoving observer’s perception would be no different if the latter was at the horizon of the entire universe.
 
But what happens to time? Well, if you measure time by the frequency of light being emitted from an object at any of these horizons, it gets Doppler-shifted to zero, so time ‘stops’ for the ‘local’ observer (on Earth) but not for the observer at the horizon.
 
So far, I’ve avoided talking about quantum mechanics (QM), but something curious happens when you apply QM to cosmology: time disappears. According to Paul Davies in The Goldilocks Enigma: ‘…vanishing of time for the entire universe becomes very explicit in quantum cosmology, where the time variable simply drops out of the quantum description.’ This is consistent with Freeman Dyson’s argument that QM can only describe the future. Thus, if you apply a description of the future to the entire cosmos, there would be no time.
 
 
* Note: you can still apply SR within that ‘space’.

 

Addendum: I've since learned that in 1958, David Finkelstein (a postdoc with the Stevens Institute of Technology in Hoboken, New Jersey) wrote an article in Physical Review that gave the same explanation for how time appears different to different observers of a black hole, as I do above. It immediately grabbed the attention (and approval) of Oppenheimer, Wheeler and Penrose (among others), who had struggled to resolve this paradox. (Ref. Black Holes And Time Warps; Einstein's Outrageous Legacy, Kip S. Thorne, 1994)
 

Monday, 14 November 2022

Kant and modern physics

 I wrote a post on Kant back in February 2020, but it was actually an essay I wrote more than 20 years earlier, when I was a student of philosophy. I would not be able to improve on that essay, and I’m not about to try now. In that essay, I argue that Kant’s great contribution to philosophy, and epistemology in particular, was his idea of the ‘thing-in-itself’, which may remain forever unknowable, as we only have our perceptions of ‘things’.
 
In other posts, I have sometimes argued that the ‘thing-in-itself’ is dependent on the scale that we can observe it, but there is something deeper that I think only became apparent in the so-called golden age of physics in the 20th Century. In a more recent post, I pointed out that both relativity theory and quantum mechanics (the 2 pillars of modern physics) are both observer dependent. I argue that there could be an objective ontology that they can’t describe. I think this is more obvious in the case of special relativity, where different observers literally measure different durations of both space and time, but I’m getting ahead of myself.
 
On Quora, there are 4 physicists whom I ‘follow’ and read regularly. They are Viktor T Toth, Richard Muller, Mark John Fernee and Ian Miller. Out of these, Miller is possibly the most contentious as he argues against non-locality in QM (quantum mechanics), which I’m not aware of any other physicist concurring with. Of course, it’s Bell’s Inequality that provides the definitive answer to this, of which Miller has this to say:
 
If you say it must because of violations of Bell’s Inequality, first note that the inequality is a mathematical relationship that contains only numbers; no physical concept is included.
 
But the ‘numbers’ compare classical statistical outcomes with Born statistical outcomes and experiments verify Born’s results, so I disagree. Having said that, Miller makes pertinent points that I find insightful and, like all those mentioned, he knows a lot more about this topic than me.
 
For example, concerning relativity, he argues that it’s the ruler that changes dimension and not the space being measured. He also points out, regarding the twin paradox, that only one twin gains energy, which is the one whose clock slows down. Note that clocks are also a form of ‘ruler’, but they measure time instead of space. So you can have 2 observers who ‘measure’ different durations of space and time, but agree on ‘now’, when they reunite, as is the case with the twin paradox thought experiment.
 
This point is slightly off-track, but not irrelevant to the main focus of this post. The main focus is an academic paper jointly written by Shaun Maguire and Richard Muller, titled Now, and the Flow of Time. This paper is arguably as contentious as Miller’s take on non-locality and Bell, because Muller and Maguire argue that ‘space’ can be created.
 
Now, Viktor T Toth is quite adamant that space is not created because space is not an entity, but a ‘measurement’ between entities called ‘objects’. Now, it has to be said, that Muller has stated publicly on Quora that he has utmost respect for Toth and neither of them have called each other out over this issue.
 
Toth argues that people confound the mathematical metric with ‘space’ or ‘spacetime’, but I’d argue that this mathematical metric has physical consequences. In another post, I reference another paper, recommended to me by Mark John Fernee (authored by Tamara M. Davis and Charles H. Lineweaver at the University of New South Wales) which describes how a GR Doppler shift intrinsically measures the expansion of space.
 
The general relativistic interpretation of the expansion interprets cosmological redshifts as an indication of velocity since the proper distance between comoving objects increases. However, the velocity is due to the rate of expansion of space, not movement through space, and therefore cannot be calculated with the special relativistic Doppler shift formula.
(My emphasis)
 
As I explain in that post: ‘What they are effectively saying is that there is a distinction between the movement of objects in space and the movement of space itself.’
 
The spacetime metric that Toth refers to provides a reference frame for c, the speed of light. So, whilst a spacetime metric (‘space’ by another name) can travel faster than light with respect to us (so over the horizon of the observable universe), an observer situated in that metric would still measure light as c relative to them.
 
Muller’s and Maguire’s paper goes even further, saying that space is created along with time, and they believe this can be measured as ‘a predicted lag in the emergence of gravitational radiation when two black holes merge.’ I won’t go into the details; you would need to read the paper.
 
A conclusion implicit in their theory is that there could be a universal now.
 
A natural question arises: why are the new nows created at the end of time, rather than uniformly throughout time, in the same way that new space is uniformly created throughout the universe.

 
The authors then provide alternative arguments, which I won’t go into, but they do ponder the fundamental difference between space and time, where one is uni-directional and the other is not. As far as we know, there is no ‘edge’ in space but there is in time. Muller and Maguire do wonder if space is ‘created’ throughout the Universe (as quoted above) or at an ‘edge’.
 
You may wonder how does Kant fit into all this? It’s because all these discussions are dependent on what we observe and what we theorise, both of which are perceptions. And, in physics, theorising involves mathematics. I’ve argued that mathematics can be seen as another medium determining perceptions, along with all the instruments we’ve built that now include the LHC and the Hubble and Webb telescopes.
 
Sabine Hossenfelder, whom I often reference on this blog these days, wrote a book, called Lost in Math, where she interviews some of the brightest minds in physics and challenges the pervading paradigm that mathematics can provide answers to questions that experimentation can’t – string theory being the most obvious.

Before the revolution in cosmology, created by Copernicus and built on by Galileo, Kepler and Newton, people believed that the Sun went round the Earth and that some objects in the night sky would occasionally backtrack in their orbits, which was explained by epicycles. That was overturned, and now it seems obvious that, in fact, the Earth rotates on its axis and orbits the sun along with all the other planets, which explains our ‘perception’ that sometimes the planets go ‘backwards.’
 
I wonder if the next revolution in science and cosmology may also provide a ‘simpler’ picture, where there is a ‘universal now’ that explains the age of the Universe, the edge of time that we all experience and non-locality in QM.
 
Of course, I’m probably wrong.

Addendum: This is Richard Muller talking about time on Quora.

Sunday, 25 September 2022

What we observe and what is reality are distinct in physics

 I’ve been doing this blog for 15 years now, and in that time some of my ideas have changed or evolved, and, in some areas, my knowledge has increased. As I’ve said on Quora a few times, I read a lot of books by people who know a lot more than me, especially in physics.
 
There is a boundary between physics and philosophy, the shoreline of John Wheeler’s metaphorical ‘island of knowledge in the infinite sea of ignorance’. To quote: “As the island grows so does the shoreline of our ignorance.” And I think ignorance is the key word here, because it’s basically speculation, which means some of us are wrong, including me, most likely. As I’ve often said, ‘Only future generations can tell us how ignorant the current generation is’. I can say that with a lot of confidence, just by looking at the history of science.
 
If this blog has a purpose beyond promoting my own pet theories and prejudices, it is to make people think.
 
Recently, I’ve been pre-occupied with determinism and something called superdeterminism, which has become one of those pet prejudices among physicists in the belief that it’s the only conclusion one can draw from combining relativity theory, quantum mechanics, entanglement and Bell’s theorem. Sabine Hossenfelder is one such advocate, who went so far as to predict that one day all other physicists will agree with her. I elaborate on this below.
 
Mark John Fernee (physicist with Qld Uni), with whom I’ve had some correspondence, is one who disagrees with her. I believe that John Bell himself proposed that superdeterminism was possibly the only resolution to the quandaries posed by his theorem. There are two other videos worth watching, one by Elijah Lew-Smith and a 50min one by Brian Greene, who doesn’t discuss superdeterminism. Nevertheless, Greene’s video gives the best and easiest to understand description of Bell’s theorem and its profound implications for reality.
 
So what is super-determinism, and how is it distinct from common or garden determinism? Well, if you watch the two relevant videos, you get two different answers. According to Sabine, there is no difference and it’s not really to do with Bell’s theorem, but with the measurement problem in QM. She argues that it’s best explained by looking at the double-slit experiment. Interestingly, Richard Feynman argued that all the problems associated with QM can be analysed, if not understood, by studying the double-slit experiment.
 
Sabine wrote an academic paper on the ‘measurement problem’, co-authored with Jonte R. Hance from the University of Bristol, which I’ve read and is surprisingly free of equations (not completely) but uses the odd term I’m unfamiliar with. I expect I was given a link by Fernee which I’ve since lost (I really can’t remember), but I still have a copy. One of her points is that as long as we have unsolved problems in QM, there is always room for different philosophical interpretations, and she and Hance discuss the most well-known ones. This is slightly off-topic, but only slightly, because even superdeterminism and its apparent elimination of free will is a philosophical issue.
 
Sabine argues that it’s the measurement that creates superdeterminism in QM, which is why she uses the double-slit experiment to demonstrate it. It’s because the ‘measurement’ ‘collapses’ the wave function and ‘determines’ the outcome, that it must have been ‘deterministic’ all along. It’s just that we don’t know it until a measurement is made. At least, this is my understanding of her argument.
 
The video by Elijah Lew-Smith gives a different explanation, focusing solely on Bell’s theorem. I found that it also required more than one viewing, but he makes a couple of points, which I believe go to the heart of the matter. (Greene’s video gives an easier-to-follow description, despite its length).
 
We can’t talk about an objective reality independent of measurement.
(Which echoes Sabine’s salient point in her video.)
 
And this point: There really are instantaneous interactions; we just can’t access them.
 
This is known as ‘non-locality’, and Brian Greene provides the best exposition I’ve seen, and explains how it’s central to Bell’s theorem and to our understanding of reality.
 
On the other hand, Lew-Smith explains non-locality without placing it at the centre of the discussion.
 
If I can momentarily go back to Sabine’s key argument, I addressed this in a post I wrote a few years back. Basically, I argued that you can only know the path an electron or photon takes retrospectively, after the measurement or observation has been made. Prior to that, QM tells us it’s in a superposition of states and we only have probabilities of where it will land. Curiously, I referenced a video by Sabine in a footnote, where she makes this point in her conclusion:
 
You don’t need to know what happens in the future because the particle goes to all points anyway. Except…  It doesn’t. In reality, it goes to only one point. So maybe the reason we need the measurement postulate is because we don’t take this dependency on the future seriously enough.
 
And to me, that’s what this is all about: the measurement is in the future of the wave function, and the path it takes is in the past. This, of course, is what Freeman Dyson claims: that QM cannot describe the past, only the future.
 
And if you combine this perspective with Lew-Smith’s comment about objective reality NOT being independent of the measurement, then objective reality only exists in the past, while the wave function and all its superpositional states exist in the future.
 
So how does entanglement fit into this? Well, this is the second point I highlighted, which is that ‘there really are instantaneous reactions, which we can’t access’, which is ‘non-locality’. And this, as Schrodinger himself proclaimed, is what distinguishes QM from classical physics. In classical physics, ‘locality’ means there is a relativistic causal connection and in entanglement there is not, which is why Einstein called it ‘spooky action at a distance’.
 
Bell’s theorem effectively tells us that non-locality is real, supported by experiment many times over, but you can’t use it to transmit information faster-than-light, so relativity is not violated in practical terms. But it does ask questions about simultaneity, which is discussed in Lew-Smith’s video. He demonstrates graphically that different observers will observe a different sequence of measurement, so we have disagreement, even a contradiction about which ‘measurement’ collapsed the wave function. And this is leads to superdeterminism, because, if the outcome is predetermined, then the sequence of measurement doesn’t matter.
 
And this gets to the nub of the issue, because it ‘appears’ that ‘objective reality’ is observer dependent. Relativity theory always gives the result from a specific observer’s point of view and different observers in different frames of reference can epistemically disagree. Is there a frame of reference that is observer independent? I always like to go back to the twin paradox, because I believe it provides an answer. When the twins reunite, they disagree on how much time has passed, yet they agree on where they are in space-time. There is not absolute time, but there is absolute space-time.
 
Did you know we can deduce the velocity that Earth travels relative to absolute space-time, meaning the overall observable Universe? By measuring the Doppler shift of the CMBR (cosmic microwave background radiation) in all directions, it’s been calculated that we are travelling at 350km/s in the direction of Pisces (ref., Paul Davies, About Time; Einstein’s Unfinished Revolution, 1995). They should teach this in schools.
 
Given this context, is it possible that entanglement is a manifestation of objective simultaneity? Not according to Einstein, who argued that: ‘The past, present and future is only a stubbornly persistent illusion’; which is based on the ‘fact’ that simultaneity is observer dependent. But Einstein didn’t live to see Bell’s theorem experimentally verified. Richard Muller, a prize-winning physicist and author (also on Quora) was asked what question he’d ask Einstein if he could hypothetically meet him NOW. I haven’t got a direct copy, but essentially Muller said he’d ask Einstein if he now accepted a ‘super-luminal connection’, given experimental confirmation of Bell’s theorem. In other words, entanglement is like an exception to the rule, where relativity strictly doesn’t apply.
 
Sabine with her co-author, Jonte Hance, make a passing comment that the discussion really hasn’t progressed much since Bohr and Einstein a century ago, and I think they have a point.
 
Mark Fernee, whom I keep mentioning on the sidelines, does make a distinction between determinism and superdeterminism, where determinism simply means that everything is causally connected to something, even if it’s not predictable. Chaos being a case-in-point, which he describes thus:
 
Where this determinism breaks down is with chaotic systems, such as three body dynamics. Chaotic systems are so sensitive to the initial parameters that even a slight inaccuracy can result in wildly different predictions. That's why predicting the weather is so difficult.
Overall, complexity limits the ability to predict the future, even in a causal universe.

 
On the other hand, superdeterminism effectively means the end of free will, and, in his own words, ‘free will is a contentious issue, even among physicists’.
 
Fernee provided a link to another document by Sabine, where she created an online forum specifically to deal with less than knowledgeable people about their disillusioned ideas on physics – crackpots and cranks. It occurred to me that I might fall into this category, but it’s for others to judge. I’m constantly reminded of how little I really know, and that I’m only fiddling around the edges, or on the ‘shoreline of ignorance’, as Wheeler described it, where there are many others far more qualified than me.
 
I not-so-recently wrote a post where I challenged a specific scenario often cited by physicists, where two observers hypothetically ‘observe’ contradictory outcomes of an event on a distant astronomical body that is supposedly happening simultaneously with them.
 
As I said before, relativity is an observer-dependent theory, almost by definition, and we know it works just by using the GPS on our smart-phones. There are algorithms that make relativistic corrections to the signals coming from the satellites, otherwise the map on your phone would not match the reality of your actual location.
 
What I challenge is the application of relativity theory to an event that the observer can’t observe, even in principle. In fact, relativity theory rules out a physical observation of a purportedly simultaneous event. So I’m not surprised that we get contradictory results. The accepted view among physicists is that each observer ‘sees’ a different ontology (one in the future and one in the past), whereas I contend that there is an agreed ontology that becomes observable at a later time, when it’s in both observers’ past. (Brian Greene has another video demonstrating the ‘conventional’ view among physicists.)
 
Claudia de Rahm is Professor of Physics at Imperial College London, and earlier this year, she gave a talk titled, What We Don’t Know About Gravity, where she made the revelatory point
that Einstein’s GR (general theory of relativity) predicted its own limitations. Basically, if you apply QM probabilities to extreme curvature spacetime, you get answers over 100%, so nonsense. GR and QM are mathematically incompatible if we try to quantise gravity, though QFT (quantum field theory) ‘works fine on the manifold of spacetime’, according to expert, Viktor T Toth.
 
Given that relativity theory, as it is applied, is intrinsically observer dependent, I question if it can be (reliably) applied to events that have no causal relation to the observer (meaning outside the observer's light cone, both past and future). Which is why I challenge its application to events the observer can't observe (refer 2 paragraphs ago).

 

Addendum: I changed the title so it's more consistent with the contents of the post. The previous title was Ignorance and bliss; philosophy and science. Basically, the reason we have different interpretations of the same phenomenon is because physics can only tell us about what we observe, and what that means for reality is often debatable; superdeterminism being a case in point. Many philosophers and scientists talk about a ‘gap’ between theory and reality, whereas I claim the gap is between the observation and reality, a la Kant.