There is an excellent series on YouTube called ‘Closer to Truth’, where the host, Richard Lawrence Kuhn, interviews some of the cleverest people on the planet (about existential and epistemological issues) in such a way that ordinary people, like you and me, can follow. I understand from Wikipedia that it’s really a television series started in 2000 on America’s PBS.
In an interview with Gregory Chaitin, he asks the above question, which made me go back and re-read Chaitin’s book, Thinking about Godel and Turing, which I originally bought and read over a decade ago, and then posted about on this blog, (not long after I created it). It’s really a collection of talks and abridged papers given by Chaitin from 1970 to 2007, so there’s a lot of repetition but also an evolution in his narrative and ideas. Reading it for the second time (from cover to cover) over a decade later has the benefit of using the filter of all the accumulated knowledge that I’ve acquired in the interim.
More than one person (Umberto Eco and Jeremy Lent, for examples) have wondered if the discreteness we find in the world, and which we logically apply to mathematics, is a consequence of a human projection rather than an objective reality. In other words, is it an epistemological bias rather than an ontological condition? I’ll return to this point later.
Back to Chaitin’s opus, he effectively takes us through the same logical and historical evolution over and over again, which ultimately leads to the same conclusions. I’ll summarise briefly. In 1931, Kurt Godel proved a theorem that effectively tells us that, within a formal axiom-based mathematical system, there will always be mathematical truths that can’t be solved. Then in 1936, Alan Turing proved, with a thought experiment that presaged the modern computer, that there will always be machine calculations that may never stop and we can’t predict whether they will or not. For example, Riemann’s hypothesis can be calculated using an algorithm to whatever limit you like (and is being calculated somewhere right now, probably) but you can never know in advance if it will ever stop (by finding a false result). As Chaitin points out, this is an extension of Godel’s theorem, and Godel’s theorem can be deduced from Turing’s.
Then Chaitin himself proved, by inventing (or discovering) a mathematical device, (Ω) called Omega, that there are innumerable numbers that can never be completely calculated (Omega gives a probability of a Turing program halting). In fact, there are more incomputable numbers than rational numbers, even though they are both infinite in extent. The rational Reals are countably infinite while the incomputable Reals are uncountably infinite. I’ve mentioned this previously when discussing Noson Yanofsky’s excellent book, The Outer Limits of Reason; What Science, Mathematics, and Logic CANNOT Tell Us. Chaitin claims that this proves that Godel’s Incompleteness Theorem is not some aberration, but is part of the foundation of mathematics – there are infinitely more numbers that can’t be calculated than those that can.
So that’s the gist of Chaitin’s book, but he draws some interesting conclusions on the side, so-to-speak. For a start, he argues that maths should be done more like physics and maybe we should accept some unproved theorems (like Riemann’s) as new axioms, as one would in physics. In fact, this is happening almost by default in as much as there already exists new theorems that are dependent on Riemann’s conjecture being true. In other words, Riemann’s hypothesis has effectively morphed into a mathematical caveat so people can explore its consequences.
The other area of discussion that Chaitin gets into, which is relevant to this discussion is whether the Universe is like a computer. He cites Stephen Wolfram (who invented Mathematica) and Edward Fredkin.
According to Pythagoras everything is number, and God is a mathematician… However, now a neo-Pythagorean doctrine is emerging, according to which everything is 0/1 bits, and the world is built entirely out of digital information. In other words, now everything is software, God is a computer programmer, not a mathematician, and the world is a giant information-processing system, a giant computer [Fredkin, 2004, Wolfram, 2002, Chaitin, 2005].
Carlo Rovelli also argues that the Universe is discrete, but for different reasons. It’s discrete because quantum mechanics (QM) has a Planck limit for both time and space, which would suggest that even space-time is discrete. Therefore it would seem to lend itself to being made up of ‘bits’. This fits in with the current paradigm that QM and therefore reality, is really about ‘information’ and information, as we know, comes in ‘bits’.
Chaitin, at one point, goes so far as to suggest that the Universe calculates its future state from the current state. This is very similar to Newton’s clockwork universe, whereby Laplace famously claimed that given the position of every particle in the Universe and all the relevant forces, one could, in principle, ‘read the future just as readily as the past’. These days we know that’s not correct, because we’ve since discovered QM, but people are arguing that a QM computer could do the same thing. David Deutsch is one who argues that (in principle).
There is a fundamental issue with all this that everyone seems to have either forgotten or ignored. Prior to the last century, a man called Henri Poincare discovered some mathematical gremlins that seemed of little relevance to reality, but eventually led to a physics discipline which became known as chaos theory.
So after re-reading Chaitin’s book, I decided to re-read Ian Stewart’s erudite and deeply informative book, Does God Play Dice? The New Mathematics of Chaos.
Not quite a third of the way through, Stewart introduces Chaitin’s theorem (of incomputable numbers) to demonstrate why the initial conditions in chaos theory can never be computed, which I thought was a very nice and tidy way to bring the 2 philosophically opposed ideas together. Chaos theory effectively tells us that a computer can never predict the future evolvement of the Universe, and it’s Chaitin’s own theorem which provides the key.
At another point, Stewart quips that God uses an analogue computer. He’s referring to the fact that most differential equations (used by scientists and engineers) are linear whilst nature is clearly nonlinear.
Today’s science shows that nature is relentlessly nonlinear. So whatever God deals with… God’s got an analogue computer as versatile as the entire universe to play with – in fact, it is the entire universe. (Emphasis in the original.)
As all scientists know (and Yanofsky points out in his book) we mostly use statistical methods to understand nature’s dynamics, not the motion of individual particles, which would be impossible. Erwin Schrodinger made a similar point in his excellent tome, What is Life? To give just one example that most people are aware of: radioactive decay (an example Schrodinger used). Statistically, we know the half-lives of radioactive decay, which follow a precise exponential rule, but no one can predict the radioactive decay of an individual isotope.
Whilst on the subject of Schrodinger, his eponymous equation is both linear and deterministic which seems to contradict the very idea of QM discrete and probabilistic effects. Perhaps that is why Carlo Rovelli contends that Schrodinger’s wavefunction has misled our attempts to understand QM in reality.
Roger Penrose explicates QM in phases: U, R and C (he always displays them bold), depicting the wave function phase; the measurement phase; and the classical physics phase. Logically, Schrodinger’s wave function only exists in the U phase, prior to measurement or observation. If it wasn’t linear you couldn’t add the waves together (of all possible paths) which is essential for determining the probabilities and is also fundamental to QED (which is the latest iteration of QM). The fact that it’s deterministic means that it can calculate symmetrically forward and backward in time.
My own take on this is that QM and classical physics obey different rules, and the rules for classical physics are chaos, which are neither predictable nor linear. Both lead to unpredictability but for different reasons and using different mathematics. Stewart has argued that just maybe you could describe QM using chaos theory and David Deutsch has argued the opposite: that you could use the multi-world interpretation of QM to explain chaos theory. I think they’re both wrong-headed, but I’m the first to admit that all these people know far more than me. Freeman Dyson (one of the smartest physicists not to win a Nobel Prize) is the only other person I know who believes that maybe QM and classical physics are distinct. He’s pointed out that classical physics describes events in the past and QM provides future probabilities. It’s not a great leap from there to suggest that the wavefunction exists in the future.
You may have noticed that I’ve wandered away from my original question, so maybe I should wonder my way back. In my introduction, I mentioned the epistemological point, considered by some, that maybe our employment of mathematics, which is based on integers, has made us project discreteness onto the world.
Chaitin’s theorem demonstrates that most of mathematics is not discrete at all. In fact, he cites his hero, Gottlieb Leibniz, that most of mathematics is ‘transcendental’, which means it’s beyond our intellectual grasp. This turns the general perception that mathematics is a totally logical construct on its head. We access mathematics using logic, but if there are an uncountable infinity of Reals that are not computable, then, logically, they are not accessible to logic, including computer logic. This is a consequence of Chaitin’s own theorem, yet he argues that is the reason it’s not reality.
In fact, Chaitin would argue that it’s because of that inacessability that a discrete universe makes sense. In other words, a discrete universe would be computable. However, chaos theory suggests God would have to keep resetting his parameters. (There is such a thing as ‘chaotic control’, called ‘Proportional Perturbation Feedback’, PPF, which is how pacemakers work.)
Ian Stewart has something to say on this, albeit while talking about something else. He makes the valid point that there is a limit to how many decimals you can use in a computer, which has practical limitations:
The philosophical point is that the discrete computer model we end up with is not the same as the discrete model given by atomic physics.
Continuity uses calculus, as in the case of Schrodinger’s equation (referenced above) but also Einstein’s field equations, and calculus uses infinitesimals to maintain continuity mathematically. A computer doing calculus ‘cheats’ (as Stewart points out) by adding differences quite literally.
This leads Stewart to make the following observation:
Computers can work with a small number of particles. Continuum mechanics can work with infinitely many. Zero or infinity. Mother Nature slips neatly into the gap between the two.
Wolfram argues that the Universe is pseudo-random, which would allow it to run on algorithms. But there are 2 levels of randomness, one caused by QM and one caused by chaos. (Chaos can create stability as well, which I‘ve discussed elsewhere.) The point is that initial conditions have to be calculated to infinity to determine chaotic phenomena (like weather), but it applies to virtually everything in nature. Even the orbits of the planets are chaotic, but over millions, even billions of years. So at some level the Universe may be discrete, even at the Planck scale, but when it comes to evolutionary phenomena, chaos rules, and it’s neither computably determinable (long term) nor computably discrete.
There is one aspect of this that I’ve never seen discussed and that is the relationship between chaos theory and time. Carlos Rovelli, in his recent book, The Order of Time, argues that ‘time’s arrow’ can only be explained by entropy, but another physicist, Richard A Muller, in his book, NOW; The Physics of Time, argues the converse. Muller provides a lengthy and compelling argument on why entropy doesn’t explain the arrow of time.
This may sound simplistic, but entropy is really about probabilities. As time progresses, a dynamic system, if left to its own devices, progresses to states of higher probability. For example, perfume released from a bottle in one corner of a room soon dissipates throughout the room because there is a much higher probability of that then it accumulating in one spot. A broken egg has an infinitesimally low probability of coming back together again. The caveat, ‘left to its own devices’, simply means that the system is in equilibrium with no external source of energy to keep it out of equilibrium.
What has this to do with chaos theory? Well, chaotic phenomena are time asymmetrical (they can't be repeated, if rerun). Take weather. If weather was time reversible symmetrical, forecasts would be easy. And weather is not in a state of equilibrium, so entropy is not the dominant factor. Take another example: biological evolution. It’s not driven by entropy because it increases in complexity but it’s definitely time asymmetrical and it’s chaotic. In fact, speciation appears to be fractal, which is a chaos parameter.
Now, I pointed out that the U phase of Penroses’s explication of QM is time symmetrical, but I would contend that the overall U, R, C sequence is not. I contend that there is a sequence from QM to classical physics that is time asymmetrical. This infers, of course, that QM and classical physics are distinct.
Addendum 1: This is slightly off-topic, but relevant to my own philosophical viewpoint. Freeman Dyson delivers a lecture on QM, and, in the 22.15 to 24min time period, he argues that the wavefunction and QM can only tell us about the future and not the past.
Addendum 2 (Conclusion): Someone told me that this was difficult to follow, so I've written a summary based on a comment I gave below.
Chaitin's theorem arises from his derivation of omega (Ω), which is
the 'halting probability', an extension of Turing's famous halting
theorem. You can read about it here, including its significance to incomputability.
I
agree with Chaitin 'mathematically' in that I think there are
infinitely more incomputable Reals than computable Reals. You already have this with transcendental
numbers like π and e, which are incomputable. Chaitin's Ω
can be computed to whatever resolution you like, just like π and e, but
(of course) not to infinity.
I disagree with him 'philosophically'
in that I don't think the Universe is necessarily discrete and can be
reduced to 0s and 1s (bits). In other words, I don't think the Universe
is like a computer.
Curiously and ironically, Chaitin has proved
that the foundation of mathematics consists mostly of incomputable
Reals, yet he believes the Universe is computable. I agree with him on
the first part but not the second.
Addendum 3: I discussed the idea of the Universe being like a computer in an earlier post, with no reference to Chaitin or Stewart.
Addendum 4: I recently read Jim Al-Khalili's chapter on 'Laplace's demon' in his book, Paradox; The Nine Greatest Enigmas in Physics, which is specifically a discussion on 'chaos theory'. Al-Khalili contends that 'the Universe is almost certainly deterministic', but I think his definition of 'deterministic' might be subtly different to mine. He rightly points out that chaos is deterministic but unpredictable. What this means is that everything in the past and everything in the future has a cause and effect. So there is a causal path from any current event to as far into the past as you want to go. And there will also be a causal path from that same event into the future; it's just that you won't be able to predict it because it's uncomputable. In that sense the future is deterministic but not determinable. However (as Ian Stewart points out in Does God Play Dice?) if you re-run a chaotic experiment you will get a different result, which is almost a definition of chaos; tossing a coin is the most obvious example (cited by Stewart). My point is that if the Universe is chaotic then it follows that if you were to rerun the Universe you'd get a different result. So it may be 'deterministic' but it's not 'determined'. I might elaborate on this in a separate post.