Paul P. Mealing

Check out my book, ELVENE. Available as e-book and as paperback (print on demand, POD). Also this promotional Q&A on-line.

Saturday, 29 June 2024

Feeling is fundamental

 I’m not sure I’ve ever had an original idea, but I sometimes raise one that no one else seems to talk about. And this is one of them: I contend that the primary, essential attribute of consciousness is to be able to feel, and the ability to comprehend is a secondary attribute.
 
I don’t even mind if this contentious idea triggers debate, but we tend to always discuss consciousness in the context of human consciousness, where we metaphorically talk about making decisions based on the ‘head’ or the ‘heart’. I’m unsure of the origin of this dichotomy, but there is an inference that our emotional and rational ‘centres’ (for want of a better word) have different loci (effectively, different locations). No one believes that, of course, but possibly people once did. The thing is that we are all aware that sometimes our emotional self and rational self can be in conflict. This is already going down a path I didn’t intend, so I may return at a later point.
 
There is some debate about whether insects have consciousness, but I believe they do because they demonstrate behaviours associated with fear and desire, be it for sustenance or company. In other respects, I think they behave like automatons. Colonies of ants and bees can build a nest without a blueprint except the one that apparently exists in their DNA. Spiders build webs and birds build nests, but they don’t do it the way we would – it’s all done organically, as if they have a model in their brain that they can follow; we actually don’t know.
 
So I think the original role of consciousness in evolutionary terms was to feel, concordant with abilities to act on those feelings. I don’t believe plants can feel, but they’d have very limited ability to act on them, even if they could. They can communicate chemically, and generally rely on the animal kingdom to propagate, which is why a global threat to bee populations is very serious indeed.
 
So, in evolutionary terms, I think feeling came before cognitive abilities – a point I’ve made before. It’s one of the reasons that I think AI will never be sentient – a viewpoint not shared by most scientists and philosophers, from what I’ve read.  AI is all about cognitive abilities; specifically, the ability to acquire knowledge and then deploy it to solve problems. Some argue that by programming biases into the AI, we will be simulating emotions. I’ve explored this notion in my own sci-fi, where I’ve added so-called ‘attachment programming’ to an AI to simulate loyalty. This is fiction, remember, but it seems plausible.
 
Psychological studies have revealed that we need an emotive component to behave rationally, which seems counter-intuitive. But would we really prefer if everyone was a zombie or a psychopath, with no ability to empathise or show compassion. We see enough of this already. As I’ve pointed out before, in any ingroup-outgroup scenario, totally rational individuals can become totally irrational. We’ve all observed this, possibly actively participated.
 
An oft made point (by me) that I feel is not given enough consideration is the fact that without consciousness, the universe might as well not exist. I agree with Paul Davies (who does espouse something similar) that the universe’s ability to be self-aware, would seem to be a necessary condition for its existence (my wording, not his). I recently read a stimulating essay in the latest edition of Philosophy Now (Issue 162, June/July 2024) titled enigmatically, Significance, by Ruben David Azevedo, a ‘Portuguese philosophy and social sciences teacher’. His self-described intent is to ‘Tell us why, in a limitless universe, we’re not insignificant’. In fact, that was the trigger for this post. He makes the point (that I’ve made elsewhere myself), that in both time and space, we couldn’t be more insignificant, which leads many scientists and philosophers to see us as a freakish by-product of an otherwise purposeless universe. A perspective that Davies has coined ‘the absurd universe’. In light of this, it’s worth reading Azevedo’s conclusion:
 
In sum, humans are neither insignificant nor negligible in this mind-blowing universe. No living being is. Our smallness and apparent peripherality are far from being measures of our insignificance. Instead, it may well be the case that we represent the apex of cosmic evolution, for we have this absolute evident and at the same time mysterious ability called consciousness to know both ourselves and the universe.
 
I’m not averse to the idea that there is a cosmic role for consciousness. I like John Wheeler’s obvious yet pertinent observation:
 
The Universe gave rise to consciousness, and consciousness gives meaning to the Universe.

 
And this is my point: without consciousness, the Universe would have no meaning. And getting back to the title of this essay, we give the Universe feeling. In fact, I’d say that the ability to feel is more significant than the ability to know or comprehend.
 
Think about the role of art in all its manifestations, and how it’s totally dependent on the ability to feel. In some respects, I consider AI-generated art a perversion, because any feeling we have for its products is of our own making, not the AI’s.
 
I’m one of those weird people who can even find beauty in mathematics, while acknowledging only a limited ability to pursue it. It’s extraordinary that I can find beauty in a symphony, or a well-written story, or the relationship between prime numbers and Riemann’s Zeta function.


Addendum: I realised I can’t leave this topic without briefly discussing the biochemical role in emotional responses and behaviours. I’m thinking of the brain’s drugs-of-choice like serotonin, dopamine, oxytocin and endorphins. Some may argue that these natural ‘messengers’ are all that’s required to explain emotions. However, there are other drugs, like alcohol and caffeine (arguably the most common) that also affect us emotionally, sometimes to our detriment. My point being that the former are nature’s target-specific mechanisms to influence the way we feel, without actually being the genesis of feelings per se.

Wednesday, 19 June 2024

Daniel C Dennett (28 March 1942 - 19 April 2024)

 I only learned about Dennett’s passing in the latest issue of Philosophy Now (Issue 162, June/July 2024), where Daniel Hutto (Professor of Philosophical Psychology at the University of Wollongong) wrote a 3-page obituary. Not that long ago, I watched an interview with him, following the publication of his last book, I’ve Been Thinking, which, from what I gathered, is basically a memoir, as well as an insight into his philosophical musings. (I haven’t read it, but that’s the impression I got from the interview.)
 
I should point out that I have fundamental philosophical differences with Dennett, but he’s not someone you can ignore. I must confess I’ve only read one of his books (decades ago), Freedom Evolves (2006), though I’ve read enough of his interviews and commentary to be familiar with his fundamental philosophical views. It’s something of a failing on my part that I haven’t read his most famous tome, Consciousness Explained (1991). Paul Davies once nominated it among his top 5 books, along with Douglas Hofstadter’s Godel Escher Bach. But then he gave a tongue-in-cheek compliment by quipping, ‘Some have said that he explained consciousness away.’
 
Speaking of Hofstadter, he and Dennett co-published a book, The Mind’s I, which is really a collection of essays by different authors, upon which Dennett and Hofstadter commented. I wrote a short review covering only a small selection of said essays on this blog back in 2009.
 
Dennett wasn’t afraid to tackle the big philosophical issues, in particular, anything relating to consciousness. He was unusual for a philosopher in that he took more than a passing interest in science, and appreciated the discourse that axiomatically arises between the 2 disciplines, while many others (on both sides) emphasise the tension that seems to arise and often morphs into antagonism.
 
What I found illuminating in one of his YouTube videos was how Dennett’s views of the world hadn’t really changed that much over time (mind you, neither have mine), and it got me thinking that it reinforces an idea I’ve long held, but was once iterated by Nietzsche, that our original impulses are intuitive or emotive and then we rationalise them with argument. I can’t help but feel that this is what Dennett did, though he did it extremely well.
 
I like the quote at the head of Hutto’s obituary: “The secret of happiness is: Find something more important than you are and dedicate your life to it.”

 


Saturday, 15 June 2024

The negative side of positive thinking

 This was a topic in last week’s New Scientist (8 June 2024) under the heading, The Happiness Trap, an article written by Conor Feehly, a (freelance journalist, based in Bangkok). Basically, he talks about the plethora of ‘self-help’ books and in particular, the ‘emergence of the positive psychology movement in 1998’. I was surprised he could provide a year, when one would tend to think it was a generational transition. At least, that’s my experience.
 
He then discusses the backlash (my term, not his) that’s occurred since, and mentions a study, ‘published in 2022, [by] an international group of psychologists exploring how societal pressure to be happy affects people in 40 countries’ (my emphasis). He cites Brock Bastian at the University of Melbourne, who was part of the study, “When we are not willing to accept negative emotions as a part of life, this can mean that we may see negative emotions as a sign there is something wrong with us.” And this gets to the nub of the issue.
 
I can’t help but think that there is a generational effect, if not a divide. I see myself as being in between, generationally speaking. My parents lived through the Great Depression and WW2, so they experienced enough negative emotion for all of us. Growing up in rural NSW, we didn’t have much but neither did anyone else, so we didn’t think that was exceptional. There was a lot of negative emotion in our lives as a consequence of the trauma that my Dad experienced as both a wartime serviceman and a prisoner-of-war. It was only much later, as an adult, that I realised this was not the norm. Back then, PTSD wasn’t a term.
 
One of the things that struck me in Feehly’s article was the idea of ‘acceptance’. To quote:
 
Research shows that when people accepted their negative emotions – rather than judging mental experience as good or bad – they become more emotionally resilient, experiencing fewer negative feelings in response to environmental stressors and attaining a greater sense of well-being.
 
He also says in the same context:
 
The good news is that, as we age, we increasingly rely on acceptance – which might help to explain why older people tend to report better emotional well-being.

 
As one of that cohort (older people), I can identify with that sentiment. Acceptance is a multi-faceted word, because one of the unexpected benefits of getting older is that we learn to accept ourselves, becoming less critical and judgemental, and hopefully extending that to others.
 
In our youth, acceptance by one’s peers is a prime driver of self-esteem and associated behaviours, and social media has to a large extent hijacked that impulse, which was also highlighted by Brock Bastian (cited above).
 
I’ve got side-tracked to the extent that this is the antithesis of the so-called ‘positive psychology movement’, possibly because I think my generation largely avoided that trap. We are more likely to see that a ‘think positive’ attitude in the face of all of life’s dilemmas and problems is a delusion. What’s obvious is that negative emotional states have evolutionary value, because they have ancient roots. The other point that’s obvious to me is that we are all addicted to stories, where we vicariously experience negative emotions on a regular basis. In fact, a story that only contained positive emotions would never be read, or watched.
 
What has always been obvious to me, and which I’ve written about before, including in the very early history of this blog, is that we need adversity to gain wisdom. As I keep saying, it’s the theme of virtually every story ever told. When I look back on my early adult years and how seemingly insurmountable they felt, my older self is so grateful I persevered. There is a hypothetical often raised: what advice would you give your younger self? I’d just say, ‘Hang in there, it gets better.’

Sunday, 9 June 2024

More on radical ideas

 As you can tell from the title, this post carries on from the last one, because I got a bit bogged down on one issue, when I really wanted to discuss more. One of the things that prompted me was watching a 1hr presentation by cosmologist, Claudia de Rahm, whom I’ve mentioned before, when I had the pleasure of listening to an on-line lecture she gave, care of New Scientist, during the COVID lockdown.
 
Claudia’s particular field of study is gravity, and, by her own admission, she has a ‘crazy idea’. Now here’s the thing: I meet a lot of people on Quora and in the blogosphere, who like me, live (in a virtual sense) on the fringes of knowledge rather than as academic or professional participants. And what I find is that they often have an almost zealous confidence in their ideas. To give one example, I recently came across someone who argued quite adamantly that the Universe is static, not expanding, and has even written a book on the subject. This is contrary to virtually everyone else I’m aware of who works in the field of cosmology and astrophysics. And I can’t help but compare this to Claudia de Rahm who is well aware that her idea is ‘crazy’, even though she’s fully qualified to argue it.
 
In other words, it’s a case of the more you know about a subject, the less you claim to know, because experts are more aware of their limitations than non-experts. I should point out, in case you didn’t already know, I’m not one of the experts.
 
Specifically, Claudia’s crazy idea is that not only are there gravitational waves, but gravitons and that gravitons have an extremely tiny amount of mass, which would alter the effect of gravity at very long range. I should say that at present, the evidence is against her, because if she’s right, gravity waves would travel not at the speed of light, as predicted by Einstein, but ever-so-slightly less than light.
 
Freeman Dyson, by the way, has argued that if gravitons do exist, they would be impossible to detect, but if Claudia is right, then they would be.
 
In her talk, Claudia also discusses the vacuum energy, which according to particle physics, should be 28 orders of magnitude greater than the relativistic effect of ‘dark energy’. She calls it ‘the biggest discrepancy in the entire history of science’. This suggests that there is something rotten in the state of theoretical physics, along with the fact, that what we can physically observe, only accounts for 5% of the Universe.
 
It should be pointed out that at the end of the 19th Century no one saw or predicted the 2 revolutions in physics that were just around the corner – relativity theory and quantum mechanics. They were an example of what Thomas Kuhn called The Structure of Scientific Revolutions (the title of his book expounding on this). And I’d suggest that these current empirical aberrations in cosmology are harbingers of the next Kuhnian revolution.
 
Roger Penrose, whom I’ve referenced a number of times on this blog, is someone else with some ‘crazy’ ideas compared to the status quo, for which I admire him even if I don’t agree with him. One of Penrose’s hobby horses is his own particular inference from Godel’s Incompleteness Theorem, which he learned as a graduate (under Steen, at Cambridge) and which he discusses in this video. He argues that it provides evidence that humans don’t think like computers. If one takes the example of Riemann’s Hypothesis (really a conjecture) we know that a computer can’t tell us if it’s true or not (my example, not Penrose’s).* However, most mathematicians believe it is true, and it would be an enormous shock if it was proven untrue, or a contra-example was found by a computer. This is the case with other conjectures that have been proven true, like Fermat’s Last Theorem and Poincare’s conjecture. Penrose’s point, if I understand him correctly, is that it takes a human mind and not a computer to make this leap into the unknown and grasp a ‘truth’ out of the aether.
 
Anyone who has engaged in some artistic endeavour can identify with this, even if it’s not mathematical truths they are seeking but the key to unravelling a plot in a story.
 
Penrose makes the point in the video that he’s a ‘visual’ person, which he thinks is unusual in his field. Penrose is an excellent artist, by the way, and does all his own graphics. This is something else I can identify with, as I was quite precocious as a very young child at drawing (I could draw in perspective, though no one taught me) even though it never went anywhere.
 
Finally, some crazy ideas of my own. I’ve pointed out on other posts that I have a predilection (for want of a better term) for Kant’s philosophical proposition that we can never know the ‘thing-in-itself’ but only a perception of it.
 
With this in mind, I contend that this philosophical premise not only applies to what we can physically detect via instruments, but what we theoretically infer from the mathematics we use to explore nature. As heretical an idea as it may seem, I argue that mathematics is yet another 'instrument' we use to probe the secrets of the Universe. Quantum mechanics and relativity theory being the most obvious.
 
As I’ve tried to expound on other posts, relativity theory is observer-dependent, in as much as different observers will both measure and calculate different values of time and space, dependent on their specific frame of reference. I believe this is a pertinent example of Kant’s proposition that the thing-in-itself escapes our perception. In particular, physicists (including Penrose) will tell you that events that are ostensibly simultaneous to us (in a galaxy far, far away) will be perceived as both past and future by 2 observers who are simply crossing a street in opposite directions. I’ve written about this elsewhere as ‘the impossible thought experiment’.
 
The fact is that relativity theory rules out the event being observed at all. In other words, simultaneous events can’t be observed (according to relativity). For this reason, virtually all physicists will tell you that simultaneity is an illusion – there is no universal now.
 
But here’s the thing: if there is an edge in either space or time, it can only be observed from outside the Universe. Relativity theory, logically enough, can only tell us what we can observe from within the Universe.
 
But to extend this crazy idea, what’s stopping the Universe existing within a higher dimension that we can’t perceive. Imagine being a fish and you spend your entire existence in a massive body of water, which is your entire universe. But then one day you are plucked out of that environment and you suddenly become aware that there is another, even bigger universe that exists right alongside yours.
 
There is a tendency for us to think that everything that exists we can learn and know about – it’s what separates us from every other living thing on the planet. But perhaps there are other dimensions, or even worlds, that lie forever beyond our comprehension.


*Footnote: Actually, Penrose in his book, The Emperor’s New Mind, discusses this in depth and at length over a number of chapters. He makes the point that Turing’s ‘proof’ that it’s impossible to predict whether a machine attempting to compute all the Riemann zeros (for example) will stop, is a practical demonstration of the difference between ‘truth’ and ‘proof’ (as Godel’s Incompleteness Theorem tell us). Quite simply, if the theorem is true, the computer will never stop, so it can never be proven algorithmically. It can only be proven (or disproven) if one goes ‘outside the [current] rules’ to use Penrose’s own nomenclature.

Sunday, 2 June 2024

Radical ideas

 It’s hard to think of anyone I admire in physics and philosophy who doesn’t have at least one radical idea. Even Richard Feynman, who avoided hyperbole and embraced doubt as part of his credo: "I’d rather have doubt and be uncertain, than be certain and wrong."
 
But then you have this quote from his good friend and collaborator, Freeman Dyson:

Thirty-one years ago, Dick Feynman told me about his ‘sum over histories’ version of quantum mechanics. ‘The electron does anything it likes’, he said. ‘It goes in any direction at any speed, forward and backward in time, however it likes, and then you add up the amplitudes and it gives you the wave-function.’ I said, ‘You’re crazy.’ But he wasn’t.
 
In fact, his crazy idea led him to a Nobel Prize. That exception aside, most radical ideas are either still-born or yet to bear fruit, and that includes mine. No, I don’t compare myself to Feynman – I’m not even a physicist - and the truth is I’m unsure if I even have an original idea to begin with, be they radical or otherwise. I just read a lot of books by people much smarter than me, and cobble together a philosophical approach that I hope is consistent, even if sometimes unconventional. My only consolation is that I’m not alone. Most, if not all, the people smarter than me, also hold unconventional ideas.
 
Recently, I re-read Robert M. Pirsig’s iconoclastic book, Zen and the Art of Motorcycle Maintenance, which I originally read in the late 70s or early 80s, so within a decade of its publication (1974). It wasn’t how I remembered it, not that I remembered much at all, except it had a huge impact on a lot of people who would never normally read a book that was mostly about philosophy, albeit disguised as a road-trip. I think it keyed into a zeitgeist at the time, where people were questioning everything. You might say that was more the 60s than the 70s, but it was nearly all written in the late 60s, so yes, the same zeitgeist, for those of us who lived through it.
 
Its relevance to this post is that Pirsig had some radical ideas of his own – at least, radical to me and to virtually anyone with a science background. I’ll give you a flavour with some selective quotes. But first some context: the story’s protagonist, whom we assume is Pirsig himself, telling the story in first-person, is having a discussion with his fellow travellers, a husband and wife, who have their own motorcycle (Pirsig is travelling with his teenage son as pillion), so there are 2 motorcycles and 4 companions for at least part of the journey.
 
Pirsig refers to a time (in Western culture) when ghosts were considered a normal part of life. But then introduces his iconoclastic idea that we have our own ghosts.
 
Modern man has his own ghosts and spirits too, you know.
The laws of physics and logic… the number system… the principle of algebraic substitution. These are ghosts. We just believe in them so thoroughly they seem real.

 
Then he specifically cites the law of gravity, saying provocatively:
 
The law of gravity and gravity itself did not exist before Isaac Newton. No other conclusion makes sense.
And what that means, is that the law of gravity exists nowhere except in people’s heads! It’s a ghost! We are all of us very arrogant and conceited about running down other people’s ghosts but just as ignorant and barbaric and superstitious about our own.
Why does everybody believe in the law of gravity then?
Mass hypnosis. In a very orthodox form known as “education”.

 
He then goes from the specific to the general:
 
Laws of nature are human inventions, like ghosts. Laws of logic, of mathematics are also human inventions, like ghosts. The whole blessed thing is a human invention, including the idea it isn’t a human invention. (His emphasis)
 
And this is philosophy in action: someone challenges one of your deeply held beliefs, which forces you to defend it. Of course, I’ve argued the exact opposite, claiming that ‘in the beginning there was logic’. And it occurred to me right then, that this in itself, is a radical idea, and possibly one that no one else holds. So, one person’s radical idea can be the antithesis of someone else’s radical idea.
 
Then there is this, which I believe holds the key to our disparate points of view:
 
We believe the disembodied 'words' of Sir Isaac Newton were sitting in the middle of nowhere billions of years before he was born and that magically he discovered these words. They were always there, even when they applied to nothing. Gradually the world came into being and then they applied to it. In fact, those words themselves were what formed the world. (again, his emphasis)
 
Note his emphasis on 'words', as if they alone make some phenomenon physically manifest.
 
My response: don’t confuse or conflate the language one uses to describe some physical entity, phenomena or manifestation with what it describes. The natural laws, including gravity, are mathematical in nature, obeying sometimes obtuse and esoteric mathematical relationships, which we have uncovered over eons of time, which doesn’t mean they only came into existence when we discovered them and created the language to describe them. Mathematical notation only exists in the mind, correct, including the number system we adopt, but the mathematical relationships that notation describes, exist independently of mind in the same way that nature’s laws do.
 
John Barrow, cosmologist and Fellow of the Royal Society, made the following point about the mathematical ‘laws’ we formulated to describe the first moments of the Universe’s genesis (Pi in the Sky, 1992).
 
Specifically, he says our mathematical theories describing the first three minutes of the Universe predict specific ratios of the earliest ‘heavier’ elements: deuterium, 2 isotopes of helium and lithium, which are 1/1000, 1/1000, 22 and 1/100,000,000 respectively; with the remaining (roughly 78%) being hydrogen. And this has been confirmed by astronomical observations. He then makes the following salient point:



It confirms that the mathematical notions that we employ here and now apply to the state of the Universe during the first three minutes of its expansion history at which time there existed no mathematicians… This offers strong support for the belief that the mathematical properties that are necessary to arrive at a detailed understanding of events during those first few minutes of the early Universe exist independently of the presence of minds to appreciate them.
 
As you can see this effectively repudiates Pirsig’s argument; but to be fair to Pirsig, Barrow wrote this almost 2 decades after Pirsig’s book.
 
In the same vein, Pirsig then goes on to discuss Poincare’s Foundations of Science (which I haven’t read), specifically talking about Euclid’s famous fifth postulate concerning parallel lines never meeting, and how it created problems because it couldn’t be derived from more basic axioms and yet didn’t, of itself, function as an axiom. Euclid himself was aware of this, and never used it as an axiom to prove any of his theorems.
 
It was only in the 19th Century, with the advent of Riemann and other non-Euclidean geometries on curved surfaces that this was resolved. According to Pirsig, it led Poincare to question the very nature of axioms.
 
Are they synthetic a priori judgements, as Kant said? That is, do they exist as a fixed part of man’s consciousness, independently of experience and uncreated by experience? Poincare thought not…
Should we therefore conclude that the axioms of geometry are experimental verities? Poincare didn’t think that was so either…
Poincare concluded that the axioms of geometry are conventions, our choice among all possible conventions is guided by experimental facts, but it remains free and is limited only by the necessity of avoiding all contradiction.

 
I have my own view on this, but it’s worth seeing where Pirsig goes with it:
 
Then, having identified the nature of geometric axioms, [Poincare] turned to the question, Is Euclidean geometry true or is Riemann geometry true?
He answered, The question has no meaning.
[One might] as well as ask whether the metric system is true and the avoirdupois system is false; whether Cartesian coordinates are true and polar coordinates are false. One geometry can not be more true than another; it can only be more convenient. Geometry is not true, it is advantageous.
 
I think this is a false analogy, because the adoption of a system of measurement (i.e. units) and even the adoption of which base arithmetic one uses (decimal, binary, hexadecimal being the most common) are all conventions.
 
So why wouldn’t I say the same about axioms? Pirsig and Poincare are right in as much that both Euclidean and Riemann geometry are true because they’re dependent on the topology that one is describing. They are both used to describe physical phenomena. In fact, in a twist that Pirsig probably wasn’t aware of, Einstein used Riemann geometry to describe gravity in a way that Newton could never have envisaged, because Newton only had Euclidean geometry at his disposal. Einstein formulated a mathematical expression of gravity that is dependent on the geometry of spacetime, and has been empirically verified to explain phenomena that Newton couldn’t. Of course, there are also limits to what Einstein’s equations can explain, so there are more mathematical laws still to uncover.
 
But where Pirsig states that we adopt the axiom that is convenient, I contend that we adopt the axiom that is necessary, because axioms inherently expand the area of mathematics we are investigating. This is a consequence of Godel’s Incompleteness Theorem that states there are limits to what any axiom-based, consistent, formal system of mathematics can prove to be true. Godel himself pointed out that that the resolution lies in expanding the system by adopting further axioms. The expansion of Euclidean to non-Euclidean geometry is a case in point. The example I like to give is the adoption of √-1 = i, which gave us complex algebra and the means to mathematically describe quantum mechanics. In both cases, the axioms allowed us to solve problems that had hitherto been impossible to solve. So it’s not just a convenience but a necessity.
 
I know I’ve belaboured a point, but both of these: non-Euclidean geometry and complex algebra; were at one time radical ideas in the mathematical world that ultimately led to radical ideas: general relativity and quantum mechanics; in the scientific world. Are they ghosts? Perhaps ghost is an apt metaphor, given that they appear timeless and have outlived their discoverers, not to mention the rest of us. Most physicists and mathematicians tacitly believe that they not only continue to exist beyond us, but existed prior to us, and possibly the Universe itself.
 
I will briefly mention another radical idea, which I borrowed from Schrodinger but drew conclusions that he didn’t formulate. That consciousness exists in a constant present, and hence creates the psychological experience of the flow of time, because everything else becomes the past as soon as it happens. I contend that only consciousness provides a reference point for past, present and future that we all take for granted.

Sunday, 19 May 2024

It all started with Euclid

 I’ve mentioned Euclid before, but this rumination was triggered by a post on Quora that someone wrote about Plato, where they argued, along with another contributor, that Plato is possibly overrated because he got a lot of things wrong, which is true. Nevertheless, as I’ve pointed out in other posts, his Academy was effectively the origin of Western philosophy, science and mathematics. It was actually based on the Pythagorean quadrivium of geometry, arithmetic, astronomy and music.
 
But Plato was also a student and devoted follower of Socrates and the mentor of Aristotle, who in turn mentored Alexander the Great. So Plato was a pivotal historical figure and without his writings, we probably wouldn’t know anything about Socrates. In the same way that, without Paul, we probably wouldn’t know anything about Jesus. (I’m sure a lot of people would find that debatable, but, if so, it’s a debate for another post.)
 
Anyway, I mentioned Euclid in my own comment (on Quora), who was the Librarian at Alexandria around 300BC, and thus a product of Plato’s school of thought. Euclid wrote The Elements, which I contend is arguably the most important book written in the history of humankind – more important than any religious text, including the Bible, Homer’s Iliad and the Mahabharata, which, I admit, is quite a claim. It's generally acknowledged as the most copied text in the secular world. In fact, according to Wikipedia:
 
It was one of the very earliest mathematical works to be printed after the invention of the printing press and has been estimated to be second only to the Bible in the number of editions published since the first printing in 1482.
 
Euclid was revolutionary in one very significant way: he was able to demonstrate what ‘truth’ was, using pure logic, albeit in a very abstract and narrow field of inquiry, which is mathematics.
 
Before then, and in other cultures, truth was transient and subjective and often prescribed by the gods. But Euclid changed all that, and forever. I find it extraordinary that I was examined on Euclid’s theorems in high school in the 20th Century.
 
And this mathematical insight has become, millennia later, a key ingredient (for want of a better term) in the hunt for truths in the physical world. In the 20th Century, in what has become known as the Golden Age of Physics, the marriage between mathematics and scientific inquiry at all scales, from the cosmic to the infinitesimal, has uncovered deeply held secrets of nature that the Pythagoreans, and Euclid for that matter, could never have dreamed of. Look no further than quantum mechanics (QM) and the General Theory of Relativity (GR). Between these 2 iconic developments, they underpin every theory we currently have in physics, and they both rely on mathematics that was pivotal in the development of the theories from the outset. In other words, without the mathematics of complex algebra and Riemann geometry respectively, these theories would have been stillborn.
 
I like to quote Richard Feynman from his book, The Character of Physical Law, in a chapter titled, The Relation of Mathematics to Physics:
 
…what turns out to be true is that the more we investigate, the more laws we find, and the deeper we penetrate nature, the more this disease persists. Every one of our laws is a purely mathematical statement in rather complex and abstruse mathematics... Why? I have not the slightest idea. It is only my purpose to tell you about this fact.
 
The strange thing about physics is that for the fundamental laws we still need mathematics.
 
Physicists cannot make a conversation in any other language. If you want to learn about nature, to appreciate nature, it is necessary to understand the language that she speaks in. She offers her information only in one form.

 
And this has only become more evident since Feynman wrote those words.
 
There was another revolution in the 20th Century, involving Alan Turing, Alonso Church and Kurt Godel; this time involving mathematics itself. Basically, each of these independently demonstrated that some mathematical truths were elusive to proof. Some mathematical conjectures could not be proved within the mathematical system from which they arose. The most famous example would be Riemann’s Hypothesis, involving primes. But the Goldbach conjecture (also involving primes) and the conjecture of twin primes also fit into this category. While most mathematicians believe them to be true, they are yet to be proven. I won’t elaborate on them, as they can easily be looked up.
 
But there is more: according to Gregory Chaitin, there are infinitely more incomputable Real numbers than computable Real numbers, which means that most of mathematics is inaccessible to logic.
 
So, when I say it all started with Euclid, I mean all the technology and infrastructure that we take for granted; and which allows me to write this so that virtually anyone anywhere in the world can read it; only exists because Euclid was able to derive ‘truths’ that stood for centuries and ultimately led to this.