Paul P. Mealing

Check out my book, ELVENE. Available as e-book and as paperback (print on demand, POD). Also this promotional Q&A on-line.

Sunday 17 April 2016

Eye in the Sky

Two movie reviews in a row – but quite different – one arguably the latest incarnation of my generation’s best known comic book icons, and the other a serious intellectual debate on the moral dimension of  modern warfare.

This is a really good movie: one where you can’t leave the cinema without internally debating the pros and cons of a military operation, where you know the consequences are real for those who take part in this very new ‘theatre of war’ involving drone strikes, electronic intelligence surveillance and high tech Western military powers versus third world terrorist enclaves. This is one of those movies where you ask yourself: What would I do? Only many times over.

You insert yourself in so many points of view; a credit to the filmmakers and the actors who create them for you. Only 2 of the actors are known to me: Helen Mirren and Alan Rickman; but they all acquit themselves well, with events taking place simultaneously in 3 geographically separate parts of the world. Such is the nature of modern warfare and communications availability that one can imagine the co-operation of 3 different countries’ governments and military personnel performing one tactically precise operation.

A British production, Colin Firth is one of the producers, which is how it came to Helen Mirren (according to an interview with her) and you wonder why he’s not in it. One can imagine him playing any one of the British roles, such is his versatility. Apparently, the Mirren character was written for a man, so it’s a master stroke giving it to her. Sadly, it’s Alan Rickman’s last film, so it seems fitting to me that he has arguably the best line in the movie: “Never tell a soldier that he doesn’t know the cost of war.” Seeing ‘In Loving Memory of Alan Rickman’ in the credits was as emotional for me as any moment in the movie itself. And the movie certainly has its moments.

I’m not giving anything away by telling you the premise: a drone strike on a house in Nairobi is compromised by the presence of a young innocent girl (just watch the trailer). And it was the trailer that compelled me to go and see this film.  In some respects this is a perfectly realistic and believable recasting of Mills’ famous trolley thought experiment: would you sacrifice the life of 1 innocent man to save the lives of 4 others? In this case, do you sacrifice the life of 1 innocent girl to save the potential 80+ lives from a suicide bomber? Really, that’s it in a nutshell. You empathise with everyone in the so-called chain of command, but, in particular, with the young drone pilots, who must perform the actual kill, one of whom is a woman on her very first operation.

Like the military personnel (played by Rickman and Mirren) you get frustrated by the Public Service mentality of avoiding a decision for fear of yet-to-be realised consequences. But what struck me was that the entire decision-making process was driven purely by legal and political considerations, not moral ones. I’ve never been in a war so I really can’t judge. The truth is that in a war, one’s moral compass is changed, not least because you are trained to kill; something you’ve been taught never to do for your entire life. The other truth is that the more one side escalates atrocities so does the opposing side. Concepts of right and wrong that seem so solid and dependable in civilian life can suddenly become slippery and even obsolete. I’ve never been there but I can imagine.

A few years back I wrote a post on drone warfare after reading an article that cited David Kilcullen (in the Weekend Australian) who opposed it, arguing that it would recruit terrorists. One of the many arguments that takes place in the movie is about winning the propaganda war. At the time, watching the scene, I thought: who cares? But at the end of the movie, I realised that collateral damage is always a propaganda win for the opponent. This is the biggest risk of drone warfare. There is another side to this as well. Someone once pointed out (no, I don’t remember who) that when one side of a conflict is technically superior to the other, the other side invariably uses tactics that are considered unethical by the superior side, but the inferior side know that such tactics are their only advantage. This is the case in the so-called ‘War on Terror’, where the technological might of Western military power is thwarted by suicide bomber attacks in public places.

In movies, it’s not difficult to create a character whom the audience roots for, and in this case, it’s the young girl. Alongside that is the imperative to stop terrorist attacks by ideologues whose stated aim is to eradicate Western political and educational norms in whichever way they can. The film makes it clear that the young girl represents the future that these ideologues oppose.

Monday 28 March 2016

Superheroes for adults – Batman v Superman: Dawn of Justice

This is not the first movie review I’ve written on this blog; not even the first about superheroes. I wrote a review of Watchmen in Oct., 2009, which is an exceptional movie in my view, based on an exceptional graphic novel by Alan Moore, which I have to confess I read some years after I saw the movie.

One really shouldn’t reference other reviewers when writing a review (an unwritten rule of reviewing) but Stephen Romei, writing in the Weekend Australian Review (26-27 Mar., 2016) makes the pertinent point of how our superheroes have evolved over the best part of a century (the ones in this movie were all created pre-WW2). As someone who was born immediately post-WW2, I grew up with these heroes in the form that they were born in, comic books. Like many of my generation (including Romie, I suspect) they are imbedded in my psyche, especially Superman.

Romei makes the point that he’s glad he didn’t take his 10 year old son (so maybe not my generation) because the movie is long and the characters' relationships complex. But the truth is that when you see Lois in a bath you know this isn’t a movie for kids. And no, it’s not a gratuitous nude scene – it’s a very clever way of demonstrating her relationship with Clark without showing them in bed. Our superheroes have become grown up – they have sex. It’s a bit like the point in your life when you realise your parents have been at it for at least as long as you’ve been alive. Bruce Wayne has someone in his bed as well, but we never meet her. In fact, she’s so unobtrusive that I now wonder if I imagined her.

This is a very noirish film, and not only in subtext. The first thing that struck me about this movie was the cinematography: it’s darkly lit, even the outdoor shots. But what makes this film worthy of a blog post is that it has a moral dimension that reflects the current world we live in. It’s about fear and trust and how we are manipulated by politicians and media. Our heroes are flawed, suffer doubt and have to deal with real moral dilemmas. All of these factors are dealt with a level of authenticity that we would not expect from a superhero movie. It’s also about being judged by association; very relevant in the current global environment.

One of the themes of this movie, which is spelt out in some of the dialogue, as well as in gestures, is that these heroes are effectively gods. Bryan Singer brought this home to us as well in Superman Returns (a movie that you either loved or hated; it’s one my favourites, I confess). This is a point I’ve raised myself (when I discussed Watchmen): the superheroes are our ‘Greek Gods’. And like the Greek Gods of literature, they exhibit human traits, dabble in human affairs and even have human lovers. I am a storyteller by nature, and the whole point of storytelling is to be able to stretch our imaginations to worlds and beings that only exist in that realm. But that storytelling only resonates with us when it deals with human affairs, not only of the heart, but of politics and moral crises.

Chris Nolan’s second Dark Knight movie is a case in point, where Heath Ledger’s Joker makes Christian Bale’s Batman become, albeit fleetingly, as morally compromised as he is. This is the lesson: do we have to become as bad as our enemies in order to defeat them. Consider the Republicans’ current leading contender for the White House saying on national television that in order to defeat ISIS we need to attack their families. Cringeworthy doesn’t cover it.

And this movie, in its own way, challenges our prejudices, our inherent distrust in anyone who is ‘not one of us’, especially when we can associate them with atrocities occurring in remote locations and on our doorstep. We are tribal – it’s our strength and our downfall. And this fear and mistrust is manipulated blatantly (in the movie) which is why it is relevant and meaningful to the present day. Science fiction stories, always set in the future, always have something relevant to say about the time in which they are written.

And this brings me to the introduction of Wonder Woman, who has very little screen time, yet promises much for the future. I have a particular interest in her character, because she influenced one of my own creations, albeit subconsciously (I wasn’t aware of the obvious references until after I’d written it). I have to confess I was worried that she would come across as a lightweight, but Gal Gadot gives the role the gravitas it deserves. Gadot is a former Miss Israeli and the fact that she’s served in the military is maybe why she convinces us that she is a genuine warrior and not just someone who looks good in tight-fitting clothes.

Remember that Sean Connery was a Mr Universe contender before he became the first and (50 years later) still the most iconic James Bond. But the reason for her relevance is that female superheroes have been historically in short supply, but there is a sense that their time has come. Looking on the Internet, the biggest concern seemed to be if her boobs were big enough. And, in fact, a radio interviewer asked her that very question. She pointed out that the real Amazonians only had one breast, which may have made the role ‘problematic if one really wanted authenticity’. (I remember being told that as a kid: that they cut off their left breast so they could draw and release a bow string. It seemed plausible to me then and it sounds plausible to me now.) That slightly irrelevant point aside, the original Wonder Woman was based on Greek mythology; she is Hellenic, so possibly more in common with the Greek Gods than any other 20th Century fictional creation. Anyway, I think Gadot perfect for the role, and I only hope the scriptwriters have done her justice in her own story.

Just one bit of trivia: there is a piece of dialogue by Alfred (played pitch-perfect by Jeremy Irons) that has been lifted straight out of Frank Miller’s The Dark Knight Returns (1986) of which I still have a copy. A subtle but respectful salute.

Monday 14 March 2016

Argument, reason and belief

Recently in New Scientist (5 March 2016) there was a review of a book, The Persuaders: The Hidden Industry that Wants to Change Your Mind by James Garvey (which I must read), which tackles the issue of why argument by reason often fails. I’ve experienced this first hand on this blog, which has led me to the conclusion that you can’t explain something to someone who doesn’t want it explained. The book referenced above is more about how propaganda works, apparently, but that is not what I wish to discuss.

In the same vein, I’ve recently watched a number of YouTube videos covering excerpts from debates and interviews with scientists on the apparent conflict between science and religion. I say apparent because not all scientists are atheists and not all theologians are creationists, yet that is the impression one often gets from watching these.

The scientists I’ve been watching include Richard Dawkins, Neil deGrasse Tyson, Bill Nye (aka the science guy) and Michio Kaku. I think Tyson presents the best arguments (from the small sample I watched) but Michio Kaku comes closest to presenting my own philosophical point of view. In one debate (see links at bottom) he says the ‘God question’ is ‘undecidable’ and predicts that, unlike many scientific questions of today, the God question will be no further advanced in 100 years time than it is in the current debate.

The issue, from my perspective, is that science and religion deal with completely different things. Science is an epistemology – it’s a study of the natural world in all its manifestations. Religion is something deeply personal and totally subjective, and that includes God. God is something people find inside themselves which is why said God has so many different personalities and prejudices depending on who the believer is. I’ve argued this before, so I won’t repeat myself.

At least 2 of the scientists I reference above (Dawkins and Tyson) point out something I’ve said myself: once you bring God into epistemology to explain some phenomenon that science can’t currently explain, you are saying we have come to the end of science. History has revealed many times over that something that was inexplicable in the past becomes explicable in the future. As I’ve said more than once on this blog: only people from the future can tell us how ignorant we are in the present. Tyson made the point that the apposite titled God-of-the-Gaps is actually a representation of our ignorance – a point I’ve made myself.

This does not mean that God does not exist; it means that God can’t help us with our science. People who argue that science can be replaced with Scripture are effectively arguing that science should be replaced by ignorance. The Old Testament was written by people who wanted to tell a story of their origins and it evolved into a text to scare people into believing that they are born intrinsically evil. At least that’s how it was presented to me as a child.

Of all the videos I watched, the most telling was an excerpt from a debate between Bill Nye (the science guy) and Ken Ham (the architect of the Creation Museum in Petersburg, Kentucky). Ham effectively argued that science can only be done in the present. So-called historical science, giving the age of the Earth or the Universe, or its origins, cannot be determined using the same methods we use for current scientific investigations. When asked if any evidence could change his beliefs, he said there was no such evidence and the Bible was the sole criterion for his beliefs.

And this segues back into my introduction: you cannot explain something to someone who doesn’t want it explained. When I argue with someone or even present an argument on this blog, I don’t expect to change people’s points of view to mine; I expect to make them think. Hence the little aphorism in the blog’s title block.

One of the points made in the New Scientist review, referenced in my opening, is that people rarely if ever change their point of view even when presented with indisputable evidence or a proof. This is true even among scientists. We all try to hang on to our pet theories for as long as possible until they are no longer tenable. I’m as guilty of this as anyone, even though I’m not a scientist.

One of the things that helps perpetuate our stubbornness is confirmation bias (mentioned by New Scientist) whereby we tend to only read or listen to people whom we agree with. We do this with politics all the time. But I have read contrary points of view, usually given to me by people who think I’m biased. I’ve even read C.S. Lewis. What I find myself doing in these instances is arguing in my head with the authors. To give another example, I once read a book by Colin McGinn (Basic Structures of Reality) that only affirmed for me that people who don’t understand science shouldn’t write books about it, yet I still read it and even wrote a review of it on Amazon UK.

There is a thing called philosophy and it’s been married to science for many centuries. Despite what some people claim (Richard Dawkins and Stephen Hawking to mention 2) I can’t see a divorce any time soon. To use a visual metaphor, our knowledge is like an island surrounded by a sea of unsolved mysteries. The island keeps expanding but the sea is infinite. The island is science and the shoreline is philosophy. To extend the metaphor, our pet theories reside on the beach.

Noson Yanofsky, a Professor of Computer and Information Science in New York, wrote an excellent book called The Outer Limits of Reason, whereby he explained how we will never know everything – it’s cognitively and physically impossible. History has demonstrated that every generation believes that we almost know everything that science can reveal, yet every revelation only reveals new mysteries.

This is a video of Michio Kaku and Richard Dawkins, amongst others, giving their views on science, God and religion.

This is a short video of Leonard Susskind explaining 2 types of agnosticism, one of which he seems to concur with.

Saturday 27 February 2016

In Nature, paradox is the norm, not the exception

I’ve just read Marcus Chown’s outstanding book for people wanting their science served without equations, Quantum Theory Cannot Hurt You. As the title suggests, half the book covers QM and half the book covers relativity. Chown is a regular contributor to New Scientist, and this book reflects his journalistic ease at discussing esoteric topics in physics. He says, right at the beginning, that he brings his own interpretation to these topics but it’s an erudite and well informed one.

No where is Nature’s paradoxical nature more apparent than the constant speed of light, which was predicted by Maxwell’s equations, not empirical evidence. Of course this paradox was resolved by Einstein’s theories of relativity; both of them (the special theory and the general theory). Other paradoxes that seem built into the Universe are not so easily resolved, but I will come to them.

As Chown explicates, the constant speed of light has the same psychological effect as if it was infinite and the Lorentz transformation, which is the mathematical device used to describe relativistic effects, tends to infinity at its limit (the limit being the speed of light). If one could travel at the speed of light, a light beam would appear stationary and time would literally stand still. In fact, this is what Einstein imagined in one of his famous thought experiments that led him to his epiphanic theory.

The paradox is easily demonstrated if one imagines a spacecraft travelling at very high speed, which could be measured as a fraction of the speed of light. This craft transmits a laser both in front of it and behind it. Intuition tells us that someone ahead of the craft who is stationary relative to the craft (say on Earth) receives the signal at the speed of light plus the fraction that it is travelling relative to Earth. On the other hand, if the spacecraft was travelling away from Earth at the same relative speed, one would expect to measure the laser as being the speed of light minus the relative speed of the craft. However, contrary to intuition, the speed of light is exactly the same in both cases which is the same as measured by anyone on the spacecraft itself. The paradox is resolved by Einstein’s theory of special relativity that tells us that whilst the speed of light is constant for both observers (one on the spacecraft and one on Earth) their measurements of time and length will not be the same, which is entirely counter-intuitive. This is not only revealed in the mathematics but has been demonstrated by comparing clocks in real spacecraft compared to Earth. In fact, the Sat-Nav you use in your car or on your phone, takes relativistic effects into account to give you the accuracy you’ve become acquainted with. (Refer Addendum 2 below, which explains the role of the Doppler effect in determining moving light sources.)

However, there are other paradoxes associated with relativity that have not been resolved, including time itself. Chown touches on this and so did I, not so long ago, in a post titled, What is now? According to relativity, there is no objective now, and Chown goes so far as to say: ‘”now” is a fictitious concept’ (quotation marks in the original). He quotes Einstein: “For us physicists, the distinction between past, present and future is only an Illusion.” And Chown calls it ‘one of [Nature’s] great unsolved mysteries.’

Reading this, one may consider that Nature’s paradoxes are simply a consequence of the contradiction between our subjective perceptions and the reality that physics reveals. However, there are paradoxes within the physics itself. For example, we give an age to the Universe which does suggest that there is a universal “now”, and quantum entanglement (which Chown discusses separately) implies that simultaneity can occur over any distance in the Universe.

Quantum mechanics, of course, is so paradoxical that no one can agree on what it really means. Do we live in a multiverse, where every possibility predicted mathematically by QM exists, of which we experience only one? Or do things only exist when they are ‘observed’? Or is there a ‘hidden reality’ which the so-called real ‘classical’ world interacts with? I discussed this quite recently, so I will keep this discussion brief. If there is a multiverse (which many claim is the only ‘logical’ explanation) then they interfere with each other (as Chown points out) and some even cancel each other out completely, for every single quantum event. But another paradox, which goes to the heart of modern physics, is that quantum theory and Einstein’s general theory of relativity cannot be reconciled in their current forms. As Chown points out, String Theory is seen as the best bet but it requires 10 dimensions of which all but 3 cannot be detected with current technology.

Now I’m going to talk about something completely different, which everyone experiences, but which is also a paradox when analysed scientifically. I’m referring to free will, and like many of the topics I’ve touched on above, I discussed this recently as well. The latest issue of Philosophy Now (Issue 112, February / March 2016) has ‘Free Will’ as its theme. There is a very good editorial by Grant Bartley who discusses on one page all the various schools of thought on this issue. He makes the point, that I’ve made many times myself: ‘Why would consciousness evolve if it didn’t do anything?’ He also makes this statement: ‘So if there is free will, then there must be some way for a mind to direct the state of its brain.’ However, all the science tells us that the ‘mind’ is completely dependent on the ‘state of its brain’ so the reverse effect must be an illusion.

This interpretation would be consistent with the notion I mooted earlier that paradoxes are simply the consequence of our subjective experience contradicting the physical reality. However, as I pointed out in my above-referenced post, there are examples of the mind affecting states of the brain. In New Scientist (13 February 2016) Anil Ananthaswamy reviews Eliezer Sternberg’s Neurologic: The brain’s hidden rationale behind our irrational behaviour (which I haven’t read). According to Ananthaswamy, Sternberg discusses in depth the roles of the conscious and subconscious and concludes that the unconscious ‘can get things wrong’. He then asks the question: ‘Can the conscious right some of these wrongs? Can it influence the unconscious? Yes, says Sternberg.’ He gives the example of British athlete, Steve Backley ‘imagining the perfect [javelin] throw over and over again’ even though a sprained ankle stopped him from practicing, and he won Silver in the 1996 Atlanta Olympics.

My point is that paradoxes are a regular feature of the Universe at many levels, from quantum mechanics to time to consciousness. In fact, consciousness is arguably the least understood phenomenon in the entire Universe, yet, without it, the Universe’s existence would be truly meaningless. Consciousness is subjectivity incarnate yet we attempt to explain it with complete objectivity. Does that make it a paradox or an illusion?


Addendum 1: Since writing this post, I came across this video of John Searle discussing the paradox of free will. He introduces the subject by saying that no progress has been made on this topic in the last 100 years. Unlike my argument, he discusses the apparent contradiction between free will and cause and effect.

Addendum 2: It should be pointed out that the Doppler effect allows an observer to know if a light source is moving towards them or away from them. In other words, there is change in frequency even though there isn't a change in velocity (of the light). It's for this reason that we know the Universe is expanding with galaxies moving away from us.

Tuesday 2 February 2016

Creation Science: a non sequitur

A friend of mine – someone whom I’d go to for help – leant me a ‘Creation’ magazine to prove that there are creationists who are real scientists. And, I have to admit, my friend was right: the magazine was full of contributors who had degrees in science, including one who has a PhD and honours and works at a run-of-the-mill university; but who wrote the following howler: ‘Cosmology is unscientific because you can’t do an experiment in cosmology.’ I wonder if said writer would be willing to say that to Australian Nobel Prize winner, Brian Schmidt. Only humans can be living contradictions.

Creation science is an epistemological contradiction – there’s no such thing – by definition. Science does not include magic – I can’t imagine anyone who would disagree with that, but I might be wrong. Replacing a scientific theory with supernaturally enhanced magic is anti-science, yet creationists call it science – as the Americans like to say: go figure.

The magazine was enlightening in that the sole criterion for these ‘scientists’ as to the validity of any scientific knowledge was whether or not it agreed with the Bible. If this was literally true, we would still be believing that the Sun goes round the Earth, rather than the other way round. After all, the Book of Joshua tells us how God stopped the Sun moving in the sky. It doesn’t say that God stopped the Earth spinning, which is what he would have had to do.

One contributor to the magazine even allows for ‘evolution’ after ‘creation’, because God programmed ‘subroutines’ into DNA, but was quick to point out that this does ‘not contradict the Bible’. Interesting to note that DNA wouldn’t even have been discovered if all scientists were creationists (like the author).

Why do you think the ‘Dark Ages’ are called the dark ages? Because science, otherwise known as ‘natural philosophy’, was considered pagan, as the Greeks’ neo-Platonist philosophy upon which it was based was pagan. Someone once pointed out that Hypatia’s murder by a Christian mob (around 400AD) signalled the start of the dark ages, which lasted until around 1200, when Fibonacci introduced the West to the Hindu-Arabic system of numbers. In fact, it is the Muslims who kept that knowledge in the interim, otherwise it may well have been lost to us forever.

So science and Christianity have a long history of contention that goes back centuries before Copernicus, Galileo and Darwin. If anything, the gap has got wider, not closer; they’ve only managed to co-exist by staying out of each other’s way.

There are many religious texts in the world, a part of our collective cultural and literary legacy, but none of them are scientific or mathematical texts, which also boast diverse cultural origins. It is an intellectual conceit (even deceit) to substitute religious teaching for scientifically gained knowledge. Of course scientifically gained knowledge is always changing, advancing, being overtaken and is never over. In fact, I would contend that science will never be complete, as history has demonstrated, so there will always be arguments for supernatural intervention, otherwise known as the ‘God-of-the-Gaps’. Godel’s Incompleteness theorem infers that mathematics is a never-ending epistemological mine, and I believe that the same goes for science.

Did I hear someone say: what about Intelligent Design (ID)? Well, it’s still supernatural intervention, isn’t it? Same scenario, different description.

Religion is not an epistemology, it’s a way of life. Whichever way you look at it, it’s completely subjective. Religion is part of your inner world, and that includes God. So the idea that the God you’ve found within yourself is also the Creator of the entire Universe is a non sequitur. Because everyone’s idea of God is unique to them.

Tuesday 19 January 2016

Is this the God equation?

Yes, this is a bit tongue-in-cheek, but like most things tongue-in-cheek it just might contain an element of truth. I’m not a cosmologist or even a physicist, so this is just me being playful yet serious in as much as anyone can be philosophically serious about the origins of Everything, otherwise known as the Universe.

Now I must make a qualification, lest people think I’m leading them down the garden path. When people think of ‘God’s equation’, they most likely think of some succinct equation or set of equations (like Maxwell’s equations) from which everything we know about the Universe can be derived mathematically. For many people this is a desired outcome, founded on the belief that one day we will have a TOE (Theory Of Everything) – itself a misnomer – which will incorporate all the known laws of the Universe in one succinct theory. Specifically, said theory will unite the Electromagnetic force, the so-called Weak force, the so-called Strong force and Gravity as all being derived from a common ‘field’. Personally, I think that’s a chimera, but I’d be happy to be proven wrong. Many physicists believe some version of String Theory or M Theory will eventually give us that goal. I should point out that the Weak force has already been united with the Electromagnetic force.

So what do I mean by the sobriquet, God’s equation? Last week I watched a lecture by Allan Adams as part of MIT Open Courseware (8.04, Spring 2013) titled Lecture 6: Time Evolution and the Schrodinger Equation, in which Adams made a number of pertinent points that led me to consider that perhaps Schrodinger’s Equation (SE) deserved such a title. Firstly, I need to point out that Adams himself makes no such claim, and I don’t expect many others would concur.

Many of you may already know that I wrote a post on Schrodinger’s Equation nearly 5 years ago and it has become, by far, the most popular post I’ve written. Of course Schrodinger’s Equation is not the last word in quantum mechanics –more like a starting point. By incorporating relativity we have Dirac’s equation, which predicted anti-matter – in fact, it’s a direct consequence of relativity and SE. In fact, Schrodinger himself, followed by Klein-Gordon, also had a go at it and rejected it because it gave answers with negative energy. But Richard Feynman (and independently, Ernst Stuckelberg) pointed out that this was mathematically equivalent to ordinary particles travelling backwards in time. Backwards in time, is not an impossibility in the quantum world, and Feynman even incorporated it into his famous QED (Quantum Electro-Dynamics) which won him a joint Nobel Prize with Julian Schwinger and Sin-Itiro Tomonaga in 1965. QED, by the way, incorporates SE (just read Feynman’s book on the subject).

This allows me to segue back into Adams’ lecture, which, as the title suggests, discusses the role of time in SE and quantum mechanics generally. You see ‘time’ is a bit of an enigma in QM.

Adams’ lecture, in his own words, is to provide a ‘grounding’ so he doesn’t go into details (mathematically) and this suited me. Nevertheless, he throws terms around like eigenstates, operators and wave functions, so familiarity with these terms would be essential to following him. Of those terms, the only one I will use is wave function, because it is the key to SE and arguably the key to all of QM.

Right at the start of the lecture (his Point 1), Adams makes the salient point that the Wave function, Ψ, contains ‘everything you need to know about the system’. Only a little further into his lecture (his Point 6) he asserts that SE is ‘not derived, it’s posited’. Yet it’s completely ‘deterministic’ and experimentally accurate. Now (as discussed by some of the students in the comments) to say it’s ‘deterministic’ is a touch misleading given that it only gives us probabilities which are empirically accurate (more on that later). But it’s a remarkable find that Schrodinger formulated a mathematical expression based on a hunch that all quantum objects, be they light or matter, should obey a wave function.

But it’s at the 50-55min stage (of his 1hr 22min lecture) that Adams delivers his most salient point when he explains so-called ‘stationary states’. Basically, they’re called stationary states because time remains invariant (doesn’t change) for SE which is what gives us ‘superposition’. As Adams points out, the only thing that changes in time in SE is the phase of the wave function, which allows us to derive the probability of finding the particle in ‘classical’ space and time. Classical space and time is the real physical world that we are all familiar with. Now this is what QM is all about, so I will elaborate.

Adams effectively confirmed for me something I had already deduced: superposition (the weird QM property that something can exist simultaneously in various positions prior to being ‘observed’) is a direct consequence of time being invariant or existing ‘outside’ of QM (which is how it’s usually explained). Now Adams makes the specific point that these ‘stationary states’ only exist in QM and never exist in the ‘Real’ world that we all experience. We never experience superposition in ‘classical physics’ (which is the scientific pseudonym for ‘real world’). This highlights for me that QM and the physical world are complementary, not just versions of each other. And this is incorporated in SE, because, as Adams shows on his blackboard, superposition can be derived from SE, and when we make a measurement or observation, superposition and SE both disappear. In other words, the quantum state and the classical state do not co-exist: either you have a wave function in Hilbert space or you have a physical interaction called a ‘wave collapse’ or, as Adams prefers to call it, ‘decoherence’. (Hilbert space is a theoretical space of possibly infinite dimensions where the wave function theoretically exists in its superpositional manifestation.)

Adams calls the so-called Copenhagen interpretation of QM the “Cop Out” interpretation which he wrote on the board and underlined. He prefers ‘decoherence’ which is how he describes the interaction of the QM wave function with the physical world. My own view is that the QM wave function represents all the future possibilities, only one of which will be realised. Therefore the wave function is a description of the future yet to exist, except as probabilities; hence the God equation.

As I’ve expounded in previous posts, the most popular interpretation at present seems to be the so-called ‘many worlds’ interpretation where all superpositional states exist in parallel universes. The most vigorous advocate of this view is David Deutsch, who wrote about it in a not-so-recent issue of New Scientist (3 Oct 2015, pp.30-31). I also reviewed his book, Fabric of Reality, in September 2012. In New Scientist, Deutsch advocated for a non-probabilistic version of QM, because he knows that reconciling the many worlds interpretation with probabilities is troublesome, especially if there are an infinite number of them. However, without probabilities, SE becomes totally ineffective in making predictions about the real world. It was Max Born who postulated the ingenious innovation of squaring the modulus of the wave function (actually multiplying it with its complex conjugate, as I explain here) which provides the probabilities that make SE relevant to the physical world.

As I’ve explained elsewhere, the world is fundamentally indeterministic due to asymmetries in time caused by both QM and chaos theory. Events become irreversible after QM decoherence, and also in chaos theory because the initial conditions are indeterminable. Now Deutsch argues that chaos theory can be explained by his many worlds view of QM, and mathematician, Ian Stewart, suggests that maybe QM can be explained by chaos theory as I expound here. Both these men are intellectual giants compared to me, yet I think they’re both wrong. As I’ve explained above, I think that the quantum world and the classical world are complementary. The logical extension of Deutch’s view, by his own admission, requires the elimination of probabilities, making SE ineffectual. And Stewart’s circuitous argument to explain QM probabilities with chaos theory eliminates superposition, for which we have indirect empirical evidence (using entanglement, which is well researched). Actually, I think superposition is a consequence of the wave function effectively being everywhere at once or 'permeates all of space' (to quote Richard Ewles in MATHS 1001).

If I’m right in stating that QM and classical physics are complementary (and Adams seems to make the same point, albeit not so explicitly) then a TOE may be impossible. In other words, I don't think classical physics is a special case of QM, which is the current orthodoxy among physicists.


Addendum 1: Since writing this, I've come to the conclusion that QM and, therefore, the wave function describe the future - an idea endorsed by non-other than Freeman Dyson, who was instrumental in formulating QED with Richard Feynman.

Addendum 2: I've amended the conclusion in my 2nd last paragraph, discussing Deutch's and Stewart's respective 'theories', and mentioning entanglement in passing. Schrodinger once said (in a missive to Einstein, from memory) that entanglement is what QM is all about. Entanglement effectively challenges Einstein's conclusion that simultaneity is a non sequitur according to his special theory of relativity (and he's right, providing there's no causal relationship between events). I contend that neither Deutch nor Stewart can resolve entanglement with their respective 'alternative' theories, and neither of them address it from what I've read.

Tuesday 12 January 2016

How to write a story so it reads like a movie in your head

I’ve written about writing a few times now, including Writing’s 3 Essential Skills (Jul. 2013) and How to Create an Imaginary, Believable World (Aug. 2010), the last one being a particularly popular post. Also, I taught a creative writing course in 2009 and have given a couple of talks on the subject, but never intentionally to provide advice on how to make a story read like a movie in your head.

This post has arisen from a conversation I had when I realised I had effectively taught myself how to do this. It’s not something that I deliberately set out to do but I believe I achieved it inadvertently and comments from some readers appear to confirm this. At YABooksCentral, a teenage reviewer has made the point specifically, and many others have said that my book (Elvene) would make a good movie, including a filmmaker. Many have said that they ‘could see everything’ in their mind’s eye.

Very early in my writing career (though it’s never been my day job) I took some screenwriting courses and even wrote a screenplay. I found that this subconsciously influenced my prose writing in ways that I never foresaw and that I will now explain. The formatting of a screenplay doesn’t lend itself to fluidity, with separate headings for every scene and dialogue in blocks interspersed with occasional brief descriptive passages. Yet a well written screenplay lets you see the movie in your mind’s eye and you should write it as you’d imagine it appearing on a screen. However, contrary to what you might think, this is not the way to write a novel. Do not write a novel as if watching a movie. Have I confused you? Well, bear this in mind and hopefully it will all make sense before the end.

Significantly, a screenplay needs to be written in ‘real time’, which means descriptions are minimalistic and exposition non-existent (although screenwriters routinely smuggle exposition into their dialogue). Also, all the characterisation is in the dialogue and the action – you don’t need physical descriptions of a character, including their attire, unless it’s significant; just gender, generic age and ethnicity (if it’s important). It was this minimalistic approach that I subconsciously imported into my prose fiction.

There is one major difference between writing a screenplay and writing a novel and the two subsequent methods require different states of mind. In writing a screenplay you can only write what is seen and heard on the screen, whereas a novel can be written entirely (though not necessarily) from inside a character’s head. I hope this clarifies the point I made earlier. Now, as someone once pointed out to me (fellow blogger, Eli Horowitz) movies can take you into a character’s head through voiceover, flashbacks and dream sequences. But, even so, the screenplay would only record what is seen and heard on the screen, and these are exceptions, not the norm. Whereas, in a novel, getting inside a character’s head is the norm.

To finally address the question implicit in my heading, there are really only 2 ‘tricks’ for want of a better term: write the story in real time and always from some character’s point of view. Even description can be given through a character’s eyes, and the reader subconsciously becomes an actor. By inhabiting a character’s mind, the reader becomes fully immersed in the story.

Now I need to say something about scenes, because, contrary to popular belief, scenes are the smallest component of a story, not words or sentences or paragraphs. It’s best to think of the words on the page like the notes on a musical score. When you listen to a piece of music, the written score is irrelevant, and, even if you read the score, you wouldn’t hear the music anyway (unless, perhaps, if you’re a musician or a composer). Similarly, the story takes place in the reader’s mind where the words on the page conjure up images and emotions without conscious effort.

In a screenplay a scene has a specific definition, defined by a change in location or time. I use the same definition when writing prose. There are subtle methods for expanding and contracting time psychologically in a movie, and these can also be applied to prose fiction. I’ve made the point before that the language of story is the language of dreams, and in dreams, as in stories, sudden changes in location and time are not aberrant. In fact, I would argue that if we didn’t dream, stories wouldn’t work because our minds would continuously and subconsciously struggle with the logic.

Tuesday 5 January 2016

Free will revisited

I’ve written quite a lot on this in the past, so one may wonder what I could add.

I’ve just read Mark Balaguer’s book, Free Will, which I won when Philosophy Now published my answer to their Question of the Month in their last issue (No 111, December 2015). It’s the fourth time I’ve won a book from them (out of 5 submissions).

It’s a well written book, not overly long or over-technical in a philosophical sense, so very readable whilst being well argued. Balaguer makes it clear from the outset where he stands on this issue, by continually referring to those who argue against free will as ‘the enemies of free will’. Whilst this makes him sound combative, the tone of his arguments are measured and not antagonistic. In his conclusion, he makes the important distinction that in ‘blocking’ arguments against free will, he’s not proving that free will exists.

He makes the distinction between what he calls Hume-style free will and Non-predetermined free will (NDP), which is a term I believe he’s coined for himself. Hume-style free will, is otherwise known as ‘compatibilism’, which means it’s compatible with determinism. In other words, even if everything in the world is deterministic from the Big Bang onwards, it doesn’t rule out you having free will. I know it sounds like a contradiction, but I think it’s to do with the fact that a completely deterministic universe doesn’t conflict with the subjective sense we all have of having free will. As I’ve expressed in numerous posts on this blog, I think there is ample evidence that the completely deterministic universe is a furphy, so compatibilism is not relevant as far as I’m concerned.

Balaguer also coins another term, ‘torn decision’, which he effectively uses as a litmus test for free will. In a glossary in the back he gives a definition which I’ve truncated:

A torn decision is a conscious decision in which you have multiple options and you’re torn as to which option is best.

He gives the example of choosing between chocolate or strawberry flavoured ice cream and not making a decision until you’re forced to, so you make it while you’re still ‘torn’. This is the example he keeps coming back to throughout the book.

In recent times, experiments in neuro-science have provided what some people believe are ‘slam-dunk’ arguments against free will, because scientists have been able to predict with 60% accuracy what decision a subject will make seconds before they make it, simply by measuring neuron activity in certain parts of the brain. Balaguer provides the most cogent arguments I’ve come across challenging these contentions. In particular, the Haynes studies, which showed neuron activity up to 10 seconds prior to the conscious decision. Balaguer points out that the neuron activity for these studies occurs in the PC and BA10 areas of the brain, which are associated with the ‘generation of plans’ and the ‘storage of plans’ respectively. He makes the point (in greater elaboration than I do here) that we should not be surprised if we subconsciously use our ‘planning’ areas of the brain whilst trying to make ‘torn decisions’. The other experiment and their counterparts, known as the Libet studies (since the 1960s) showed neuron activity half a second prior to conscious decision-making and was termed the ‘readiness potential’.  Balaguer argues that there is ‘no evidence’ that the readiness potential causes the decision. Even so, it could be argued that, like the Haynes studies, it is subconscious activity happening prior to the conscious decision.

It is readily known (as Balaguer explicates) that much of our thinking is subconscious. We all have the experience of solving a problem subconsciously so it comes to us spontaneously when we don’t expect it to. And anyone who has pursued some artistic endeavour (like writing fiction) knows that a lot of it is subconscious so that the story and its characters appear on the page with seemingly divine-like spontaneity.

Backtracking to so-called Hume-style free will, it does have a relevance if one considers that our ‘wants’ - what we wish to do - are determined by our desires and needs. We assume that most of the animal kingdom behave on this principle. Few people (including Balaguer) discuss other sentient creatures when they discuss free will, yet I’ve long believed that consciousness and free will go hand-in-hand. In other words, I really can’t see the point of consciousness without free will. If everything is determined subconsciously, without the need to think, then why have we evolved to think?

But humans take thinking to a new level compared to every other species on the planet, so that we introspect and cogitate and reason and internally debate our way to many a decision.

Back in Feb., 2009, I reviewed Douglas Hofstadter’s Pulitzer prize winning book, Godel, Escher, Bach where, among other topics, I discussed consciousness, as that’s one of the themes of his book. Hofstadter coins the term ‘strange loop’. This is what I wrote back then:

By strange loop, Hofstadter means that we can effectively look at all the levels of our thinking except the ground level, which is our neurons. In between we have symbols, which is language, which we can discuss and analyse in a dispassionate way, just like I’m doing now. I can talk about my own thoughts and ideas as if they weren’t mine at all. Consciousness, in Hofstadter’s model (for want of a better word) is the top level, and neurons are the hardware level. In between we have the software (symbols) which is effectively language.

I was quick to point out that ‘software’ in this context is a metaphor – I don’t believe that language is really software, even though we ‘download’ it from generation to generation and it is indispensable to human reasoning, which we call thinking.

The point I’d make is that this is a 2 way process: the neurons are essential to thoughts, yet our thoughts I expect can affect neurons. I believe there is evidence that we can and do rewire our brains simply by exercising our mental faculties, even in later years, and surely exercising consciously is the very definition of will.

Tuesday 15 December 2015

The battle for the future of Islam

There are many works of fiction featuring battles between ‘Good’ and ‘Evil’, yet it would not be distorting the truth to say that we are witnessing one now, though I think it is largely misconstrued by those of us who are on the sidelines. We see it as a conflict between Islam and the West, when it’s actually within Islam itself. This came home to me when I recently saw the biographical movie, He Named Me Malala (pronounce Ma-la-li, by the way).

Malala is well known as the 14 year old Pakistani school girl, shot in the head on a school bus by the Taliban for her outspoken views on education for girls in Pakistan. Now 18 years old (when the film was made) she has since won the Nobel Peace Prize and spoken in the United Nations, as well as having audiences with world leaders, like Barak Obama. In a recent interview with Emma Watson (on Emma’s Facebook page) she appeared much wiser than her years. In the movie, amongst her family, she behaves like an ordinary teenager with ‘crushes’ on famous sports stars. In effect, her personal battle with the Taliban represents in microcosm a much wider battle between past and future that is occurring on the world stage within Islam. A battle for the hearts and minds of Muslims all over the world.

IS or ISIS or Daesh has arisen out of conflicts between Shiites and Sunnis in both Iraq and Syria, but the declaration of a Caliphate has led to a much more serious, even sinister, connotation, because its followers believe they are fulfilling a prophecy which will only be resolved with the biblical end of the world. I’m not an Islamic scholar, so I’m quoting from Audrey Borowski, currently doing a PhD at the University of London, who holds a MSt (Masters degree) in Islamic Studies from Oxford University. She asserts: ‘…one of the Prophet Muhammad’s earliest hadith (sayings) locates the fateful showdown between Christians and Muslims that heralds the apocalypse in the city of Dabiq in Syria.’

“The Hour will not be established until the Romans (Christians) land at Dabiq. Then an army from Medina of the best people on the earth at that time… will fight them.”

She wrote an article of some length in Philosophy Now (Issue 111, Dec. 2015/Jan. 2016) titled Al Qaeda and ISIS; From Revolution to Apocalypse.

The point is that if someone believes they are in a fight for the end of the world, then destroying entire populations and cities is not off the table. They could resort to any tactic, like contaminating water supplies of entire cities or destroying food crops on a large scale. I alluded in the introduction that this apocalyptic ideology, in a fictional context, represents a classic contest between good and evil. From where I (and most people reading this blog) stand, anyone intent on destroying civilization as we know it, would be considered the ultimate evil. 

What is most difficult for us to comprehend is that the perpetrators, the people ‘on the other side’ would see the roles reversed. Earlier this year (April 2015), I wrote a post titled Morality is totally in the eye of the beholder, where I explained how two different cultures in the same country (India) could have completely opposing views concerning a crime against a young woman, who was raped and murdered on a bus returning from seeing a movie with her boyfriend. One view was that the girl was the victim of a crime and the other view was that the girl was responsible for her own fate.

Many people have trouble believing that otherwise ordinary people, who commit evil acts in the form of atrocities, would see themselves as not being evil. We have an enormous capacity to justify to ourselves the most heinous acts, and no where is this more evident, than when one believes they are performing the ‘Will of God’. This is certainly the case with IS and their followers.

Unfortunately, this has led to a backlash in the West against all Muslims. In particular, we see both in social media and mainstream media, and even amongst mainstream politicians, a sentiment that Islam is fundamentally flawed and needs to be reformed. It seems to me that they are unaware that there is already a battle happening within Islam, where militant bodies like IS and Boko Haram and the Taliban represent the worse and a young schoolgirl from Pakistan represents the best.

Ayaan Hirsi Ali (whom I wrote about in March 2011), said when she was in Australia many years ago, that Islam was not compatible with a secular society, which is certainly true if Islamists wish to establish a religious-based government. There is a group, Hizb ut-Tahrir, who is banned in most Western countries, but not UK or Australia, and whose stated aim is to form a caliphate and whose political agenda, including the introduction of Sharia law, would clearly conflict with Australian law. But the truth is that there are many Muslims living an active and productive life in Australia, whilst still practising their religion. A secular society is not an atheistic society, yet is religiously nondependent by definition. In other words, there is room for variety in religious practice and that is what we see. Extremists of any religious persuasion are generally not well received in a pluralist multicultural society, yet that is the fear that is driving the debate in many secular societies.

Over a year ago (Aug 2014) I wrote a post titled Don’t judge all Muslims the same, based on another article I read in Philosophy Now (Issue 104, May/Jun 2014) by Terri Murray (Master of Theology, Heythrop College, London) who made a very salient point differentiating cultural values and ideals from individual ones.  In particular, she asserted that an individual’s rights overrules the so-called rights of a culture or a community. Therefore, misogynistic issues like female genital mutilation, honour killings, child marriage, all of which are illegal in Australia, are abuses of individual rights that may be condoned, even considered normal practice, in some cultures.

Getting back to my original subject matter, like the case of the Indian girl (a medical graduate) who was murdered for going on a date, this really is a battle between past and future. IS and the Taliban and their variant Islamic ideologies represent a desire to regain a past that has no relevance in the 21st Century – it’s mediaeval, not only in concept but also in practice. One of the consequences of the Internet is that it has become a vehicle for both sides. So young women in far off countries are learning that there is another world where education can lead to a better life. And this is the key: education of women, as Malala has brought to the world’s attention, is the only true way forward. It’s curious that women are what these regimes seem to fear most, including IS, whose greatest fear is to be killed by a female Kurdish warrior, because then they won’t get to Paradise.

Tuesday 1 December 2015

Why narcissists are a danger to themselves and others

I expect everyone has met a narcissist, though, like all personality disorders, there are degrees of severity, from the generally harmless egotistical know-it-all to the megalomaniac, who takes control of an entire nation. In between those extremes is the person who somehow self-destructs while claiming it’s everyone else’s fault. They’re the ones who are captain of the ship and totally in control, even when it runs aground, but suddenly claim it’s no longer their fault. I’m talking metaphorically, but this happened quite literally and spectacularly, a couple of years back, as most of you will remember.

The major problem with narcissists is not their self-aggrandisement and over-inflated opinion of their own worth, but their distorted view of reality.

Narcissists have a tendency to self-destruct, not on purpose, but because their view of reality, based on their overblown sense of self-justification, becomes so distorted that they lose perspective and then control, even though everyone around them can see the truth, but are generally powerless to intervene.

They are particularly disastrous in politics but are likely to rise to power when things are going badly, because they are charismatic and their self-belief becomes contagious. Someone said (I don’t know who) that when things are going badly society turns on itself – they were referring to the European witch hunts, which coincided with economic and environmental tribulations. The recent GFC creates ripe conditions for charismatic leaders to feed a population’s paranoia and promise miracle solutions with no basis in rationality. Look at what happened in Europe following the Great Depression of the 20th Century: World War 2. And who started it? Probably the most famous narcissist in recent history. The key element that they have in common with the aforementioned witch-hunters is that they can find someone to blame and, frighteningly, they are believed.

Narcissists make excellent villains as I’ve demonstrated in my own fiction. But one must be careful of whom we demonise lest we become as spiteful and destructive as those we wish not to emulate. Seriously, we should not take them seriously; then all their self-importance and self-aggrandisement becomes comical. Unfortunately, they tend to divide society between those who see themselves as victims and those who see the purported culprits as the victims. In other words, they divide nations when they should be uniting them.

But there are exceptions. Having read Steve Jobs’ biography (by Walter Isaacson) I would say he had narcissistic tendencies, yet he was eminently successful. Many people have commented on his ‘reality-distortion field’, which I’ve already argued is a narcissistic trait, and he could be very egotistical at times, according to anecdotal evidence. Yet he could form deep relationships despite being very contrary in his dealings with his colleagues – building them up one moment and tearing them down the next. But Jobs was driven to strive for perfection, both aesthetically and functionally, and he sought out people who had the same aspiration. He was, of course, extraordinarily charismatic, intelligent and somewhat eccentric. He was a Buddhist, which may have tempered his narcissistic tendencies; but I’m just speculating – I never met him or worked with him – I just used and admired his products like many others. Anyway, I would cite Jobs as an example of a narcissist who broke the mould – he didn’t self-destruct, quite the opposite, in fact.


Addendum: When I wrote this I had recently read Isaacson's biography of Steve Jobs, but I've since seen a documentary and he came perilously close to self-destruction. He was called before a Senate Committee under charges of fraud. He was giving his employees backdated shares (I think that was the charge, from memory). Anyway, according to the documentary, he only avoided prison because it would have destroyed the share price of Apple, which was the biggest company on the share market at the time. I don't know how true this is, but it rings true.

Tuesday 24 November 2015

The Centenary of Einstein’s General Theory of Relativity

This month (November 2015) marks 100 years since Albert Einstein published his milestone paper on the General Theory of Relativity, which not only eclipsed Newton’s equally revolutionary Theory of Universal Gravitation, but is still the cornerstone of every cosmological theory that has been developed and disseminated since.

It needs to be pointed out that Einstein’s ‘annus mirabilis’ (miraculous year), as it’s been called, occurred 10 years earlier in 1905, when he published 3 groundbreaking papers that elevated him from a patent clerk in Bern to a candidate for the Nobel Prize (eventually realised of course). The 3 papers were his Special Theory of Relativity, his explanation of the photo-electric effect using the newly coined concept, photon of light, and a statistical analysis of Brownian motion, which effectively proved that molecules made of atoms really exist and were not just a convenient theoretical concept.

Given the anniversary, it seemed appropriate that I should write something on the topic, despite my limited knowledge and despite the plethora of books that have been published to recognise the feat. The best I’ve read is The Road to Relativity; The History and Meaning of Einstein’s “The Foundation of General Relativity” (the original title of his paper) by Hanoch Gutfreund and Jurgen Renn. They have managed to include an annotated copy of Einstein’s original handwritten manuscript with a page by page exposition. But more than that, they take us on Einstein’s mental journey and, in particular, how he found the mathematical language to portray the intuitive ideas in his head and yet work within the constraints he believed were necessary for it to work.

The constraints were not inconsiderable and include: the equivalence of inertial and gravitational mass; the conservation of energy and momentum under transformation between frames of reference both in rotational and linear motion; and the ability to reduce his theory mathematically to Newton’s theory when relativistic effects were negligible.

Einstein’s epiphany, that led him down the particular path he took, was the realisation that one experienced no force when one was in free fall, contrary to Newton’s theory and contrary to our belief that gravity is a force. Free fall subjectively feels no different to being in orbit around a planet. The aptly named ‘vomit comet’ is an aeroplane that goes into free fall in order to create the momentary sense of weightlessness that one would experience in space.

Einstein learnt from his study of Maxwell’s equations for electromagnetic radiation, that mathematics could sometimes provide a counter-intuitive insight, like the constant speed of light.

In fact, Einstein had to learn new mathematics (for him) and engaged the help of his close friend, Marcel Grossman, who led him through the technical travails of tensor calculus using Riemann geometry. It would seem, from what I can understand of his mental journey, that it was the mathematics, as much as any other insight, that led Einstein to realise that space-time is curved and not Euclidean as we all generally believe. To quote Gutfreund and Renn:

[Einstein] realised that the four-dimensional spacetime of general relativity no longer fitted the framework of Euclidean geometry… The geometrization of general relativity and the understanding of gravity as being due to the curvature of spacetime is a result of the further development and not a presupposition of Einstein’s formulation of the theory.

By Euclidean, one means space is flat and light travels in perfectly straight lines. One of the confirmations of Einstein’s theory was that he predicted that light passing close to the Sun would be literally bent and so a star in the background would appear to shift as the Sun approached the same line of sight for an observer on Earth as for the star. This could only be seen during an eclipse and was duly observed by Arthur Eddington in 1919 on the island of Principe near Africa.

Einstein’s formulations led him to postulate that it’s the geometry of space that gives us gravity and the geometry, which is curved, is caused by massive objects. In other words, it’s mass that curves space and it’s the curvature of space that causes mass to move, as John Wheeler famously and succinctly expounded.

It may sound back-to-front, but, for me, Einstein’s Special Theory of Relativity only makes sense in the context of his General Theory, even though they were formulated in the reverse order. To understand what I’m talking about, I need to explain geodesics.

When you fly long distance on a plane, the path projected onto a flat map looks curved. You may have noticed this when they show the path on a screen in the cabin while you’re in flight. The point is that when you fly long distance you are travelling over a curved surface, because, obviously, the Earth is a sphere, and the shortest distance between 2 points (cities) lies on what’s called a great circle. A great circle is the one circle that goes through both points that is the largest circle possible. Now, I know that sounds paradoxical, but the largest circle provides the shortest distance over the surface (we are not talking about tunnels) that one can travel and there is only one, therefore there is one shortest path. This shortest path is called the geodesic that connects those 2 points.

A geodesic in gravitation is the shortest distance in spacetime between 2 points and that is what one follows when one is in free fall. At the risk of information overload, I’m going to introduce another concept which is essential for understanding the physics of a geodesic in gravity.

One of the most fundamental principles discovered in physics is the principle of least action (formulated mathematically as a Lagrangian which is the difference between kinetic and potential  energy). The most commonly experienced example would be refraction of light through glass or water, because light travels at different velocities in air, water and glass (slower through glass or water than air). The extremely gifted 17th Century amateur mathematician, Pierre de Fermat (actually a lawyer) conjectured that the light travels the shortest path, meaning it takes the least time, and the refractive index (Snell’s law) can be deduced mathematically from this principle. In the 20th Century, Richard Feynman developed his path integral method of quantum mechanics from the least action principle, and, in effect, confirmed Fermat’s principle.

Now, when one applies the principle of least action to a projectile in a gravitational field (like a thrown ball) one finds that it too takes the shortest path, but paradoxically this is the path of longest relativistic time (not unlike the paradox of the largest circle described earlier).

Richard Feynman gives a worked example in his excellent book, Six Not-So-Easy Pieces. In relativity, time can be subjective, so that a moving clock always appears to be running slow compared to a stationary clock, but, because motion is relative, the perception is reversed for the other clock. However, as Feynman points out:

The time measured by a moving clock is called its “proper time”. In free fall, the trajectory makes the proper time of an object a maximum.

In other words, the geodesic is the trajectory or path of longest relativistic time. Any variant from the geodesic will result in the clock’s proper time being shorter, which means time literally slows down. So special relativity is not symmetrical in a gravitational field and there is a gravitational field everywhere in space. As Gutfreund and Renn point out, Einstein himself acknowledged that he had effectively replaced the fictional aether with gravity.

This is most apparent when one considers a black hole. Every massive body has an escape velocity which is the velocity a projectile must achieve to become free of a body’s gravitational field. Obviously, the escape velocity for Earth is larger than the escape velocity for the moon and considerably less than the escape velocity of the Sun. Not so obvious, although logical from what we know, the escape velocity is independent of the projectile’s mass and therefore also applies to light (photons). We know that all body’s fall at exactly the same rate in a gravitational field. In other words, a geodesic applies equally to all bodies irrespective of their mass. In the case of a black hole, the escape velocity exceeds the speed of light, and, in fact, becomes the speed of light at its event horizon. At the event horizon time stops for an external observer because the light is red-shifted to infinity. One of the consequences of Einstein’s theory is that clocks travel slower in a stronger gravitational field, and, at the event horizon, gravity is so strong the clock stops.

To appreciate why clocks slow down and rods become shorter (in the direction of motion), with respect to an observer, one must understand the consequences of the speed of light being constant. If light is a wave then the equation for a wave is very fundamental:

v = f λ , where v is velocity, f is the frequency and λ is the wavelength.

In the case of light the equation becomes c = f λ , where c is the speed of light.

One can see that if c stays constant then f and λ can change to accommodate it. Frequency measures time and wavelength measures distance. One can see how frequency can become stretched or compressed by motion if c remains constant, depending whether an observer is travelling away from a source of radiation or towards it. This is called the Doppler effect, and on a cosmic scale it tells us that the Universe is expanding, because virtually all galaxies in all directions are travelling away from us. If a geodesic is the path of maximum proper time, we have a reference for determining relativistic effects, and we can use the Doppler effect to determine if a light source is moving relative to an observer, even though the speed of light is always c.

I won’t go into it here, but the famous twin paradox can be explained by taking into account both relativistic and Doppler effects for both parties – the one travelling and the one left at home.

This is an exposition I wrote on the twin paradox.

Saturday 14 November 2015

The Unreasonable Effectiveness of Mathematics

I originally called this post: Two miracles that are fundamental to the Universe and our place in it. The miracles I’m referring to will not be found in any scripture and God is not a necessary participant, with the emphasis on necessary. I am one of those rare dabblers in philosophy who argues that science is neutral on the subject of God. A definition of miracle is required, so for the purpose of this discussion, I call a miracle something that can’t be explained, yet has profound and far-reaching consequences. ‘Something’, in this context, could be described as a concordance of unexpected relationships in completely different realms.

This is one of those posts that will upset people on both sides of the religious divide, I’m sure, but it’s been rattling around in my head ever since I re-read Eugene P. Wigner’s seminal essay, The Unreasonable Effectiveness of Mathematics in the Natural Sciences. I came across it (again) in a collection of essays under the collective title, Math Angst, contained in a volume called The World Treasury of Physics, Astronomy and Mathematics edited by Timothy Ferris (1991). This is a collection of essays and excerpts by some of the greatest minds in physics, mathematics and cosmology in the 20th Century.

Back to Wigner, in discussing the significance of complex numbers in quantum mechanics, specifically Hilbert’s space, he remarks:

‘…complex numbers are far from natural or simple and they cannot be suggested by physical observations. Furthermore, the use of complex numbers in this case is not a calculated trick of applied mathematics but comes close to being a necessity in the formulation of the laws of quantum mechanics.’

It is well known, among physicists, that in the language of mathematics, quantum mechanics not only makes perfect sense but is one of the most successful physical theories ever. But in ordinary language it is hard to make sense of it in any way that ordinary people would comprehend it.

It is in this context that Wigner makes the following statement in the next paragraph following the quote above:

‘It is difficult to avoid the impression that a miracle confronts us here… or the two miracles of the existence of laws of nature and of the human mind’s capacity to divine them.’

Hence the 2 miracles I refer to in my introduction. The key that links the 2 miracles is mathematics. A number of physicists: Paul Davies, Roger Penrose, John Barrow (they’re just the ones I’ve read); have commented on the inordinate correspondence we find between mathematics and regularities found in natural phenomena that have been dubbed ‘laws of nature’.

The first miracle is that mathematics seems to underpin everything we know and learn about the Universe, including ourselves. As Barrow has pointed out, mathematics allows us to predict the makeup of fundamental elements in the first 3 minutes of the Universe. It provides us with the field equations of Einstein’s general theory of relativity, Maxwell’s equations for electromagnetic radiation, Schrodinger’s wave function in quantum mechanics and the four digit software code for all biological life we call DNA.

The second miracle is that the human mind is uniquely evolved to access mathematics to an extraordinarily deep and meaningful degree that has nothing to do with our everyday prosaic survival but everything to do with our ability to comprehend the Universe in all the facets I listed above.

The 2 miracles combined give us the greatest mystery of the Universe, which I’ve stated many times on this blog: It created the means to understand itself, through us.

So where does God fit into this? Interestingly, I would argue that when it comes to mathematics, God has no choice. Einstein once asked the rhetorical question, in correspondence with his friend, Paul Ehrenfest (if I recall it correctly): did God have any choice in determining the laws of the Universe? This question is probably unanswerable, but when it comes to mathematics, I would answer in the negative. If one looks at prime numbers (there are other examples, but primes are fundamental) it’s self-evident that they are self-selected by their very definition – God didn’t choose them.

The interesting thing about primes is that they are the ‘atoms’ of mathematics because all the other ‘natural’ numbers can be determined from all the primes, all the way to infinity. The other interesting thing is that Riemann’s hypothesis indicates that primes have a deep and unexpected relationship with some of the most esoteric areas of mathematics. So, if one was a religious person, one might suggest that this is surely the handiwork of God, yet God can’t even affect the fundamentals upon which all this rests.

Addendum: I changed the title to reflect the title of Wigner's essay, for web-search purposes.

Friday 23 October 2015

Freedom; a moral imperative

I wrote something about freedom recently, in answer to a question posed in Philosophy Now (Issue 108, Jun/Jul 2015) regarding What's The More Important: Freedom, Justice, Happiness, Truth? My sequence of importance starting at the top was Truth, Justice, Freedom and Happiness based on the argued premise that each was dependent on its predecessor. But this post is about something else: the relevance of John Stuart Mill’s arguments on ‘liberty’ in the 21st Century.

Once again, this has been triggered by Philosophy Now, but Issue 110 (Oct/Nov 2015) though the context is quite different. Philosophy Now is a periodical and they always have a theme, and the theme for this issue is ‘Liberty & Equality’ so it’s not surprising to find articles on freedom. In particular, there are 2 articles: Mill, Liberty & Euthanasia by Simon Clarke and The Paradox of Liberalism by Francisco Mejia Uribe.

I haven’t read Mill’s book, On Liberty, which is cited in both of the aforementioned articles, but I’ve read his book, Utilitarianism, and what struck me was that he was a man ahead of his time. Not only is utilitarian philosophy effectively the default position in Western democracies (at least, in theory) but he seemed to predict findings in social psychology like the fact that one’s conscience is a product of social norms and not God whispering in one’s ear, and that social norms can be changed, like attitudes towards smoking, for example. I’ve written a post on utilitarian moral philosophy elsewhere, so I won’t dwell on it here.

The first essay by Clarke (cited above) deals with the apparent conflict between freedom to pursue one’s potential and the freedom to end one’s life through euthanasia, which is not the subject of this post either. It’s Clarke’s reference to Mill’s fundamental philosophy of individual freedom that struck a chord with me.

An objectively good life, on Mill’s (Aristotelian) view, is one where a person has reached her potential, realizing the powers and abilities she possesses. According to Mill, the chief essential requirement for personal well-being is the development of individuality. By this he meant the development of a person’s unique powers, abilities, and talents, to their fullest potential.

I’ve long believed that the ideal society allows this type of individualism: that each of us has the opportunity, through education, mentoring and talent-driven programmes to pursue the goals best suited to our abilities. Unfortunately, the world is not an equitable place and many people - the vast majority - don’t have this opportunity.

The second essay (cited above), by Uribe, deals with the paradox that arises when liberal political and societal ideals meet fundamentalism. One may ask: what paradox? The paradox is that liberal attitudes towards freedom of expression, religious and cultural norms allows the rise of fundamentalist ideals that actually wish to curtail such freedoms. In the current age, fundamentalism is associated with Islamic fundamentalism manifested by various ideologies all over the globe, which has led to a backlash in the form of Islamophobia. Some, like IS (Islamic State) and Boko Haram (in Nigeria) have extreme, intolerant views that they enforce on entire populations in the most brutal and violent manner imaginable. In other words, they could not be further from Mill’s ideal of freedom and liberation (Uribe, by the way, makes no reference to Islam).

In Western societies, there is a widely held fear, exploited by many right-winged and nationalist movements, that Islamic fundamentalism will overthrow our Western democratic systems of government and replace it with a religious totalitarian one. The reports of extreme human rights violations (including genocide, slavery and internet posted executions) in far-off politically unstable countries, only adds to this paranoia.

There are caveats to Mill’s manifesto (my term) on individual freedom, as pointed out by Clarke: ‘Excepting children and the insane, for whom intervention for their own sake is permissible…’ and ‘Freedom for the sake of individuality does not allow the harming of others, because that would damage the individuality of others.’

It’s this last point: ‘that would damage the individuality of others’; that I would argue, goes to the crux of the issue. Totalitarianism and fundamentalist ideologies should and can be opposed on this moral principle – political and social structures that inhibit unfairly the ability for individuals to pursue happiness should not be supported. This seems self-evident, yet it’s at the core of the current gay-marriage debate that is happening in many Western countries, including Australia (where I live). It’s also the reason that many Muslims oppose Islam extremists as they affect their own individualism.

On another, freedom-related issue, Australia has for the past 15 years pursued a ruthless, not-to-mention contentious, policy of so-called ‘border protection’ against refugees arriving by boat. Both sides of the political spectrum in Australia pursue this policy because our politics have become almost completely poll-driven, and any change of policy by either side, would immediately damage them in the polls, due to the paranoid nature of our society at large. This is related to the issue of Islamophobia I mentioned earlier, because a large portion of these refugees are from the unstable countries where atrocities are being committed. Not surprisingly, it’s the right-wing elements who exploit this issue as well. But it’s hard to imagine an issue that more strongly evokes Mill’s demand for individual freedom and liberty (except, possibly, the abolition of slavery).

As I said in an earlier post (the one I reference above), freedom and hope are partners. It’s the deliberate elimination of hope that drives my government’s policy, and the fact that this has serious mental health consequences is not surprising, yet it’s ignored.

Imprisonment is the most widely employed method of punishment for criminals because it eliminates freedom, though not necessarily hope. The Australian government’s rationalisation behind their extremely tough policy on asylum seekers is that they are ‘illegals’ and therefore deserve to be punished in this manner. However, the punishment is much worse than we dispense to convicted criminals under our justice system. It’s a sad indictment on our society that we have neither the political will nor the moral courage to reverse this situation.

Thursday 24 September 2015

What is now?

Notice I ask what and not when, because ‘now’, as we experience it, is the most ephemeral of all experiences. As I’ve explained in another post: to record anything at all requires a duration – there is no instantaneous moment in time – except in mathematical calculus where a sleight-of-hand makes an infinitesimal disappear completely. It’s one of the most deceptive tricks in mathematics, but in mathematics you can have points with zero dimensions in space, so time with zero dimensions is just another idealism that allows one to perform calculations that would otherwise be impossible.

But another consequence of ‘now’ is that without memory we would not even know we have consciousness. Think about it: ‘now’ has no duration and consciousness exists in a continuous present so no memory would mean no experience of consciousness, or ‘now’ for that matter, because once it occurs it’s already in the past. Therefore memory is required to experience it at all.

But this post is not about calculus or consciousness per se; it arose from a quote I came across attributed to William Lawrence Bragg:

Everything that has already happened is particles, everything in the future is waves. The advancing sieve of time coagulates waves into particles at the moment ‘now’.

For those who don’t know, Sir William Lawrence Bragg was son of Sir William Henry Bragg, whom, as far as I know, were the only father and son to be jointly awarded a Nobel Prize in physics, for their work on X-ray diffraction in crystals. Henry was born in England and Lawrence was born in Australia. I heard about them at school, naturally, but I only came across this quote earlier in the week. They were among the first to exploit short wave photons (X-rays) to find the atomic-scale dimensions of crystal lattices, thus pioneering the discipline of crystallography.

In the same week, I came across this quote from Freeman Dyson recalling a conversation he had with Richard Feynman:

Thirty-one years ago Dick Feynman told me about his ‘sum over histories’ version of quantum mechanics. ‘The electron does anything it likes’, he said. ‘It goes in any direction at any speed, forward and backward in time, however it likes, and then you add up the amplitudes and it gives you the wave-function.’ I said, ‘You’re crazy.’ But he wasn’t.

I’ve discussed in some detail the mathematical formulation of the ‘wave-function’ known as Schrodinger’s equation, in another post, but what’s significant, in regard to the 2 quotes I’ve cited, is that the wave function effectively disappears or becomes irrelevant once an ‘observation’ or experimental ‘measurement’ occurs. In other words, the wave function ‘permeates all space’ (according to Richard Elwes in MATHS 1001) before it becomes part of the ‘classical physics’ real world. So Bragg’s quote makes perfect sense that the wave function represents the future and the particle ‘observation’, be it a photon or electron or whatever, represents the past with the interface being ‘now’.

As I’ve explicated in my last post, the default interpretation of Feyman’s ‘sum over histories’ or ‘path integrals’ mathematical description of quantum mechanics, is that all ‘histories’ occur in parallel universes, but I would argue that it’s a consequence of the irreversibility of time once the particle is ‘observed’. Now ‘observed’, in this context, means that the particle becomes part of the real world, or at least, that’s my prosaic interpretation. There is an extreme interpretation that it does require a ‘conscious observation’ in order to become real, but the fact that the Universe existed many billions of years prior to consciousness evolving, makes this interpretation logically implausible to say the least.

Brian Cox, in one of his BBC TV programmes (on ‘Time’) points out that one of the problems that Einstein had with quantum mechanics is that, according to its ‘rules’, the future was indeterminate. Einstein’s mathematical formulation of space-time, which became fundamental to his General Theory of Relativity (albeit was a consequence of his Special Theory) was that time could literally be treated like a dimension of space. This meant that the future was just as ‘real’ as the past. In other words, Einstein firmly believed that the universe, and therefore our lives, are completely deterministic – there was no place for free will in Einstein’s universe. Interestingly, this was a topic in a not-so-recent issue of Philosophy Now, though the author of the article didn’t explain that Einstein’s strict position on this was a logical consequence of his interpretation of space-time: the future was just as fixed as the past.

But, even without quantum mechanics, we know that chaos theory also contributes to the irreversibility of time, although Einstein was unaware of chaos theory in his lifetime. Paul Davies explains this better than most in his book on chaos theory, The Cosmic Blueprint.

The point is that, both in chaos theory and Feynman’s multiple histories, there are many possibilities that can happen in the ‘future’, but the ‘past’ is only one path and it can’t be remade. According to David Deutsch and Max Tegmark, all the future possibilities occur both in quantum mechanics and at a macro level. In fact, Deutsch has argued that chaotic phenomena are a consequence of the quantum mechanics' many worlds interpretation. In effect, they disassemble the asymmetry between the future and the past. According to their world-view, the future is just as inevitable as the past, because no matter which path is chosen, they all become reality somewhere in some universe; all of which bar one, we can’t see. From my perspective, this is not an argument in support of the many worlds interpretation, but an argument against it.

In my last post but one, I discussed at length Paul Davies’ book, The Mind of God. One of his more significant insights was that the Universe allows evolvement without dictating its end. In other words, it’s because of both chaos and quantum phenomena that there are many possible outcomes yet they all arise from a fixed past and this is a continuing process - it’s deterministic yet unpredictable.

One could make the same argument for free will. At many points in our lives we make choices based on a past that is fixed whilst conscious of a future that has many possibilities. I agree with Carlo Rovelli that free will is not a consequence of quantum mechanics, but the irreversibility of time applies to us as individual conscious agents in exactly the same way it applies to the dynamics of the Universe at both quantum and macro levels.

There is just one problem with this interpretation of the world, and that is, according to Einstein’s theories, there is no universal ‘now’. If there is no simultaneity, which is a fundamental outcome of the Special Theory of Relativity, then it’s difficult to imagine that people separated in space-time could agree on a ‘now’. And yet, the fact that we give the Universe an age and a timeline, effectively insists that there must be a ‘now’ for the Universe at large. I confess I don’t know enough physics to answer this, but quantum entanglement reintroduces simultaneity by stealth, even if we can’t use it to send messages. One of the features of the Universe is causality. Despite the implications of both quantum mechanics and relativity theory on the physics of time, neither of them interfere with causality, despite what some may argue (and that includes entanglement). But causality requires the speed of light to separate causal events, which is why the ‘now’ we experience sees stars in the firmament up to billions of years old. So space-time makes ‘now’ a subjective experience, even to the extent that at the event horizon of a black hole ‘now’ can become frozen to an outside observer.

Addendum: I actually believe there is a universal 'now', which I've addressed in a later post (towards the end).

Tuesday 15 September 2015

Are Multiverses the solution or the problem?

Notice I use the plural for something that represents a collection of universes. That’s because there are multiple versions of them; according to Max Tegmark there are 3 levels of multiverses.

I’m about to do something that I criticise others for doing: I’m going to challenge the collective wisdom of those who are much smarter and more knowledgeable than me. I’m not a physicist, let alone a cosmologist, and I’m not an academic in any field – I’m just a blogger. My only credentials are that I read a lot, especially about physics by authors who are eminently qualified in their fields. But even that does not give me any genuine qualification for what I’m about to say. Nevertheless, I feel compelled to point out something that few others are willing to cognise.

This occurred to me after I wrote my last post. In the 2 books I reference by Paul Davies (The Mind of God and The Goldilocks Enigma) he discusses and effectively dismisses the multiverse paradigm, yet I don’t mention it. Partly, that was because the post was getting too lengthy as it was, and, partly, because I didn’t need to discuss it to make the point I wished to make.

But the truth is that the multiverse is by far the most popular paradigm in both quantum physics and cosmology, and this is a relatively recent trend. What I find interesting is that it has become the default position, epistemologically, to explain what we don’t know at both of the extreme ends of physics: quantum mechanics and the cosmos.

Davies makes the point, in Mind of God (and he’s not the only one to do so), that for many scientists there seems to be a choice between accepting the multiverse or accepting some higher metaphysical explanation that many people call God. In other words, it’s a default position in cosmology because it avoids trying to explain why our universe is so special for life to emerge. Basically, it’s not special if there are an infinite number of them.

In quantum mechanics, the multiverse (or many words interpretation, as it’s called) has become the most favoured interpretation following the so-called Copenhagen interpretation championed by Niels Bohr. It’s based on the assumption that the wave function, which describes a quantum particle in Hilbert space doesn’t disappear when someone observes something or takes a measurement, but continues on in a parallel universe. So a bifurcation occurs for every electron and every photon every time it hits something. What’s more, Max Tegmark argues that if you have a car crash and die, in another universe you will continue to live. And likewise, if you have a near miss (as he did, apparently) in this universe, then in another parallel universe you died.

In both cases, cosmology and quantum mechanics, the multiverse has become the ‘easy’ explanation for events or phenomena that we don’t really understand. Basically, they are signposts for the boundaries or limits of scientific knowledge as it currently stands. String Theory or M Theory, is the most favoured cosmological model, but not only does it predict 10 spatial dimensions (as a minimum, I believe) it also predicts 10500 universes.

Now, I’m sure many will say that since the multiverse crops up in so many different places: caused by cosmological inflation, caused by string theory, caused by quantum mechanics; at least one of these multiverses must exist, right? Well no, they don’t have to exist – they’re just speculative, albeit consistent with everything we currently know about this universe, the one we actually live in.

Science, as best I understand it, historically, has always homed in on things. In particle physics it homed in on electrons, protons and neutrons, then neutrinos and quarks in all their varieties. In biology, we had evolution by natural selection then we discovered genes and then DNA, which underpinned it all. In mechanics, we had Galileo, Kepler and Newton, who finally gave us an equation for gravity, then Einstein gave us relativity theory that equated energy with mass in the most famous equation in the world, plus the curvature of space-time giving a completely geometric account of gravity that also provided a theoretical foundation for cosmology. Faraday, followed by Maxwell showed us that electricity and magnetism are inherently related and again Einstein took it further and gave an explanation of the photo-electric effect by proposing that light came in photons.

What I’m trying to say is that we made advances in science by either finding more specific events and therefore particles or by finding common elements that tied together apparently different phenomena. Kepler found the mathematical formulation that told us that planets travel in ellipses, Newton told us that gravity’s inverse square law made this possible and Einstein told us that it’s the curvature of space-time that explains gravity. Darwin and Wallace gave us a theory of evolution by natural selection, but Mendel gave us genes that explained how the inheritance happened and Francis and Crick and Franklin gave us the DNA helix that is the key ingredient for all of life.

My point is that the multiverse explanation for virtually everything we don’t know is going in the opposite direction. Now the explanation for something happening, whether it be a quantum event or the entire universe, is that every possible variation or physical manifestation is happening but we can only see the one we are witnessing. I don’t see this as an explanation for anything; I only see it as a manifestation of our ignorance.


Addendum: This is Max Tegmark providing a very cogent counterargument to mine. I think his argument from inflation is the most valid and his argument from QM multiple worlds, the most unlikely. Quantum computers won't prove parallel universes, because they are dependent on entanglement (as I understand it) which is not the same thing as multiple copies. Philip Ball makes this exact point in Beyond Weird, where he explains that so-called multiple particles only exist as probabilities; only one of which becomes 'real'.