Paul P. Mealing

Check out my book, ELVENE. Available as e-book and as paperback (print on demand, POD). Also this promotional Q&A on-line.

Sunday, 17 January 2010

Quantum Entanglement; nature’s great tease

I’ve just read the best book on the history of quantum mechanics that I’ve come across, called The Age of Entanglement, subtitled When Quantum Physics was Reborn, by Louisa Gilder. It’s an even more extraordinary achievement when one discovers that it’s Gilder’s first book, yet one is not surprised to learn it had an 8 year gestation. It’s hard to imagine a better researched book in this field.

Gilder takes an unusual approach, where she creates conversations between key players, as she portrays them, using letters and anecdotal references by the protagonists. She explains how she does this, by way of example, in a preface titled A Note To The Reader, so as not to mislead us that these little scenarios were actually recorded. Sometimes she quotes straight from letters.

When I taught a fiction-writing course early last year, someone asked me is biography fiction or non-fiction? My answer was that as soon as you add dialogue, unless it’s been recorded, it becomes fiction. An example is Schindler’s Ark by Thomas Keneally, who explained that he wrote it as a novel because ‘that is his craft’. In the case of Gilder’s book, I would call these scenarios quasi-fictional. The point is that they work very well, whether they be fictional or not, in giving flesh to the characters as well as the ideas they were exploring.

She provides an insight into these people and their interactions, at a level of perspicuity, that one rarely sees. In particular, she provides an insight into their personal philosophies and prejudices that drove their explorations and their arguments. The heart of the book is Bell’s Theorem or Bell’s Inequality, which I’ve written about before (refer Quantum Mechanical Philosophy, Jul.09). She starts the book off like a Hollywood movie, by providing an excellent exposition of Bell’s Theorem for laypeople (first revealed in 1964) then jumping back in time to the very birth of quantum mechanics (1900) when Planck coined the term, h, (now known as Planck’s constant) to satisfactorily explain black body radiation. Proceeding from this point, Gilder follows the whole story and its amazing cast of characters right up to 2005.

In between there were 2 world wars, a number of Nobel Prizes, the construction of some very expensive particle accelerators and a cold war, which all played their parts in the narrative.

David Mermin, a solid state physicist at Cornell gave the best exposition of Bell’s Theorem to non-physicists, for which the great communicator, Richard Feynman, gave him the ultimate accolade by telling him that he had achieved what Feynman himself had been attempting to achieve yet failed to realise.

Bell’s Theorem, in essence, makes predictions about entangled particles. Entangled particles counter-intuitively suggest action-at-a-distance occurring simultaneously, contradicting everything else we know about reality, otherwise known as ‘classical physics’. Classical physics includes relativity theory which states that nothing, including communication between distinct objects, can occur faster than the speed of light. This is called ‘locality’. Entanglement, which is what Bell’s Theorem entails, suggests the opposite, which we call ‘non-locality’.

Gilder’s abridged version of Mermin’s exposition is lengthy and difficult to summarise, but, by the use of tables, she manages to convey how Bell’s Theorem defies common sense, and that’s the really important bit to understand. Quantum mechanics defies what our expectations are, and Bell’s great contribution to quantum physics was that his famous Inequality puts the conundrum into a succinct and testable formula.

Most people know that Bohr and Einstein were key players and philosophical antagonists over quantum theory. The general view is that Bohr ultimately won the argument, and was further justified by the successful verification of Bell’s Theorem, while Einstein was consigned to history as having discovered two of the most important theories in physics (the special and general theories of relativity) but stubbornly rejected the most empirically successful theory of all time, quantum mechanics. Gilder’s book provides a subtle but significantly different perspective. Whilst she portrays Einstein as unapologetically stubborn, he played a far greater role in the development of quantum theory than popular history tends to grant him. In particular, it could be argued that he understood the significance of Bell’s Theorem many decades before Bell actually conceived it.

Correspondence, referenced by Gilder, suggests that Schrodinger’s famous Cat thought experiment originally arose from a suggestion by Einstein, only Einstein envisaged a box containing explosives that were both exploded and un-exploded at the same time. Einstein also supported De Broglie at a time when everyone else ignored him, and he acknowledged that de Broglie had ‘lifted a corner of the great veil’.

Curiously, the cover of her book contains 3 medallion-like photographic portraits, in decreasing size: Albert Einstein, Erwin Schrodinger and Louis de Broglie; all quantum mechanic heretics. Gilder could have easily included David Bohm and John Bell as well, if that was her theme.

Why heretics? Because they all challenged the Copenhagen interpretation of quantum mechanics, led by Bohr and Heisenberg, and which remains the ‘conventional’ interpretation to this day, even though the inherent conundrum of entanglement remains its greatest enigma.

It was Bohr who apparently said that anyone who claims they understand quantum mechanics doesn’t comprehend it at all, or words to that effect. When we come across something that is new to us, that we don’t readily understand, the brain looks for an already known context in which to place it. In an essay I wrote on Epistemology (July 2008) I made the point that we only understand new knowledge when we incorporate it into existing knowledge. The best example is when we look up a word in a dictionary – it’s always explained by using words that we already know. I also pointed out that this plays a role in storytelling where we are continuously incorporating new knowledge into existing knowledge as the story progresses. Without this innate human cognitive ability we’d give up on a story after the first page.

Well the trap with quantum mechanics is that we attempt to understand it in the context of what we already know, when, really, we can’t. It’s only when you understand the mystery of quantum mechanics that you can truly say: I understand it. In other words, when you finally understand what can’t be known, or can’t be predicted, as we generally do with so-called ‘classical physics’. Quantum mechanics obeys different rules, and when you appreciate that they don’t meet our normal ‘cause and effect’ expectations, then you are on the track of appreciating the conundrum. It’s a great credit to Gilder that she conveys this aspect of quantum physics, both in theory and in experiment, better than any other writer I’ve read.

Some thumbnail sketches based on Gilder’s research are worth relaying. She consistently portrays Neils Bohr as a charismatic leader who dominated as much by personality as by intellect. People loved him, but, consequently, found it difficult to oppose him, is the impression that she gives. The great and famous exception was Einstein, who truly did have a mind of his own, but also Wolfgang Pauli, who was famously known to be the most critical critic of any physicist.

John Wheeler, who in the latter part of the 20th Century, became Bohr’s greatest champion said of his early days with Bohr: “Nothing has done more to convince me that there once existed friends of mankind with the human wisdom of Confucius and Buddha, Jesus and Pericles, Erasmus and Lincoln, than walks and talks under the beech trees of Klampenborg Forest with Neils Bohr.” Could there be any greater praise?

Einstein wrote of Max Planck: “an utterly honest man who thinks of others rather than himself. He has, however, one fault: he is clumsy in finding his way about foreign trains of thought.” As for Lorentz, with whom he was corresponding with at the same time as Planck, he found him “astonishingly profound… I admire this man as no other, I would say I love him.”

Much later in the story, Gilder relates an account of how a 75 year-old Planck made a personal presentation to Hitler, attempting to explain how his dismissal of Jewish scientists from academic positions would have disastrous consequences for Germany. Apparently, he barely opened his mouth before he was given a severe dressing-down by the dictator and told where to go. Nevertheless, the story supports Einstein’s appraisal of the man from a generation earlier.

Gilder doesn’t provide a detailed portrait of Paul Dirac or P.A.M. Dirac, as he’s often better-known, but we know he was a very reserved and eccentric individual, whose mathematical prowess effectively forecast the existence of anti-matter. The Dirac equation is no less significantly prophetic than Einstein’s famous equation, E=mc2.

Wolfgang Pauli’s great contribution to physics was the famous Pauli exclusion principle, which I learnt in high school, and provides the explanation as to why atoms don’t all collapse in on each other, and, why, when you touch something you don’t sink into it. He also predicted the existence of the neutrino. Pauli’s personal life went into a steep decline in the 1930s when he suffered from chronic depression and alcoholism. His life turned around after he met Carl Jung and became a lifelong friend. ‘In two years of Jung’s personal analysis and friendship, Pauli shed his depression. In 1934 he met and married Franca Bertram, who would be his companion for the rest of his life.’

This friendship with Jung led to a contradiction in the light of our 21st Century sensibilities, according to Gilder:

’Pauli could tell Bohr to “shut up” and Einstein that his ideas were “actually not stupid”… But in the words of Franca Pauli, “the extremely rational thinker subjected himself to total dependence on Jung’s magical personality.”’

Schrodinger is as well known for his libertine attitude towards sexual relationships as he is for his famous equation. His own wife became the mistress of Schrodinger’s close friend and mathematician, Hermann Weil, whilst Schrodinger had a string of mistresses. But the identity of his lover-companion, when he was famously convalescing from tuberculosis in an Alpine resort in Arosa and conjured up the wave equations that bear his name, is still unknown to this day.

When Schrodinger died in 1961, Max Born (another Nobel Prize winner in the history of quantum mechanics) wrote the following eulogy:

“His private life seemed strange to bourgeois people like ourselves. But all this does not matter. He was a most loveable person, independent, amusing, temperamental, kind, and generous, and he had a most perfect and efficient brain.”

It was Born who turned Schrodinger’s equations into a probability function that every quantum theorist uses to this day. Born was a regular correspondent with Einstein, but is now almost as famously known in pop culture as being grandfather to Australian songstress, Olivia Newton John (not mentioned in Gilder’s book).

Gilder provides a relatively detailed and bitter-sweet history of the relationship between David Bohm and J. Robert Oppenheimer, both affected in adverse ways by the cold war and McCarthy’s ‘House Un-American Activities Committee’.

I personally identify with Gilder’s portrait of Bohm more than I anticipated, not because of his brilliance or his courage, but because of his apparent neurotic disposition and insecurity and his almost naïve honesty.

Gilder has obviously accessed transcripts of his interrogation, where he repeatedly declined to answer questions “on the ground that it might incriminate and degrade me, and also, I think it infringes on my rights as guaranteed by the First Amendment.”

When he was eventually asked if he belonged to a political party, he finally said, “Yes, I am. I would say ‘Yes’ to that question.”

This raised everyone’s interest, naturally, but when he followed up the next question, “What party or association is that?” he said, “I would say definitely that I voted for the Democratic ticket.” ‘The representative from Missouri’, who asked the question, must have been truly pissed off when he pointed out that that wasn’t what he meant. To which Bohm said, in all honesty no doubt, “How does one become a member of the Democratic Pary?”

Bohm lost his career, his income, his status and everything else at a time when he should have been at the peak of his academic abilities. Even Einstein’s letter of recommendation couldn’t get him a position at the University of Manchester and he eventually went to Sao Paulo in Brasil, though he never felt at home there. Gilder sets one of her quasi-fictional scenarios in a bar, when Feynman was visiting Brasil and socialising with Bohm, deliberately juxtaposing the two personalities. She portrays Bohm as not being jealous of Feynman’s mind, but being jealous of his easy confidence in a foreign country and his sex-appeal to women. That’s the David Bohm I can identify with at a similar age.

Bohm eventually migrated to England where he lived for the rest of his life. I don’t believe he ever returned to America, though I can’t be sure how true that is. I do know he became a close friend to the Dalai Lama, because the Dalai Lama mentions their friendship in one of his many autobiographies.

According to Gilder, it’s unclear if Bohm ever forgave Oppenheimer for ‘selling out’ his friend, Bernard Peters, both of whom hero-worshipped Oppenheimer. Certainly, at the time that Oppenheimer ‘outed’ Peters as a ‘crazy red’, Bohm felt that he had betrayed him.

Bohm made a joke of the House Un-American Activities Committee based on the famous logic conundrum postulated by Bertrand Russell: “If the barber is the man who shaves all men who do not shave themselves, who shaves the barber?” Bohm’s version: “Congress should appoint a committee to investigate all committees that do not investigate themselves.”

But of all the characters, John Bell is the one about whom I knew the least, and yet he is the principal character in Gilder’s narrative, because he was not only able to grasp the essential character of quantum mechanics but to quantify it in a way that could be verified. I won’t go into the long story of how it evolved from the Einstein-Podolsky-Rosen (EPR) conjecture, except to say that Gilder covers it extremely well.

What I did find interesting was that after Bell presented his Inequality, the people who wanted to confirm it were not supported or encouraged on either side of the Atlantic. It was considered a career-stopper, and Bell himself, even discouraged up-and-coming physicists from pursuing it. That all changed, of course, when results finally came out.

After reading Gilder’s account, I went back to the interview that Paul Davies had with Bell (The Ghost in the Atom, 1986) after the famous Alain Aspect experiment had confirmed Bell’s Inequality.

Bell is critical of the conventional Copenhagen interpretation because he argues where do you draw the line between the quantum world and the classical world when you make your ‘observation’. Is it at the equipment, or is it in the optic nerve going to your brain, or is it at the neuron in the brain itself. He’s deliberately mocking the view that ‘consciousness’ is the cause of the ‘collapse’ of the quantum wave function.

In the interview he makes specific references to de Broglie and Bohm. Gilder, I noticed, sourced the same material.

“One of the things that I specifically wanted to do was to see whether there was any real objection to this idea put forward long ago by de Broglie and Bohm that you could give a completely realistic account of all quantum phenomena. De Broglie had done that in 1927, and was laughed out of court in a way that I now regard as disgraceful, because his arguments were trampled on. Bohm resurrected that theory in 1952, and was rather ignored. I thought that the theory of Bohm and de Broglie was in all ways equivalent to quantum mechanics for experimental purposes, but nevertheless it was realistic and unambiguous. But it did have the remarkable feature of action-at-a-distance. You could see that when something happened at one point there were consequences immediately over the whole of space unrestricted by the velocity of light.”

Perhaps that should be the last word in this dissertation, but I would like to point out, that, according to Gilder, Einstein made the exact same observation in 1927, when he tried to comprehend the double-slit experiment in terms of Schrodinger’s waves.

Monday, 4 January 2010

Jesus' philosophy

Normally, I wouldn’t look twice at a book with the title, Jesus & Philosophy, but when the author’s name is Don Cupitt, that changes everything. In September last year, I reviewed his book, Above Us Only Sky (under a post titled The Existential God) which is effectively a manifesto on the ‘religion of ordinary life’ to use his own words.

Cupitt takes a very scholarly approach to his topic, referencing The Gospel of Jesus, which arose from the ‘Jesus Seminar’ (1985 to 1995). And, in fact, Cupitt dedicates the book to the seminar’s founder, Robert W. Funk. He also references a document called ‘Q’. For those, like myself, who’ve never heard of Q, I quote Cupitt himself:

“Q, it should be said in parenthesis here, is the term used by Gospel critics to describe a hypothetical sayings-Gospel, written somewhere between the years 50 and 70 CE, and drawn upon extensively by both Matthew and Luke.”

Cupitt is a most unusual theologian in that he has all but disassembled orthodox Christian theology, and he now sees himself more as a philosopher. The overarching thesis of his book, is that Jesus was the first humanist. From anyone else, this could be dismissed as liberal-theological claptrap, but Cupitt is not anyone else; he commands you to take him seriously by the simple merit of his erudition and his lack of academic pretension or arrogance. You don’t have to agree with him but you can’t dismiss him as a ratbag either.

Many people, these days, even question whether Jesus ever existed. Stephen Law, has posed the question more than once on his blog, but, besides provoking intelligent debate, he’s merely revealed how little we actually know. Cupitt doesn’t even raise this question; he assumes that there was an historical Jesus in the same way that we assume there was an historical Buddha, who, like Jesus, kept no records of his teachings. In fact, Cupitt makes this very same comparison. He argues that Jesus’ sayings, like the Buddha’s, would have been remembered orally before anyone wrote them down, and later narratives were attached to them, which became the gospels we know today. He doesn’t question that the biblical stories are fictional, but he believes that behind them was a real person, whose teachings have been perverted by the long history of the Church. He doesn’t use that term, but I do, because it’s what I’ve long believed. The distortion, if not the perversion, was started by Paul, who is the single most responsible person for the concept of Jesus as saviour or messiah that we have today.
.
I actually disagree with Stephen Law’s thesis, and I’ve contended it on his blog, because a completely fictional Jesus doesn’t make a lot of sense to me. If you are going to create a fictional saviour (who is a Deity) then why make him a mortal first and why make him a complete failure, which he was. On the other hand, deifying a mortal after their death at the hands of their enemy, to become a saviour for an oppressed people, makes a lot of sense. A failure in mortal flesh becomes a messiah in a future kingdom beyond death.

Also if Jesus is completely fictional, who was the original author? The logical answer is Paul, but records of Jesus precede Paul, so Paul must have known he was fictional, if that was the case. I’m not an expert in this area, but Cupitt is not the first person to make a distinction between a Jesus who took on the Church of his day and stood up for the outcast and disenfranchised in his society, and Paul’s version, who both knew and prophesied that he was the ‘Son of God’. H.G. Wells in his encyclopedic book, The Outline of History (written after WWI), remarks similarly on a discontinuity in the Jesus story as we know it.

But all this speculation is secondary, though not irrelevant, to Cupitt’s core thesis. Cupitt creates a simple imagery concerning the two conflicting strands of morality, theistic and humanistic, as being vertical and horizontal. The vertical strand comes straight from God or Heaven, which makes it an unassailable authority, and the horizontal strand stems from the human ‘heart’.

His argument, in essence, is that Jesus’ teachings, when analysed, appealed to the heart, not to God’s authority, and, in this respect, he had more in common with Buddha and Confucius than to Moses or Abraham or David. In fact, more than once, Cupitt likens Jesus to an Eastern sage (his words) who drew together a group of disciples, and through examples and teachings, taught a simple philosophy, not only of reciprocity, but of forgiveness.

In fact, Cupitt contends that reciprocity was not Jesus’ core teaching, and, even in his Preface, before he gets into the body of his text, he quotes from the ‘Gospel of Jesus’ to make his point: “If you do good to those who do good to you, what merit is there in that?” (Gospel of Jesus, 7.4; Q/Luke 7.33). Cupitt argues that one of Jesus’ most salient messages was to break the cycle of violence that afflicts all of humanity, and which we see, ironically, most prominently demonstrated in modern day Palestine.

Cupitt uses the term 'ressentiment' to convey this peculiar human affliction: the inability to let go of a grievance, especially when it involves a loved one, but also when it involves more amorphous forms of identity, like nation or race or creed (see my post on Evil, Oct. 07). According to Cuppit, “Jesus says: ‘Don’t let yourself be provoked into ressentiment by the prosperity of the wicked. Instead, be magnanimous, and teach yourself to see in it the grace of God, giving them time to repent. Too many people who have seen the blood of the innocent crying out for vengeance have allowed themselves to develop the revolting belief in a sadistic and vengeful God.’” (Cupitt doesn’t give a reference for this ‘saying’, however.)

I don’t necessarily agree with Cupitt’s conclusion that Jesus is the historical ‘hinge’ from the vertical strand to the horizontal strand, which is the case he makes over 90 odd pages. I think there have been others, notably Gautama Siddhartha (Buddha) and Confucius, who were arguably just as secular as Jesus was, and preceded him by 500 years, though their spheres of influence were geographically distinct from Jesus’.

Obviously, I haven’t covered all of Cupitt’s thesis, including references to Plato and Kant, and the historical relationship between the vertical and horizontal strands of morality. He makes compelling arguments that Jesus has long been misrepresented by the Church, in particular, that Jesus challenged his society’s dependence on dogmatic religious laws.

One interesting point Cupitt makes, almost as a side issue, is that it was the introduction of the novel that brought humanist morality into intellectual discourse. Novels, and their modern derivatives in film and television, have invariably portrayed moral issues as being inter-human not God-human. As Cupitt remarks, you will go a long way before you will find a novel that portrays morality as being God-given. Even so-called religious writers, like Graham Greene and Morris West, were always analysing morality through human interaction (Greene was a master of the moral dilemma) and if God entered one of their narratives, ‘He’ was an intellectual concept, not a character in the story.

There is one aspect of Jesus that Cupitt doesn’t address, and it’s the fact that so many Christians claim to have a personal relationship with him. This, of course, is not isolated to Jesus. I know people who claim to have a personal relationship with Quan Yin (the Buddhist Goddess of Mercy) and others claim a relationship with the Madonna and others with Allah and others with Yahweh and so on. So what is all this? This phenomena, so widespread, has fascinated me all my life, and the simple answer is that it’s a projection. There is nothing judgmental in this hypothesis. My reasoning is that for every individual, the projection is unique. Everyone who believes in this inner Jesus has their own specific version of him. I don’t knock this, but, as I’ve said before, the Deity someone believes in says more about them than it says about the Deity. If this Deity represents all that is potentially good in humanity then there is no greater aspiration.

In the beginning of Cupitt’s book, even before the Preface, he presents William Blake’s poem, The Divine Image. In particular, I like the last verse, which could sum up Cupitt’s humanist thesis.

And all must love the human form,
In heathen, Turk, or Jew;
Where Mercy, Love, & Pity dwell
There God is dwelling too.


In other words, God represents the feeling we have for all of humanity, which is not only subjective, but covers every possible attribute. If you believe in a vengeful, judgmental God, then you might not have a high opinion of humanity, but if you believe in a forgiving and loving God, then maybe that’s where your heart lies. As for those who claim God is both, then I can only assume they are as schizoid in their relationships as their Deity is.

Friday, 1 January 2010

Stieg Larsson’s The Girl with the Dragon Tattoo

I don’t normally review novels, but, to be honest, I don’t read a lot of them either, which is an incredible admission for a want-to-be author to make (actually, I’m a real author, just not a very successful one). Most of my reading is non-fiction, at least 90%, and when given a choice between a novel or a non-fiction book, I’ll invariably end up with the latter. There are always a stack of unread books in front of me, which are all non-fiction.

The Girl with the Dragon Tattoo was an exception – this was a book I had to read – simply because I’d heard so much about it. It’s the first in a trilogy by Stieg Larsson, who unfortunately died before they became monstrously successful. The first won a Galaxy British Books award for the ‘Crime Thriller of the Year 2009’. I’m not sure if it has the same status in America as it has in the rest of the world, but, if it hasn’t, I expect that would change if the movie went international.

Larsson was not much younger than me, and was a journalist and editor-in-chief of Expo from 1999. He died in 2004, only 50 years old, just after he delivered all three manuscripts to his Swedish publisher. As a first novel, I’m extraordinarily impressed, and, from my own experience of publishing my first novel at a similar age, I suspect he must have been practicing the art of fiction well beforehand. Very few journalists make the jump from non-fiction to fiction (refer my post on Storytelling, Jul.09) even though the craft of creating easy-to-read yet meaningful and emotively charged prose is well hone. It’s a big leap from writing stories about real people and real events to imaginary scenarios populated by fictional yet believable characters. The craft of writing engaging dialogue looks deceptively simple, yet it can stump the most practiced wordsmith if they’ve never attempted it before.

Larsson has two protagonists, one middle-aged male and one mid-twenties female, who are opposites in almost every respect except intelligence. Mikael Blomkvist could easily be an alter-ego for Larsson, as he’s a financial investigative journalist who jointly runs a magazine with his ‘occasional lover’, Erika Berger, but, being an author myself, I don’t necessarily jump to such obvious conclusions. When people see an actor in a role on the screen, they often assume that that is what he or she must be like in real life, yet that’s the actor’s job: to make you believe the screen persona is a real person. Well, we authors have to create the same illusion – we are all magicians in that sense. There’s no doubt that Larsson used his inside-knowledge in developing his story and background for his character, but the personality of Blomkvist may be quite different to Larsson. In fact, there is no reason not to assume that his other protagonist, albeit a different age, sex and occupation, may be closer in personality to its creator. Having said that, Lisbeth Salander (who is the girl with the dragon tattoo) is a dysfunctional personality, possibly with a variant of Asberger’s that makes one pause. The point is that she’s just as well drawn as Blomkvist, perhaps even better.

My point is that authors, myself included, often create characters who have characteristics that we wish we had but know we haven’t. Blomkvist is an easy-going, tolerant person with liberal views, who charms the pants off women, but has an incisive mind that sees through deception. Larsson may have had these qualities or some of these qualities and added others. We all do this to our protagonists.

One of the strengths of this book is that, whilst it entertains in a way that we expect thrillers to, it exhibits a social conscience with a strong feminist subtext. There are 4 parts, comprising 29 chapters, bookended with a prologue and epilogue. Each part has its own title page, and they all contain a statistic concerning violence against women in Sweden. I will quote the last one in the book, on the title page for Part 4, Hostile Takeover: “92% of women in Sweden who have been subjected to sexual assault have not reported the most recent violent incident to the Police.”

One of the things I personally like about this book is that it challenges our natural tendency to judge people by appearances. Larsson does this with Salander all through the book, where people continually misjudge her, and, all through her fictional life, she’s been undervalued and written off as a ‘retard’ and social misfit.

As a young person growing up, two of the most influential people in my life were eccentrics, both women, one my own age and one 2 generations older. This has made me more tolerant and less judgemental than most people I know. To this extent I can identify with Blomkvist. It’s one of the resonances I felt most strongly in this novel.

Larsson is a very good writer on all fronts. It is halfway through the book (comprising over 500 pages, approx. 150,000 words long) before anything truly dramatic happens on the page. Beforehand we have lots of mystery and lots of intrigue, but not a lot of action or real suspense. It’s a great credit to Larsson that he keeps you engaged for that entire time (it’s a genuine page-turner) without resorting to mini-climaxes. Dan Brown could learn a lot from Stieg Larsson. Brown is a master of riddles, but Larsson’s writing has a depth, both in characterisation and subtext, that’s way over Brown’s head.

There’s nothing much else to say – I’m yet to read the other two in the series. I assume they are self-contained stories with the same protagonists. One is naturally interested in how their relationship develops, both professionally and personally. I often believe that what raises one novel above another is the psychological believability of the characters, to coin my own phrase. Last year I reviewed the film, Watchmen (Oct. 09), a movie based on a graphic novel, but it was the depth of characterisation, along with its substantial subtext, that lifted it above the expected norm for comic-book movies. The Girl with the Dragon Tattoo is in a similar class of fiction. I don’t judge books or films by their genre; I judge them, primarily, by how well-written they are – it’s probably the criterion we all use, but we’re not consciously aware of.

Friday, 25 December 2009

The Origins of Mathematics

Note the plural in the title, because the mathematics we use today comes from a number of different sources, geographically and culturally. We have a very Eurocentric outlook on mathematics that belies its global heritage.

I came across a small, well-presented volume in my local bookshop: The Bedside Book of Algebra by Michael Willers, a Canadian high school teacher of my vintage, going by the pop-cultural references he sprinkles throughout. When I thumbed through it, my first impression was that it contained nothing I didn’t already know, but I liked the presentation and I realised that it gave a history of mathematics as well as an exposition. It would be an excellent book for anyone wanting a grasp of high school mathematics, as it covers most topics, except calculus and matrices. The presentation is excellent, as he delivers his topics in 2 page bites, and provides examples that are easy to follow. I read it from cover to cover, and learnt a few new things as well as reacquainting myself with old friends like Pascal’s triangle. In fact, Willers revealed a few things about Pascal’s triangle that I didn’t know, like its relationship with the Fibonacci sequence and its generation of fractal patterns using a ‘tiling’ algorithm developed by Polish mathematician, Waclaw Sierpinski in 1915.

I already knew that the Chinese had discovered Pascal’s triangle some 500 years before Pascal (11th Century, Jia Xian), but I didn’t know that the earliest known reference was from an Indian mathematician, Varahamihira, in the 6th Century, or that it appeared in 10th Century Persia, thanks to Al-Karaji. (Blaise Pascal lived 1623-1662.)

Fibonacci (1170-1250) is most famously remembered for the arithmetic sequence that bears his name and also the ‘Golden Ratio’, which can be generated from the sequence. Both the Fibonacci sequence and the Golden Ratio can be found in nature – for example, flower petals are invariably a Fibonacci number and the height of a person’s navel to their height is supposedly the Golden Ratio, but I’m unsure if that is true or just wistful thinking on the part of renaissance artists. Because the Fibonacci sequence is derived by the sum of the previous 2 numbers in the sequence, there are some natural events that follow that rule, like the unchecked population growth of rabbits (that is provided as an example in Willers’ book) and was apparently the original example that Fibonacci used to introduce it.

But we all owe Fibonacci a great debt, because it was he who introduced the Hindu-Arabic numeral system to the Western world in a popular format that has made life so much easier for accountants, engineers, economists, mathematical students and anyone who has ever had to deal with numbers, which is all of us. When I was a child I was told that we used ‘Arabic’ numerals, and I only learned recently that they originated in India. The 7th Century Indian mathematician, Brahmagupta, formulated the first known mathematical concepts that treated zero as a number as well as a place holder (to paraphrase Willers).

Zero and negative numbers were treated with suspicion by the ancient Greeks and Romans, as they preferred geometrical over arithmetical analysis. Because there were no negative areas or negative volumes, the idea of a negative number was considered ‘absurd’. (I have to admit I had the same problem with ‘imaginary’ numbers, when I first encountered them, but I’m getting off the track, and I’ll return to imaginary numbers later.) Likewise, there was no place for a number that represented nothing, but once one introduces negative numbers, zero becomes inevitable, because a negative plus a positive of the same amount must give zero. But zero as a place holder is even more important, because it facilitates all arithmetical computations. As Willer says, imagine trying to do basic arithmetic with Roman numerals, let alone anything esoteric.

Willers quotes Pierre-Simon Laplace (1749-1827): “It is India that gave us the ingenious method of expressing all numbers by means of ten symbols, each symbol receiving a value of position as well as an absolute value; a profound and important idea which appears so simple to us now that we ignore its true merit.”

Willers is the first author I’ve read who makes a genuine attempt to give the Indians and the Persians their due credit for our mathematical heritage. Like the Chinese, the Indians discovered Pythagoras’s triangle before the Pythagoreans, though many people believe Pythagoras actually learnt it from the Babylonians. The Indians also investigated the square route of 2, as well as π (pi), around the same time as the ancient Greeks. In the middle ages, a succession of Indian scholars worked on quadratic equations.

But it is Brahmagupta (598-670), who lived in northwestern India (now Pakistan), to whom Willers devotes one of his 2-page treatises, because he argues that Brahmagupta had the biggest influence on Western mathematics. He lists Brahmagupta’s 14 laws, all dealing with the arithmetic ‘rules’ applicable to zero and negative numbers that, in modern times, we all learn in our childhood.

Willers also gives special attention to two Islamic mathematicians, Al-Khwarizmi (born around 780) and Omar Khayam (1048-1122). Al-Khwarizmi came from Khwarezm (present-day Uzbekistan) and worked in the ‘House of Wisdom’ (see below). He gave us two of our most common mathematical terms: algebra and algorithm. Algebra came from the title of a book he wrote, Hisab Al-jabr w’Al-Muqabala, derived from the word, ‘Al-jabr’. Significantly, he developed methods for deriving the roots of quadratic equations.

The word, algorithm, also comes from the title of a book, Algoritmi de Numero Indorum, which is a Latin translation of one of Al-Khwarizmi’s Arabic texts, now lost. And, according to Willers, algorithm ‘means a number of steps or instructions to be followed.’ Of course, this word is now associated with computer programmes (software). This modern incarnation began with Alan Turing’s famous ‘thought experiment’ of a ‘Universal Turing machine’, as the first iconic example of the modern use of algorithm, which is literally a set of instructions, otherwise known as ‘code’. All modern computers are universal Turing machines, by the way, so it’s much more than a thought experiment now, and algorithms are the code, or software, that drives them.

Omar Khayam is probably better known as a poet, from his authorship of The Rubaiyat, a collection of 600 quatrains (4 line poems). But he also authored a number of books on mathematics, including Treatise on Demonstration of Problems of Algebra (1070), in which he solves cubic equations geometrically by intersecting conic sections with a plane. If one cuts a cone with a plane it describes a curve on the plane. Depending on the angle of the plane with the cone, one can get a circle, an ellipse, a parabola, or, using two cones, two mirror hyperbolae. In another text, that Khayam references, but has since been lost, he writes about Pascal’s triangle, though, obviously, he called it something else.

Omar Khayam provides the best quote in Willers’ book, taken from the above text:

The majority of people who imitate philosophers confuse the true with the false, and they do nothing but deceive and pretend knowledge, and they do not use what they know of the sciences except for base and material purposes.

As Willers points out, Plato’s academy closed in 529 and Fibonacci came on the scene in Pisa at the end of the 12th Century, and it wasn’t until the renaissance that Western science, art and philosophy really gained the ascendancy again. The interim period is known colloquially as the ‘dark ages’, because knowledge and scientific progress seemed to stagnate. As Willers says: “From that point [the closure of Plato’s Academy] until the thirteenth century the mathematical centre of the world was in the East.”

According to Willers, The House of Wisdom was established in Baghdad by Harun Al-Rashid (763-809) and translated works from Persia, Greece and India. It was a centre for education in humanities and sciences until it was destroyed by the Mongols in 1258. Without this Islamic connection over that period, the Greek and Roman knowledge in the sciences, philosophy, mathematics and literature, which, today, we consider to be our Western heritage, may have been lost.

As well as providing this historical context, in more detail than I can render here, and that most of us don’t even know about, Willers gives us excellent exposition on a number of topics: permutations and combinations, probability theory, logarithms, trigonometry, quadratics, complex algebra, the binomial theorem, and others.

He treats all these exemplarily, but I would like to say something about complex arithmetic and imaginary numbers, because it was a personal stumbling block for me, and, in hindsight, it shouldn’t have been. The set of imaginary numbers contains only one number, i, which is the square route of -1 (some texts say the set contains 2 numbers, i and –i, but, being pedantic, I beg to differ). Now all through my childhood, the square route of -1 was considered an impossibility like dividing by zero. So when someone finally came up with i, I believed I’d been conned – it was a convenience, invented to overcome a conundrum, and, from my perspective, it should have remained an impossibility. Part of the problem, as Willers points out, is that it’s called an ‘imaginary’ number, when it’s just as real as any other number, and I think that’s a very good point.

When one thinks that the Pythagoreans had serious problems accepting irrational numbers, and then the Greek and Roman mathematicians who followed them, had conceptual issues with zero and negative numbers, the concept of i is no different. It’s a number and it opens up an entirely new world in mathematics that includes fractals, the famous Mandelbrot set and quantum mechanics. If one doesn’t explain complex numbers using the complex plane (or Argand diagram) then it won’t make sense, but if one does, everything falls into place. In particular, multiplying by i rotates any graph on the plane through 90 degrees (a right angle), and by i2 (-1) by 180 degrees. In an ordinary number line with positive numbers running right and negative numbers running left, multiplying a positive number by -1 rotates the number through the origin (0) by 180 degrees to its negative equivalent. If you have an i axis running vertically through 0 then multiplying a number by i just rotates it by 90 degrees (half way). If you draw the graph it makes perfect sense.

Imaginary numbers, like multiple dimensions, demonstrate that the mathematical world can go places that the physical world doesn’t necessarily follow, yet these esoteric mathematical entities can have applications in the real world that we don’t anticipate at the time of their discovery. Reimann’s geometry giving us Einstein’s General Theory of Relativity and imaginary numbers giving us the key to quantum mechanics are two cases in point, both barely a century ago.

I’m one of those who sees mathematics as an abstract territory that only an intelligent species can navigate. Personally, I would like to think that we are not the only ones in the universe who can, and maybe there is at least one other species somewhere who can navigate even further than we can. It’s a sobering yet tantalising thought.


Addendum: I've since written an exposition on imaginary numbers and the complex plane, for those who are interested.


Monday, 7 December 2009

Tim Flannery’s The Weather Makers

This is a timely post, considering the Copenhagen Summit on Climate Change is starting tonight (my time). I’ve just read Tim Flannery’s The Weather Makers, published in 2005. Tim Flannery is unknown outside of Australia, but he was awarded the Australian of the Year title in 2007. Considering that our Prime Minister of the day, John Howard, was a self-confessed climate-change sceptic, that’s quite an achievement. He was also awarded Australian Humanist of the Year in 2005. I have to admit that I didn’t even know the award existed until I read about it on the back fly cover of his book.

Tim Flannery is a scientist, and the scope and erudition of his book reflects that. Bill Bryson’s endorsement on the cover says it all and is no exaggeration: “It would be hard to imagine a better or more important book.” But reading Flannery’s book, I see the problem that the scientific community is faced with, when it comes to communicating a message. This book is largely aimed at people like myself, who read scientific magazines like New Scientist and Scientific American, and who like to immerse themselves in the scientific challenges of the day, but vicariously, without having to do the research or know all the esotericism of the subject. But most people, and this includes politicians, really aren’t that interested, despite the fact that writers as good as Flannery can engage readers outside of academia. What most people want, politicians included - some may say, politicians especially - is a neat one-liner that summarises the entire subject into a sound-bite. Of course, as soon as you give them this, all the data and all the arguments and all the research is left behind, and then every armchair-critic in the world can challenge its veracity.

Flannery faces this dilemma himself, because I’ve seen him defend his position in the media when he’s been misquoted or misrepresented for his honesty. Evolutionary biologists face exactly the same problem when they have to defend their honesty: that we don’t know all the answers to all of nature’s mysteries. But I get angry when politicians really believe that they know more than the scientists, or look for scientists on the fringe who will support their position. We’ve seen this with the tobacco industry, the cosmetics industry, the invitro-fertilisation industry, and, now, with climate-change, the fossil-fuel industry.

Science is different to any other discipline or endeavour. It’s highly dependent on data, research and the work of diverse groups over long periods of time. It suffers from its own rigor for reporting truth. Scientists need to be conservative when extrapolating or speculating into the future, which, in the case of climate-change, is an imperative. This leads them open to challenges by anyone who is a doubter or believes their immediate interests are in jeopardy. In the case of climate-change, this includes the entire Western and Developing world. Economically, entire nation states are in jeopardy, but, so is the very planet if climate-change is a reality. Risk evasion or risk management has never been circumvented by doing nothing. In many cases - the recent economic crisis being a case in point - doing nothing is often the greatest risk of all. Unless people take that into account, they are not practicing risk-management, they are practicing ignorance and denial.

Flannery’s book covers all the bases. He covers the entire living history of the planet throughout all the geological ages, which puts our current, most recent age in perspective. He explains how evidence from ice cores, fossils and other geological and biological sources from all over the world, comprehensively build a picture that is compelling and believable. Flannery provides the science behind Al Gore’s film, An Inconvenient Truth, and has the advantage, as a book, of being able to expound in detail all of his arguments, providing sources and revealing the evidence that has been accumulating over decades. One of the book’s strength is that Flannery demonstrates how climate-science is not a new invention arising from a perceived threat, but goes back at least half a century to Milutin Milankovich’s Canon of Insolation of the Ice-Age Problem published in 1941, describing, for the first time, the relationship between the ice ages and the Earth’s inherent precession on its axis (wobble). The apparent relationship between sunspot activity and the Earth’s temperature goes back centuries. Flannery also explains the relationship between climate change and the world’s great extinction events, including the near-loss of our own species “around 100,000 years ago when humans were as rare as gorillas are today.” Few people know that we nearly didn’t make it to the end of our current evolutionary branch – a very sobering thought indeed.

Flannery’s expertise is in zoology, and it’s his detailed exposition on the impact of climate-change on ecosystems, especially in both polar regions (where the impacts are different yet equally catastrophic) and in coral reefs, that I found most compelling and most depressing. Compelling because it’s already evident and depressing because the bulk of humanity is both unaware and uncaring. Yet it will be truly disastrous to the planet if entire food chains disappear in this century, and that’s the alarm bell that Flannery is ringing. At the very least, biodiversity will be decimated and the long term consequences to us is unknown. The fact that we’ve gone from 1 billion to 6 billion in the last century doesn’t bode well for this century, nor the long term health of the planet.

He spends an entire chapter explaining how the extinction rate of frog and toad species in all parts of the world are probably the most accurate harbingers of climate change, and how it’s been happening and being recorded since the 1980s.

But of all the arguments and evidence that Flannery presents, it’s the ‘time-gates’ of 1976 and 1998 that leave one in no doubt that climate-change is already occurring and we are fools to ignore it. By time-gates he’s referring to events that have become permanent and will not switch back to previous norms. In other words the norms for global climate have already changed. Obviously, it’s future time-gates that we are now attempting to avoid. All of our policies should be based on working backwards from predicted time-gates, and this is the hardest argument to sell. But if more people (politicians in particular) recognised the time-gates we’ve already passed through in the last 3 decades, one would expect the argument to be a very soft sell indeed.

The 1976 ‘climate gate’ relates to the well know El-Nino effect, and data collected in the central Pacific.

“Between 1945 and 1955 the temperature of the surface of the tropical Pacific commonly dipped below 19.2C, but after the magic gate opened in 1976 it has rarely been below 25.3C.”

The El-Nino La-Nina cycles have since become longer: “one would expect such long cycles only once in several thousand years”.

“The 1998 magic gate is also tied up with the El-Nino La-Nino cycle, a two to eight-year-long cycle that brings extreme climate events to much of the world.”

“The 1997-98 El-Nino year has been immortalised by the World Wide Fund for Nature (now the WWF) as ‘the year the world caught fire’.”

In Australia, we have witnessed the effects of this first-hand. As Flannery once wrote in New Scientist (approximately 2 years ago) Australia is witnessing climate-change in advance of the rest of the world. Despite this, our conservative opposition party is literally split down the middle between climate-change-deniers and climate-change-proponents. As recently as last week, this split resulted in a leadership change with the sceptics now in the ascendant.

But, according to his book, it’s Africa that has possibly suffered most from climate change to date, especially, what he calls ‘the Sahelian catastrophe’ in the Dafur region of Western Sudan. And this goes back 4 decades to the 1960s, when Western governments and Western media believed it was all a problem of the local inhabitants’ own making. Flannery argues that, with hindsight and climatology research, it’s Western induced greenhouse gases that have created the Sahelian catastrophe as early as the 1960s.

“The Sahelian climate shift is emblematic of the situation faced by the world as a whole, for in it we see the West focusing on religion and politics as the problem, rather then the well-documented and evident environmental catastrophe that is its ultimate cause.”

Computer modeling for the future has created the most controversy as I alluded to in my introduction, but one of the factors, that few climatologists disagree on, is that there is a residual effect of 5 decades from CO2. In other words, the full effects of the current status of CO2 in the atmosphere will not be experienced until 2050. This is why climatologists are arguing for immediate action. No one expects us to cut our emissions to zero, yet we can’t remove what we’ve already put in, and we have to wait another 2 generations before the full effects of current levels are known in reality. It’s even more serious when one realises that: “half of the energy generated since the Industrial Revolution has been consumed in just the last twenty years.” Business as usual is not a morally responsible option. I don’t expect corporations to be morally responsible, because they’re not – one only has to look at the way they behave in third world countries – but I expect governments to be.

I imagine a lot of people would avoid this book because it makes depressing reading, but, for a start, it should be compulsory reading for all politicians. Flannery discusses 3 possible tipping points, all of which have occurred in the past: the shutting down of the Gulf Stream, the collapse of the Amazon rain forest and the release of methane from the ocean floor, which created the greatest mass extinction ever, 55 million years ago, when an estimated 90% of the planet’s species (that’s species not individual plants and animals) became extinct. Of the three, the last is the most unlikely, and the other two would take the rest of this century to become fully evident, yet, once started, possibly in half that time, they could not be reversed.

But it would take less extreme events to create shortages of food, water and energy, which are already being predicted. Flannery discusses this and the logical outcome is genocidal warfare, because that’s what humans do. This is the scenario that we should all be trying to avoid, yet we don’t even contemplate it, let alone imagine the consequences. It’s human nature to be optimistic and ignore worst-case-scenarios, but we’ve all seen the results of this thinking (in America alone in recent years) with Hurricane Katrina and the subprime mortgage debacle. Unless we consider worst-case-scenarios they will overtake us and cause calamity. In the case of climate-change, this will occur on an unprecedented global scale in human-recorded history.

In the last 2 sections of the book, Flannery talks about solutions, both current and future. He starts with a discussion of the Kyoto protocol; to date, a complete failure compared to the Montreal protocol for the banning of CFCs that saved us from ozone depletion. He concentrates on Australia, partly because its his home and partly because, like the US, it refused to ratify Kyoto and produced spurious arguments to defend its position.

However: “…documents came to light under Australia’s Freedom of Information Act revealing how it [MEGABARE, Australia’s economic model for negotiation] had been funded, to the tune of $400,000, by the Australian Aluminium Council, Rio Tinto, Mobil and other like-minded groups, all of whom had received a seat on the study’s steering committee.” In other words, our position had been determined by representatives in the energy industry rather than climate scientists, even though CSIRO (Australia’s esteemed scientific research establishment) had done considerable research in this area, especially considering Australia’s extensive history of droughts, fires and floods.

Not surprisingly, however, Flannery saves his most scathing criticism for the United States, in particular, the role of the second Bush administration and the energy industries:

“The fact that in the 1970s the US was a world leader and innovator in energy conversation, photovoltaics and wind technology, yet today is a simple follower is testimony to their success [the energy industries]. It is impossible to overestimate the role these industries have played over the last two decades in preventing the world from taking serious action to combat climate change.’

Flannery meticulously documents the role of coal companies, in particular, both in America and Australia, in fighting and funding propaganda warfare against climate-change policy. In both countries, members of the industry were given prominent positions in energy sector reviews, effectively censoring genuine scientific debate. He also cites the ‘Global Climate Coalition’, whose stated purpose was to ‘cast doubt on the theory of global warming’. After 11 years of lobbying, it eventually broke up in 2000 because major players like DuPont and BP realised that they were on the wrong side of the debate and left it in 1997, causing others to follow.

In Australia, a conservative politician recently stated publicly that it was all a ‘hoax’, tacitly referring to a well-circulated conspiracy theory, very popular with climate-change-deniers in this country, that academics, the world over, have created climate-change, or exaggerated its potential impact, for no other reason than to maintain their funding and their careers. This is the most cynical of arguments, but it has a lot of currency amongst the most ignorant and intransigent of my country’s politicians.

On the other hand, Flannery cites the UK as a leader in climate-change reform, going back to the Thatcher years, thanks largely, to the lobbying and influence of James Lovelock.

Flannery is critical of carbon geosequestration, seeing it as a waste of public money to allow the coal industry to continue for another 50 years, when the money could be better spent on other alternatives. Most experts agree that coal is the biggest danger to climate change, yet many countries, including Australia and China, are committed to its continued use for economic reasons.

Flannery discusses all the alternatives, including hydrogen cells and nuclear power but plumps for wind and solar, even arguing a case for self-sufficiency independent of the grid. He also believes that geothermal has been under-explored, especially in Australia, where he contends it could provide all our needs for the next 75 years, carbon free.

Flannery leaves the reader in no doubt that climate-change is already happening. The sceptics argue, considering the extreme climate variations in the geological past, that the real question is whether the current climate change is human-induced or natural. But the correlation between the industrial revolution and consequential global changes in the past century, especially with the 2 significant ‘climate-gate’ changes in the last 3 decades, is compelling evidence.

But if there is any lingering doubt, Flannery added the following postscript to his book:

“As this book was going to press the journal Science published proof positive of global warming. A study by James Hansen and colleagues revealed that Earth is now absorbing more energy, an extra 0.85 watts per square metre, than it’s radiating to space.”

As for the sceptics, it’s an over-eager optimism combined with a reluctance to face a global economic challenge that motivates their opposition. It’s not a coincidence that it’s the political conservatives, in all nations, who are questioning the science. It’s the conservatives who want to maintain the status quo, who believe that change is inherently unwise, yet fail to appreciate that we could well create change on a biblical scale, in this very century, just by doing nothing at all.

Flannery has filled his book with quotations from people as diverse as James Lovelock, William Shakespeare and indigenous people like Aboriginal Elder, Big Bill Neidjie, Gagadju Man. But I thought the best and most relevant quote was from Alfred Russel Wallace, who concurrently discovered the law of natural selection (yes, it’s a law, not a theory) with Charles Darwin.

It is among those nations that claim to be the most civilised, those that profess to be guided by a knowledge of laws of nature, those that most glory in the advance of science, that we find the greatest apathy, the greatest recklessness, in continually rendering impure this all-important necessity of life… (from Man’s Place in the Universe, 1903).

Friday, 20 November 2009

Science, Philosophy, Religion

In a not-so-recent discussion I had on Stephen Law’s blog, I had trouble convincing some of the participants that, not only is there a difference between science and philosophy, but the distinction is an important one.

In a comment on my last post, Timmo made a reference to Richard Feynman’s book, The Character of Physical Law, which got me re-reading it. You may wonder how these 2 issues are related. Well, in my last post I discussed some of Erwin Schrodinger’s philosophy, and the aforementioned Feynman’s book is probably his most philosophical. Together, they highlight the fact that Feynman’s philosophical musings probably couldn’t be more different than Schrodinger’s, yet I doubt that they would disagree on the science. The same is true of contemporary physicists. For example, Roger Penrose and Stephen Hawking, even though they have collaborated scientifically and even won a joint prize in physics, are philosophically miles apart on the nature of mind. In his book, Shadows of the Mind, Penrose actually invited Hawking to provide a counter-philosophical point of view, which, of course, he did. Likewise, Albert Einstein and Kurt Godel were very good friends, when they were both fellows at the Princeton Institute for Advanced Study, but held philosophically divergent views: Godel was a mathematical Platonist and Einstein was not; yet I’m sure they didn’t disagree on the mathematics of each other’s theories.

As a general rule, philosophy deals with questions, the answers for which are not certain, and in many cases, may never be; whereas science deals with questions, where the answers will decide the ultimate truth, and the limits of truth, for a particular theory. Bertrand Russell made the observation that, in philosophy, there may be no right or wrong answers, but the questions, when addressed in the right spirit, are the bulwark against dogmatism and the conservative resistance we find to genuine questing for knowledge. A corollary to this approach is to beware of those who claim they have answers of certainty to questions of profundity.

You may wonder where religion fits into all this. Well, religion is philosophy taken to the metaphysical extreme, but is often confounded by politics to the extent that some people don’t delineate one from the other. In fact, religion is often confounded with ideology, because, for many people, religion and ideology are unassailable truths. But truth is arguably the most elusive concept in the human world, and in this context is an abuse.

I have 2 ways of defining science. Firstly, a general definition is that science is the study of the natural world in all its manifestations. So this leaves out many aspects of knowledge that are human-based, or what is generically called the humanities: all the arts, and topics like ethics and justice. Arguably, psychology crosses the boundary, and I discussed this briefly in another post, Is psychology a science? (Nov. 08). But the topic of ‘mind’, that was raised by Schrodinger, certainly falls into a category where science, psychology and philosophy all merge, but I don’t want to get too far off the track, so I will return to ‘mind’ later. Interestingly, philosophy is generally considered a humanities subject.

The other definition, which is effectively a working definition, is that science is a dialectic between theory and experimentation or observation. Questions that can’t be answered by experimental analysis generally remain philosophical until they can. An example is AI (artificial intelligence). Will AI ever be sentient? Providing we can agree on a definition of sentience, this question will probably one day be resolved. Until that day, it will remain a philosophical question. But there are other philosophical questions that may never be decided by science. An example is the so-called multiverse (multiple universes) theory. If they exist, we may never find any evidence of them, though one should be careful of never saying never. Metaphysical questions like: does the universe have a purpose? (See my post on this topic, Oct. 07) is an example of a subtly different nature. This is a question that science can’t answer, although almost anyone who gives an answer, one way or the other, uses their scientific knowledge to support it. And this is why the distinction is important. Using science to support a philosophical point of view doesn’t turn philosophy into science, though many people, when lost in their own rhetoric, may infer that it does, whether intentionally or not.

On the subject of the dialectic in science, Feynman, in his book, The Character of Physical Law, gives excellent examples, whilst discussing the evolution of the Universal Theory of Gravitation: specifically, how astronomical observations forced changes to theory and then confirmed theory. In other words, without experimentation and observation, we would have just continued to bark up the wrong tree.

His opening chapter on The Law of Gravitation, an example of Physical Law provides one of the best expositions of this dialectic, including descriptions of the experiments that Galileo performed to show gravity’s universality on Earth. And how Tycho Brahe’s unprecedented accuracy in tracking planetary motion gave Johannes Kepler the key to his 3 laws, which ultimately led Newton to the Universal Theory of Gravity we have today. Yes, it’s been modified by Einstein, as Feynman explains, but Newton was able to marry Kepler’s laws to his calculus that not only clinched the theory but eventually led to predictions of another planet perturbing Neptune’s orbit. The ultimate test of a theory is when it predicts hitherto unobserved events.

String Theory is an example of a theory without the dialectic, so we have innumerable variants of which none can be validated by reality. String Theory is not exactly philosophy either – it’s a mathematical adventure. I would describe it as a mathematical model looking for an experiment to make it a scientifically valid theory. I’m not an expert on the subject, but I provide a review of Peter Woit’s book, Not Even Wrong, in a post I wrote earlier this year (Nature’s Layers of Reality, May 09).

And this leads to the significance of mathematics. No one who discusses physics and philosophy can avoid discussing the role of mathematics, and this includes Feynman. In the edition of Feynman’s book that I have (1992), Paul Davies has written an Introduction. He not only acknowledges Feynman’s influence, unorthodoxy and brilliance as a communicator, but relates a dialogue he once had with him on the philosophy of mathematics.

“…Feynman had an abiding suspicion of philosophers. I once had occasion to tackle him about the nature of mathematics… whether abstract mathematical laws could be considered to enjoy an independent Platonic existence. He gave a spirited and skilful description of why this indeed appears so but soon backed off when I pressed him to take a specific philosophical position. He was similarly wary when I attempted to draw him out on the subject of reductionism.”

Feynman devotes an entire chapter (lecture) to the topic, The Relation of Mathematics to Physics, describing it as a language with reasoning, and sees it as an intellectual construct based on axioms. He doesn’t address Godel’s Incompleteness Theorem, because it’s not strictly relevant to his topic: mathematics in physics. He refers to Newton’s calculus as an ‘invention’, whereas Platonists would call it a ‘discovery’.

But more relevant to this discussion is that he describes 3 different ways of looking at the Universal Theory of Gravity, even though they are all mathematically equivalent. One is ‘action at a distance’ or force mediated by the inverse square law; two is by a ‘potential field’; and three is by the ‘least action’ principle, which is Feynman’s personal favourite, and I discuss it in 2 other post ( Nature’s Layers of Reality, May 09 and The Laws of Nature, Mar.08). The point is that these are philosophical interpretations that would determine how a scientist may investigate a phenomenon further. Feynman prefers the ‘least action’ principle because it applies to the refraction of light as well, and therefore suggests a universal principle.

So there is philosophy within science as well as philosophy outside of science, and, once again, I think the distinction is important. Philosophy within science is more likely to be eventually resolved because it generally leads to new avenues of investigation. Feynman says of this: “…every theoretical physicist who is any good knows six or seven different theoretical representations of exactly the same physics.” By ‘exactly the same physics’ he means the mathematics is equivalent (this will become more evident when I discuss quantum mechanics). In other words, it contributes to the dialectic between theory and empirical evidence. Philosophy outside of science is generally removed from the dialectic, which is why it remains philosophy and not science. Philosophy within science remains philosophy until it can evolve into theory. In quantum mechanics (as I discuss below) theory is effectively deadlocked and has been for many decades. At least, that is the impression I get from what I’ve read on the subject by people who know it.

As an aside, the abovementioned quote was once construed by a philosophical writer (Michael Frayn in The Human Touch) as evidence that theoretical physicists effectively make things up because "nature doesn’t have six or seven different ways to represent itself, or even one." But it’s obvious to me that, even though Feynman referred to theories as ‘guesses’ in his usual cavalier manner, he didn’t doubt the validity of nature’s laws. In the cases he’s referring to, the mathematics is solid, but the philosophical interpretations are not (I elaborate on this below).

Elsewhere in the book, Feynman alludes to a view that we will eventually understand all the laws of physics. This is a philosophical position and one I’ve argued against in the past. My reason is history. We never know what we are going to discover and every resolution of a mystery in science has only revealed more mysteries. I find it hard to imagine that this will ever stop, but I also admit that I don’t want it to stop. Feynman, on the other hand, argues that we will eventually run out of finding new laws: either, because of the limit of our ability to reveal them or the limit of their actual existence. He believes that the 20th Century was a golden age of discovery in physics, and no one can deny that. But each age has uncovered new intellectual territory and nature appears far from revealing all its secrets.

On a related note, I quote Feynman in my post, Nature’s Layers of Reality, (cited by Peter Woit, Not Even Wrong) where he is scathing about String Theory. I’m not in a position to judge String Theory, but I don’t think it’s the scientific Holy Grail as some commentators do, and it does reveal how much we still don’t know. String Theory is an example of where people hope to find a ‘Theory of Everything’. It’s one of the reasons I’m a sceptic, but I could be proven wrong.

In previous posts (specifically Quantum Mechanical Philosophy, Jul.09) I describe how the philosophical implications of quantum mechanics are not resolved, yet as a meta-theory, it is arguably the most empirically successful ever. Paul Davies makes exactly the same point in The Goldilocks Enigma. Quantum mechanics demonstrates, more strikingly than any other endeavour, the fundamental differences that lie between science and philosophy. Philosophically, there is the Copenhagen interpretation (Neils Bohr), the Many Worlds interpretation (Hugh Everitt) and the Hidden Variables interpretation (David Bohm). And there are variations amongst these, which I discuss to some extent in the aforementioned post. These are not just different theories; they all have philosophical implications on how we perceive reality. Epistemologically, it can’t get more serious than that.

The Copenhagen interpretation is generally considered to be the conventional interpretation, but as Feynman says in his book: “…I think I can safely say that nobody understands quantum mechanics”. What he means is that no one can explain quantum phenomena in plain language without creating cognitive or logical contradictions. Schrodinger created a thought experiment, popularly known as Schrodinger’s Cat, that encapsulates this conundrum perfectly, where, theoretically, a cat can be dead and alive at the same time. Ironically, Schrodinger also created (he would say discovered) the mathematical equations that have made quantum mechanics the most successful theory ever.

Mathematically, there are no contradictions or conundrums – Schrodinger’s wave mechanical equations and their derivatives, especially the famous Dirac equation, have not only confirmed existing observed phenomena but predicted new ones. Dirac’s equation not only prescribed quantum electron ‘spin’ as an inherent feature of the equation, but predicted the electron’s anti-particle (the positron) and therefore anti-matter. As Feynman says, the best theories, by far, are those where we get more out than what we've put in. More relevant to this discussion, quantum mechanics demonstrates explicitly that science deals in answers and philosophy deals in questions, and sometimes one is not resolved by the other as we might expect.

And now I must come to ‘mind’ because it’s the one topic that really does cross boundaries (including religion). Feynman doesn’t discuss it, because it’s not relevant to his lectures on physics, but Schrodinger did (see previous post), and so does Penrose, who has written 3 books on the subject that I have read. I haven’t read Daniel Dennett’s Consciousness Explained but I’ve read John Searle’s Mind, and it’s the most accessible I’ve found on the subject thus far. I’ve discussed this in previous posts (Subjectivity: The Mind’s I, June 09) and of course in my last post on Schrodinger. I think Schrodinger makes a couple of salient points, which I’ve alluded to previously. In particular, that there is a subjective aspect to consciousness that makes it ontological as well as epistemological. Searle makes this point as well, in his aforementioned book, as does the Dalai Lama in his book, The Universe in a Single Atom.

Schrodinger, in particular, explains how phenomena like light and sound can be measured and analysed by instruments, and we can even analyse how they are transcribed into nerve impulses in our bodies, but all the instruments and analysis in the world can’t describe or explain the actual experience we have of light and sound. This is a contentious point, but people forget that this is what consciousness is, first and foremost: an experience. And if each and every one of us didn’t have this experience, science would no doubt tell us that it doesn’t exist, in the same way that science tells us that free will doesn’t exist. It is still the greatest enigma in the universe, and is likely to remain that way, possibly for ever.

And this leads to Schrodinger’s second salient point: without ‘mind’ the universe would be meaningless. In an earlier post (The Existential God, Sep.09) I reviewed Don Cupitt’s book, Above Us Only Sky, who goes further and says that without language, there would be no meaning and no ‘truth’. I won’t revisit Cupitt, but one should not confuse meaning with reality, nor ontology with epistemology. To quote Einstein: “The most incomprehensible thing about the universe is that it’s comprehensible.” There are various ways one can interpret that statement but mine is: The greatest mystery of the universe is that it created the ability to understand itself. Paul Davies takes this head-on in The Goldilocks Enigma and elaborates on a philosophical premise proposed by John Wheeler. Wheeler effectively argued that the universe exists as the result of a cosmological-scale quantum loop. Because we observe it, it exists. I’m not going to argue one way or the other with Wheeler, but I agree with Schrodinger that without ‘mind’ there is no point to the universe’s existence, and Davies makes a similar point. At the end of The Goldilocks Enigma he summarises all the philosophical viewpoints that are in currency (including ID, the multiverse and the ‘absurd universe’, probably better known as the accidental universe) ending with Wheeler’s, which he calls The self-explaining universe. To quote: “I have suggested that only self-consistent loops capable of understanding themselves can create themselves, so that only universes with (at least the potential for) life and mind really exist.”

In a way I’ve returned to a point I alluded to much earlier: does the universe have a purpose? This is a philosophical question, as I said, but it leads into religion and religious belief. Paul Davies obviously believes it does, and says so, but he’s quick to point out that this does not axiomatically lead to a belief in God. Feynman, whom I’m almost certain was an atheist, makes only one reference to God in his book, when he discusses the hierarchical nature of nature. He explains how the laws of physics can have consequences at a higher level that are unforeseeable yet totally necessary for the universe’s existence as we know it. The example he gives is Hoyle’s and Salpeter’s prediction concerning carbon 12, which arises from the unlikely combination of 3 helium atoms creating a specific new energy level that allows the rest of the elements in the periodic table to exist. Feynman doesn’t make anything metaphysical of this, but he makes the point that nature’s laws at one level have consequences at a higher level of existence that are not readily apparent.

He invokes God (metaphorically, as he’s quick to point out) as either the progenitor of the laws or the ultimate end result; at opposite ends of reality. In an uncharacteristically poetic moment, in another part of the book, he says: “Nature uses only the longest threads to weave her patterns, so each small piece of her fabric reveals the organization of the entire tapestry.” He’s indirectly invoking the implication in the title of the Dalai Lama’s book on science and religion The Universe in a Single Atom. The laws of nature are the threads and the tapestry is the universe in all its complexity.

There are no objective religious truths, contrary to what fundamentalists tell us, but there are mathematical truths. And the more we learn about the universe, the more mathematics plays a role. Every book I’ve read on nature’s laws illustrates this fundamental premise. Feynman, Einstein and Hawking would suggest that the mathematics is human reason, but others, like Penrose, Schrodinger and Godel, would argue that mathematics is independent of human thought, albeit we only know it through human thought. Pythagoras and Plato might have argued that God exists in the mathematics and Schrodinger might have argued that God is the ultimate unity of mind (refer my last post). Like Feynman’s metaphorical attribution, they represent opposite ends of reality. At the end of the day, God becomes a metaphor and a projection for what we don’t know, whichever end of reality we posit that projection.

Religion is mind’s quest to find meaning in its own existence. If we were to accept that simple premise without the urge to create an edifice of mythology and political ideology around it, maybe we could all accept each other’s religion.