Paul P. Mealing

Check out my book, ELVENE. Available as e-book and as paperback (print on demand, POD). Also this promotional Q&A on-line.

Tuesday, 2 February 2016

Creation Science: a non sequitur

A friend of mine – someone whom I’d go to for help – leant me a ‘Creation’ magazine to prove that there are creationists who are real scientists. And, I have to admit, my friend was right: the magazine was full of contributors who had degrees in science, including one who has a PhD and honours and works at a run-of-the-mill university; but who wrote the following howler: ‘Cosmology is unscientific because you can’t do an experiment in cosmology.’ I wonder if said writer would be willing to say that to Australian Nobel Prize winner, Brian Schmidt. Only humans can be living contradictions.

Creation science is an epistemological contradiction – there’s no such thing – by definition. Science does not include magic – I can’t imagine anyone who would disagree with that, but I might be wrong. Replacing a scientific theory with supernaturally enhanced magic is anti-science, yet creationists call it science – as the Americans like to say: go figure.

The magazine was enlightening in that the sole criterion for these ‘scientists’ as to the validity of any scientific knowledge was whether or not it agreed with the Bible. If this was literally true, we would still be believing that the Sun goes round the Earth, rather than the other way round. After all, the Book of Joshua tells us how God stopped the Sun moving in the sky. It doesn’t say that God stopped the Earth spinning, which is what he would have had to do.

One contributor to the magazine even allows for ‘evolution’ after ‘creation’, because God programmed ‘subroutines’ into DNA, but was quick to point out that this does ‘not contradict the Bible’. Interesting to note that DNA wouldn’t even have been discovered if all scientists were creationists (like the author).

Why do you think the ‘Dark Ages’ are called the dark ages? Because science, otherwise known as ‘natural philosophy’, was considered pagan, as the Greeks’ neo-Platonist philosophy upon which it was based was pagan. Someone once pointed out that Hypatia’s murder by a Christian mob (around 400AD) signalled the start of the dark ages, which lasted until around 1200, when Fibonacci introduced the West to the Hindu-Arabic system of numbers. In fact, it is the Muslims who kept that knowledge in the interim, otherwise it may well have been lost to us forever.

So science and Christianity have a long history of contention that goes back centuries before Copernicus, Galileo and Darwin. If anything, the gap has got wider, not closer; they’ve only managed to co-exist by staying out of each other’s way.

There are many religious texts in the world, a part of our collective cultural and literary legacy, but none of them are scientific or mathematical texts, which also boast diverse cultural origins. It is an intellectual conceit (even deceit) to substitute religious teaching for scientifically gained knowledge. Of course scientifically gained knowledge is always changing, advancing, being overtaken and is never over. In fact, I would contend that science will never be complete, as history has demonstrated, so there will always be arguments for supernatural intervention, otherwise known as the ‘God-of-the-Gaps’. Godel’s Incompleteness theorem infers that mathematics is a never-ending epistemological mine, and I believe that the same goes for science.

Did I hear someone say: what about Intelligent Design (ID)? Well, it’s still supernatural intervention, isn’t it? Same scenario, different description.

Religion is not an epistemology, it’s a way of life. Whichever way you look at it, it’s completely subjective. Religion is part of your inner world, and that includes God. So the idea that the God you’ve found within yourself is also the Creator of the entire Universe is a non sequitur. Because everyone’s idea of God is unique to them.

Tuesday, 19 January 2016

Is this the God equation?

Yes, this is a bit tongue-in-cheek, but like most things tongue-in-cheek it just might contain an element of truth. I’m not a cosmologist or even a physicist, so this is just me being playful yet serious in as much as anyone can be philosophically serious about the origins of Everything, otherwise known as the Universe.

Now I must make a qualification, lest people think I’m leading them down the garden path. When people think of ‘God’s equation’, they most likely think of some succinct equation or set of equations (like Maxwell’s equations) from which everything we know about the Universe can be derived mathematically. For many people this is a desired outcome, founded on the belief that one day we will have a TOE (Theory Of Everything) – itself a misnomer – which will incorporate all the known laws of the Universe in one succinct theory. Specifically, said theory will unite the Electromagnetic force, the so-called Weak force, the so-called Strong force and Gravity as all being derived from a common ‘field’. Personally, I think that’s a chimera, but I’d be happy to be proven wrong. Many physicists believe some version of String Theory or M Theory will eventually give us that goal. I should point out that the Weak force has already been united with the Electromagnetic force.

So what do I mean by the sobriquet, God’s equation? Last week I watched a lecture by Allan Adams as part of MIT Open Courseware (8.04, Spring 2013) titled Lecture 6: Time Evolution and the Schrodinger Equation, in which Adams made a number of pertinent points that led me to consider that perhaps Schrodinger’s Equation (SE) deserved such a title. Firstly, I need to point out that Adams himself makes no such claim, and I don’t expect many others would concur.

Many of you may already know that I wrote a post on Schrodinger’s Equation nearly 5 years ago and it has become, by far, the most popular post I’ve written. Of course Schrodinger’s Equation is not the last word in quantum mechanics –more like a starting point. By incorporating relativity we have Dirac’s equation, which predicted anti-matter – in fact, it’s a direct consequence of relativity and SE. In fact, Schrodinger himself, followed by Klein-Gordon, also had a go at it and rejected it because it gave answers with negative energy. But Richard Feynman (and independently, Ernst Stuckelberg) pointed out that this was mathematically equivalent to ordinary particles travelling backwards in time. Backwards in time, is not an impossibility in the quantum world, and Feynman even incorporated it into his famous QED (Quantum Electro-Dynamics) which won him a joint Nobel Prize with Julian Schwinger and Sin-Itiro Tomonaga in 1965. QED, by the way, incorporates SE (just read Feynman’s book on the subject).

This allows me to segue back into Adams’ lecture, which, as the title suggests, discusses the role of time in SE and quantum mechanics generally. You see ‘time’ is a bit of an enigma in QM.

Adams’ lecture, in his own words, is to provide a ‘grounding’ so he doesn’t go into details (mathematically) and this suited me. Nevertheless, he throws terms around like eigenstates, operators and wave functions, so familiarity with these terms would be essential to following him. Of those terms, the only one I will use is wave function, because it is the key to SE and arguably the key to all of QM.

Right at the start of the lecture (his Point 1), Adams makes the salient point that the Wave function, Ψ, contains ‘everything you need to know about the system’. Only a little further into his lecture (his Point 6) he asserts that SE is ‘not derived, it’s posited’. Yet it’s completely ‘deterministic’ and experimentally accurate. Now (as discussed by some of the students in the comments) to say it’s ‘deterministic’ is a touch misleading given that it only gives us probabilities which are empirically accurate (more on that later). But it’s a remarkable find that Schrodinger formulated a mathematical expression based on a hunch that all quantum objects, be they light or matter, should obey a wave function.

But it’s at the 50-55min stage (of his 1hr 22min lecture) that Adams delivers his most salient point when he explains so-called ‘stationary states’. Basically, they’re called stationary states because time remains invariant (doesn’t change) for SE which is what gives us ‘superposition’. As Adams points out, the only thing that changes in time in SE is the phase of the wave function, which allows us to derive the probability of finding the particle in ‘classical’ space and time. Classical space and time is the real physical world that we are all familiar with. Now this is what QM is all about, so I will elaborate.

Adams effectively confirmed for me something I had already deduced: superposition (the weird QM property that something can exist simultaneously in various positions prior to being ‘observed’) is a direct consequence of time being invariant or existing ‘outside’ of QM (which is how it’s usually explained). Now Adams makes the specific point that these ‘stationary states’ only exist in QM and never exist in the ‘Real’ world that we all experience. We never experience superposition in ‘classical physics’ (which is the scientific pseudonym for ‘real world’). This highlights for me that QM and the physical world are complementary, not just versions of each other. And this is incorporated in SE, because, as Adams shows on his blackboard, superposition can be derived from SE, and when we make a measurement or observation, superposition and SE both disappear. In other words, the quantum state and the classical state do not co-exist: either you have a wave function in Hilbert space or you have a physical interaction called a ‘wave collapse’ or, as Adams prefers to call it, ‘decoherence’. (Hilbert space is a theoretical space of possibly infinite dimensions where the wave function theoretically exists in its superpositional manifestation.)

Adams calls the so-called Copenhagen interpretation of QM the “Cop Out” interpretation which he wrote on the board and underlined. He prefers ‘decoherence’ which is how he describes the interaction of the QM wave function with the physical world. My own view is that the QM wave function represents all the future possibilities, only one of which will be realised. Therefore the wave function is a description of the future yet to exist, except as probabilities; hence the God equation.

As I’ve expounded in previous posts, the most popular interpretation at present seems to be the so-called ‘many worlds’ interpretation where all superpositional states exist in parallel universes. The most vigorous advocate of this view is David Deutsch, who wrote about it in a not-so-recent issue of New Scientist (3 Oct 2015, pp.30-31). I also reviewed his book, Fabric of Reality, in September 2012. In New Scientist, Deutsch advocated for a non-probabilistic version of QM, because he knows that reconciling the many worlds interpretation with probabilities is troublesome, especially if there are an infinite number of them. However, without probabilities, SE becomes totally ineffective in making predictions about the real world. It was Max Born who postulated the ingenious innovation of squaring the modulus of the wave function (actually multiplying it with its complex conjugate, as I explain here) which provides the probabilities that make SE relevant to the physical world.

As I’ve explained elsewhere, the world is fundamentally indeterministic due to asymmetries in time caused by both QM and chaos theory. Events become irreversible after QM decoherence, and also in chaos theory because the initial conditions are indeterminable. Now Deutsch argues that chaos theory can be explained by his many worlds view of QM, and mathematician, Ian Stewart, suggests that maybe QM can be explained by chaos theory as I expound here. Both these men are intellectual giants compared to me, yet I think they’re both wrong. As I’ve explained above, I think that the quantum world and the classical world are complementary. The logical extension of Deutch’s view, by his own admission, requires the elimination of probabilities, making SE ineffectual. And Stewart’s circuitous argument to explain QM probabilities with chaos theory eliminates superposition, for which we have indirect empirical evidence (using entanglement, which is well researched). Actually, I think superposition is a consequence of the wave function effectively being everywhere at once or 'permeates all of space' (to quote Richard Ewles in MATHS 1001).

If I’m right in stating that QM and classical physics are complementary (and Adams seems to make the same point, albeit not so explicitly) then a TOE may be impossible. In other words, I don't think classical physics is a special case of QM, which is the current orthodoxy among physicists.


Addendum 1: Since writing this, I've come to the conclusion that QM and, therefore, the wave function describe the future - an idea endorsed by non-other than Freeman Dyson, who was instrumental in formulating QED with Richard Feynman.

Addendum 2: I've amended the conclusion in my 2nd last paragraph, discussing Deutch's and Stewart's respective 'theories', and mentioning entanglement in passing. Schrodinger once said (in a missive to Einstein, from memory) that entanglement is what QM is all about. Entanglement effectively challenges Einstein's conclusion that simultaneity is a non sequitur according to his special theory of relativity (and he's right, providing there's no causal relationship between events). I contend that neither Deutch nor Stewart can resolve entanglement with their respective 'alternative' theories, and neither of them address it from what I've read.

Tuesday, 12 January 2016

How to write a story so it reads like a movie in your head

I’ve written about writing a few times now, including Writing’s 3 Essential Skills (Jul. 2013) and How to Create an Imaginary, Believable World (Aug. 2010), the last one being a particularly popular post. Also, I taught a creative writing course in 2009 and have given a couple of talks on the subject, but never intentionally to provide advice on how to make a story read like a movie in your head.

This post has arisen from a conversation I had when I realised I had effectively taught myself how to do this. It’s not something that I deliberately set out to do but I believe I achieved it inadvertently and comments from some readers appear to confirm this. At YABooksCentral, a teenage reviewer has made the point specifically, and many others have said that my book (Elvene) would make a good movie, including a filmmaker. Many have said that they ‘could see everything’ in their mind’s eye.

Very early in my writing career (though it’s never been my day job) I took some screenwriting courses and even wrote a screenplay. I found that this subconsciously influenced my prose writing in ways that I never foresaw and that I will now explain. The formatting of a screenplay doesn’t lend itself to fluidity, with separate headings for every scene and dialogue in blocks interspersed with occasional brief descriptive passages. Yet a well written screenplay lets you see the movie in your mind’s eye and you should write it as you’d imagine it appearing on a screen. However, contrary to what you might think, this is not the way to write a novel. Do not write a novel as if watching a movie. Have I confused you? Well, bear this in mind and hopefully it will all make sense before the end.

Significantly, a screenplay needs to be written in ‘real time’, which means descriptions are minimalistic and exposition non-existent (although screenwriters routinely smuggle exposition into their dialogue). Also, all the characterisation is in the dialogue and the action – you don’t need physical descriptions of a character, including their attire, unless it’s significant; just gender, generic age and ethnicity (if it’s important). It was this minimalistic approach that I subconsciously imported into my prose fiction.

There is one major difference between writing a screenplay and writing a novel and the two subsequent methods require different states of mind. In writing a screenplay you can only write what is seen and heard on the screen, whereas a novel can be written entirely (though not necessarily) from inside a character’s head. I hope this clarifies the point I made earlier. Now, as someone once pointed out to me (fellow blogger, Eli Horowitz) movies can take you into a character’s head through voiceover, flashbacks and dream sequences. But, even so, the screenplay would only record what is seen and heard on the screen, and these are exceptions, not the norm. Whereas, in a novel, getting inside a character’s head is the norm.

To finally address the question implicit in my heading, there are really only 2 ‘tricks’ for want of a better term: write the story in real time and always from some character’s point of view. Even description can be given through a character’s eyes, and the reader subconsciously becomes an actor. By inhabiting a character’s mind, the reader becomes fully immersed in the story.

Now I need to say something about scenes, because, contrary to popular belief, scenes are the smallest component of a story, not words or sentences or paragraphs. It’s best to think of the words on the page like the notes on a musical score. When you listen to a piece of music, the written score is irrelevant, and, even if you read the score, you wouldn’t hear the music anyway (unless, perhaps, if you’re a musician or a composer). Similarly, the story takes place in the reader’s mind where the words on the page conjure up images and emotions without conscious effort.

In a screenplay a scene has a specific definition, defined by a change in location or time. I use the same definition when writing prose. There are subtle methods for expanding and contracting time psychologically in a movie, and these can also be applied to prose fiction. I’ve made the point before that the language of story is the language of dreams, and in dreams, as in stories, sudden changes in location and time are not aberrant. In fact, I would argue that if we didn’t dream, stories wouldn’t work because our minds would continuously and subconsciously struggle with the logic.

Tuesday, 5 January 2016

Free will revisited

I’ve written quite a lot on this in the past, so one may wonder what I could add.

I’ve just read Mark Balaguer’s book, Free Will, which I won when Philosophy Now published my answer to their Question of the Month in their last issue (No 111, December 2015). It’s the fourth time I’ve won a book from them (out of 5 submissions).

It’s a well written book, not overly long or over-technical in a philosophical sense, so very readable whilst being well argued. Balaguer makes it clear from the outset where he stands on this issue, by continually referring to those who argue against free will as ‘the enemies of free will’. Whilst this makes him sound combative, the tone of his arguments are measured and not antagonistic. In his conclusion, he makes the important distinction that in ‘blocking’ arguments against free will, he’s not proving that free will exists.

He makes the distinction between what he calls Hume-style free will and Non-predetermined free will (NDP), which is a term I believe he’s coined for himself. Hume-style free will, is otherwise known as ‘compatibilism’, which means it’s compatible with determinism. In other words, even if everything in the world is deterministic from the Big Bang onwards, it doesn’t rule out you having free will. I know it sounds like a contradiction, but I think it’s to do with the fact that a completely deterministic universe doesn’t conflict with the subjective sense we all have of having free will. As I’ve expressed in numerous posts on this blog, I think there is ample evidence that the completely deterministic universe is a furphy, so compatibilism is not relevant as far as I’m concerned.

Balaguer also coins another term, ‘torn decision’, which he effectively uses as a litmus test for free will. In a glossary in the back he gives a definition which I’ve truncated:

A torn decision is a conscious decision in which you have multiple options and you’re torn as to which option is best.

He gives the example of choosing between chocolate or strawberry flavoured ice cream and not making a decision until you’re forced to, so you make it while you’re still ‘torn’. This is the example he keeps coming back to throughout the book.

In recent times, experiments in neuro-science have provided what some people believe are ‘slam-dunk’ arguments against free will, because scientists have been able to predict with 60% accuracy what decision a subject will make seconds before they make it, simply by measuring neuron activity in certain parts of the brain. Balaguer provides the most cogent arguments I’ve come across challenging these contentions. In particular, the Haynes studies, which showed neuron activity up to 10 seconds prior to the conscious decision. Balaguer points out that the neuron activity for these studies occurs in the PC and BA10 areas of the brain, which are associated with the ‘generation of plans’ and the ‘storage of plans’ respectively. He makes the point (in greater elaboration than I do here) that we should not be surprised if we subconsciously use our ‘planning’ areas of the brain whilst trying to make ‘torn decisions’. The other experiment and their counterparts, known as the Libet studies (since the 1960s) showed neuron activity half a second prior to conscious decision-making and was termed the ‘readiness potential’.  Balaguer argues that there is ‘no evidence’ that the readiness potential causes the decision. Even so, it could be argued that, like the Haynes studies, it is subconscious activity happening prior to the conscious decision.

It is readily known (as Balaguer explicates) that much of our thinking is subconscious. We all have the experience of solving a problem subconsciously so it comes to us spontaneously when we don’t expect it to. And anyone who has pursued some artistic endeavour (like writing fiction) knows that a lot of it is subconscious so that the story and its characters appear on the page with seemingly divine-like spontaneity.

Backtracking to so-called Hume-style free will, it does have a relevance if one considers that our ‘wants’ - what we wish to do - are determined by our desires and needs. We assume that most of the animal kingdom behave on this principle. Few people (including Balaguer) discuss other sentient creatures when they discuss free will, yet I’ve long believed that consciousness and free will go hand-in-hand. In other words, I really can’t see the point of consciousness without free will. If everything is determined subconsciously, without the need to think, then why have we evolved to think?

But humans take thinking to a new level compared to every other species on the planet, so that we introspect and cogitate and reason and internally debate our way to many a decision.

Back in Feb., 2009, I reviewed Douglas Hofstadter’s Pulitzer prize winning book, Godel, Escher, Bach where, among other topics, I discussed consciousness, as that’s one of the themes of his book. Hofstadter coins the term ‘strange loop’. This is what I wrote back then:

By strange loop, Hofstadter means that we can effectively look at all the levels of our thinking except the ground level, which is our neurons. In between we have symbols, which is language, which we can discuss and analyse in a dispassionate way, just like I’m doing now. I can talk about my own thoughts and ideas as if they weren’t mine at all. Consciousness, in Hofstadter’s model (for want of a better word) is the top level, and neurons are the hardware level. In between we have the software (symbols) which is effectively language.

I was quick to point out that ‘software’ in this context is a metaphor – I don’t believe that language is really software, even though we ‘download’ it from generation to generation and it is indispensable to human reasoning, which we call thinking.

The point I’d make is that this is a 2 way process: the neurons are essential to thoughts, yet our thoughts I expect can affect neurons. I believe there is evidence that we can and do rewire our brains simply by exercising our mental faculties, even in later years, and surely exercising consciously is the very definition of will.

Tuesday, 15 December 2015

The battle for the future of Islam

There are many works of fiction featuring battles between ‘Good’ and ‘Evil’, yet it would not be distorting the truth to say that we are witnessing one now, though I think it is largely misconstrued by those of us who are on the sidelines. We see it as a conflict between Islam and the West, when it’s actually within Islam itself. This came home to me when I recently saw the biographical movie, He Named Me Malala (pronounce Ma-la-li, by the way).

Malala is well known as the 14 year old Pakistani school girl, shot in the head on a school bus by the Taliban for her outspoken views on education for girls in Pakistan. Now 18 years old (when the film was made) she has since won the Nobel Peace Prize and spoken in the United Nations, as well as having audiences with world leaders, like Barak Obama. In a recent interview with Emma Watson (on Emma’s Facebook page) she appeared much wiser than her years. In the movie, amongst her family, she behaves like an ordinary teenager with ‘crushes’ on famous sports stars. In effect, her personal battle with the Taliban represents in microcosm a much wider battle between past and future that is occurring on the world stage within Islam. A battle for the hearts and minds of Muslims all over the world.

IS or ISIS or Daesh has arisen out of conflicts between Shiites and Sunnis in both Iraq and Syria, but the declaration of a Caliphate has led to a much more serious, even sinister, connotation, because its followers believe they are fulfilling a prophecy which will only be resolved with the biblical end of the world. I’m not an Islamic scholar, so I’m quoting from Audrey Borowski, currently doing a PhD at the University of London, who holds a MSt (Masters degree) in Islamic Studies from Oxford University. She asserts: ‘…one of the Prophet Muhammad’s earliest hadith (sayings) locates the fateful showdown between Christians and Muslims that heralds the apocalypse in the city of Dabiq in Syria.’

“The Hour will not be established until the Romans (Christians) land at Dabiq. Then an army from Medina of the best people on the earth at that time… will fight them.”

She wrote an article of some length in Philosophy Now (Issue 111, Dec. 2015/Jan. 2016) titled Al Qaeda and ISIS; From Revolution to Apocalypse.

The point is that if someone believes they are in a fight for the end of the world, then destroying entire populations and cities is not off the table. They could resort to any tactic, like contaminating water supplies of entire cities or destroying food crops on a large scale. I alluded in the introduction that this apocalyptic ideology, in a fictional context, represents a classic contest between good and evil. From where I (and most people reading this blog) stand, anyone intent on destroying civilization as we know it, would be considered the ultimate evil. 

What is most difficult for us to comprehend is that the perpetrators, the people ‘on the other side’ would see the roles reversed. Earlier this year (April 2015), I wrote a post titled Morality is totally in the eye of the beholder, where I explained how two different cultures in the same country (India) could have completely opposing views concerning a crime against a young woman, who was raped and murdered on a bus returning from seeing a movie with her boyfriend. One view was that the girl was the victim of a crime and the other view was that the girl was responsible for her own fate.

Many people have trouble believing that otherwise ordinary people, who commit evil acts in the form of atrocities, would see themselves as not being evil. We have an enormous capacity to justify to ourselves the most heinous acts, and no where is this more evident, than when one believes they are performing the ‘Will of God’. This is certainly the case with IS and their followers.

Unfortunately, this has led to a backlash in the West against all Muslims. In particular, we see both in social media and mainstream media, and even amongst mainstream politicians, a sentiment that Islam is fundamentally flawed and needs to be reformed. It seems to me that they are unaware that there is already a battle happening within Islam, where militant bodies like IS and Boko Haram and the Taliban represent the worse and a young schoolgirl from Pakistan represents the best.

Ayaan Hirsi Ali (whom I wrote about in March 2011), said when she was in Australia many years ago, that Islam was not compatible with a secular society, which is certainly true if Islamists wish to establish a religious-based government. There is a group, Hizb ut-Tahrir, who is banned in most Western countries, but not UK or Australia, and whose stated aim is to form a caliphate and whose political agenda, including the introduction of Sharia law, would clearly conflict with Australian law. But the truth is that there are many Muslims living an active and productive life in Australia, whilst still practising their religion. A secular society is not an atheistic society, yet is religiously nondependent by definition. In other words, there is room for variety in religious practice and that is what we see. Extremists of any religious persuasion are generally not well received in a pluralist multicultural society, yet that is the fear that is driving the debate in many secular societies.

Over a year ago (Aug 2014) I wrote a post titled Don’t judge all Muslims the same, based on another article I read in Philosophy Now (Issue 104, May/Jun 2014) by Terri Murray (Master of Theology, Heythrop College, London) who made a very salient point differentiating cultural values and ideals from individual ones.  In particular, she asserted that an individual’s rights overrules the so-called rights of a culture or a community. Therefore, misogynistic issues like female genital mutilation, honour killings, child marriage, all of which are illegal in Australia, are abuses of individual rights that may be condoned, even considered normal practice, in some cultures.

Getting back to my original subject matter, like the case of the Indian girl (a medical graduate) who was murdered for going on a date, this really is a battle between past and future. IS and the Taliban and their variant Islamic ideologies represent a desire to regain a past that has no relevance in the 21st Century – it’s mediaeval, not only in concept but also in practice. One of the consequences of the Internet is that it has become a vehicle for both sides. So young women in far off countries are learning that there is another world where education can lead to a better life. And this is the key: education of women, as Malala has brought to the world’s attention, is the only true way forward. It’s curious that women are what these regimes seem to fear most, including IS, whose greatest fear is to be killed by a female Kurdish warrior, because then they won’t get to Paradise.

Tuesday, 1 December 2015

Why narcissists are a danger to themselves and others

I expect everyone has met a narcissist, though, like all personality disorders, there are degrees of severity, from the generally harmless egotistical know-it-all to the megalomaniac, who takes control of an entire nation. In between those extremes is the person who somehow self-destructs while claiming it’s everyone else’s fault. They’re the ones who are captain of the ship and totally in control, even when it runs aground, but suddenly claim it’s no longer their fault. I’m talking metaphorically, but this happened quite literally and spectacularly, a couple of years back, as most of you will remember.

The major problem with narcissists is not their self-aggrandisement and over-inflated opinion of their own worth, but their distorted view of reality.

Narcissists have a tendency to self-destruct, not on purpose, but because their view of reality, based on their overblown sense of self-justification, becomes so distorted that they lose perspective and then control, even though everyone around them can see the truth, but are generally powerless to intervene.

They are particularly disastrous in politics but are likely to rise to power when things are going badly, because they are charismatic and their self-belief becomes contagious. Someone said (I don’t know who) that when things are going badly society turns on itself – they were referring to the European witch hunts, which coincided with economic and environmental tribulations. The recent GFC creates ripe conditions for charismatic leaders to feed a population’s paranoia and promise miracle solutions with no basis in rationality. Look at what happened in Europe following the Great Depression of the 20th Century: World War 2. And who started it? Probably the most famous narcissist in recent history. The key element that they have in common with the aforementioned witch-hunters is that they can find someone to blame and, frighteningly, they are believed.

Narcissists make excellent villains as I’ve demonstrated in my own fiction. But one must be careful of whom we demonise lest we become as spiteful and destructive as those we wish not to emulate. Seriously, we should not take them seriously; then all their self-importance and self-aggrandisement becomes comical. Unfortunately, they tend to divide society between those who see themselves as victims and those who see the purported culprits as the victims. In other words, they divide nations when they should be uniting them.

But there are exceptions. Having read Steve Jobs’ biography (by Walter Isaacson) I would say he had narcissistic tendencies, yet he was eminently successful. Many people have commented on his ‘reality-distortion field’, which I’ve already argued is a narcissistic trait, and he could be very egotistical at times, according to anecdotal evidence. Yet he could form deep relationships despite being very contrary in his dealings with his colleagues – building them up one moment and tearing them down the next. But Jobs was driven to strive for perfection, both aesthetically and functionally, and he sought out people who had the same aspiration. He was, of course, extraordinarily charismatic, intelligent and somewhat eccentric. He was a Buddhist, which may have tempered his narcissistic tendencies; but I’m just speculating – I never met him or worked with him – I just used and admired his products like many others. Anyway, I would cite Jobs as an example of a narcissist who broke the mould – he didn’t self-destruct, quite the opposite, in fact.


Addendum: When I wrote this I had recently read Isaacson's biography of Steve Jobs, but I've since seen a documentary and he came perilously close to self-destruction. He was called before a Senate Committee under charges of fraud. He was giving his employees backdated shares (I think that was the charge, from memory). Anyway, according to the documentary, he only avoided prison because it would have destroyed the share price of Apple, which was the biggest company on the share market at the time. I don't know how true this is, but it rings true.