Paul P. Mealing

Check out my book, ELVENE. Available as e-book and as paperback (print on demand, POD). Also this promotional Q&A on-line.

Sunday 11 October 2009

Watching Watchmen

Yes, I know it was released over 6 months ago, but I’ve just seen it. I don’t normally review movies on this blog – in fact, I’ve only done it once before: Man on Wire (refer The philosophy of Philippe Petit, Oct. 08), and that’s a completely different kettle of cinematic and philosophical fish.

But Watchmen is such a good movie on so many levels, and it encapsulates so much of the American psyche, especially the not-so-recent paranoia of the Cold War, as well as our universal infatuation with violence. And the cinematic references: Apocalypse Now and Dr. Strangelove being the most obvious; both relevant to the cold war era. I am an outsider, regarding America, so my perspective may be different to those who imbibe and live its culture every day.

In Australia it got great reviews, with one notable exception: Evan Williams panned it for its gross glorification of violence as he saw it. Even the pacifists in this movie seem to thrive on violence. One can’t help but compare it to Sin City, which is an iconic movie for technical reasons rather than story content, not to mention a stellar cast. It was clearly influenced by Quentin Tarantino, whereas Watchmen probably owes more to the Matrix trilogy. But both films explore the moral landscape and both are based on comic books or graphic novels. It seems that this genre is becoming increasingly obsessed with violence, with a particular emphasis on the ‘graphic’ in graphic novel, and this is reflected in the movie renditions of the same.

I think the violence in movies has had one tragic consequence in real life. In the last 5 years, in Australia, there has been an increase in alcohol-fueled street violence with people receiving life-long and life-losing injuries. I think movies portray an unrealistic expectation that you can belt the crap out of someone and only inflict superficial injuries. In the case of super-heroes, the story can justify it, but the violence is part of the entertainment in these movies – it doesn’t really condemn it the way we do in real life. Cinema has replaced the Roman gladiators without the leftover corpses upsetting our sense of fun.

One of the characters in the movie, The Comedian, is quite literally a psychopath, yet he is clearly tolerated by his brethren because he’s on the side of 'good'. He is an allegory for the darker side of the American psyche, in particular, what Dick Cheney referred to as the ‘dark side’ of foreign operations. There is a scene in a bar in Vietnam where The Comedian shoots a pregnant woman, apparently pregnant with his child, but first he delivers a diatribe on why he hates her country, even though he’s supposed to be there to ‘help’ it.

Back in America, he continues the same psychopathic behaviour towards what he sees as America’s internal enemies. The Comedian pretty well encapsulates the contradictions that the rest of us wrestle with when we observe the dialectic in America between extreme right wing and liberal politics that inevitably overflows onto the global stage.

I’ve said in a previous post on Storytelling (Jul.09) that comic books are our equivalent to Greek mythology, and, like all mythology, allegory should be its core ingredient. In this regard, I felt Watchmen doesn’t disappoint, especially with the character, Doctor Manhattan. Named after the Manhattan project (as the movie reveals for those who don’t make the connection) which effectively ended WWII with the construction and deployment of the atomic bomb.

One of the advantages of reviewing a film so long after its release is I don’t feel guilty about giving away the ending. Doctor Manhattan effectively becomes an allegory for God, especially when it’s his ability to destroy on a cataclysmic, even biblical, scale that finally achieves world peace. This is a particularly pessimistic view of humanity, exemplified by the Bible in my view. We are inherently self-destructive by nature and only a fear of a superhuman (therefore supernatural) force can stop us from achieving our genetically determined destiny (in biblical terms, original sin). So, in a way, it’s a cautionary tale – but the moral of the tale in my view is that paranoia is what will lead to our self-annihilation and not Divine vengeance.

There are 2 things that make Watchmen an exceptional movie. Firstly, it’s cinematic rendering is close to perfect. A combination of film noir and graphic realisation that sets the standard above anything else I’ve seen, including The Matrix and Sin City. But it’s the rendering of the characters that really sets this movie above the norm for comic book movies. The romance between Nite Owl and Silk Spectre is completely believable. Only Bryan Singer’s Superman Returns compares in the genre and Singer is a master storyteller. But all the characters, in particular, the deeply, psychically wounded Rorschach, have a psychological depth one doesn’t expect in these movies. Again, I would reference Singer’s original X-Men as one of the few comparable movies in the genre, and of course Heath Ledger’s memorable rendition of The Joker in The Dark Knight.

But it’s as allegory for the American psyche, in all its contradictions, that I feel this movie delivers. It competes with Apocalypse Now and Dr. Strangelove on that level, both of which it unashamedly honours, and that’s the highest praise I can give it.

Oh, and I almost forgot to mention the soundtrack - from Philip Glass to Leonard Cohen to Bob Dylan to Jimi Hendrix - what more could one ask for?

Sunday 4 October 2009

Quantum Tunneling

In some respects this logically follows on from a post I wrote in July this year, Quantum Mechanical Philosophy, which is one of the more esoteric essays I’ve written on this blog. Hopefully, this essay will be less so, as the source material is well written and aimed at the uninitiated.

But I need to recount the gist of that post to make the relevant connection: specifically, the enigmatic Bell’s Theorem or Bell Inequality. To summarise, Bell’s Theorem arose from a thought experiment created by Einstein in an attempt to prove Bohr’s interpretation of quantum mechanics (the famous ‘Copenhagen Interpretation’) wrong.

The thought experiment was elaborated upon by Podolsky and Rosen, so it became known as the Einstein-Podolsky-Rosen or EPR experiment. It examines the purported ‘action-at-a-distance’ phenomenon predicted by quantum physics for certain traits of particles or photons, which Einstein described, quite accurately, as ‘spooky’. If you have 2 particles with a common origin (could be photons with opposite polarisation or subatomic particles like electrons with opposite ‘spins’), then separate them over any distance whatsoever, you will not know what the spin or polarity, or whatever quantum mechanical trait you are measuring, is, until you take the actual measurement. The ‘spooky’ bit is that as soon as you make the measurement the ‘twin’ particle will instantaneously become the opposite. Before the measurement or observation is made the particles are in, what’s called, a ‘superposition’ of states – it can be either one or the other.

Einstein realised that this conjecture contradicted his special theory of relativity, which states that no signal or means of communication between particles of any kind can travel faster than the speed of light, which had already been confirmed by experiment. John Bell developed a mathematical equation that analysed correlations of hypothetical results from the thought experiment that would categorically prove either Einstein or Bohr wrong.

Alain Aspect developed a real experiment to test Bell’s Inequality (made the thought experiment actually happen) and proved Einstein wrong (long after Einstein had died, by the way).

As I point out in that previous post, the upshot of this is that either faster-than-light actions are possible (called non-locality) or there is no objective reality. Non-locality is self-explanatory (you can’t communicate faster than the speed of light) but no objective reality means that the thing doesn’t exist until someone measures it or takes an observation. I discuss this in more detail (lots of detail) in my previous post, but that’s effectively the Copenhagen interpretation of quantum mechanics: at the subatomic scale, particles don’t exist until they are measured or observed. A less extreme and more popular interpretation is that they remain in a superposition of states until they interact with something else. If you want to delve deeper, read my previous post, but you may be none the wiser. The philosophical implications of this have never been truly resolved.

My conclusion was to accept non-locality (faster-than-light connections) in order to keep objective reality, and I made specific reference to David Bohm’s unpopular interpretation, known as the ‘hidden variables theory’. Bohm believed that there was a hidden set of parameters that govern the particles which we can’t see or detect.

To quote David Deutsch (who doesn’t agree with Bohm at all): ‘A non-local hidden variable theory means, in ordinary language, a theory in which influences propagate across space and time without passing through the space in between.’

And this leads me to quantum tunneling, because that’s exactly what quantum tunneling does, only it happens over short distances, not the distances used in the EPR experiment, which could theoretically include the other side of the universe.

I’ve just read an excellent book on this subject, Zero Time Space subtitled, How Quantum Tunneling Broke the Light Speed Barrier, authored by Gunter Nimtz and Astrid Haibel. Originally published in German in 2004, it was published in English in 2008. This book could be read by people with only a rudimentary knowledge of physics, as it contains only a few simple equations, among them Planck’s equation: E = hf where E is energy, f is frequency of a ‘wave’ and h is Planck’s constant, 6.6 x 10-34 Js (Joules seconds). The authors also include Snell’s law of refraction and the universal wave equation of wavelength times frequency equals velocity (I can’t find the symbol, lambda, for wavelength, in my arsenal of fonts). One of the annoyances is that there is a type-setting error in this particular equation (in the book). If someone is going to include equations, especially for people unfamiliar with them, I wish they could at least get them checked during type-setting. The same applies to Richard Feynman’s excellent book on relativity theory, Six Not-So-Easy Pieces where I found 3 type-setting errors amongst the equations scattered throughout the book. In both cases the books are aimed at people who are not familiar with the material, which means they won’t know the errors are there.

Putting that one (some may say petty) criticism aside, it’s a very good book on quantum mechanics for people who know very little about physics. It includes a short history of physics leading up to Einstein’s theories of relativity (with particular reference to the Special Theory) as well as quantum mechanics. They do this because the whole point of the book is to highlight how quantum tunneling breaks Einstein’s special theory of relativity, and therefore reinforces non-locality, as I described in my previous post. So the authors go to some pains to give the reader an overview of both Einstein’s theory and quantum mechanics, in conjunction with the historical context. It’s very well done.

Nimtz and Haibel, by the way, make no reference to Bell’s Theorem, as it would probably confuse readers who are unfamiliar with it – I hope I haven’t put people off by referencing it here. Having said that, they do discuss ‘entanglement’ towards the end of their book, which is the state I described above concerning ‘twin’ particles interacting at a distance. In particular, they raise this phenomenon in their lengthy discussion on causality, as they are at pains to explain that ‘tunneling’ does not affect causality as some people might be led to believe. Even so, it still manages to confound common sense, as I explicate below.

In the forward to the book, they briefly discuss the ‘myth… about the half-life of knowledge… It suggests that our knowledge is being declared invalid every five years by new knowledge.’ They then go on to dispel the most common representation of that myth: Newton’s theory of gravitation is still valid, even in the light of the theory of relativity… Einstein’s theory has extended theory rather than disproved Newton’s theory.’

I made the same point in my essay on The Laws of Nature (March 08), explaining that Einstein’s equations reduce to Newton’s when certain parameters become negligible. The authors raise this point, because, whilst quantum tunneling appears to contradict Einstein’s special theory of relativity, in their own words: ‘Einstein deals with free space, whereas tunneling is not free space.’ In other words there are constraints on relativity theory in the same way that there are constraints on Newton’s theory, but there are various aspects of nature where one is more significant than the other. It’s one of the reasons that I’m a bit sceptical about a grand unified theory (GUT), a meta-theory of everything. Many people would love to prove me wrong, and a part of me would like to see that, but another part wouldn’t because I don’t believe there will ever be an end to physics.

During this discussion they make another statement, relevant to the stability of scientific knowledge: ‘Mathematical proof has been regarded since Pythagoras and Plato as eternal, metaphysical truth.’ A statement I would agree with. For example, Reimann geometry hasn’t displaced Euclidean geometry, it has just extended our knowledge, both of the mathematical world and the physical world (through Einstein’s theory of General Relativity).

I’ve discussed on other posts, the relationship between mathematics and the natural world (refer The unreasonable effectiveness of mathematics, March 09), but no where is that more significant than in quantum mechanics. QED (Quantum Electrodynamics), for which Richard Feynman, Julian Schwinger and Sin-Itoro Tomonaga jointly won the Nobel Prize, is the most successful theory of all time. Without mathematics, quantum mechanics would be indecipherable, quite literally. Intriguingly, there are imaginary numbers in quantum theory that are completely relevant to quantum tunneling. Without imaginary numbers (created by the square route of -1, called i) quantum mechanics would never have been articulated as a meaningful theory at all.

As Nimtz and Haibel point out, it is the imaginary component of the equation that does the tunneling. When this was first derived, people just assumed that these imaginary components were unnecessary remnants of the mathematics, but that’s not the case. When tunneling occurs there is an interface where part of the signal is reflected and part is transmitted through ‘the tunnel’. The part that is reflected is mathematically ‘real’ and the part that is transmitted is mathematically ‘imaginary’. (I've since been informed this is not correct - refer Addendum 2 below.) A tunnel, by the way, is a barrier, where the particle or wave theoretically can’t travel, because it doesn’t have enough energy. The authors point out that it even occurs in the sun, otherwise the fusion, which gives us sunlight, would never occur. I should add that quantum tunneling is a feature of all transistor devices. In fact, it's the very feature that makes transistors work (called 'tunnel diode' by Nimtz and Haibel).

Both of the authors have performed experiments, to not only detect quantum tunneling, but to also measure the time elapsed. As predicted by Thomas Hartman in 1962, there is a time elapse at the ‘entrance’ to the tunnel, or the ‘interface’, between the medium and ‘the tunnel’, but the actual time spent in the tunnel is zero. This is called the Hartman effect. To quote the authors: ‘So the wave packet spreads in the tunnel in zero time and is everywhere from the entrance to the exit. This non-local phenomenon makes one feel eery.’ An understatement, if I’ve ever read one.

One of the authors, Gunter Nimtz, participated in an experiment that tunneled Mozart’s symphony in g-minor through a waveguide at superluminal speed: 4.7 times the speed of light. The elapsed time occurred at the entrance to the tunnel, as predicted by Hartman, not in the tunnel itself. In an exposition, that I will not try to repeat here, the authors explain how this quirk of nature (the elapsed time at the entrance to the tunnel) allows superluminal communication without impacting causality. The speed in the tunnel is infinite – as the Americans like to say: go figure. The title of the book, Zero Time Space, is therefore entirely appropriate.

They end the book with a brief description of wormholes and hypothetical warp drives, beloved of Sci-Fi writers, like me, that require exotic negative gravity amongst other improbabilities.

Of all the incredible manifestations of the universe, only consciousness is arguably more inexplicable or more mysterious (but no more weird) than quantum phenomena. If we didn’t observe it, no one would believe it. And if we didn’t have the mathematics to describe it, no one would be able to fathom it, even remotely.

Addendum 1: I came across this - it's very entertaining as well as informative.

Addendum 2: I would like to acknowledge Timmo (refer comments thread below) who has valiantly tried to correct all my mistakes. In particular, that the imaginary component of Schrodinger's equation plays no greater role in tunneling than the real component, if I understand Timmo correctly. Also he points out that tunneling and non-locality are independent phenomena, and possibly I misled people on that point.

He also corrects some faux pas I made concerning the Lorenz transformation and Godel's Incompleteness Theorem in response to comments I've made since the post was posted.

I confess I don't know as much as I appear to, and I wish I understood more than I actually do.

And I would like to thank Timmo for reminding me of how much I don't know.