Paul P. Mealing

Check out my book, ELVENE. Available as e-book and as paperback (print on demand, POD). Also this promotional Q&A on-line.

Sunday 13 July 2014

The Physics of Motorcycling

Since I wrote a post on the Physics of Driving (March 2014), it seems only logical and fair to write one on the physics of motorcycle riding. The physics is more complex and counter-intuitive, but it’s also more intriguing.

In both cases the driving force (excuse the pun) is gyroscopic dynamics, though, in the case of a motorcycle, it’s both more central and more controlling. I can still remember the first time I went round a decent corner (as opposed to a street intersection) on a motorcycle and felt the inherent weightlessness it generates. This is the appeal of riding a bike and what separates the experience viscerally from driving a car.

As I’ve already explained in my previous post on driving, it’s the muscle strain on our necks that tells us how hard we are cornering, whether we are in a car or on a bike, though the effect is reversed from one to the other. In the case of a car we lean our heads into the corner to balance the semi-circular canals in our ears, and our neck muscles subconsciously tell us what the lateral force is in a subjective sensory manner. In the case of a bike we lean our bodies and keep our heads upright - because we feel effectively weightless - but the strain on our neck muscles is exactly the same, even though it is reversed.

So that explains how it feels but it doesn’t explain how it all works. The physics is not easy to grasp, but the effect is relatively easy to explain, even if one doesn’t understand the dynamics behind it, so please persevere with me. There is a second law of angular momentum, which effectively says that if you apply a torque around an axis perpendicular to the rotating axis, you will get a rotation around the third axis, called precession. One usually draws diagrams at this stage to demonstrate this, but I can do better: I will give you an example that you may be able to perform at home.

A surveyor’s plumb bob works best to demonstrate this, but a bicycle wheel can work as well. Take a plumb bob with its string wrapped around it, hold it horizontally so the wound string is vertical, then let it go while holding the end of the string. As it falls the unwinding string makes the plumb bob spin about its horizontal axis, but when it gets to the end of the string, it doesn’t fall over.  It precesses, giving the impression of weightlessness. This YouTube video demonstrates what I’m talking about rather dramatically with a heavy flywheel, and its sequel demonstrates it even better, and explains the so-called weightless effect. And this video explains the physics concerning the 3 axes using an ordinary bicycle wheel on the end of a rope (which you may be able to do yourself) .

So what has all this physics got to do with riding a motorcycle? It’s what gets you around a corner – as simple as that – but the way it does it is completely counter-intuitive. To get the bike to lean over we apply a torque, via the handlebars, perpendicular to the rotational axis, only we apply it in the opposite direction to what we might think. Basically, if you push on the bar in the direction you want to turn, it will lean over in that direction. By ‘push’ I mean you push on the left bar to lean left and on the right bar to lean right. This is the counter-intuitive part, because we would think that if we pushed on the left bar the wheel would turn right. In fact, I’ve argued about this with people who ride motorbikes, but I know it’s true because, I not only understand the physics behind it, I put it into practice in over a decade of riding.

Now, when the bike leans over, it behaves exactly the same as the fly-wheel in the videos, and, under the force of gravity, the bike precesses around the corner, generating a feeling of weightlessness at the same time.

So that’s the core of the physics of riding a motorcycle but there’s more. In a car you can swerve and brake at the same time, as any advanced driving course will teach you. But on a bike you can do one or the other but not both. If you brake in a corner, the bike will ‘stand up’ and there is nothing you can do about it. This is different to simply closing the throttle, when the bike will tighten its line (turn tighter). Now, why this quirk of physics may seem catastrophic, it’s what allows you to brake in a corner at all. You see the bike will still follow the same curved trajectory while it’s slowing down, and it does it without any intervention from you except for the application of brakes.

The other laws of physics I explained in my last post, regarding the inverse law of speed versus rate-of-change of direction, and the braking distance following the speed squared law still apply. In other words, it takes twice as long to change direction at double the speed, and it takes 4 times the distance to brake at double the speed.

Monday 26 May 2014

Why consciousness is unique to the animal kingdom

I’ve written a number of posts on consciousness over the last 7 years, or whenever it was I started blogging, so this is a refinement of what’s gone before, and possibly a more substantial argument. It arose from a discussion in New Scientist  24 May 2014 (Letters) concerning the evolution of consciousness and, in particular the question: ‘What need is there of actual consciousness?’ (Eric Kvaalen from France).

I’ve argued in a previous post that consciousness evolved early and it arose from emotions, not logic. In particular, early sentient creatures would have relied on fear, pain and desire, as these do pose an evolutionary advantage, especially if memory is also involved. In fact, I’ve argued that consciousness without memory is pretty useless, otherwise the organism (including humans) wouldn’t even know it was conscious (see my post on Afterlife, March 2014).

Many philosophers and scientists argue that AI (Artificial Intelligence) will become sentient. The interesting argument is that ‘we will know’ (referencing New Scientist Editorial, 2 April 2011) because we don’t know that anyone else is conscious either. In other words, the argument goes that if an AI behaves like it’s conscious or sentient, then it must be. However, I argue that AI entities don’t have emotions unless they are programmed artificially to behave like they do – i.e. simulated. And this is a major distinction, if one believes, as I do, that sentience arose from emotions (feelings) and not logic or reason.

But in answer to the question posed above, one only has to look at another very prevalent life form on this planet, which is not sentient, and the answer, I would suggest, becomes obvious. I’m talking about vegetation. And what is the fundamental difference? There is no evolutionary advantage to vegetation having sentience, or, more specifically, having feelings. If a plant was to feel pain or fear, how could it respond? Compared to members of the animal kingdom, it cannot escape the source, because it is literally rooted to the spot. And this is why I believe animals evolved consciousness (sentience by another name) and plants didn’t. Now, there may be degrees of consciousness in animals (we don’t know) but, if feelings were the progenitor of consciousness, we can understand why it is a unique attribute of the animal kingdom and not found in vegetation or machines.

Monday 12 May 2014

How should I live?

This is the 'Question of the Month' in the latest issue of Philosophy Now (Issue 101, March/April 2014). Submissions need to be 400 words or less, so mine is 400 words exactly (refer below).

How should I live?

How I should live and how I do live are not necessarily the same, but having aspirations and trying to live up to them is a good starting point. So the following is how I aspire to live, which I don’t always achieve in practice.

The most important point is that no one lives in isolation. The fact that we all, not only speak in a language, but also think in a language, illustrates how significantly dependent we are on our milieu. What’s more, from our earliest cognitive experiences to the remainder of our lives, we interact with others, and the quality of our lives is largely dependent on that interaction.

Everyone seeks happiness and in modern Western societies this universal goal is taken for granted. But how to achieve it? A tendency to narcissism, tacitly encouraged by the relatively recent innovation of social media, can lead to self-obsession, which we are particularly prone to in our youth. Socrates famously said (or so we believe): ‘The unexamined life is not worth living.’ But a thoughtful analysis of that coda, when applied to one’s own life, reveals that we only examine our lives when we fail. The corollary to this is that a life without failure is a life not worth living. And this is how wisdom evolves over a life’s experiences: not through success or study but through dealing with life’s trials and tribulations. This is reflected in virtually every story that’s been told: how the protagonist deals with adversity, be it physical or psychological or both. And this is why storytelling is universally appealing.

So how should I live my life? By being the opposite to narcissistic and self-obsessed. By realising that every interaction in my life is an opportunity to make my life more rewarding by making someone else’s life more rewarding. In any relationship, familial, work-related, contractual or whatever, either both parties are satisfied or both are dissatisfied. It is very rare that someone achieves happiness at someone else’s expense, unless they are competing in a sporting event or partaking in a reality TV show.

There is an old Chinese saying, possibly Confucian in origin: If you want to know the true worth of a person, observe the effects they have on other people's lives. A true leader knows that their leadership is not about their personal achievements: it’s about enabling others to realise their own achievements.

Sunday 4 May 2014

Pitfalls of a Democracy

The latest issue of Philosophy Now has as its theme, ‘Democracy’, with a number of articles on the subject covering Plato to contemporary politics. In particular relevance to this post, Anja Steinbauer ‘explains why Plato had problems with democracy.’ I won’t discuss her article at length, but early on she points out that ‘…it all comes to a head with Socrates: Athenian democracy didn’t like Socrates, which is why the troublesome thinker was democratically put to death.’ The reason I paraphrase this is because it has dramatic relevance to current political events in Australia.

On a recent issue of 4 Corners, whistleblowers and video footage tell us what the government was unwilling to reveal regarding not so recent events at a detention centre for refugees on Manus Island, Papua New Guinea, where an asylum-seeker was killed during a riot. The programme reveals the deeply flawed inhumanity of this particular government policy which was originally introduced by the previous (Labor) government and is now being brutally pursued by the incumbent (Liberal) government. Both sides of politics endorse this policy because it’s a vote-winner and, in fact, the last election campaign was dominated by who could be more successful in ‘stopping the boats’ (containing asylum-seekers) as if we are suffering from a wartime invasion.

The relevance to Steinbauer’s insightful commentary on Plato and Socrates is that, with the explicit support of the general public, a government can execute policies that directly contravene the human rights of people who have no political representation in that country. In essence, we are guilty of inflicting both physical and emotional trauma on people; an action we would condemn if it was being done somewhere else. In short, a democratic process does not necessarily provide the most ethical and moral resolution to a dilemma.

The other side to this, is the airing of the programme itself. Only a healthy democratic society can foster a journalistic culture that can openly criticise government policies without fear of retribution.

Saturday 15 March 2014

The physics of driving

This is quite a departure, I know, but one of my hobby-horses is how little people know about the physics of driving. Unlike our man-made road rules, the laws of nature are unbreakable, so a rudimentary knowledge can be useful.

But what prompted me to write this post was a road test I read of the new, upmarket Infiniti Q50 in EVO Australia (March 2014). The big-selling feature of the Infiniti Q50 is its so-called ‘direct adaptive steering’; a world first, apparently for a production car (as opposed to a prototype or research vehicle). It’s a totally ‘fly-by-wire’ steering system, so there is no mechanical connection between the helm and the front wheels. Personally, I think this is a dangerous idea, and I was originally going to title this post, rather provocatively, “A dangerous idea”. Not surprisingly, at least to me, the road-tester found the system more than a little disconcerting when he used it in the real world. It was okay until he wanted to push the car a little, when the lack of feedback through the wheel made him feel somewhat insecure.

There are gyroscopic forces on the front wheels, which naturally increase with cornering force and can be felt through the  steering wheel. The wheel weights up in direct proportion to this force (and not the amount of lock applied as some might think). In other words, it’s a linear relationship, and it’s one of the major sources of determining the cornering force being generated.

I should point out that the main source of determining cornering force is your inner ear, which we all use subconsciously, and is why we all lean our heads when cornering, even though we are unaware of it. It’s the muscle strain on our necks, arising from maintaining our inner ear balance, that tells us how much lateral g-force we have generated. On a motorcycle, we do the opposite, keeping our heads straight while we lean our bodies, so the muscle strain is reversed, but the effect is exactly the same.

Therefore, you may think, we don’t need the steering wheel’s feedback, but there is more. The turn-in to a corner is the most critical part of cornering. This was pointed out to me decades ago, when I was a novice, but experience has confirmed it many times over. Yes, the corner can change radius or camber or both and you might strike something mid-corner, like loose gravel, but, generally, if the front wheels grip on entry then you know they will grip throughout the rest of the corner. This is the case whether you’re under brakes or not, wet or dry surface. It’s possible to loosen traction with a heavy right foot, but most cars have traction control these days, so even that is not an issue for most of us. The point is that, if the front wheels grip on turn-in, we ‘feel’ it through the steering wheel, because of the gyroscopic relationship between cornering force and the weight of the wheel. And cornering force is directly proportional to the amount of grip. The point is that without this critical feedback at turn-in, drivers will be dependent on visual cues to work out if the car is gripping or not. What’s more, the transition from grip to non-grip and back won’t be felt through the wheel. If this system becomes the engineering norm it will make bad drivers out of all of us.

While I’m on the topic, did you know that at twice the speed it takes four times the distance to pull up to a stop? Perhaps, you did, but I bet no one told you when you were learning to drive. The relationship between speed and braking distance is not linear – braking distance is proportional to the speed squared, so 3 times as fast takes 9 times the distance to stop. This is independent of road surface, tyres and make of car – it’s a natural law.

Another one to appreciate is that at twice the speed, changing direction is twice as slow. There is an inverse relationship between speed and rate of change of direction. This is important in the context of driving on multi-lane highways. A car travelling at half the speed of another – that’s overtaking it, say – can change direction twice as fast as the faster car. This is also a law of nature, so even allowing for superior tyres and dynamics of the faster car, the physics is overwhelmingly against it. This is why the safest speed to travel on multi-lane highways is the same speed as everyone else. An atypically slow car, in these circumstances, is just as dangerous (to other motorists and itself) as an atypically fast car.

Addendum: I also wrote a post on the physics of riding a motorcycle.

Saturday 8 March 2014

Afterlife belief – a unique human condition

Recently I’ve been dreaming about having philosophical discussions, which is very strange, to say the least. And one of these was on the topic of the afterlife. My particular insight, from my dream, was that humans have the unique capacity to imagine a life and a world beyond death. It’s hard to imagine that any other creature, no matter its cognitive capacity, would be able to make the same leap. This is not a new insight for me; it’s one my dream reminded me of rather than initiated. Nevertheless, it’s a good starting point for a discussion on the afterlife.

It’s also, I believe, the reason humans came up with religion – it’s hard to dissociate one from the other.  Humans are more than capable of imagining fictional worlds – I’ve created a few myself as a sometime sci-fi writer. But imagining a life after death is to project oneself into an eternal continuity, a form of immortality. Someone once pointed out that death is the ultimate letting go of the ego, and I believe this is a major reason we find it so difficult to confront. The Buddhists talk about the ‘no-self’ and ‘no attachments’, and I believe this is what they’re referring to. We all form attachments during life, be it material or ideological or aspirational or through personal relationships, and I think that this is natural, even psychologically necessary for the most part. But death requires us to give all these up. In some cases people make sacrifices, where an ideal or another’s life takes precedent over one’s own ego. In effect, we may substitute someone else’s ego for our own.

Do I believe in an afterlife? Actually, I’m agnostic on that point, but I have no expectation, and, from what we know, it seems unlikely. I have no problem with people believing in an afterlife – as I say, it’s part of the human condition – but I have a problem when people place more emphasis on it than the current life they’re actually living. There are numerous stories of people ostracizing their children, on religious grounds, because seeking eternal paradise is more important than familial relationships. I find this perverse, as I do the idea of killing people with the promise of reaching heaven as a reward.

Personally, I think it’s more healthy to have no expectation when one dies. It’s no different to going to sleep or any other form of losing consciousness, only one never regains it. No one knows when they fall asleep or when they lose consciousness, and the same applies when one dies. It leaves no memory, so we don’t know when it happens. There is an oft asked question: why is there something rather than nothing? Well, consciousness plays a big role in that question, because, without consciousness, there might as well be nothing. ‘Something’ only exists for ‘you’ while you are alive.

Consciousness exists in a continuous present, and, in fact, without consciousness, the concepts of past present and future would have no meaning. But more than that, without memory, you would not even know you have consciousness. In fact, it is possible to be conscious or act conscious, whilst believing, in retrospect, that you were unconscious. It can happen when you come out of anaesthetic (it’s happened to me) or when you’re extremely intoxicated with alcohol or when you’ve been knocked unconscious by a blow. In these admittedly rare and unusual circumstances, one can be conscious and behave consciously, yet create no memories, so effectively be unconscious. In other words, without memory (short term memory) we would all be subjectively unconscious.

So, even if there is the possibility that one’s consciousness can leave behind the body that created it, after corporeal death, it would also leave behind all the memories that give us our sense of self. It’s only our memories that give us our sense of continuity, and hence our sense of self.

Then there is the issue of afterlife and infinity. Only mathematicians and cosmologists truly appreciate what infinity means. The point is that if you have an infinite amount of time and space than anything that can happen once can happen an infinite number of times. This means that, with infinity, in this world or any other, there would be an infinite number of you and me. But, not only am I not interested in an infinite number of me, I don’t believe anyone would want to live for infinity if they really thought about it.

At the start, I mentioned that I believe religion arose from a belief in the afterlife. Having said that, I think religion comes from a natural tendency to look for meaning beyond the life we live. I’ve made the point before, that if there is a purpose beyond the here and now, it’s not ours to know. And, if there is a purpose, we find it in the lives we live and not in an imagined reward beyond the grave.

Saturday 1 February 2014

Quantum mechanics without complex algebra

 
In my last post I made reference to a comment Noson Yanofsky made in his book, The Outer limits of Reason, whereby he responded to a student’s question on quantum mechanics: specifically, why does quantum mechanics require complex algebra (-1) to render it mathematically meaningful?

Complex numbers always take the form a + ib, which I explain in detail elsewhere, but it is best understood graphically, whereby a exists on the Real number line and b lies on the ‘imaginary’ axis orthogonal to the Real axis. (i = -1, in case you’re wondering.)

In last week’s New Scientist (25 January 2014, pp.32-5), freelance science journalist, Matthew Chalmers, discusses the work of theoretical physicist, Bill Wootters of Williams College, Williamstown, Massachusetts, who has attempted to rid quantum mechanics of complex numbers.

Chalmers introduces his topic by explaining how i (-1) is not a number as we normally understand it – a point I’ve made myself in previous posts. You can’t count an i quantity of anything, and, in fact, I’ve argued that i is best understood as a dimension not a number per se, which is how it is represented graphically. Chalmers also alludes to the idea that i can be perceived as a dimension, though he doesn’t belabour the point. Chalmers also gives a very brief history lesson, explaining how i has been around since the 16th Century at least, where it allowed adventurous mathematicians to solve certain equations. In fact, in its early manifestation it tended to be a temporary device that disappeared before the final solution was reached. But later it became as ‘respectable’ as negative numbers and now it makes regular appearances in electrical engineering and analyses involving polar co-ordinates, as well as quantum mechanics where it seems to be a necessary mathematical ingredient. You must realise that there was a time when negative numbers and even zero were treated with suspicion by ancient scholars.

As I’ve explained in detail in another post, quantum mechanics has been rendered mathematically as a wave function, known as Schrodinger’s equation. Schrodinger’s equation would have been stillborn, as it explained nothing in the real world, were it not for Max Born’s ingenious insight to square the modulus (amplitude) of the wave function and use it to give a probability of finding a particle (including photons) in the real world. The point is that once someone takes a measurement or makes an observation of the particle, Schrodinger’s wave function becomes irrelevant. It’s only useful for making probabilistic predictions, albeit very accurate ones. But what’s mathematically significant, as pointed out by Chalmers, is that Born’s Rule (as it’s called) gets rid of the imaginary component of the complex number, and makes it relevant to the real world with Real numbers, albeit as a probability.

Wootters ambition to rid quantum mechanics of imaginary numbers started when he was a PhD student, but later became a definitive goal. Not surprisingly, Chalmers doesn’t go into the mathematical details, but he does explain the ramifications. Wootters has come up with something he calls the ‘u-bit’ and what it tells us is that if we want to give up complex algebra, everything is connected to everything else.

Wootters expertise is in quantum information theory, so he’s well placed to explore alternative methodologies. If the u-bit is a real entity, it must rotate very fast, though this is left unexplained. Needless to say, there is some scepticism as to its existence apart from a mathematical one. I’m not a theoretical physicist, more of an interested bystander, but my own view is that quantum mechanics is another level of reality – a substrate, if you like, to the world we interact with. According to Richard Ewles (MATHS 1001, pp.383-4): ‘…the wave function Ψ permeates all of space… [and when a measurement or observation is made] the original wave function Ψ is no longer a valid description of the state of the particle.’

Many physicists also believe that Schrodinger’s equation is merely a convenient mathematical device, and therefore the wave function doesn’t represent anything physical. Whether this is true or not, its practical usefulness suggests it can tells us something important about the quantum world. The fact that it ‘disappears’ or becomes irrelevant, once the particle becomes manifest in the physical world, suggests to me that there is a disjunct between the 2 physical realms. And the fact that the quantum world can only be understood with complex numbers simply underlines this disjunction.

Friday 3 January 2014

The Introspective Cosmos


I haven’t written anything meaty for a while, and I’m worried I might lose my touch. Besides, I feel the need to stimulate my brain and, hopefully, yours in the process.

Just before Christmas, I read an excellent book by Noson S. Yanofsky, titled: The Outer Limits of Reason; What Science, Mathematics, and Logic CANNOT Tell Us. Yanofsky is Professor in the Department of Computer and Information Science at Brooklyn College and The Graduate Center of the City of University of New York. He is also co-author of Quantum Computing for Computer Scientists (which I haven’t read).

Yanofsky’s book (the one I read) covers a range of topics, including classical and quantum physics, chaos theory, determinism, free will, Godel’s Incompleteness Theorem, the P-NP problem, the anthropic principle and a whole lot more. The point is that he is well versed in all these areas, yet he’s very easy to read. His fundamental point, delineated in the title, is that it is impossible for us to know everything. And there will always be more that we don’t know compared to what we do know. Anyone with a remote interest in epistemology should read this book. He really does explain the limits of our knowledge, both theoretically and practically. At the end of each section he gives a synopsis of ‘further reading’, not just a list. I found the book so compelling, I even read all the ‘Notes’ in the appendix (something I rarely do).

Along the way, he explains things like countable infinities and uncountable infinities and why it is important to make the distinction. He also explains the difference between computing problems that are simply incomputable and computing problems that are computable but would take more time than the Universe allows, even if the computer was a quantum computer.

He discusses, in depth, philosophical issues like the limitations of mathematical Platonism, and provides compelling arguments that the mathematics we use to describe physical phenomena invariably have limitations that the physical phenomena don’t. In other words, no mathematical equation, no matter its efficacy, can cover all physical contingencies. The physical world is invariably more complex than the mathematics we use to interpret it, and a lot of the mathematical tools we use deal with large scale averages rather than individual entities – like the universal gas equation versus individual molecules.

He points out that there is no ‘fire in the equations’ (as does Lee Smolin in Time Reborn, which I’ve also read recently) meaning mathematics can describe physical phenomena but can’t create them. My own view is that mathematics is a code that only an intelligence like ours can uncover. As anyone who reads my blog knows, I believe mathematics is largely discovered, not invented. Marcus du Sautoy presented a TV programme called The Code, which exemplifies this view. But this code is somehow intrinsic in nature in that the Universe obeys laws and the laws not only require mathematics to quantify them but, without mathematics, we would not know their existence except, possibly, at a very rudimentary and uninformed level.

Yanofsky discusses Eugene Wigner’s famous declaration concerning ‘The Unreasonable Effectivenessof Mathematics’ and concludes that it arises from the fact that we use mathematics to probe the physical world, and that, in fact, leaving physics aside, there is a ‘paucity of mathematics in general science’. But in the next paragraph, Yanofsky says this:

The answers to Wigner’s unreasonable effectiveness leads to much deeper questions. Rather than asking why the laws of physics follow mathematics, ask why there are any laws at all.

In the same vein, Yanofsky gives a personal anecdote of a student asking him why complex numbers work for quantum mechanics. He answers that ‘…the universe does not function using complex numbers, Newton’s formula, or any other law of nature. Rather, the universe works the way it does. It is humans who use the tools they have to understand the world.’ And this is completely true as far as it goes, yet I would say that complex numbers are part of ‘the code’ required to understand one of the deepest and fundamental mysteries of the Universe.

Yanofsky’s fundamental question, quoted above, ‘why are there any laws at all?’ leads him to discuss the very structure of the universe, the emergence of life and, finally, our place in it. In fact he lists this as 3 questions:

1: Why is there any structure at all in the universe?
2: Why is the structure that exists capable of sustaining life?
3: Why did this life-sustaining structure generate a creature with enough intelligence to understand the structure?

I’ve long maintained that the last question represents the universe’s greatest enigma. There is something analogous here between us as individuals and the cosmos itself. We are each an organism with a brain that creates something we call consciousness that allows us to reflect on ourselves, individually. And the Universe created, via an extraordinary convoluted process, the ability to reflect on itself, its origins and its possible meaning.

Not surprisingly, Yanofsky doesn’t give any religious answers to this but, instead, seems to draw heavily on Paul Davies (whom he acknowledges generously at the end of the chapter) in providing various possible answers to these questions, including John Wheeler’s controversial thesis that the universe, via a cosmic scale quantum loop, has this particular life and intelligence generating structure simply because we’re in it. I’ve discussed these issues before, without coming to any definitive conclusion, so I won’t pursue them any further here.

In his notes on this chapter, Yanofsky makes this point:

Perhaps we can say that the universe is against having intelligent life and that the chances of having intelligent life are, say, 0.0000001 percent. We, therefore, only see intelligent life in 0.0000001 percent of the universe.

This reminds me of John Barrow’s point, in one of his many books, that the reason the universe is so old, and so big, is because that’s how long it takes to create complex life, and, because the universe is uniformly expanding, age and size are commensurate.

So Yanofsky’s is a deep and informative book on many levels, putting in perspective not only our place in the universe but the infinite knowledge we will never know. Towards the end he provides a table that summarises the points he delineates throughout the book in detail:

Solvable computer problems                             Unsolvable computer problems
Describable phenomena                                    Indescribable phenomena
Algebraic numbers                                            Transcendent numbers
(Provable) mathematical statements                 Mathematical facts

Finally, he makes the point that, in our everyday lives, we make decisions based primarily on emotions not reason. We seemed to have transcended our biological and evolutionary requirements when we turned to mathematics and logic to comprehend phenomena hidden from our senses and attempted to understand the origin and structure of the universe itself.

Saturday 7 December 2013

Dr Who 50th Anniversary Special


A bit late, I know, as it was 2 weeks ago, but worthy of a post. Despite my advanced years, I didn’t see Dr Who in my teenage years when it first came to air. I really only became a fan with the resurrection or second coming in 2005, when Russell T Davies rebooted it with Christopher Eccleston as the Doctor. But, personally, I liked David Tennant and then Matt Smith’s renditions and it was a pleasure to see them together in the 50th Anniversary special, The Day of the Doctor, alongside John Hurt, who was an inspirational casting choice. One should also mention Steven Moffat, who, as chief writer, deserves credit for making the show a monumental success. Writers rarely get the credit they deserve.

I recently re-watched episodes involving David Tennant and Matt Smith, and I particularly liked the narrative involving Martha Jones, played by Freema Agyeman, who, as far as I know, is the first non-white ‘companion’. Arguably, as significant as Halle Berry’s appearance as a ‘Bond girl’. My favourite episode was the ‘Weeping Angels’ because it was so cleverly structured from a time-travel perspective.

I saw the 50th Anniversary Special in a cinema in 3D (good 3D as opposed to bad 3D) and I’ve since watched it again on ABC’s iview (expires today). It was also great to see Billie Piper recreate her role as Rose Tyler or Bad Wolf, albeit in a subtly different guise. It was one of many clever elements in this special. At its heart it contains a moral dilemma – a la John Stuart Mill – which was mirrored in one of the subplots. The interaction between John Hurt’s Doctor and Billie Piper’s sentient AI conscience is one of the highlights of the entire story, which was reinforced when I watched it for the second time. I know that some people had trouble following the time jumps and plot machinations, but that wasn’t an issue for me. To create a doomsday device to end all doomsday devices and give it a sentient conscience is a stroke of narrative genius. At 1hr 16 mins it’s not quite movie-length, yet it shows that length is not a criterion for quality. I found it witty, clever and highly entertaining, both in story context and execution; suitably engaging for a 50th Anniversary celebration.

Postscript: I should confess that the Daleks had an influence on ELVENE, which is readily spotted by any fan of popular Sci-Fi culture.

Monday 30 September 2013

Probability and Causality – can they be reconciled in our understanding of the universe?

In last month’s Philosophy Now (July/August 2013) Raymond Tallis wrote an interesting and provocative article (as he often does) on the subject of probability and its relationship to quantum mechanics and causality (or not). He started off by referencing a talk he gave at the Hay Festival in Wales titled, ‘Has Physics Killed Philosophy?’ According to Tallis, no, but that’s neither the subject of his article nor this post.

Afterwards, he had a conversation with Raja Panjwani, who apparently is both a philosopher and a physicist as well as ‘an international chess champion’. They got to talking about how, in quantum mechanics, ‘causation has been replaced by probability’ unless one follows the ‘many-worlds’ interpretation of quantum mechanics, whereby every causal effect is realised in some world somewhere. One of the problems with the many-worlds view (not discussed by Tallis) is that it doesn’t account for the probability of an event occurring in ‘our world’ as dictated by Schrodinger’s equation and Born’s rule. (I’ve written an entire post on that subject if the reader is interested.)

David Deutsch, the best known advocate of the many-worlds interpretation, claims that the probabilities are a consequence of how many worlds there are for each quantum event, but if there are infinite possibilities, as the theory seems to dictate according to Feynman’s integral path method, then every probability is one, which would be the case if there were an infinite number of worlds. It has to be said that Deutsch is much cleverer than me, so he probably has an answer to that, which I haven’t seen.

Tallis’s discussion quickly turns to coin-tossing, as did his conversation with Panjwani apparently, to demonstrate to ordinary people (i.e. non-physicists) how probabilities, despite appearances to the contrary, are non-causal. In particular, Tallis makes the point, often lost on gamblers, that a long sequence of ‘Heads’ (for example) has no consequence for the next coin toss, which could still be equal probability ‘Head’ or ‘Tail’. But, assuming that the coin is ‘fair’ (not biased), we know that the probability of a long sequence of ‘Heads’ (or ‘Tails’) becomes exponentially less as the sequence gets longer. So what is the connection? I believe it’s entropy.

Erwin Schrodinger in his book (series of lectures, actually), What is Life? gives the example of shuffling cards to demonstrate entropy, which also involves probabilities, as every poker player knows. In other words, entropy, which is one of the fundamental laws of the universe, is directly related to probability. To take the classic example of perfume diffusing from a bottle into an entire room, what is the probability of all the molecules of the perfume ending up back in the bottle? Infinitesimal. In other words, there is a much, much higher probability of the perfume being evenly distributed throughout the entire room, gusts of wind and air-conditioning notwithstanding. Entropy is also linked to the arrow of time, but that’s another not entirely unrelated topic, which I may return to.

Tallis then goes on to discuss how each coin toss is finely dependent on the initial conditions, which is chaos theory. It seems that Tallis was unaware that he was discussing entropy and chaos theory, or, if he did, he didn’t want to muddy the waters. I’ve discussed this elsewhere in more detail, but chaos is deterministic yet unpredictable and seems to be entailed in everything from galactic formation to biological evolution. In other words, like entropy and quantum mechanics, it seems to be a fundamental factor in the universe’s evolvement.

Towards the end of his article, Tallis starts to talk about time and references physicist, Carlo Rovelli, whom he quotes as saying that there is ‘a possibility that quantum mechanics will become “a theory of the relations between variables, rather than a theory of the evolution of variables in time.”’ Now, I’ve quoted Rovelli previously (albeit second-hand from New Scientist) as claiming that at the basic level of physics, time disappears. The relevance of that assertion to this discussion is that causality doesn’t exist without time. Schrodinger’s time dependent equation is dependent on an ‘external clock’ and can only relate to ‘reality’ through probabilities. These probabilities are found by multiplying components of the complex equation with their conjugates, and, as Schrodinger himself pointed out, that is equivalent to solving the equation both forwards and backwards in time (ref: John Gribbin, Erwin Schrodinger and the Quantum Revolution, 2012).

So it is ‘time’ that is intrinsic to causality as we observe and experience it in everyday life, and time is a factor, both in entropy and chaos theory. But what about quantum mechanics? I think the jury is still out on that to be honest. The many-worlds interpretation says it’s not an issue, but John Wheeler’s ‘backwards in time’ thought experiment for the double-slit experiment (since been confirmed according to Paul Davies) says it is.

When I first read Schrodinger’s provocative and insightful book, What is Life? one of the things that struck me (and still does) is how everything in the universe seems to be dependent on probabilities, especially on a macro scale. Einstein famously said “God does not play with dice” in apparent frustration at the non-determinism inherent in quantum mechanics, yet I’d say that ‘God’ plays dice at all levels of nature and evolution. And causality seems to be a consequence, an emergent property at a macro level, without which we would not be able to make sense of the world at all.

Monday 16 September 2013

I support Montreal protesters for right of religious expression

This is such wrong-thinking that I can’t imagine the motivation behind it. I spent 6 weeks in Montreal in 2001 and have very fond memories of it – it has some similarities with my hometown for the past 2 decades, Melbourne, Australia. (Mind you, I was there in July, so I don’t know what it’s like in winter.)

In Melbourne, this legislation would be unthinkable by any major political party and would receive the strongest opposition. We once had a conservative female politician who wanted to band the hijab in schools, but it never gained any political traction.

At the very least, the legislation proposed by Premiere Pauline Marois is discriminatory, and, at worst, it’s Orwellian. In particular, it tacitly encourages prejudice and discrimination against people who hold religious beliefs. In my view, it’s the antithesis of what a secular society stands for.

Sunday 18 August 2013

Moral relativism – a much abused and misconstrued term

The latest issue of Philosophy Now (Issue 97; July/August 2013) has as its theme ‘The Self’ but there are a couple of articles that touch on ethics and morality, including one that looks at moral relativism (Julien Beillard, pp. 23-4). In a nutshell, Beillard claims that moral relativism is ‘unintelligible’, because, to the moral relativist, all moral stances are equally true and equally false, which is patently ‘absurd’. I know it’s unfair to reduce a 2 page argument to a one-liner, but it’s not the direction I wish to take, albeit I think he has a point.

In another article, Joel Marks (p.52) expounds on 3 books he’s written on the subject of Ethics without Morals (one of the titles) without actually arguing his case, so I can’t comment, let alone analyse his position, without reading at least one of them. The reason I raise it at all is because he briefly discusses the idea of ‘morality [being] independent of religion’. Marks calls himself an ‘amoralist’, but again, this is not the direction I wish to take.

Moral relativism is one of the most abused terms one finds on blogs like mine, especially by religious fundamentalists. It’s a reflex action for many of them when faced with an atheist or a non-theist. (I make the distinction, because non-theists don’t particularly care, whereas atheists tend to take a much harder stance towards religion in general.) The point is that fundamentalists take immediate refuge in the belief that all atheists must be moral relativists, which is just nonsense. To paraphrase Marks (out of context) they believe that ‘secular moralists …are on much less secure ground than traditional theism, because it purports to issue commands… without a commander (God).’ (parentheses in the original.)

The point I’d make, in response to this, is that people project their morality onto their God rather than the other way round. For example, homophobes have a homophobic God, and they will find the relevant text to support this view, disregarding the obvious point that the text was written by a human, just as mortal as themselves. Others of the same faith, but a different disposition towards homosexuality, will find relevant texts to support the exact opposite point of view. This was recently demonstrated on this TV panel discussion, involving opposing theologians from the Catholic Church, Judaism and Islam (some were audience members and one was video-linked from California). And one has to ask the obvious question, given the context of this discussion: is this moral relativism in action?

The point is that most moral attitudes and beliefs are cultural, and that includes all the religious ones as well. And like all cultural phenomena, morality evolves, so what was taboo generations ago, can become the social norm, and gay marriage is a good example of a social norm in transition. It also highlights the point that conservative voices like to keep the status quo (some even want to turn the clock back) while radical voices tend to advocate change, which we all recognise, politically, as being right and left. But over generations the radical becomes the status quo, and eventually conservatives defend what was once considered radical, which is how morality evolves.

I would argue that no one ever practices moral relativism – I’ve never met one and I never expect to. Why? Because everyone has a moral stance on virtually every moral question. In effect, this is exactly the point that Beillard makes, albeit in a more roundabout way. The real question is where do they get that stance? For conservatives, the answer is tradition and often religion. But there are liberal theologians as well, so religion is very flexible, completely dependent on the subjective perspective of its adherents. In a secular pluralist society, like the one I live in, there are many diverse moral views (on topics like gay marriage) as the abovementioned TV discussion demonstrates. Abortion is another example that can be delineated pretty much between conservatives and liberals. Are these examples of moral relativism? No, they are examples of diverse cultural norms and topics of debate, as they should be. Some of these issues are decided, for the society as a whole, in Parliaments, where democratically elected members can discuss and argue, sometimes being allowed a conscience vote. In other words, they don’t have to follow party lines. Gay marriage is an issue that should be allowed a conscience vote, though one conservative party, in our country, still refuses. As Penny Wong, a gay member of parliament and a mother, says in the above debate: the issue will only be resolved when both major parties allow a conscience vote. This is democracy in action.

So moral relativism has to be looked at in the context of an evolving culture where mores of the past, like abolition and women’s right to vote, have become the accepted norm, even for conservatives. The same will eventually occur for gay marriage, as we are seeing the transition occurring all over the world. There really is no such thing as moral relativism, except as a catch-phrase for religious conservatives to attempt to sideline their philosophical opponents. No one is a moral relativist for as long as they hold a philosophical position on a moral issue, and that includes most of us.

Addendum: This is Penny Wong's speech to parliament that effectively demonstrates the evolvement of social norms. It says a lot about Australian politics that she's effectively talking to an empty chamber.


Monday 29 July 2013

Why the economic growth paradigm is past its use-by date

Last week’s New Scientist (20 July 2013, pp.42-5) had an intriguing article on the relationship between demographics and economic health in various countries. It’s not the first time that they’ve featured this little known aspect of political and economic interaction, but this article was better than the previous one, because the interactions they describe are more obviously perceived. Basically, the median age of a country is a determining factor in that country’s economic future.

Economic growth is related to burgeoning population growth, which led to the so-called 'Asian Tigers' in the 1980s and the huge spurt in post-war economic growth in Western countries, as well as Japan. Many of these countries, like Japan and much of Europe, are now economically stagnant due to ageing populations, so you can see the relationship between median age and economic growth. The author, Fred Pearce, claims that even China’s against-the-trend growth will be stymied by their ‘one child’ policy in coming generations.

But stability is also an issue and ageing populations are more politically stable, whereas youthful countries trying to embrace democracy (like Egypt and Afghanistan) are struggling and unlikely to succeed in the near future.

Countries like the US, Canada and Australia depend on immigration to maintain economic growth. In Australia, it is ridiculous that our economic health is always gauged by new home construction, which is obviously dependent on sustained population growth (only yesterday, the flag went up that housing had slumped therefore we were in trouble). It’s ridiculous because ‘sustained population growth’ has limits, and those limits are beginning to be experienced in many Western countries, especially Europe.

The problem, which is readily understood in this context, is that economic growth is married to a youthful burgeoning population without limits, which obviously can’t be sustained indefinitely. Yet all our so-called ‘future’ policies ignore this fact of nature. It’s ironic that conservative politics are determined to keep everything the same, yet it’s these very policies that will create the greatest change the planet has ever seen, and not necessarily for the better.

Friday 19 July 2013

Writing’s 3 Essential Skills

I’ve written on this topic before and even given classes in it, as well as talks, but this is a slightly different approach. Basically, I’m looking at the fundamental skills one has to acquire or develop in order to write fiction, as opposed to non-fiction. In a nutshell, they are the ability to create character, the ability to create emotions and the ability to create narrative tension. None of these are required for ordinary writing but they are all requisite skills for fiction. I’ll address them in reverse order.

Some people may prefer the term ‘narrative drive’ to ‘narrative tension’ but the word tension is more appropriate in my view. Tension is antithetical to resolution and has a comparable role in music which is less obvious. Narrative tension can be manifest in many forms, but it’s essential to fiction because it’s what motivates the reader to turn the page. A novel without narrative tension won’t be read. You can have tension between characters, in many forms: sexual, familial, or between colleagues or between protagonist and antagonist. Tension can be created by jeopardy, which is suspense, or by anticipation or by knowledge semi-revealed. In a word, this is called drama. And, of course, all these forms can be combined to occur in parallel or in series, and have different spans over the duration of the story. Tension requires resolution and the resolution is no less important a skill than the tension itself. Ideally, you want tension in some form on every page.

Emotion is what art is all about, and the greatest exemplar is music. Music is arguably the purest art form because music is the most emotive of art forms. No where is this more apparent than in cinema, where it is employed so successfully that the audience, for the most part, is unaware of its presence, yet it manipulates you emotionally as much as anything on the screen. In novels, the writer doesn’t have access to this medium, yet he or she is equally adept at manipulating emotions. And, once again, this is an essential skill, otherwise the reader will find the story lifeless. Novels can make you laugh, make you cry, make you horny, make you scared and make you excited, sometimes all in the same book.

Normally, I start any discussion on writing with character, because it is the most essential skill of all. I can’t tell you how to create characters – it’s one of those skills that comes with practice – I only know that I do it without thinking about it too much. For me, when I’m writing, the characters take on a life of their own, and if they don’t, I know I’m wasting my time. But there is one thing I’ll say about characters, based on other reading I’ve done, and that is if I’m not sympathetic to the protagonist(s) I find the story an ordeal. If the protagonist is depressed, I get depressed; if the protagonist is an angry young man, I find myself avoiding his company; if the protagonist is a pretentious prat, I find myself wishing they’d have an accident. It’s a very skilled writer who can engage you with uninviting characters, and I’m not one of them.

There is a link between character and emotion, because the character is the channel through which you feel emotion. A story is told through its characters, including description and exposition. If you want to describe something or explain something do it through the characters' senses and introspection.

Finally, why is crime the most popular form of fiction? Because crime often involves a mystery or a puzzle and invariably involves suspense, which is a guaranteed form of narrative tension. The best crime fiction (for example, Scandinavian) involves psychologically authentic characters, and that will always separate good fiction from mediocre. We like complex drawn characters, because they feel like people we know, and their evolvement is one of the reasons we return to the page.

Saturday 13 July 2013

Malala Day


A 16 year old girl, shot by the Taliban for going to school, stands defiant and delivers an impassioned and inspirational speech to the United Nations General Assembly. This girl not only represents the face of feminism in Islam but represents the future of women all over the world. Education is the key to humanity's future and, as the Dalai Lama once said, ignorance is one of the major poisons of the mind. Ignorance is the enemy of the 21st Century. May this day go down in history as the representation of a young girl's courage and determination to forge her own future in a society where the idea is condemned.

Tuesday 25 June 2013

Fruits of Corporate Greed


A couple of years ago I wrote a post about global feudalism, but it’s much worse than I thought.

This eye-opening programme is shameful. As Kerry O’Brien says at the end: out of sight, out of mind. This is the so-called level playing field in action. Jobs going overseas because the labour is cheaper. Actually it’s jobs going overseas because it’s virtually slave labour - I’m talking literally not figuratively.

But more revelatory than anything else is that there is no code of ethics for these companies unless it is forced upon them. They really don’t care if the workers, who actually create the products they sell, die or are injured or are abused. When things go wrong they do their best to avoid accountability, and, like all criminals, only own up when incontrovertible evidence is produced.

Algebra - the language of mathematics


I know I’m doing things back-to-front – arse-about - as we say in Oz (and possibly elsewhere) but, considering all the esoteric mathematics I produce on this blog, I thought I should try and explain some basics.

As I mentioned earlier this year in a post on ‘analogy’, mathematics is a cumulative endeavour and you can’t understand calculus, for example, if you don’t know algebra. I’ve come across more than a few highly intelligent people, of both sexes, who struggle with maths (or math as Americans call it) and the sight of an equation stops them in their tracks.

Mathematics is one of those topics where the gap, between what you are expected to know and what you actually learn, can grow as you progress through school, mainly because you were stumped by algebra. You know: the day you were suddenly faced with numbers being replaced by letters; and things like counting, adding, subtracting, dividing, multiplying, fractions and even decimals suddenly seemed irrelevant. In other words, everything you’d learned about mathematics, which was firmly grounded in numbers – something you’d learned almost as soon as you could talk – suddenly seemed useless. Even Carl Jung, according to his autobiography, stopped understanding maths the day he had to deal with ‘x’. In fact, his wife, Emma, had a better understanding of physics than Jung did.

But for those who jump this hurdle, seemingly effortlessly, ‘x’ is a liberator in the same way that the imaginary number i is perceived by those who appreciate its multi-purposefulness. In both cases, we can do a lot more than we could before, and that is why algebra is a stepping-stone to higher mathematics.

Fundamentally, mathematics is not so much about numbers as the relationship between numbers, and algebra allows us to see the relationships without the numbers, and that’s the conceptual hurdle one has to overcome.

I’ll give a very simple example that everyone should know: Pythagoras’s triangle.

I don’t even have to draw it, I only have to state it: a2 + b2 = c2; and you should know what I’m talking about. But a picture is worth innumerable words.

The point is that we can use actual integers, called Pythagorean triples, that obey this relationship; the smallest being 52 = 42 + 32. Do the math as you Americans like to say.

But the truth is that this relationship applies to all Pythagorean triangles, irrespective of their size, length of sides and units of measurement. The only criteria being that the triangle is ‘flat’, or Euclidean (is not on a curved surface) and contains one right angle (90o).

By using letters, we have stated a mathematical truth, a universal law that applies right across the universe. Pythagoras’s triangle was discovered well before Pythagoras (circa 500BC) by the Egyptians, Babylonians and the Chinese, and possibly other cultures as well.

Most of the mathematics, that I do, involves the manipulation of algebraic equations, including a lot of the stuff I describe on this blog. If you know how to manipulate equations, you can do a lot of mathematics, but if you don’t, you can’t do any.

A lot of people are taught BIDMAS, which gives the priority of working out an equation: Brackets, Indices, Division, Multiplication, Addition and Subtraction. To be honest, I’ve never come across a mathematician who uses it.

On the other hand, a lot of maths books talk about the commutative law, the associative law and the distributive law as the fundaments of algebra.

There is a commutative law for addition and a commutative law for multiplication, which are both simple and basic.

A + B = B + A  and  A x B = B x A (that’s it)

Obviously there is no commutative law for subtraction or division.

A – B B – A  and  A/B B/A (pretty obvious)

There are some areas of mathematics where this rule doesn’t apply, like matrices, but we won’t go there.

The associative law also applies to addition and multiplication.

So A + (B + C) = (A + B) + C  and  A x (B x C) = (A x B) x C

It effectively says that it doesn’t matter what order you perform these operations you’ll get the same result, and, obviously, you can extend this to any length of numbers, because any addition or multiplication creates a new number that can then be added or multiplied to any other number or string of numbers.

But the most important rule to understand is the distributive law because it combines addition and multiplication and can be extended to include subtraction and division (if you know what you're doing). The distributive law lies at the heart of algebra.

A(B + C) = AB + AC  and  A(B + C) ≠ AB + C (where AB = A x B)

And this is where brackets come in under BIDMAS. In other words, if you do what’s in the brackets first you’ll be okay. But you can also eliminate the brackets and get the same answer if you follow the distributive rule.

But we can extend this: 1/A(B - C) = B/A - C/A (where B/A = B ÷ A)

And  -A(B – C) = CA – BA  because (-1)2 = 1, so a minus times a minus equals a plus.

If 1/A(B + C) = B/A + C/A then (B + C)/A = B/A + C/A

And  A/C + B/D = (DA + BC)/DC

To appreciate this do the converse:

(DA + BC)/DC = DA/DC + BC/DC = A/C + B/D

But the most important technique one can learn is how to change the subject of an equation. If we go back to Pythagoras’s equation:

a2 + b2 = c2  what’s b = ?

The very simple rule is that whatever you do to one side of an equation you must do to the other side. So if you take something away from one side you must take it away from the other side and if you multiply or divide one side by something you must do the same on the other side.

So, given the above example, the first thing we want to do is isolate b2. Which means we take a2 from the LHS and also the RHS (left hand side and right hand side).

So b2 = c2 – a2

And to get b from b2 we take the square root of b2, which means we take the square root of the RHS.

So b = (c2 – a2)

Note b ca  because (c2 – a2) c2 - a2

In the same way that (a + b)2 a2 + b2

In fact (a + b)2 = (a + b)(a + b)

And applying the distributive law: (a + b)(a + b) = a(a + b) + b(a + b)

Which expands to  a2 + ab + ba + b2 = a2 + 2ab + b2

But (a + b)(a – b) = a2 – b2  (work it out for yourself)

An equation by definition (and by name) means that something equals something. To maintain the equality whatever you do on one side must be done on the other side, and that’s basically the most important rule of all. So if you take the square root or a logarithm or whatever of a single quantity on one side you must take the square root or logarithm or whatever of everything on the other side. Which means you put brackets around everything first and apply the distributive law if possible, and, if not, leave it in brackets like I did with the example of Pythagoras’s equation.

Final Example:  A/B = C + D    What’s B = ?

Invert both sides:  B/A = 1/(C + D)

Multiply both sides by A:   B = A/(C + D)   (Easy)

Note: A/(C + D) A/C + A/D


Sunday 23 June 2013

Time again to talk about time


Last week’s New Scientist’s cover declared SPACE versus TIME; one has to go. But which? (15 June 2013). This served as a rhetorical introduction to physics' most famous conundrum: the irreconcilability of its 2 most successful theories - quantum mechanics and Einstein’s theory of general relativity - both conceived at the dawn of the so-called golden age of physics in the early 20th Century.

The feature article (pp. 35-7) cites a number of theoretical physicists including Joe Polchinski (University of California, Santa Barbara), Sean Carroll (California Institute of Technology, Pasadena), Nathan Seiberg (Institute for Advanced Study, Princeton), Abhay Ashtekar (Pennsylvania University), Juan Malcadena (no institute cited) and Steve Giddings (also University of California).

Most scientists and science commentators seem to be banking on String Theory to resolve the problem, though both its proponents and critics acknowledge there’s no evidence to separate it from alternative theories like loop quantum gravity (LQG), plus it predicts 10 spatial dimensions and 10500 universes. However, physicists are used to theories not gelling with common sense and it’s possible that both the extra dimensions and the multiverse could exist without us knowing about them.

Personally, I was intrigued by Ashtekar’s collaboration with Lee Smolin (a strong proponent of LQG) and Carlo Rovelli where ‘Chunks of space [at the Planck scale] appear first in the theory, while time pops up only later…’ In a much earlier publication of New Scientist on ‘Time’ Rovelli is quoted as claiming that time disappears mathematically: “For me, the solution to the problem is that at the fundamental level of nature, there is no time at all.” Which I discussed in a post on this very subject in Oct. 2011.

In a more recent post (May 2013) I quoted Paul Davies from The Goldilocks Enigma: ‘[The] vanishing of time for the entire universe becomes very explicit in quantum cosmology, where the time variable simply drops out of the quantum description.’ And in the very article I’m discussing now, the author, Anil Ananthaswamy, explains how the wave function of Schrodinger’s equation, whilst it evolves in time, ‘…time is itself not part of the Hilbert space where everything else physical sits, but somehow exists outside of it.’ (Hilbert space is the ‘abstract’ space that Schrodinger’s wave function inhabits.) ‘When we measure the evolution of a quantum state, it is to the beat of an external timepiece of unknown provenance.’

Back in May 2011, I wrote my most popular post ever: an exposition on Schrodinger’s equation, where I deconstructed the famous time dependent equation with a bit of sleight-of-hand. The sleight-of-hand was to introduce the quantum expression for momentum (px = -i h d/dx) without explaining where it came from (the truth is I didn’t know at the time). However, I recently found a YouTube video that remedies that, because the anonymous author of the video derives Schrodinger’s equation in 2 stages with the time independent version first (effectively the RHS of the time dependent equation). The fundamental difference is that he derives the expression for px = i h d/dx, which I now demonstrate below.

Basically the wave function, which exploits Euler’s famous equation, using complex algebra (imaginary numbers) is expressed thus:  Î¨ = Ae i(kx−ωt)
If one differentiates this equation wrt x we get ik(Ae i(kx−ωt)), which is ikΨ. If we differentiate it again we get d2Ψ/dx2 = (ik)2Ψ.

Now k is related to wavelength (λ) by 2π such that k = 2π/λ.

And from Planck’s equation (E = hf) and the fact that (for light) c = f λ we can get a relationship between momentum (p) and λ. If p = mc and E = mc2, then p = E/c. Therefore p = hf/f λ which gives p = h/λ effectively the momentum version of Planck’s equation. Note that p is related to wavelength (space) and E is related to frequency (time).

This then is the quantum equation for momentum based on h (Planck’s constant) and λ. And, of course, according to Louis de Broglie, particles as well as light can have wavelengths.

And if we substitute 2π/k for λ we get p = hk/2π which can be reformulated as
k = p/h where h = h/2Ï€.

And substituting this in (ik)2 we get –(p/h)2  { i2 = -1}

So Ψ d2/dx2 = -(px/h)2Ψ

Making p the subject of the equation we get px2 = - h2 d2/dx2 (Ψ cancels out on both sides) and I used this expression in my previous post on this topic.

And if I take the square root of px2 I get px = i h d/dx, the quantum term for momentum.

So the quantum version of momentum is a consequence of Schrodinger’s equation and not an input as I previously implied. Note that -1 can be i or –i so px can be negative or positive. It makes no difference when it’s used in Schrodinger’s equation because we use px2.

If you didn’t follow that, don’t worry, I’m just correcting something I wrote a couple of years ago that’s always bothered me. It’s probably easier to follow on the video where I found the solution.

But the relevance to this discussion is that this is probably the way Schrodinger derived it. In other words, he derived the term for momentum first (RHS), then the time dependent factor (LHS), which is the version we always see and is the one inscribed on his grave’s headstone.

This has been a lengthy and esoteric detour but it highlights the complementary roles of space and time (implicit in a wave function) that we find in quantum mechanics.

Going back to the New Scientist article, the author also provides arguments from theorists that support the idea that time is more fundamental than space and others who believe that neither is more fundamental than the other.

But reading the article, I couldn’t help but think that gravity plays a pivotal role regarding time and we already know that time is affected by gravity. The article keeps returning to black holes because that’s where the 2 theories (quantum mechanics and general relativity) collide. From the outside, at the event horizon, time becomes frozen but from the inside time would become infinite (everything would happen at once) (refer Addendum below). Few people seem to consider the possibility that going from quantum mechanics to classical physics is like a phase change in the same way that we have phase changes from ice to water. And in that phase change time itself may be altered.
 
Referring to one of the quotes I cited earlier, it occurs to me that the ‘external timepiece of unknown provenance’ could be a direct consequence of gravity, which determines the rate of time for all objects in free fall.

Addendum: Many accounts of the event horizon, including descriptions in a recent special issue of Scientific American; Extreme Physics (Summer 2013), claim that one can cross an event horizon without even knowing it. However, if time is stopped for 'you' according to observers outside the event horizon, then their time must surely appear infinite to ‘you’, to be consistent. Kiwi, Roy Kerr, who solved Einstein's field equations for a rotating black hole (the most likely scenario), claims that there are 2 event horizons, and after crossing the first one, time becomes space-like and space becomes time-like. This infers, to me, that time becomes static and infinite and space becomes dynamic. Of course, no one really knows, and no one is ever going to cross an event horizon and come back to tell us.