Over the last month I’ve acquired 3 books that are not entirely unrelated. Not surprisingly, they all deal with topics I’ve discussed before.
In order of acquisition they are: Physics and Philosophy by Werner Heisenberg; The Book of Universes by John D. Barrow; and MATHS 1001 by Richard Elwes. Of all these, Heisenberg’s book is probably the least accessible, even though it’s written more for a lay-audience than an academic one.
Elwes’ book is subtitled Absolutely everything you need to know about mathematics in 1001 bite-sized explanations. Under the subtitle is a mini-bite-sized blurb presented as an un-credited quote: ‘More helpful than an encyclopaedia, much easier than a textbook’.
Both of these claims seem unrealistic, yet the blurb is probably closer to the end result than the subtitle. I had this book whilst I spent a recent 4 day sojourn in hospital and it ensured that I never got bored.
But Barrow’s book is the most compelling, not least because he’s not just an observer but a participant in the story. Barrow covers the entire Western history of ‘cosmology’ from Stonehenge to String Theories. This is a book that really does attempt to tell you everything you wanted to know about theories of the universe(s). And Barrow’s book is certainly worth writing a post about, because he revealed things to me that I hadn’t known or considered before.
On the back fly cover, Barrow’s credentials are impressive: ‘Professor of Mathematical Sciences and Director of the Millennium Mathematics Project at Cambridge University, Fellow of Clare Hall, Cambridge, a Fellow of the Royal Society, and current Gresham Professor of Geometry at Gresham College, London.’ As an understatement, the citation continues: ‘His principal area of scientific research is cosmology…’. It’s rare to find someone, so highly respected in an esoteric field, who can write so eloquently and incisively for a lay audience. Paul Davies comes to mind, as does Roger Penrose, both of whom get mentioned in the pages.
Not surprisingly, even though Barrow’s narrative goes from Aristotle to Ptolemy to Copernicus then Galileo, Kepler and Newton, it resides mostly in the 20th Century, specifically post Einstein’s theories of relativity. Einstein’s field equations have really dictated all theoretical explorations into cosmology from their inception to the present day, and Barrow continually reminds us of this, despite all the empirical data that has driven our best understanding of the universe to date, like Hubble’s constant and the microwave background radiation.
One of the revelations I found in this text, is that Alan Guth’s inflationary hypothesis virtually guarantees that there is a multiverse. Inflation is like a bubble and beyond the bubble, which must always lie beyond the horizon of our expanding universe, are all the anomalies and inconsistencies that we expect to find from a Big Bang universe. The hypothesis contains within it the possibility that there are numerous other inflationary bubbles, many of which could have occurred prior to ours. Barrow also points out that, if there are an infinite number of universes, than any event with probability greater than 0 could occur an infinite number of times. Only mathematicians and cosmologists truly understand just how big infinity is and what its consequences are. Elwes’ book (MATHS 1001) also brings this point home, albeit in a different way. Barrow’s point is that if there are an infinite number of universes then there are an infinite number of you(s) doing exactly what you are doing now as well as an infinite number living infinitely different lives. The fact that they will never encounter each other means that they can exist without mutual awareness except as philosophical speculations like I’m doing now.
For most people the thought of an infinite number of themselves living infinitely variable lives is enough to turn them off the infinite multiverse hypothesis. It should also make one reconsider the idea of an infinite afterlife.
The other philosophical concept that Barrow discusses at length is the anthropic principle and how it is virtually unavoidable in the face of our existence. Another of his relevations (to me) was that we don’t live in one of the most ‘probable’ universes. He demonstrates that if we were to produce a bell curve of probable universes that our particular universe exists in the ‘tail’ and not at the peak as one might expect.
As he says: “Universes that don’t produce the possibility of ‘observers’ – and they do not need to be like ourselves – don’t really count when it comes to comparing the theory with the evidence.”
He then goes on to say: “This is most sobering. We are not used to the existence of cosmologists being a significant factor in the evaluation of cosmological theories.”
There is a link between this idea and quantum mechanics, which I’ll return to later. It was explored specifically by John Wheeler and discussed at length by Paul Davies in his book, The Goldilocks Enigma. People are often dismissive about the idea of why there is something rather than nothing. Recently, Stephen Law, in a debate with Peter Atkins, said that this was the wrong question without elaborating on why it was or what the right question might be. The point is that without conscious entities there may as well be nothing, because only conscious entities, like us, give meaning to the universe at all. To dismiss the question is to say that the universe not only has no meaning but should have no meaning. It’s not surprising (to me) that the people who insist our existence has no meaning also insist that we have no free will. I challenge both premises (or conclusions, depending how they’re framed).
Slightly off track, but only slightly; Barrow immediately follows this relevation with another of equal importance. Life in a universe requires both lots of time and lots of space, so we should not be so surprised that we live in such a vast expanse of space bookended by equally vast amounts of time. It is because life requires enormous complexity that it also requires enormous time to create it.
Again, to quote Barrow: “This is why we should not be surprised to find that our universe is so old. It takes lots of time to produce the chemical building blocks needed for any type of complexity. And because the universe is expanding, if it is old, it must be big – billions of light years in extent.”
Stephen Hawking recently created a minor furor when he claimed the entire universe could have arisen from nothing. People who should know better, or should simply read more, were derisive of the statement, believing he was giving fundamentalists ready-made ammunition by kicking an own goal. Back in the 1980s, Paul Davies in his book, God and the New Physics (covers much the same material as Dawkins’ The God Delusion, only in more depth) quotes Alan Guth that “the Universe is the ultimate free lunch”. Barrow also points out that gravity in the way of potential energy (therefore negative energy) can exactly balance all the positive energy of mass and radiation (through E=mc2) so that the energy balance for the entire universe can be zero.
Heisenberg’s uncertainty principle allows that matter (therefore energy) can and is produced all the time (via ‘quantum fluctuations’) albeit for very short periods of time. The shorter the time, the higher the energy, via the relationship of Planck’s constant, h. So a quantum mechanism for producing something from nothing does exist. That it can happen on a cosmological scale is not so improbable if all the principle forces of nature: gravitation, electromagnetic, electroweak and strong nuclear; can all meet as equal magnitude in the crucible we call the Big Bang. In his discussion on ‘grand unification’ Barrow leaves gravity out of it. I’ve glossed over this for the sake of brevity, but Barrow discusses it in detail. He also discusses the asymmetry between matter and anti-matter that allows anything to exist at all. (He wrote another book on 'symmetry-breaking' with Joseph Silk called The Left Hand of Creation.)
Another revelation I found in Barrow’s book was his discussion of string theories, now collectively called M theory, and the significance of Calib-Yau spaces or manifolds, of which there are over 10500 possibilities (remember 1 billion is only 109). Significantly, all these predict that gravity can be expressed by Einstein’s field equations. So Einstein still dominates the landscape, though what he would make of this development is anyone’s guess.
This means that our quest for a ‘Theory of Everything’ has led to a multitude of universes of which ours is one in 10500. But Barrow goes further when he explains “There are an infinite number of possible universes. The number is too large to be explored systematically by any computer.”
But Barrow’s best revelation is left to the next to last page when he claims that he and Douglas Shaw have recently postulated that the cosmological constant (which ‘adds an additional equation to those first found by Einstein’) is given by the relationship (tp/tu)2 where tp is Planck’s fundamental time, 10-43 sec, and tu is the current age of the universe, 4.3x1017 sec. tp is the smallest quantity of time predicted by quantum mechanics, so is effectively the basic unit of time for the whole universe. By postulating the cosmological constant as a squared ratio dependent on the age of the universe it gives a rational reason, as opposed to a mystical one, why it is the value we observe today of 0.5x10-121. What’s more, their postulate makes a prediction that the curvature of the universe is -0.0056. Current observations give between -0.0133 and +0.0084, but more accurate maps of the microwave background radiation should ‘be able to confirm or refute this very precise prediction’.
There is an intriguing connection between the anthropic principle and quantum mechanics. The Copenhagen interpretation, led by Bohr and given support by Heisenberg, attempts to bridge the gap between the classical world and the quantum world, by stating that something becomes manifest only after we’ve made a ‘measurement’. I think Bohr took this literally and John Wheeler, who was a loyal disciple of Bohr’s, took it even further when he extrapolated it to the cosmos. Paul Davies explores John Wheeler’s thesis in The Goldilocks Enigma, whereby Wheeler proposes a reverse causal relationship, a cosmological quantum loop in effect, between our observation of the universe and its existence. Most people find this too fantastical to entertain, yet it ties quantum mechanics to the anthropic principle in a fundamental way.
Elwes’ book also discusses quantum mechanics and explicates better than most I’ve read, when he expounds that the wave function (given by Schrodinger’s equation) ‘is no longer a valid description of the state of the particle. It is difficult to avoid the conclusion that whenever someone (or perhaps something) takes a measurement, the quantum system mysteriously jumps from being smoothly spread out, to crystallizing at a specific position.’ (italics in the original)
One can’t help but compare Heisenberg’s book (Physics and Philosophy) with Schrodinger’s (What is Life?), which I reviewed in November 2009. Both men made fundamental contributions to quantum theory, for which they were both awarded Nobel prizes, yet they maintained philosophical differences over its ramifications. Schrodinger’s book is a far better read, not least because it’s more accessible. Both impress upon the reader the significance of mathematics in fathoming the universe’s secrets. Schrodinger appealed to Platonism whereas, to my surprise, Heisenberg appealed to the Pythagoreans, who influenced Plato’s Academy and its curriculum of arithmetic, geometry, astronomy and music – Pythagoras’s quadrivium. In particular, Heisenberg quotes Russell on Pythagoras: “I don’t know of any other man who has been as influential as he was in the sphere of thought.”
Quantum phenomena suggests to me that everything is connected. Why do radioactive half lives follow a totally predictable rule statistically but individually are not predictable at all? It’s like the decay exists at a holistic level rather than a unit level. Planck’s constant gives an epistemological limit to our ability to predict or know. At the other end of the scale, the universe exists for us at a time when we can make sense of it. Barrow, along with Douglas Shaw, entails Planck’s constant as a fundamental unit of time in an equation that suggests we understand it only because we are here at this specific time in its history. There is no other explanation, and maybe there is no other explanation required.
Addendum 1: Scientific American (through Paul J. Steinhardt) have a for-and-against discussion on the merits of Alan Guth's 30 year old inflationary theory, and include a reference to Roger Penrose's ideas that I discussed in a post last January.
Addendum 2: Yes, I've changed the title (Sep 2017).
Philosophy, at its best, challenges our long held views, such that we examine them more deeply than we might otherwise consider.
Paul P. Mealing
- Paul P. Mealing
- Check out my book, ELVENE. Available as e-book and as paperback (print on demand, POD). Also this promotional Q&A on-line.
Sunday, 15 May 2011
Friday, 6 May 2011
God with no ego
An unusual oxymoron, I know, but, like anything delivered tongue-in-cheek, it contains an element of serious conjecture. Many years ago (quarter of a century), I read a book on anthropology, which left no great impression on me except that the author said that there were 2 types of culture world wide. One cultural type had a religion based on a ‘creator’ or creation myth, and the other had a religion based on ancestor worship.
I would possibly add a third, which is religion based on the projection of the human psyche. In a historical context, religion has arisen primarily from an attempt to project our imagination beyond the grave. Fascination in the afterlife started early for humans, if ritual burials are anything to go by. By extension, the God of humans, in all the forms that we have, is largely manifest in the afterlife. The only ‘Earthly’ experiences of God or Gods occur in mythology.
Karen Armstrong, in her book, The History of God demonstrates how God has evolved over time as a reflection of the human psyche. I know that Armstrong is criticised on both sides of the religious divide, but The History of God is still one of the best books on religion I have read. It’s one of her earliest publications when she was still disillusioned by her experience as a Carmelite nun. A common theme in Armstrong’s writing is the connection between religion and myth.
I’ve referred to Ludwig Feuerbach in previous posts for his famous quote: God is the outward projection of the human psyche (I think he said ‘man’s inner nature’), so I’ve taken a bit of licence; but I think that’s as good a definition of God as you’re going to get. Feuerbach also said that ‘God is in man’s image’ not the other way round. He apparently claimed he wasn’t an atheist, yet I expect most people today would call him an atheist.
For most people, who have God as part of their existential belief, it is manifest as an internal mental experience yet is ‘sensed’ as external. Neurologist, Andrew Newberg of University of Pennsylvania, has demonstrated via brain imaging experiments that people’s experience of ‘religious feelings [God] do seem to be quite literally self-less’. This is why I claim that God is purely subjective, because everyone’s idea of God is different. I’ve long argued that a person’s idea of God says more about them than it says about God.
I would make an analogy with colour, because colour only occurs in some sentient creature’s mind, even though it is experienced as being external to the observer. There is, of course, an external cause for this experience, which is light reflected off objects. People can equally argue that there is an external cause for one’s experience of God, but I would argue that that experience is unique to that person. Colour can be tested, whereas God cannot.
Contrary to what people might think, I’m not judgemental about people’s belief in God – it’s not a litmus test for anything. But if God is a reflection of an individual’s ideal then judge the person and not their God.
When I was 16, I read Albert Camus’ La Peste (The Plague) and it challenged my idea of God. At the time, I knew nothing about Camus or his philosophy, or even his history with the French resistance during WWII. I also read L’Etranger (The Outsider) and, in both books, Camus, through his protagonists, challenges the Catholic Church. In La Peste, there is a scene where the 2 lead characters take a swim at night (if my memory serves me correctly) and, during a conversation, one of them conjectures that it would possibly be better for God if we didn’t believe in God. Now, this may seem the ultimate cynicism but it actually touched a chord with me at that time and at that age. A God who didn’t want you to believe in God would be a God with no ego. That is my ideal.
I would possibly add a third, which is religion based on the projection of the human psyche. In a historical context, religion has arisen primarily from an attempt to project our imagination beyond the grave. Fascination in the afterlife started early for humans, if ritual burials are anything to go by. By extension, the God of humans, in all the forms that we have, is largely manifest in the afterlife. The only ‘Earthly’ experiences of God or Gods occur in mythology.
Karen Armstrong, in her book, The History of God demonstrates how God has evolved over time as a reflection of the human psyche. I know that Armstrong is criticised on both sides of the religious divide, but The History of God is still one of the best books on religion I have read. It’s one of her earliest publications when she was still disillusioned by her experience as a Carmelite nun. A common theme in Armstrong’s writing is the connection between religion and myth.
I’ve referred to Ludwig Feuerbach in previous posts for his famous quote: God is the outward projection of the human psyche (I think he said ‘man’s inner nature’), so I’ve taken a bit of licence; but I think that’s as good a definition of God as you’re going to get. Feuerbach also said that ‘God is in man’s image’ not the other way round. He apparently claimed he wasn’t an atheist, yet I expect most people today would call him an atheist.
For most people, who have God as part of their existential belief, it is manifest as an internal mental experience yet is ‘sensed’ as external. Neurologist, Andrew Newberg of University of Pennsylvania, has demonstrated via brain imaging experiments that people’s experience of ‘religious feelings [God] do seem to be quite literally self-less’. This is why I claim that God is purely subjective, because everyone’s idea of God is different. I’ve long argued that a person’s idea of God says more about them than it says about God.
I would make an analogy with colour, because colour only occurs in some sentient creature’s mind, even though it is experienced as being external to the observer. There is, of course, an external cause for this experience, which is light reflected off objects. People can equally argue that there is an external cause for one’s experience of God, but I would argue that that experience is unique to that person. Colour can be tested, whereas God cannot.
Contrary to what people might think, I’m not judgemental about people’s belief in God – it’s not a litmus test for anything. But if God is a reflection of an individual’s ideal then judge the person and not their God.
When I was 16, I read Albert Camus’ La Peste (The Plague) and it challenged my idea of God. At the time, I knew nothing about Camus or his philosophy, or even his history with the French resistance during WWII. I also read L’Etranger (The Outsider) and, in both books, Camus, through his protagonists, challenges the Catholic Church. In La Peste, there is a scene where the 2 lead characters take a swim at night (if my memory serves me correctly) and, during a conversation, one of them conjectures that it would possibly be better for God if we didn’t believe in God. Now, this may seem the ultimate cynicism but it actually touched a chord with me at that time and at that age. A God who didn’t want you to believe in God would be a God with no ego. That is my ideal.
Friday, 22 April 2011
Sentience, free will and AI
In the 2 April 2011 edition of New Scientist, the editorial was titled Rights for robots; We will know when it’s time to recognise artificial cognition. Implicit in the header and explicit in the text is the idea that robots will one day have sentience just like us. In fact they highlighted one passage: “We should look to the way people treat machines and have faith in our ability to detect consciousness.”
I am a self-confessed heretic on this subject because I don’t believe machine intelligence will ever be sentient, and I’m happy to stick my neck out in this forum so that one day I can possibly be proven wrong. One of the points of argument that the editorial makes is that ‘there is no agreed definition of consciousness’ and ‘there’s no way to tell that you aren’t the only conscious being in a world of zombies.’ In other words, you really don’t know if the person right next to you is conscious (or in a dream) so you’ll be forced to give a cognitive robot the same benefit of the doubt. I disagree.
Around the same time as reading this, I took part in a discussion on Rust Belt Philosophy about what sentience is. Firstly, I contend that sentience and consciousness are synonymous, and I think sentience is pretty pervasive in the animal kingdom. Does that mean that something that is unconscious is not sentient? Strictly speaking, yes, because I would define sentience as the ability to feel something, either emotionally or physically. Now, we often feel something emotionally when we dream, so arguably that makes one sentient when unconscious. But I see this as the exception that makes my definition more pertinent rather than the exception that proves me wrong.
In First Aid courses you are taught to squeeze someone’s fingers to see if they are conscious. So to feel something is directly correlated with consciousness and that’s also how I would define sentience. Much of the brain’s activity is subconscious even to the extent that problem-solving is often executed subliminally. I expect everyone has had the experience of trying to solve a puzzle, then leaving it for a period of time, only to solve it ‘spontaneously’ when they next encounter it. I believe the creative process often works in exactly the same way, which is why it feels so spontaneous and why we can’t explain it even after we’ve done it. This subconscious problem-solving is a well known cognitive phenomenon, so it’s not just a ‘folk theory’.
This complex subconscious activity observed in humans, I believe is quite different from the complex instinctive behaviour that we see in animals: birds building nests, bees building hives, spiders building webs, beavers building dams. These activities seem ‘hard-wired’, to borrow from the AI lexicon as we tend to do.
A bee does a complex dance to communicate where the honey is. No one believes that the bee cognitively works this out the way we would, so I expect it’s totally subconscious. So if a bee can perform complex behaviours without consciousness does that mean it doesn’t have consciousness at all? The obvious answer is yes, but let’s look at another scenario. The bee gets caught in a spider’s web and tries desperately to escape. Now I believe that in this situation the bee feels fear and, by my definition, that makes it sentient. This is an important point because it underpins virtually every other point I intend to make. Now, I don’t really know if the bee ‘feels’ anything at all, so it’s an assumption. But my assumption is that sentience, and therefore consciousness, started with feelings and not logic.
In last week’s issue of New Scientist, 16 April 2011, the cover features the topic, Free Will: The illusion we can’t live without. The article, written by freelance writer, Dan Jones, is headed The free will delusion. In effect, science argues quite strongly that free will is an illusion, but one we are reluctant to relinquish. Jones opens with a scenario in 2500 when free will has been scientifically disproved and human behaviour is totally predictable and deterministic. Now, I don’t think there’s really anything in the universe that’s totally predictable, including the remote possibility that Earth could one day be knocked off its orbit, but that’s the subject of another post. What’s more relevant to this discussion is Jones’ opening sentence where he says: ‘…neuroscientists know precisely how the hardware of the brain runs the software of the mind and dictates behaviour.’ Now, this is purely a piece of speculative fiction, so it’s not necessarily what Jones actually believes. But it’s the implicit assumption that the brain’s processes are identical to a computer’s that I find most interesting.
The gist of the article, by the way, is that when people really believe they have no free will, they behave very unempathetically towards others, amongst other aberrational behaviours. In other words, a belief in our ability to direct our own destiny is important to our psychological health. So, if the scientists are right, it’s best not to tell anyone. It’s ironic that telling people they have no free will makes them behave as if they don’t, when allowing them to believe they have free will gives their behaviour intentionality. Apparently, free will is a ‘state-of-mind’.
On a more recent post of Rust Belt Philosophy, I was reminded that, contrary to conventional wisdom, emotions play an important role in rational behaviour. Psychologists now generally believe that, without emotions, our decision-making ability is severely impaired. And, arguably, it’s emotions that play the key role in what we call free will. Certainly, it’s our emotions that are affected if we believe we have no control over our behaviour. Intentions are driven as much by emotion as they are by logic. In fact, most of us make decisions based on gut feelings and rationalise them accordingly. I’m not suggesting that we are all victims of our emotional needs like immature children, but that the interplay between emotions and rational thought are the key to our behaviours. More importantly, it’s our ability to ‘feel’ that not only separates us from machine intelligence in a physical sense, but makes our ‘thinking’ inherently different. It’s also what makes us sentient.
Many people believe that emotion can be programmed into computers to aid them in decision-making as well. I find this an interesting idea and I’ve explored it in my own fiction. If a computer reacted with horror every time we were to switch it off would that make it sentient? Actually, I don’t think it would, but it would certainly be interesting to see how people reacted. My point is that artificially giving AI emotions won’t make them sentient.
I believe feelings came first in the evolution of sentience, not logic, and I still don’t believe that there’s anything analogous to ‘software’ in the brain, except language and that’s specific to humans. We are the only species that ‘downloads’ a language to the next generation, but that doesn’t mean our brains run on algorithms.
So evidence in the animal kingdom, not just humans, suggests that sentience, and therefore consciousness, evolved from emotions, whereas computers have evolved from pure logic. Computers are still best at what we do worst, which is manipulate huge amounts of data. Which is why the human genome project actually took less time than predicted. And we still do best at what they do worst, which is make decisions based on a host of parameters including emotional factors as well as experiential ones.
I am a self-confessed heretic on this subject because I don’t believe machine intelligence will ever be sentient, and I’m happy to stick my neck out in this forum so that one day I can possibly be proven wrong. One of the points of argument that the editorial makes is that ‘there is no agreed definition of consciousness’ and ‘there’s no way to tell that you aren’t the only conscious being in a world of zombies.’ In other words, you really don’t know if the person right next to you is conscious (or in a dream) so you’ll be forced to give a cognitive robot the same benefit of the doubt. I disagree.
Around the same time as reading this, I took part in a discussion on Rust Belt Philosophy about what sentience is. Firstly, I contend that sentience and consciousness are synonymous, and I think sentience is pretty pervasive in the animal kingdom. Does that mean that something that is unconscious is not sentient? Strictly speaking, yes, because I would define sentience as the ability to feel something, either emotionally or physically. Now, we often feel something emotionally when we dream, so arguably that makes one sentient when unconscious. But I see this as the exception that makes my definition more pertinent rather than the exception that proves me wrong.
In First Aid courses you are taught to squeeze someone’s fingers to see if they are conscious. So to feel something is directly correlated with consciousness and that’s also how I would define sentience. Much of the brain’s activity is subconscious even to the extent that problem-solving is often executed subliminally. I expect everyone has had the experience of trying to solve a puzzle, then leaving it for a period of time, only to solve it ‘spontaneously’ when they next encounter it. I believe the creative process often works in exactly the same way, which is why it feels so spontaneous and why we can’t explain it even after we’ve done it. This subconscious problem-solving is a well known cognitive phenomenon, so it’s not just a ‘folk theory’.
This complex subconscious activity observed in humans, I believe is quite different from the complex instinctive behaviour that we see in animals: birds building nests, bees building hives, spiders building webs, beavers building dams. These activities seem ‘hard-wired’, to borrow from the AI lexicon as we tend to do.
A bee does a complex dance to communicate where the honey is. No one believes that the bee cognitively works this out the way we would, so I expect it’s totally subconscious. So if a bee can perform complex behaviours without consciousness does that mean it doesn’t have consciousness at all? The obvious answer is yes, but let’s look at another scenario. The bee gets caught in a spider’s web and tries desperately to escape. Now I believe that in this situation the bee feels fear and, by my definition, that makes it sentient. This is an important point because it underpins virtually every other point I intend to make. Now, I don’t really know if the bee ‘feels’ anything at all, so it’s an assumption. But my assumption is that sentience, and therefore consciousness, started with feelings and not logic.
In last week’s issue of New Scientist, 16 April 2011, the cover features the topic, Free Will: The illusion we can’t live without. The article, written by freelance writer, Dan Jones, is headed The free will delusion. In effect, science argues quite strongly that free will is an illusion, but one we are reluctant to relinquish. Jones opens with a scenario in 2500 when free will has been scientifically disproved and human behaviour is totally predictable and deterministic. Now, I don’t think there’s really anything in the universe that’s totally predictable, including the remote possibility that Earth could one day be knocked off its orbit, but that’s the subject of another post. What’s more relevant to this discussion is Jones’ opening sentence where he says: ‘…neuroscientists know precisely how the hardware of the brain runs the software of the mind and dictates behaviour.’ Now, this is purely a piece of speculative fiction, so it’s not necessarily what Jones actually believes. But it’s the implicit assumption that the brain’s processes are identical to a computer’s that I find most interesting.
The gist of the article, by the way, is that when people really believe they have no free will, they behave very unempathetically towards others, amongst other aberrational behaviours. In other words, a belief in our ability to direct our own destiny is important to our psychological health. So, if the scientists are right, it’s best not to tell anyone. It’s ironic that telling people they have no free will makes them behave as if they don’t, when allowing them to believe they have free will gives their behaviour intentionality. Apparently, free will is a ‘state-of-mind’.
On a more recent post of Rust Belt Philosophy, I was reminded that, contrary to conventional wisdom, emotions play an important role in rational behaviour. Psychologists now generally believe that, without emotions, our decision-making ability is severely impaired. And, arguably, it’s emotions that play the key role in what we call free will. Certainly, it’s our emotions that are affected if we believe we have no control over our behaviour. Intentions are driven as much by emotion as they are by logic. In fact, most of us make decisions based on gut feelings and rationalise them accordingly. I’m not suggesting that we are all victims of our emotional needs like immature children, but that the interplay between emotions and rational thought are the key to our behaviours. More importantly, it’s our ability to ‘feel’ that not only separates us from machine intelligence in a physical sense, but makes our ‘thinking’ inherently different. It’s also what makes us sentient.
Many people believe that emotion can be programmed into computers to aid them in decision-making as well. I find this an interesting idea and I’ve explored it in my own fiction. If a computer reacted with horror every time we were to switch it off would that make it sentient? Actually, I don’t think it would, but it would certainly be interesting to see how people reacted. My point is that artificially giving AI emotions won’t make them sentient.
I believe feelings came first in the evolution of sentience, not logic, and I still don’t believe that there’s anything analogous to ‘software’ in the brain, except language and that’s specific to humans. We are the only species that ‘downloads’ a language to the next generation, but that doesn’t mean our brains run on algorithms.
So evidence in the animal kingdom, not just humans, suggests that sentience, and therefore consciousness, evolved from emotions, whereas computers have evolved from pure logic. Computers are still best at what we do worst, which is manipulate huge amounts of data. Which is why the human genome project actually took less time than predicted. And we still do best at what they do worst, which is make decisions based on a host of parameters including emotional factors as well as experiential ones.
Sunday, 3 April 2011
Why we shouldn’t take religion too seriously
This arose from an article in last week’s New Scientist titled Thou shalt believe – or not by Jonathan Lanman (26 March 2011, pp.38-9). Lanman lectures at the school of anthropology and Keble College, Oxford University. He’s giving a talk, entitled Atheism Explained, at St. Mary’s University College Twickenham, UK on 5 April (a couple of days away).
Lanman spent 2008 studying atheism in US, UK, Denmark and online. As a result of his research, Lanman made a distinction between what he calls ‘non-theism’ and ‘strong atheism’, whereby non-theists are effectively agnostic – they don’t really care – and strong atheists vigorously oppose religious belief on moral and political grounds. He found a curious correlation. In countries that are strongly and overtly religious, strong atheism is more predominate, whereas in countries like Sweden, where religion is not so strong, the converse is true. In his own words, there is a negative correlation between strong atheism and non-theism.
I live in Australia where there is a pervasive I-don’t-care attitude towards religious belief, so we are closer to the Swedish model than the American one. In fact, when I visited America a decade ago (both pre and post 911, as well as during) I would say the biggest difference between Australian and American culture is in religion. I spent a lot of time in Texas, where it was almost a culture shock. My experience with the blogosphere has only reinforced that impression.
What is obvious is that where religion takes on a political face then opposition is inevitable. In Australian politics there are all sorts of religious flavours amongst individual politicians, but they rarely become an issue. This wasn’t the case a couple of generations ago when there was a Protestant/Catholic divide through the entire country that started with education and permeated every community, including the small country town where I grew up. That all changed in the 1960s, and, with few exceptions, no one who remembers it wants to revisit it.
Now there is a greater mix of religions than ever, and the philosophy is largely live and let live. Even as a child, religion was seen as something deeply personal and intimate that wasn’t invaded or even shared, and that’s an attitude I’ve kept to this day. Religion, to me, is part of someone’s inner world, totally subjective, influenced by culture, yes, but ultimately personal and unique to the individual.
If people can’t joke about religion in the same way we joke about nationality, or if they feel the need to defend their beliefs in blood, then they are taking their religion too seriously. Even some atheists, in my view, take religion too seriously, when they fail, or refuse, to distinguish between secular adherents to a faith and fundamentalists. If we want to live together, then we can’t take religion too seriously no matter what one’s personal beliefs may be.
Lanman spent 2008 studying atheism in US, UK, Denmark and online. As a result of his research, Lanman made a distinction between what he calls ‘non-theism’ and ‘strong atheism’, whereby non-theists are effectively agnostic – they don’t really care – and strong atheists vigorously oppose religious belief on moral and political grounds. He found a curious correlation. In countries that are strongly and overtly religious, strong atheism is more predominate, whereas in countries like Sweden, where religion is not so strong, the converse is true. In his own words, there is a negative correlation between strong atheism and non-theism.
I live in Australia where there is a pervasive I-don’t-care attitude towards religious belief, so we are closer to the Swedish model than the American one. In fact, when I visited America a decade ago (both pre and post 911, as well as during) I would say the biggest difference between Australian and American culture is in religion. I spent a lot of time in Texas, where it was almost a culture shock. My experience with the blogosphere has only reinforced that impression.
What is obvious is that where religion takes on a political face then opposition is inevitable. In Australian politics there are all sorts of religious flavours amongst individual politicians, but they rarely become an issue. This wasn’t the case a couple of generations ago when there was a Protestant/Catholic divide through the entire country that started with education and permeated every community, including the small country town where I grew up. That all changed in the 1960s, and, with few exceptions, no one who remembers it wants to revisit it.
Now there is a greater mix of religions than ever, and the philosophy is largely live and let live. Even as a child, religion was seen as something deeply personal and intimate that wasn’t invaded or even shared, and that’s an attitude I’ve kept to this day. Religion, to me, is part of someone’s inner world, totally subjective, influenced by culture, yes, but ultimately personal and unique to the individual.
If people can’t joke about religion in the same way we joke about nationality, or if they feel the need to defend their beliefs in blood, then they are taking their religion too seriously. Even some atheists, in my view, take religion too seriously, when they fail, or refuse, to distinguish between secular adherents to a faith and fundamentalists. If we want to live together, then we can’t take religion too seriously no matter what one’s personal beliefs may be.
Sunday, 20 March 2011
Ayaan Hirsi Ali’s story
I’ve just completed reading Aayan Hirsi Ali’s autobiography, Infidel. It’s the latest book in my book club (refer my blog roll) following on from another autobiography from another refugee, Anh Do, The Happiest Refugee. Do is a stand-up comic and television celebrity in Australia, and his brother, Khoa, is a successful filmmaker and former Young Australian of the Year. They are ‘boat people’, who are stigmatised in this country, and Khoa was actually dangled over the side of a boat by pirates when he was only 2 years old. It has to be said that our major political parties show a clear deficit in moral and political courage on the issue of ‘boat people’.
But I’ve detoured before I’ve even got started. We, in the West, live in a bubble, though, occasionally, through television, films and books, like Hirsi Ali’s, we get a glimpse into another world that the rest of us would call hell. And this hell is not transient or momentary for these people, but relentless, unforgiving and even normal for those who grow up in it. Hirsi Ali is one of the few people who has straddled these 2 worlds, and that makes her book all the more compelling. As Aminata Forna wrote in the Evening Standard: “Hirsi Ali has invited [us] to walk a mile in her shoes. Most wouldn’t last a hundred yards.”
There are many issues touched on in her story, none perhaps more pertinent than identity, but I won’t start there. I will start with the apparent historical gap between some Islamic cultures and the modern Western world – a clash of civilisations, if you like. I remember the years between my teens and mid twenties were the most transformational, conflicted and depressing in my life. Like many of my generation, it was a time when I rejected my parents’ and society’s values, not to mention the religion I had grown up with, and sought a world view that I could call my own. To some extent that’s exactly what Hirsi Ali has done, only she had to jump from a culture still imbued with 6th Century social mores into the birth of the second millennium. I can fully understand what drove her, but, looking back on my own coming-of-age experience, I doubt that I could emulate her. What she achieved is a monumental leap compared to my short jump. For me, it was generational; for her, it was trans-cultural and it spanned millennia.
Much of her book deals with the treatment of women in traditional Muslim societies, treated, in her own words, as ‘minors’ not adults. One should not forget that the emancipation of women from vassals to independent, autonomous beings with their own rights has been a very lengthy process in Western society. Most societies have been historically patriarchal in both the East and West. The perception and treatment of women as second-class citizens is not confined to Islamic societies by any means. But it does appear that many Islamic cultures have the most barbaric treatment of women (enshrined in law in many countries) and are the most tardy in giving women the social status they deserve, which is equality to men.
This attitude, supported by quotes from the Qur’an, demonstrates how dangerous and misguided it is to take one’s morals from God. Because a morality supposedly given by God, in scripture, can’t be challenged and takes no account of individual circumstances, evolution of cultural norms, progress in scientific knowledge or empathy for ‘others’. And this last criterion is possibly the most important, because it is the ability to treat people outside one’s religion as ‘others’ that permits bigotry, violence and genocide, all in the name of one’s God. This is so apparent in the violence that swept through Hirsi Ali’s home country, Somalia, and became the second most salient factor, I believe, in the rejection of her own religion.
When I first saw Hirsi Ali interviewed on TV (7.30 Report, ABC Australia) after she left Holland for America, she made the statement that Islam could never coexist in a Western secular society, logically based on her experience in Holland. In an interview I heard on the radio last year (also in Australia, with Margaret Throsby) I felt she had softened her stance and she argued that Muslims could live in a secular society. She was careful to make a distinction between Islam as a religion and Islam as political ideology (refer my post Dec. 2010). My personal experience of Muslims is that they are as varied in their political views as any other group of people. I know of liberal Muslims possibly because I hold liberal views, so that should not be surprising. But it gives me a different view to those who think that all Muslims are fundamental Islamists, or potentially so. One of Hirsi Ali’s messages is that an over-dependence on tolerance in a secular society can cause its own backlash.
I’ve written elsewhere (The problems with fundamentalism, Jan. 2008) that the limits of tolerance is intolerance of others. In other words, I am intolerant of intolerance. When Muslims, or anyone else of political persuasion, start to preach intolerance towards any other group then the opposition towards that intolerance in a healthy secular society can be immense. Australia has experienced that on a national level about a decade ago and it was ugly. Xenophobia is very easily aroused in almost any nation it would appear. People who preach hatred and bigotry, no matter who they are or which group they represent, and no matter how cleverly they disguise their rhetoric, should all be treated the same – they should be refuted and denounced in the loudest voices at the highest levels of authority.
But, as the events in Somalia demonstrate, it’s not just religion that can inflame or justify violence. Clan differences are enough to justify the most heinous crimes. All through her story, Hirsi Ali describes how everyone could find fault with every other group they came in contact with. Muslims and Africans are not alone in this prejudicial bias – I grew up with it in a Western secular society. The more insular a society is, the more bigoted they are. This is why I agree with Hirsi Ali that children should not be segregated in their education. The more children mix with other ethnicities the less insular they become in their attitudes towards other groups.
In a post I wrote on Evil (one of my earliest posts, Oct. 2007) I expounded on the idea that most of the atrocities committed in the last century, and every century beforehand for that matter, were based on some form of tribalism or an ingroup-outgroup mentality. This tribalism could be familial, religious, political, ethnic or national, but it revolves around the idea of identity. We underestimate how powerful this is because it’s almost subconscious.
Hirsi Ali’s book is almost entirely about identity and her struggle to overcome its strangulation on her life. All the role models in her young life, both female and male, were imbued with the importance and necessity of identity with her clan and her religion. In her life, religion and culture were inseparable. Her grandmother made her learn her ancestry off by heart because it might one day save her life, and, in fact, it did when she was only 20 years old and a man held a knife to her throat. By reciting her ancestry back far enough she was able to claim she was his ‘sister’ and he let her go.
People often mistakenly believe that their conscience is God whispering in their mind’s ear, when, in fact, it’s almost entirely socially and culturally formed, especially when we are children. It’s only as adults that we begin to question the norms we are brought up with, and then only when we are exposed to other social norms. A way that societies tend to overcome this ‘questioning’ is to imbue a sense of their cultural ‘superiority’ over everyone else’s. This comes across so strongly in Hirsi Ali’s book, and I recognised it as part of my own upbringing. To me, it’s a sign of immaturity that someone can only justify their own position, morally, intellectually or socially, by ridiculing everyone else’s.
One of the strongest influences on Hirsi Ali and her sister, Haweya, were the Western novels that they were exposed to: not just literary standards but pulp fiction romances. It reinforced my view that storytelling, and art in general, is the best medium to transmit ideas. It was this exposure to novels that led them to believe that there were other cultures and other ways of living, especially for women. Stories are what-ifs – they put us in someone else’s shoes and challenge our view of the world. It’s not surprising that some of the world’s greatest writers have been persecuted for their subversiveness.
But this leads to the almost heretical notion that only a society open to new ideas can progress out of ossification. If there is one singular message from Hirsi Ali’s book it’s that fundamentalism (of any stripe) does not only have to be challenged, but overcome, if societies want to move forward and evolve for the betterment for everyone, and not just for those who want to hold the reigns of power.
The real gulf that Hirsi Ali jumped was not religious but educational. I’ve argued many times that ignorance is the greatest enemy facing the 21st Century. Religious fundamentalism is arguably the greatest obstacle to genuine knowledge and rational thinking in the world today. Somewhat surprisingly, this is just as relevant to America as it is to any Islamic nation. The major difference between Islamic fundamentalism and Christian fundamentalism is geography, not beliefs.
Hirsi Ali is foremost a feminist. She once argued that Islam and the West can’t coexist, but she has since softened that stance. Perhaps, like me, she has met Muslim feminists who have found a way to reconcile their religious beliefs with their sense of independence and self-belief. Arguably, self-belief is the most important attribute a human being can foster. The corollary to this is that any culture that erodes that self-belief is toxic to itself.
I’ve written elsewhere (care of Don Cupitt, Sep. 09) that the only religion worth having is the one that you have hammered out for yourself. You don’t have to be an atheist to agree with Hirsi Ali’s basic philosophy of female emancipation, but you may have to challenge some aspects of scripture, both Christian and Islamic, if you want to live what you believe, which is what she has done.
But I’ve detoured before I’ve even got started. We, in the West, live in a bubble, though, occasionally, through television, films and books, like Hirsi Ali’s, we get a glimpse into another world that the rest of us would call hell. And this hell is not transient or momentary for these people, but relentless, unforgiving and even normal for those who grow up in it. Hirsi Ali is one of the few people who has straddled these 2 worlds, and that makes her book all the more compelling. As Aminata Forna wrote in the Evening Standard: “Hirsi Ali has invited [us] to walk a mile in her shoes. Most wouldn’t last a hundred yards.”
There are many issues touched on in her story, none perhaps more pertinent than identity, but I won’t start there. I will start with the apparent historical gap between some Islamic cultures and the modern Western world – a clash of civilisations, if you like. I remember the years between my teens and mid twenties were the most transformational, conflicted and depressing in my life. Like many of my generation, it was a time when I rejected my parents’ and society’s values, not to mention the religion I had grown up with, and sought a world view that I could call my own. To some extent that’s exactly what Hirsi Ali has done, only she had to jump from a culture still imbued with 6th Century social mores into the birth of the second millennium. I can fully understand what drove her, but, looking back on my own coming-of-age experience, I doubt that I could emulate her. What she achieved is a monumental leap compared to my short jump. For me, it was generational; for her, it was trans-cultural and it spanned millennia.
Much of her book deals with the treatment of women in traditional Muslim societies, treated, in her own words, as ‘minors’ not adults. One should not forget that the emancipation of women from vassals to independent, autonomous beings with their own rights has been a very lengthy process in Western society. Most societies have been historically patriarchal in both the East and West. The perception and treatment of women as second-class citizens is not confined to Islamic societies by any means. But it does appear that many Islamic cultures have the most barbaric treatment of women (enshrined in law in many countries) and are the most tardy in giving women the social status they deserve, which is equality to men.
This attitude, supported by quotes from the Qur’an, demonstrates how dangerous and misguided it is to take one’s morals from God. Because a morality supposedly given by God, in scripture, can’t be challenged and takes no account of individual circumstances, evolution of cultural norms, progress in scientific knowledge or empathy for ‘others’. And this last criterion is possibly the most important, because it is the ability to treat people outside one’s religion as ‘others’ that permits bigotry, violence and genocide, all in the name of one’s God. This is so apparent in the violence that swept through Hirsi Ali’s home country, Somalia, and became the second most salient factor, I believe, in the rejection of her own religion.
When I first saw Hirsi Ali interviewed on TV (7.30 Report, ABC Australia) after she left Holland for America, she made the statement that Islam could never coexist in a Western secular society, logically based on her experience in Holland. In an interview I heard on the radio last year (also in Australia, with Margaret Throsby) I felt she had softened her stance and she argued that Muslims could live in a secular society. She was careful to make a distinction between Islam as a religion and Islam as political ideology (refer my post Dec. 2010). My personal experience of Muslims is that they are as varied in their political views as any other group of people. I know of liberal Muslims possibly because I hold liberal views, so that should not be surprising. But it gives me a different view to those who think that all Muslims are fundamental Islamists, or potentially so. One of Hirsi Ali’s messages is that an over-dependence on tolerance in a secular society can cause its own backlash.
I’ve written elsewhere (The problems with fundamentalism, Jan. 2008) that the limits of tolerance is intolerance of others. In other words, I am intolerant of intolerance. When Muslims, or anyone else of political persuasion, start to preach intolerance towards any other group then the opposition towards that intolerance in a healthy secular society can be immense. Australia has experienced that on a national level about a decade ago and it was ugly. Xenophobia is very easily aroused in almost any nation it would appear. People who preach hatred and bigotry, no matter who they are or which group they represent, and no matter how cleverly they disguise their rhetoric, should all be treated the same – they should be refuted and denounced in the loudest voices at the highest levels of authority.
But, as the events in Somalia demonstrate, it’s not just religion that can inflame or justify violence. Clan differences are enough to justify the most heinous crimes. All through her story, Hirsi Ali describes how everyone could find fault with every other group they came in contact with. Muslims and Africans are not alone in this prejudicial bias – I grew up with it in a Western secular society. The more insular a society is, the more bigoted they are. This is why I agree with Hirsi Ali that children should not be segregated in their education. The more children mix with other ethnicities the less insular they become in their attitudes towards other groups.
In a post I wrote on Evil (one of my earliest posts, Oct. 2007) I expounded on the idea that most of the atrocities committed in the last century, and every century beforehand for that matter, were based on some form of tribalism or an ingroup-outgroup mentality. This tribalism could be familial, religious, political, ethnic or national, but it revolves around the idea of identity. We underestimate how powerful this is because it’s almost subconscious.
Hirsi Ali’s book is almost entirely about identity and her struggle to overcome its strangulation on her life. All the role models in her young life, both female and male, were imbued with the importance and necessity of identity with her clan and her religion. In her life, religion and culture were inseparable. Her grandmother made her learn her ancestry off by heart because it might one day save her life, and, in fact, it did when she was only 20 years old and a man held a knife to her throat. By reciting her ancestry back far enough she was able to claim she was his ‘sister’ and he let her go.
People often mistakenly believe that their conscience is God whispering in their mind’s ear, when, in fact, it’s almost entirely socially and culturally formed, especially when we are children. It’s only as adults that we begin to question the norms we are brought up with, and then only when we are exposed to other social norms. A way that societies tend to overcome this ‘questioning’ is to imbue a sense of their cultural ‘superiority’ over everyone else’s. This comes across so strongly in Hirsi Ali’s book, and I recognised it as part of my own upbringing. To me, it’s a sign of immaturity that someone can only justify their own position, morally, intellectually or socially, by ridiculing everyone else’s.
One of the strongest influences on Hirsi Ali and her sister, Haweya, were the Western novels that they were exposed to: not just literary standards but pulp fiction romances. It reinforced my view that storytelling, and art in general, is the best medium to transmit ideas. It was this exposure to novels that led them to believe that there were other cultures and other ways of living, especially for women. Stories are what-ifs – they put us in someone else’s shoes and challenge our view of the world. It’s not surprising that some of the world’s greatest writers have been persecuted for their subversiveness.
But this leads to the almost heretical notion that only a society open to new ideas can progress out of ossification. If there is one singular message from Hirsi Ali’s book it’s that fundamentalism (of any stripe) does not only have to be challenged, but overcome, if societies want to move forward and evolve for the betterment for everyone, and not just for those who want to hold the reigns of power.
The real gulf that Hirsi Ali jumped was not religious but educational. I’ve argued many times that ignorance is the greatest enemy facing the 21st Century. Religious fundamentalism is arguably the greatest obstacle to genuine knowledge and rational thinking in the world today. Somewhat surprisingly, this is just as relevant to America as it is to any Islamic nation. The major difference between Islamic fundamentalism and Christian fundamentalism is geography, not beliefs.
Hirsi Ali is foremost a feminist. She once argued that Islam and the West can’t coexist, but she has since softened that stance. Perhaps, like me, she has met Muslim feminists who have found a way to reconcile their religious beliefs with their sense of independence and self-belief. Arguably, self-belief is the most important attribute a human being can foster. The corollary to this is that any culture that erodes that self-belief is toxic to itself.
I’ve written elsewhere (care of Don Cupitt, Sep. 09) that the only religion worth having is the one that you have hammered out for yourself. You don’t have to be an atheist to agree with Hirsi Ali’s basic philosophy of female emancipation, but you may have to challenge some aspects of scripture, both Christian and Islamic, if you want to live what you believe, which is what she has done.
Wednesday, 2 March 2011
A discussion on Wiki-Leaks without Assange
This is, in effect, a follow-up from a previous post on Wiki-Leaks (The forgotten man, last month), though from a different point of view. It’s a truly international discussion with 3 participants from the US, one from Iceland and one from Berlin, chaired in front of a live TV audience in Australia. This discussion is more diverse than the 4 Corners programme I referenced in my earlier post, and, arguably, more balanced as well.
When Assange was first criticised for endangering lives, I admit I considered that to be irresponsible, but events have revealed, whether by good luck or good management, that those concerns have not materialised. This aspect of the debate on Wiki-Leaks is discussed at length in this programme. The other thing that is brought out in this discussion is that you really can’t preach transparency if you can’t practice it.
But I think the most significant aspect of all this is how the internet has changed the way information can be delivered. Closing down Wiki-Leaks will be like trying to put the genie back into the bottle. Whatever happens to Assange, the world’s media will never be the same again. Wiki-Leaks has changed the rules and I don’t think, short of totalitarian measures, they can be reversed.
Addendum: For the latest refer this post (16 August 2012)
When Assange was first criticised for endangering lives, I admit I considered that to be irresponsible, but events have revealed, whether by good luck or good management, that those concerns have not materialised. This aspect of the debate on Wiki-Leaks is discussed at length in this programme. The other thing that is brought out in this discussion is that you really can’t preach transparency if you can’t practice it.
But I think the most significant aspect of all this is how the internet has changed the way information can be delivered. Closing down Wiki-Leaks will be like trying to put the genie back into the bottle. Whatever happens to Assange, the world’s media will never be the same again. Wiki-Leaks has changed the rules and I don’t think, short of totalitarian measures, they can be reversed.
Addendum: For the latest refer this post (16 August 2012)
Subscribe to:
Posts (Atom)