Paul P. Mealing

Check out my book, ELVENE. Available as e-book and as paperback (print on demand, POD). Also this promotional Q&A on-line.
Showing posts with label Morality. Show all posts
Showing posts with label Morality. Show all posts

Saturday 13 January 2024

How can we achieve world peace?

 Two posts ago, I published my submission to Philosophy Now's Question of the Month, from 2 months ago: What are the limit of knowledge? Which was published in Issue 159 (Dec 2023/Jan 2024). Logically, they inform readers of the next Question of the Month, which is the title of this post. I'm almost certain they never publish 2 submissions by the same author in a row, so I'm publishing this answer now. It's related to my last post, obviously, and one I wrote some time ago (Humanity's Achilles Heel).


There are many aspects to this question, not least whether one is an optimist or a pessimist. It’s well known that people underestimate the duration and cost of a project, even when it’s their profession, because people are optimists by default. Only realists are pessimistic, and I’m in the latter category, because I estimate the duration of projects professionally.
 
There are a number of factors that mitigate against world peace, the primary one being that humans are inherently tribal and are quick to form ingroup-outgroup mental-partitions, exemplified by politics the world over. In this situation, rational thought and reasoned argument take a back seat to confirmation bias and emotive rhetoric. Add to this dynamic, the historically observed and oft-repeated phenomena that we follow charismatic, cult-propagating leaders, and you have a recipe for self-destruction on a national scale. This is the biggest obstacle to world peace. These leaders thrive on and cultivate division with its kindred spirits of hatred and demonisation of the ‘other’: the rationale for all of society’s ills becomes an outgroup identified by nationality, race, skin-colour, culture or religion.
 
Wealth, or the lack of it, is a factor as well. Inequality provides a motive and a rationale for conflict. It often goes hand-in-hand with oppression, but even when it doesn’t, the anger and resentment can be exploited and politicised by populist leaders, whose agenda is more focused on their own sense of deluded historical significance than actually helping the people they purportedly serve.
 
If you have conflict - and it doesn’t have to be military – then as long as you have leaders who refuse to compromise, you’ll never find peace. Only moderates on both sides can broker peace.
 
So, while I’m a pessimist or realist, I do see a ‘how’. If we only elect leaders who seek and find consensus, and remove leaders who sow division, there is a chance. The best leaders, be they corporate, political or on a sporting field, are the ones who bring out the best in others and are not just feeding their own egos. But all this is easier said than done, as we are witnessing in certain parts of the world right now. For as long as we elect leaders who are narcissistic and cult-like, we will continue to sow the seeds of self-destruction.

Saturday 6 January 2024

Bad things happen when good people do nothing

 At present there are 2 conflicts holding the world’s attention – they are different, yet similar. They both involve invasions, one arguably justified, involving a response to a cowardly attack, and the other based on the flimsiest of suppositions. But what they highlight is a double-standard in the policies of Western governments in how they respond to the humanitarian crises that inevitably result from such incursions.
 
I’m talking about the war in Ukraine, following Russia’s invasion 2 years ago next month, and Israel’s war in Gaza, following Hamas’s attack on 7 Oct. 2023, killing around 1200 people and taking an estimated 240 hostages; a reported 120 still in captivity (at the time of writing).
 
According to the UN, 'Gaza faces the "highest ever recorded" levels of food insecurity', as reported on the Guardian website (21 Dec 2023). And it was reported on the news today (6 Jan 2024) that ‘Gaza is uninhabitable’. Discussions within the UN have been going on for over a month, yet have been unable to unlock a stalemate concerning humanitarian aid that requires a cessation of hostilities, despite the obvious existential need.
 
Noelia Monge, the head of emergencies for Action Against Hunger, said: “Everything we are doing is insufficient to meet the needs of 2 million people. It is difficult to find flour and rice, and people have to wait hours to access latrines and wash themselves. We are experiencing an emergency like I have never seen before.” (Source: Guardian)
 
I don’t think it’s an exaggeration to say that this is a humanitarian crisis of unprecedented proportions in modern times. It’s one thing for Israel to invade a country that harbours a mortal enemy, but it is another to destroy all infrastructure, medical facilities and cut off supplies of food and essential services, without taking any responsibility. And this is the double-standard we are witnessing. Everyone in the West condemns Putin’s attack on Ukrainian civilians, their homes and infrastructure, and calls them out as ‘war crimes’. No one has the courage to level the same accusation at Benjamin Netanyahu, despite the growing, unprecedented humanitarian crisis created by his implacable declaration to ‘destroy Hamas’. Has anyone pointed out that it’s impossible to destroy Hamas without destroying Gaza? Because that’s what he’s demonstrating.
 
The UN’s hunger monitoring system, Integrated Food Security Phase Classification (IPC), issued a report saying the “most likely scenario” in Gaza is that by 7 February “the entire population in the Gaza Strip [about 2.2 million people] would be at “crisis or worse” levels of hunger.
(Source: Guardian)
 
In America, you have the perverse situation where many in the Republican Party want to withdraw support from Volodymyr Zelensky while providing military aid to Israel. They are, in effect, supporting both invasions, though they wouldn’t couch it in those terms.
 
Israel has a special status in Western eyes, consequential to the unconscionable genocide that Jews faced under Nazi Germany. It has led to a tendency, albeit unspoken, that Israel has special privileges when it comes to defending their State. This current conflict is a test of the West’s conscience. How much of a moral bankruptcy are we willing to countenance, before we say enough is enough, and that humanity needs to win.

Sunday 3 December 2023

Philosophy in practice

 As someone recently pointed out, my posts on this blog invariably arise from things I have read (sometimes watched) and I’ve already written a post based on a column I read in the last issue of Philosophy Now (No 158, Oct/Nov 2023).
 
Well, I’ve since read a few more articles and they have prompted quite a lot of thinking. Firstly, there is an article called What Happened to Philosophy? By Dr Alexander Jeuk, who is to quote: “an independent researcher writing on philosophy, economics, politics and the institutional structure of science.” He compares classical philosophy (in his own words, the ‘great philosophers’) with the way philosophy is practiced today in academia – that place most us of don’t visit and wouldn’t understand the language if we did.
 
I don’t want to dwell on it, but it’s relevance to this post is that he laments the specialisation of philosophy, which he blames (if I can use that word) on the specialisation of science. The specialisation of most things is not a surprise to anyone who works in a technical field (I work in engineering). I should point out that I’m not a technical person, so I’m a non-specialist who works in a specialist field. Maybe that puts me in a better position than most to address this. I have a curious mind that started young and my curiosity shifted as I got older, which means I never really settled into one area of knowledge, and, if I had, I didn’t quite have the intellectual ability to become competent in it. And that’s why this blog is a bit eclectic.
 
In his conclusion, Jeuk suggests that ‘great philosophy’ should be looked for ‘in the classics, and perhaps encourage a re-emergence of great philosophical thought from outside academia.’ He mentions social media and the internet, which is relevant to this blog. I don’t claim to do ‘great philosophy’; I just attempt to disperse ideas and provoke thought. But I think that’s what philosophy represents to most people outside of academia. Academic philosophy has become lost in its obsession with language, whilst using language that most find obtuse, if not opaque.
 
Another article was titled Does a Just Society Require Just Citizens? By Jimmy Aflonso Licon, Assistant Teaching Professor in Philosophy at Arizona State University. I wouldn’t call the title misleading, but it doesn’t really describe the content of the essay, or even get to the gist of it, in my view. Licon introduces a term, ‘moral mediocrity’, which might have been a better title, if an enigmatic one, as it’s effectively what he discusses for the next, not-quite 3 pages.
 
He makes the point that our moral behaviour stems from social norms – a point I’ve made myself – but he makes it more compellingly. Most of us do ‘moral’ acts because that’s what our peers do, and we are species-destined (my term, not his) to conform. This is what he calls moral mediocrity, because we don’t really think it through or deliberate on whether it’s right or wrong, though we might convince ourselves that we do. He makes the salient point that if we had lived when slavery was the norm, we would have been slave-owners (assuming the reader is white, affluent and male). Likewise, suffrage was once anathema to a lot of women, as well as men. This supports my view that morality changes, and what was once considered radical becomes conservative. And such changes are usually generational, as we are witnessing in the current age with marriage equality.
 
He coins another term, when he says ‘we are the recipients of a moral inheritance’ (his italics). In other words, the moral norms we follow today, we’ve inherited from our forebears. Towards the end of his essay, he discusses Kant’s ideas on ‘duty’. I won’t go into that, but, if I understand Licon’s argument correctly, he’s saying that a ‘just society’ is one that has norms and laws that allow moral mediocrity, whereby its members don’t have to think about what’s right or wrong; they just follow the rules. This leads to his very last sentence: And this is fundamentally the moral problem with moral mediocrity: it is wrongly motivated.
 
I’ve written on this before, and, given the title as well as the content, I needed to think on what I consider leads to a ‘just society’. And I keep coming back to the essential need for trust. Societies don’t function without some level of trust, but neither do personal relationships, contractual arrangements or the raising of children.
 
And this leads to the third article in the same issue, Seeing Through Transparency, by Paul Doolan, who ‘teaches philosophy at Zurich International School and is the author of Collective Memory and the Dutch East Indies; Unremembering Decolonization (Amsterdam Univ Press, 2021).
 
In effect, he discusses the paradoxical nature of modern societies, whereby we insist on ‘transparency’ yet claim that privacy is sacrosanct – see the contradiction? Is this hypocrisy? And this relates directly to trust. Without transparency, be it corporate or governmental, we have trust issues. My experience is that when it comes to personal relationships, it’s a given, a social norm in fact, that a person reveals as much of their interior life as they want to, and it’s not ours to mine. An example of moral mediocrity perhaps. And yet, as Doolan points out, we give away so much on social media, where our online persona takes on a life of its own, which we cultivate (this blog not being an exception).
 
I think there does need to be transparency about decisions that affect our lives collectively, as opposed to secrets we all keep for the sake of our sanity. I have written dystopian fiction where people are surveilled to the point of monitoring all speech, and explored how it affects personal relationships. This already happens in some parts of the world. I’ve also explored a dystopian scenario where the surveillance is less obvious – every household has an android that monitors all activity. We might already have that with certain devices in our homes. Can you turn them off?  Do you have a device that monitors everyone who comes to your door?
 
The thing is that we become habituated to their presence, and it becomes part of our societal structure. As I said earlier, social norms change and are largely generational. Now they incorporate AI as well, and it’s happening without a lot of oversight or consultation with users. I don’t want to foster paranoia, but the genie has already escaped and I’d suggest it’s a matter of how we use it rather than how we put it back in the bottle.

Leaving that aside, Doolan also asks if you would behave differently if you could be completely invisible, which, of course, has been explored in fiction. We all know that anonymity fosters bad behaviour – just look online. One of my tenets is that honesty starts with honesty to oneself; it determines how we behave towards others.
 
I also know that an extreme environment, like a prison camp, can change one’s moral compass. I’ve never experienced it, but my father did. It brings out the best and worst in people, and I’d contend that you wouldn’t know how you’d be affected if you haven’t experienced it. This is an environment that turns Licon’s question on its head: can you be just in an intrinsically unjust environment?

Saturday 25 November 2023

Are people on the Left more intelligent?

 Now there’s a provocative question, and the short answer is, No. Political leanings are more associated with personality traits than IQ, according to studies I’ve read about, though I’m no expert. Having said that, I raise this subject, because I think there’s a perception on both sides that there is, which is why people on the Right love to use the word, ‘elites’, to describe what they see as a distortion of reality on subjects like climate change, the COVID pandemic and just about anything they disagree with that involves a level of expertise that most of us don’t have.
 
We live in a world overflowing with information (of which, ironically, I am a contributor) and most, if not all of it, is imbibed through a political filter. On social media we live in echo-chambers, so that confirmation bias is unplugged from all conduits of dissent.
 
To provide a personal example, I watch panel discussions facilitated by The Australian Institute using Zoom, on topics like plastic-waste, whistleblower protection, Pacific nations relations, economics of inflation (all relatively recent topics). The titles alone have a Leftish flavour (though not all), and would be dismissed as ‘woke’ by many on the Right. They are a leftwing think tank, and the panellists are all academics or experts in their field. Whether you agree with them or not, they are well informed.
 
Of course, there are rightwing thinktanks as well; the most obvious in Australia being the Institute of Public Affairs (IPA) with the catchcry, The Voice for Freedom. The Australia Institute has its own catchcry, We Change Minds, which is somewhat optimistic given it appears to be always preaching to the choir. It should be pointed out that the IPA can also provide their own experts and research into individual topics.
 
I’ve never hidden my political leanings, and only have to look at my own family to appreciate that personality traits play a greater role than intelligence. I’m the political black sheep, yet we still socialise and exhibit mutual respect. The same with some of my neighbours, who have strong religious views, yet I count as friends.
 
It’s not a cliché that people of an artistic bent tend to be leftists. I think this is especially true in theatre, where many an eccentric personality took refuge, not to mention people with different sexual orientation to the norm. We are generally more open to new ideas and more tolerant of difference. Negative traits include a vulnerability to neurosis, even depression, and a lack of discipline or willingness to abide by rules.
 
One of the contentious points-of-view I hold is that people on the Left have a propensity for being ahead of their time. It’s why they are often called ‘progressives’, but usually only by history. In their own time, they could be called ratbags, radicals or nowadays, ‘elitist’. History tends to bear this out, and it’s why zeitgeist changes are often generational.
 
Recently, I’ve come across a couple of discussions on Russell (including a 1960 interview with him) and was surprised to learn how much we have in common, philosophically. Not only in regard to epistemology and science (which is another topic), but also ethics and morality. To quote from an article in Philosophy Now (Issue 158, Oct/Nov 2023) titled Russell’s Moral Quandary by David Berman (Professor Emiritus Fellow, Philosophy Department, Trinity College Dublin).
 
…our moral judgements [According to Russell] come from a combination of our nurture and education, but primarily from our feelings and their consequences. Hence they do not arise from any timeless non-natural absolutes [like God], for they are different in different times and places.
 

It’s the very last phrase that is relevant to this essay, though it needed to be put in context. Where I possibly depart from Russell is in the role of empathy, but that’s also another discussion.
 
Even more recently, I had a conversation with a mother of a son and daughter, aged 22 and 19 respectively, where she observed that her daughter was living in a different world to the one she grew up in, particularly when it came to gender roles and expectations. I imagine many would dismiss this as a manifestation of wokeism, but I welcome it. I’ve long argued that there should be more cross-generational conversation. I’ve seen this in my professional life (in engineering), where there is a natural synergy between myself and cleverer, younger people, because we are willing to learn from each other. It naturally mitigates against close-mindedness.
 
The Right are associated with 2 social phenomena that tend to define them. Firstly, they wish to maintain the status quo, even turn back the clock, to the point that they will find their own ‘evidence’ to counter proposed changes. This is not surprising, as it’s almost the definition of conservatism. But the second trait, for want of a better word, has become more evident and even dangerous in modern politics, both locally and overseas. It’s particularly virulent in America, and I’m talking about the propensity to oppose all alternative views to the point of self-defeatism. I know that extremists on the Left can be guilty as well, but there are personalities on the Right who thrive on division; who both cultivate and exploit it. The end result is often paralysis, as we’ve seen recently in America with the House Speaker debacle, and its close-encounter with a nationwide catastrophe.
 
There is a view held by many, including people who work in my profession, that the best way to achieve the most productive outcome is through competition. In theory, it sounds good, but in practice – and I’ve seen it many times – you end up with 2 parties in constant argument and opposition to each other. Even if there are more than 2, they tend to align into 2. What you get is decision-paralysis, delays, stalemate and a neverending blame-game. On the other hand, when parties co-operate and collaborate, you get the exact opposite. Is this a surprise? No.
 
From my experience, the best leaders in project management are the ones who can negotiate compromises and it’s the same in politics. The qualities are openness, tolerance and persuasive negotiation skills. I’ve seen it in action numerous times.
 
In a post I wrote on Plato, I talked about his philosopher-king idea, which is an ideal that could never work in practice. Nevertheless, one of the problems with democracy, as it’s practiced virtually everywhere, is that the most popular opinion on a particular topic is not necessarily the best informed. I can see a benefit in experts playing a greater role in determining policies. We saw this in Australia during the pandemic and I believe it worked, though not everyone agrees. Some argue that the economy suffered unnecessarily. But this was a worldwide experiment, and we saw that where medical advice was ignored and fatalities arose accordingly, the economy suffered anyway.

Sunday 15 October 2023

What is your philosophy of life and why?

This was a question I answered on Quora, and, without specifically intending to, I brought together 2 apparently unrelated topics. The reason I discuss language is because it’s so intrinsic to our identity, not only as a species, but as an individual within our species. I’ve written an earlier post on language (in response to a Philosophy Now question-of-the-month), which has a different focus, and I deliberately avoided referencing that.
 
A ‘philosophy of life’ can be represented in many ways, but my perspective is within the context of relationships, in all their variety and manifestations. It also includes a recurring theme of mine.



First of all, what does one mean by ‘philosophy of life? For some people, it means a religious or cultural way-of-life. For others it might mean a category of philosophy, like post-modernism or existentialism or logical positivism.
 
For me, it means a philosophy on how I should live, and on how I both look at and interact with the world. This is not only dependent on my intrinsic beliefs that I might have grown up with, but also on how I conduct myself professionally and socially. So it’s something that has evolved over time.
 
I think that almost all aspects of our lives are dependent on our interactions with others, which starts right from when we were born, and really only ends when we die. And the thing is that everything we do, including all our failures and successes occur in this context.
 
Just to underline the significance of this dependence, we all think in a language, and we all gain our language from our milieu at an age before we can rationally and critically think, especially compared to when we mature. In fact, language is analogous to software that gets downloaded from generation to generation, so that knowledge can also be passed on and accumulated over ages, which has given rise to civilizations and disciplines like science, mathematics and art.
 
This all sounds off-topic, but it’s core to who we are and it’s what distinguishes us from other creatures. Language is also key to our relationships with others, both socially and professionally. But I take it further, because I’m a storyteller and language is the medium I use to create a world inside your head, populated by characters who feel like real people and who interact in ways we find believable. More than any other activity, this illustrates how powerful language is.
 
But it’s the necessity of relationships in all their manifestations that determines how one lives one’s life. As a consequence, my philosophy of life centres around one core value and that is trust. Without trust, I believe I am of no value. But, not only that, trust is the foundational value upon which a society either flourishes or devolves into a state of oppression with its antithesis, rebellion.

 

Tuesday 10 October 2023

Oppenheimer and lessons for today

 I watched Chris Nolan’s 3hr movie, Oppenheimer, and then read the 600 page book it was based on, American Prometheus, by Kai Bird and Martin J. Sherwin, which deservedly won a Pulitzer prize. Its subtitle is The Triumph and Tragedy of J. Robert Oppenheimer, which really does sum up his life.
 
I think the movie should win a swag of Oscars, not just because of the leading actors, but the way the story was told. In the movie, the ‘triumph’ and the ‘tragedy’ are more-or-less told in parallel, using the clever device of colour for the ‘bomb’ story and black and white for the political story. From memory, the bomb is detonated at the 2hr mark and the remainder of the film focuses on what I’d call the ‘inquisition’, though ‘kangaroo court’ is possibly a more accurate description and is used at least once in the book by a contemporary commentator.
 
Despite its length, the book is a relatively easy read and is hard to put down, or at least it was for me – it really does read like a thriller in places.
 
It so happened that I followed it up with The Last Days of Socrates by Plato, and I couldn’t help but draw comparisons. Both were public figures who had political influence that wasn’t welcome or even tolerated in some circles.
 
I will talk briefly about Socrates, as I think its relevant, even though its 2400 years ago. Plato, of course, adopts Socrates’ perspective, and though I expect Plato was present at his trial, we don’t know how accurate a transcription it is. Nevertheless, the most interesting and informative part of the text is the section titled The Apology of Socrates (‘Socrates’ Defence’). Basically, Socrates argued that he had been the victim of what we would call a ‘smear campaign’ or even slander, and this is well and truly before social media, but perhaps they had something equivalent in Athens (4-300 BC). Socrates makes the point that he’s a private citizen, not a public figure, and says, …you can be quite sure, men of Athens, that if I’d set about a political career all those years ago, I’d long ago have come to a sticky end… Anyone who is really fighting for justice must live as a private citizen and not a public figure if he’s going to survive even a short time.
 
One of the reasons, if not the main reason, according to Plato, that Socrates accepted his fate was that he refused to change. Practicing philosophy in the way he did was, in effect, his essence.
 
The parallels with Oppenheimer, is that Oppenheimer publicly advocated policies that were not favourable among certain politicians and certainly not the military. But to appreciate this, one must see it in the political context of its time.
 
Firstly, one must understand that immediately after the second world war, most if not all the nations that had been involved, didn’t really have an appetite for another conflict, especially on that scale, let alone one involving nuclear weapons, which I believe, is how the cold war came to be.
 
If one looks at warfare through a historical lens, the side with a technological advantage invariably prevails. A good example is the Roman empire who could build roads, bridges and viaducts, all in the service of its armies.
 
So, there was a common view among the American military, as well as the politicians of the day, that, because they had the atomic bomb, they had a supreme technological superiority and all they had to do was keep the knowledge from the enemy.
 
Oppenheimer knew this was folly and was advocating an arms treaty with Russia decades before it became accepted. Not only Oppenheimer, but most scientists, knew that humanity would not survive a nuclear holocaust, but many politicians believed that the threat of a nuclear war was the only road to peace. For this reason, many viewed Oppenheimer as a very dangerous man. Oppenheimer opposed the hydrogen bomb because it was effectively a super-bomb that would make the atomic bomb look like a comparative non-event.
 
He also knew that the US Air Force had already circled which cities in Russia they would eliminate should another hot war start. Oppenheimer knew this was madness, and today there’s few people who would not agree with him. Hindsight is a remarkable facility.
 
On February 17 1953, Oppenheimer gave a speech in New York before an audience comprising a ‘closed meeting of the Council on Foreign Relations’, in which he attempted to relay the precarious state the world was in and the pivotal role that the US was playing, while all the time acknowledging that he was severely limited in what he could actually tell them. Here are some excerpts that give a flavour:

Looking a decade ahead, it is likely to be small comfort that the Soviet Union is four years behind us… the very least we can conclude is that our twenty-thousandth bomb… will not in any deep strategic sense offset their two-thousandth.
 
We have from the first, maintained that we should be free to use these weapons… [and] one ingredient of this plan is a rather rigid commitment to their use in a very massive, initial, unremitting strategic assault on the enemy.
 
Without putting it into actual words, Oppenheimer was spelling out America’s defence policy towards the Soviets at that time. What he couldn’t tell them was that this was the strategy of the Strategic Air Command – to obliterate scores of Russian cities in a genocidal air strike.
 
In his summing up, he said, We may anticipate a state of affairs in which the two Great Powers will each be in a position to put an end to civilization and life of the other, though not without risking its own.
 
He then gave this chilling analogy: We may be likened to two scorpions in a bottle, each capable of killing the other, but only at the risk of its own life.
 
This all happened against the backdrop and hysteria of McCarthyism, which Einstein compared to Nazi Germany. Oppenheimer, his wife and his brother all had links with the Communist party, though Oppenheimer distanced himself when he became aware of the barbaric excesses of Stalin’s Russia. The FBI had him under surveillance for much of his career, both during and after the war, and it was countless files of FBI wiretaps that was used in evidence against him, in his so-called hearing. They would have been inadmissible in a proper court of law, and in the hearing, his counsel was not allowed to access them because they were ‘classified’. There were 3 panel members and one of them, a Dr Evans, wrote a dissent, arguing that there was no new evidence, and that if Oppenheimer had been cleared in 1947, he was even less of a security risk in 1954.
 
After the ‘hearing’, media was divided, just like it would be today, and that’s its relevance to modern America. The schism was the left and right of politics and that schism is still there today, and possibly even deeper than it was then.
 
If one looks at the downfall of great people – I’m thinking Alan Turing and Galileo Galilei, not to mention Socrates – history judges them differently to how they were judged in their day, and that also goes for Oppenheimer. Hypatia is another who comes to mind, though she lived (and died) 400 AD. What all these have in common, other than being persecuted, is that they were ahead of their time. People will say the same about advocates for same-sex marriage, not to mention the Cassandras warning about climate change.


Addendum: I recently wrote a post on Quora that’s made me revisit this. Basically, I gave this as an example of when the world was on the brink of madness – specifically, the potential for nuclear Armageddon – and Oppenheimer was almost a lone voice in trying to warn people, while having neither the authority nor the legal right to do so.
 
It made me consider that we are now possibly on the brink of a different madness, that I referenced in my Quora post:
 
But the greatest harbinger of madness on the world stage is that the leading contender for the next POTUS is a twice-impeached, 4-times indicted ex-President. To quote Robert De Niro: “Democracy won’t survive the return of a wannabe dictator.” We are potentially about to enter an era where madness will reign in the most powerful nation in the world. It’s happened before, so we are well aware of the consequences. Trump may not lead us into a world war, but despots will thrive and alliances will deteriorate if not outright crumble.


Thursday 25 May 2023

Philosophy’s 2 disparate strands: what can we know; how can we live

The question I’d like to ask, is there a philosophical view that encompasses both? Some may argue that Aristotle attempted that, but I’m going to take a different approach.
 
For a start, the first part can arguably be broken into 2 further strands: physics and metaphysics. And even this divide is contentious, with some arguing that metaphysics is an ‘abstract theory with no basis in reality’ (one dictionary definition).
 
I wrote an earlier post arguing that we are ‘metaphysical animals’ after discussing a book of the same name, though it was really a biography of 4 Oxford women in the 20th Century: Elizabeth Anscombe, Mary Midgley, Philippa Foot and Iris Murdoch. But I’ll start with this quote from said book.
 
Poetry, art, religion, history, literature and comedy are all metaphysical tools. They are how metaphysical animals explore, discover and describe what is real (and beautiful and good). (My emphasis.)
 
So, arguably, metaphysics could give us a connection between the 2 ‘strands’ in the title. Now here’s the thing: I contend that mathematics should be part of that list, hence part of metaphysics. And, of course, we all know that mathematics is essential to physics as an epistemology. So physics and metaphysics, in my philosophy, are linked in a rather intimate  way.
 
The curious thing about mathematics, or anything metaphysical for that matter, is that, without human consciousness, they don’t really exist, or are certainly not manifest. Everything on that list is a product of human consciousness, notwithstanding that there could be other conscious entities somewhere in the universe with the same capacity.
 
But again, I would argue that mathematics is an exception. I agree with a lot of mathematicians and physicists that while we create the symbols and language of mathematics, we don’t create the intrinsic relationships that said language describes. And furthermore, some of those relationships seem to govern the universe itself.
 
And completely relevant to the first part of this discussion, the limits of our knowledge of mathematics seems to determine the limits of our knowledge of the physical world.
 
I’ve written other posts on how to live, specifically, 3 rules for humans and How should I live? But I’m going to go via metaphysics again, specifically storytelling, because that’s something I do. Storytelling requires an inner and outer world, manifest as character and plot, which is analogous to free will and fate in the real world. Now, even these concepts are contentious, especially free will, because many scientists tell us it’s an illusion. Again, I’ve written about this many times, but it’s relevance to my approach to fiction is that I try and give my characters free will. An important part of my fiction is that the characters are independent of me. If my characters don’t take on a life of their own, then I know I’m wasting my time, and I’ll ditch that story.
 
Its relevance to ‘how to live’ is authenticity. Artists understand better than most the importance of authenticity in their work, which really means keeping themselves out of it. But authenticity has ramifications, as any existentialist will tell you. To live authentically requires an honesty to oneself that is integral to one’s being. And ‘being’ in this sense is about being human rather than its broader ontological meaning. In other words, it’s a fundamental aspect of our psychology, because it evolves and changes according to our environment and milieu. Also, in the world of fiction, it's a fundamental dynamic.
 
What's more, if you can maintain this authenticity (and it’s genuine), then you gain people’s trust, and that becomes your currency, whether in your professional life or your social life. However, there is nothing more fake than false authenticity; examples abound.
 
I’ll give the last word to Socrates; arguably the first existentialist.
 
To live with honour in this world, actually be what you try to appear to be.


Saturday 29 April 2023

Can philosophy be an antidote to dogma?

 This is similar to another post I wrote recently, both of which are answers to questions I found on Quora. The reason I’m posting this is because I think it’s better than the previous one. Not surprisingly, it also references Socrates and the role of argument in philosophical discourse.
 
What qualities are needed to be a good philosopher?
 
I expect you could ask 100 different philosophers and get 100 different answers. Someone (Gregory Scott), in answer to a similar question, claimed that everyone is a philosopher, but not necessarily a good one.
 
I will suggest 2 traits that I try to cultivate in myself: to be intellectually curious and to be analytical. But I’m getting ahead of myself.
 
For a start, there are many ‘branches’ or categories of philosophy: epistemology and ethics, being the best known and most commonly associated with philosophy. Some might include ontology as well, which has a close relationship with epistemology, like 2 sides of the same coin. There is also logic and aesthetics but then the discussion becomes interminable.
 
But perhaps the best way to answer this question is to look at philosophers you admire and ask yourself, what qualities do they possess that merit your admiration?
 
Before I answer that for myself, I’m going to provide some context. Sandy Grant (philosopher at the University of Cambridge) published an essay titled Dogmas (Philosophy Now, Issue 127, Aug/Sep 2018), whereby she points out the pitfalls of accepting points of view on ‘authority’ without affording them critical analysis. And I would argue that philosophy is an antidote to dogma going back to Socrates, who famously challenged the ‘dogmas’ of his day. Prior to Socrates, philosophy was very prescriptive where you followed someone’s sayings, be they from the Bible, or Confucius or the Upanishads. Socrates revolutionary idea was to introduce argument, and philosophy has been based on argument ever since.
 
Socrates is famously attributed with the saying, The unexamined life is not worth living, which he apparently said before he was forced to take his own life. But there is another saying attributed to Socrates, which is more germane, given the context of his death.
 
To live with honour in this world, actually be what you try to appear to be.
 
Socrates also acquitted himself well in battle, apparently, so he wasn’t afraid of dying for a cause and a principle. Therefore, I would include integrity as the ‘quality’ of a good person, let alone a philosopher.
 
We currently live in an age where the very idea of truth is questioned, whether it be in the realm of science or politics or media. Which is why I think that critical thinking is essential, whereby one looks at evidence and the expertise behind that evidence. I’ve spent a working lifetime in engineering, where, out of necessity, one looks to expertise that one doesn’t have oneself. Trust has gone AWOL in our current social media environment and the ability to analyse without emotion and ideology is paramount. To accept evidence when it goes against your belief system is the mark of a good philosopher. Evidence is the keystone to scientific endeavour and also in administering justice. But perhaps the greatest quality required of a philosopher is to admit, I don’t know, which is also famously attributed to Socrates.

Tuesday 20 December 2022

What grounds morality?

 In the most recent issue of Philosophy Now (No 153, Dec 2022/Jan 2023), they’ve published the answers to the last Question of the Month: What Grounds or Justifies Morality? I submitted an answer that wasn’t included, and having read the 10 selected, I believe I could have done better. In my answer, I said, ‘courage’, based on the fact that it takes courage for someone to take a stand against the tide of demonisation of the ‘other’, which we witness so often in history and even contemporary society.
 
However, that is too specific and doesn’t really answer the question, which arguably is seeking a principle, like the ‘Golden Rule’ or the Utilitarian principle of ‘the greatest happiness to the greatest number’. Many answers cited Kant’s appeal to ‘reason’, and some cited religion and others, some form of relativism. All in all, I thought they were good answers without singling any one out.
 
So what did I come up with? Well, partly based on observations of my own fiction and my own life, I decided that morality needed to be grounded in trust. I’ve written about trust at least twice before, and I think it’s so fundamental, because, both one-on-one relationships (of all types) and society as a whole, can’t function properly without it. If you think about it, how well you trust someone is a good measure of your assessment of their moral character. But it functions at all levels of society. Imagine living in a society where you can’t say what you think, where you have to obey strict rules of secrecy and deception or you will be punished. Such societies exist.
 
I’ve noticed a recurring motif in my stories (not deliberate) of loyalties being tested and of moral dilemmas. Both in my private life and professional life, I think trust is paramount. It’s my currency. I realised a long time ago that if people don’t trust me, I have no worth.

Wednesday 28 September 2022

Humanity’s Achilles’ heel

Good and evil are characteristics that imbue almost every aspect of our nature. It’s why it’s the subject of so many narratives, including mythologies and religions, not to mention actual real-world histories. It effectively defines what we are, what we are capable of and what we are destined to be.
 
I’ve discussed evil in one of my earliest posts, and also its recurring motif in fiction. Humanity is unique, at least on this small world we call home, in that we can change it on a biblical scale, both intentionally and unintentionally – climate change being the most obvious and recent example. We are doing this in combination with creating the fastest growing extinction event in the planet’s history, for which most of us are blissfully ignorant.
 
This post is already going off on tangents, but it’s hard to stay on track when there are so many ramifications; because none of these issues are the Achilles’ heel to which the title refers.
 
We have the incurable disease of following leaders who will unleash the worst of humanity onto itself. I wrote a post back in 2015, a year before Trump was elected POTUS, that was very prescient given the events that have occurred since. There are two traits such leaders have that not only define them but paradoxically explain their success.
 
Firstly, they are narcissistic in the extreme, which means that their self-belief is unassailable, no matter what happens. The entire world can collapse around them and somehow they’re untouchable. Secondly, they always come to power in times of division, which they exploit and then escalate to even greater effect. Humans are most irrational in ingroup-outgroup situations, which could be anything from a family dispute to a nationwide political division. Narcissists thrive in this environment, creating a narrative that only resembles the reality inside their head, but which their followers accept unquestioningly.
 
I’ve talked about leadership in other posts, but only fleetingly, and it’s an inherent and necessary quality in almost all endeavours; be it on a sporting field, on an engineering project, in a theatre or in a ‘house’ of government. There is a Confucian saying (so neither Western nor modern): If you want to know the true worth of a person, observe the effects they have on other people’s lives. I’ve long contended that the best leaders are those who bring out the best in the people they lead, which is the opposite of narcissists, who bring out the worst.
 
I’ve argued elsewhere that we are at a crossroads, which will determine the future of humanity for decades, if not centuries ahead. No one can predict what this century will bring, in the same way that no one predicted all the changes that occurred in the last century. My only prediction is that the changes in this century will be even greater and more impactful than the last. And whether that will be for the better or the worse, I don’t believe anyone can say.
 
Do I have an answer? Of course not, but I will make some observations. Virtually my whole working life was spent on engineering projects, which have invariably involved an ingroup-outgroup dynamic. Many people believe that conflict is healthy because it creates competition and by some social-Darwinian effect, the best ideas succeed and are adopted. Well, I’ve seen the exact opposite, and I witness it in our political environment all the time.
 
In reality, what happens is that one side will look for, and find, something negative about every engineering solution to a problem that is proposed. This means that there is continuous stalemate and the project suffers in every way imaginable – morale is depleted, everything is drawn out and we have time and cost overruns, which feed the blame-game to new levels. At worst, the sides end up in legal dispute, where, I should point out, I’ve had considerable experience.
 
On the contrary, when sides work together and collaboratively, people compromise and respect the expertise of their counterparts. What happens is that problems and issues are resolved and the project is ultimately successful. A lot of this depends on the temperament and skills of the project leader. Leadership requires good people skills.
 
Someone once did a study in the United States in the last century (I no longer have the reference) where they looked for the traits of individuals who were eminently successful. And what they found was that it was not education or IQ that was the determining factor, though that helped. No, the single most important factor was the ability to form consensus.
 
If one looks at prolonged conflicts, like we’ve witnessed in Ireland or the Middle East, people involved in talks will tell you that the ‘hardliners’ will never find peace, only the moderates will. So, if there is a lesson to be learned, it’s not to follow leaders who sow and reap division, but those who are inclusive. That means giving up our ingroup-outgroup mentality, which appears impossible. But, until we do, the incurable disease will recur and we will self-destruct by simply following the cult that self-destructive narcissists are so masterfully capable of growing.
 

Wednesday 10 August 2022

What is knowledge? And is it true?

 This is the subject of a YouTube video I watched recently by Jade. I like Jade’s and Tibees’ videos, because they are both young Australian women (though Tibees is obviously a Kiwi, going by her accent) who produce science and maths videos, with their own unique slant. I’ve noticed that Jade’s videos have become more philosophical and Tibees’ often have an historical perspective. In this video by Jade, she also provides historical context. Both of them have taught me things I didn’t know, and this video is no exception.
 
The video has a different title to this post: The Gettier Problem or How do you know that you know what you know? The second title gets to the nub of it. Basically, she’s tackling a philosophical problem going back to Plato, which is how do you know that a belief is actually true? As I discussed in an earlier post, some people argue that you never do, but Jade discusses this in the context of AI and machine-learning.
 
She starts off with the example of using Google Translate to translate her English sentences into French, as she was in Paris at the time of making the video (she has a French husband, whom she’s revealed in other videos). She points out that the AI system doesn’t actually know the meaning of the words, and it doesn’t translate the way you or I would: by looking up individual words in a dictionary. No, the system is fed massive amounts of internet generated data and effectively learns statistically from repeated exposure to phrases and sentences so it doesn’t have to ‘understand’ what it actually means. Towards the end of the video, she gives the example of a computer being able to ‘compute’ and predict the movements of planets without applying Newton’s mathematical laws, simply based on historical data, albeit large amounts thereof.
 
Jade puts this into context by asking, how do you ‘know’ something is true as opposed to just being a belief? Plato provided a definition: Knowledge is true belief with an account or rational explanation. Jade called this ‘Justified True Belief’ and provides examples. But then, someone called Edmund Gettier mid last century demonstrated how one could hold a belief that is apparently true but still incorrect, because the assumed causal connection was wrong. Jade gives a few examples, but one was of someone mistaking a cloud of wasps for smoke and assuming there was a fire. In fact, there was a fire, but they didn’t see it and it had no connection with the cloud of wasps. So someone else, Alvin Goodman, suggested that a way out of a ‘Gettier problem’ was to look for a causal connection before claiming an event was true (watch the video).
 
I confess I’d never heard these arguments nor of the people involved, but I felt there was another perspective. And that perspective is an ‘explanation’, which is part of Plato’s definition. We know when we know something (to rephrase her original question) when we can explain it. Of course, that doesn’t mean that we do know it, but it’s what separates us from AI. Even when we get something wrong, we still feel the need to explain it, even if it’s only to ourselves.
 
If one looks at her original example, most of us can explain what a specific word means, and if we can’t, we look it up in a dictionary, and the AI translator can’t do that. Likewise, with the example of predicting planetary orbits, we can give an explanation, involving Newton’s gravitational constant (G) and the inverse square law.
 
Mathematical proofs provide an explanation for mathematical ‘truths’, which is why Godel’s Incompleteness Theorem upset the apple cart, so-to-speak. You can actually have mathematical truths without proofs, but, of course, you can’t be sure they’re true. Roger Penrose argues that Godel’s famous theorem is one of the things that distinguishes human intelligence from machine intelligence (read his Preface to The Emperor’s New Mind), but that is too much of a detour for this post.
 
The criterion that is used, both scientifically and legally, is evidence. Having some experience with legal contractual disputes, I know that documented evidence always wins in a court of law over undocumented evidence, which doesn’t necessarily mean that the person with the most documentation was actually right (nevertheless, I’ve always accepted the umpire’s decision, knowing I provided all the evidence at my disposal).
 
The point I’d make is that humans will always provide an explanation, even if they have it wrong, so it doesn’t necessarily make knowledge ‘true’, but it’s something that AI inherently can’t do. Best examples are scientific theories, which are effectively ‘explanations’ and yet they are never complete, in the same way that mathematics is never complete.
 
While on the topic of ‘truths’, one of my pet peeves are people who conflate moral and religious ‘truths’ with scientific and mathematical ‘truths’ (often on the above-mentioned basis that it’s impossible to know them all). But there is another aspect, and that is that so-called moral truths are dependent on social norms, as I’ve described elsewhere, and they’re also dependent on context, like whether one is living in peace or war.
 
Back to the questions heading this post, I’m not sure I’ve answered them. I’ve long argued that only mathematical truths are truly universal, and to the extent that such ‘truths’ determine the ‘rules’ of the Universe (for want of a better term), they also ultimately determine the limits of what we can know.

Wednesday 20 April 2022

How can I know when I am wrong?

 Simple answer: I can’t. But this goes to the heart of a dilemma that seems to plague the modern world. It’s even been given a name: the post-truth world.  

I’ve just read a book, The Psychology of Stupidity; explained by some of the world’s smartest people, which is a collection of essays by philosophers, psychologists and writers, edited by Jean-Francois Marmion. It was originally French, so translated into English; therefore, most of the contributors are French, but some are American. 

 

I grew up constantly being reminded of how stupid I was, so, logically, I withdrew into an inner world, often fuelled by comic-book fiction. I also took refuge in books, which turned me into a know-it-all; a habit I’ve continued to this day.

 

Philosophy is supposed to be about critical thinking, and I’ve argued elsewhere that critical analysis is what separates philosophy from dogma, but accusing people of not thinking critically does not make them wiser. You can’t convince someone that you’re right and they’re wrong: the very best you can do is make them think outside their own box. And, be aware, that that’s exactly what they’re simultaneously trying to do to you.

 

Where to start? I’m going to start with personal experience – specifically, preparing arguments (called evidence) for lawyers in contractual engineering disputes, in which I’ve had more than a little experience. Basically, I’ve either prepared a claim or defended a claim by analysing data in the form of records – diaries, minutes, photographs – and reached a conclusion that had a trail of logic and evidence to substantiate it. But here’s the thing: I always took the attitude that I’d come up with the same conclusion no matter which side I was on.

 

You’re not supposed to do that, but it has advantages. The client, whom I’m representing, knows I won’t bullshit them and I won’t prepare a case that I know is flawed. And, in some cases, I’ve even won the respect of the opposing side. But you probably won’t be surprised to learn how much pressure you can be put under to present a case based on falsehoods. In the end, it will bite you.

 

The other aspect to all this is that people can get very emotional, and when they get emotional they get irrational. Writing is an art I do well, and when it comes to preparing evidence, my prose is very dispassionate, laying out an argument based on dated documents; better still, if the documents belong to the opposition.

 

But this is doing analysis on mutually recognised data, even if different sides come to different conclusions. And in a legal hearing or mediation, it’s the documentation that wins the argument, not emotive rhetoric. Most debates these days take place on social media platforms where people on opposing sides have their own sources and their own facts and we both accuse each other of being brainwashed. 

 

And this leads me to the first lesson I’ve learned about the post-truth world. In an ingroup-outgroup environment – like politics – even the most intelligent people can become highly irrational. We see everyone on one side as being righteous and worthy of respect, while everyone on the other side is untrustworthy and deceitful. Many people know about the infamous Robbers Cave experiment in 1954, where 2 groups of teenage boys were manipulated into an ingroup-outgroup situation where tensions quickly escalated, though not violently. I’ve observed this in contractual situations many times over.

 

One of my own personal philosophical principles is that beliefs should be dependent on what you know and not the other way round. It seems to me that we do the opposite: we form a belief and then actively look for evidence that turns that belief into knowledge. And, in the current internet age, it’s possible to find evidence for any belief at all, like the Earth being flat.

 

And this has led to a world of alternate universes, where the exact opposite histories are being played out. The best known example is climate change, but there are others. Most recently, we’ve had a disputed presidential election in the USA and the efficaciousness of vaccines in combatting the coronavirus (SARS-Cov-2 or COVD-19). What all these have in common is that each side believes the other side has been duped.

 

You might think that something else these 3 specific examples have in common is left-wing, right-wing politics. But I’ve learned that’s not always the case. One thing I do believe they have in common is open disagreement between purported experts in combination with alleged conspiracy theories. It so happens that I’ve worked with technical experts for most of my working life, plus I read a lot of books and articles by people in scientific disciplines. 

 

I’m well aware that there are a number of people who have expertise that I don’t have and I admit to getting more than a little annoyed with politicians who criticise or dismiss people who obviously have much more expertise than they have in specific fields, like climatology or epidemiology. One only has to look to the US, where the previous POTUS, Donald Trump, was at the centre of all of these issues, where everything he disagreed with was called a ‘hoax’, and who was a serial promoter of conspiracy theories, including election fraud. Trump is responsible for one of those alternative universes where President Elect, Joe Biden, stole the election from him, even though there is ample testimony that Trump tried to steal the election from Biden.

 

So, in the end, it comes down to who do you trust. And you probably trust someone who aligns with your ideological position or who reinforces your beliefs. Of course, I also have political views and my own array of beliefs. So how do I navigate my way?

 

Firstly, I have a healthy scepticism about conspiracy theories, because they require a level of global collaboration that’s hard to maintain in the manner they are reported. They often read or sound like movie scripts, with politicians being blackmailed or having their lives threatened and health professionals involved in a global conspiracy to help an already highly successful leader in the corporate world take control of all of our lives. This came from a so-called ‘whistleblower’, previously associated with WHO.

 

The more emotive and sensationalist a point of view, the more traction it has. Media outlets have always known this, and now it’s exploited on social media, where rules about accountability and credibility are a lot less rigorous.

 

Secondly, there are certain trigger words that warn me that someone is talking bullshit. Like calling vaccines a ‘bio-weapon’ or that it’s the ‘death-jab’ (from different sources). However, I trust people who have a long history of credibility in their field; who have made it their life’s work, in fact. But we live in a world where they can be ridiculed by politicians, whom we are supposed to respect and follow.

 

At the end of the day, I go back to the same criteria I used in preparing arguments involved in contractual disputes, which is evidence. We’ve been living with COVID for 2 years now and it is easy to find statistical data tracking the disease in a variety of countries and the effect the vaccines have had. Of course, the conspiracy theorists will tell you that the data is fabricated. The same goes for evidence involving climate change. There was a famous encounter between physicist and television presenter, Brian Cox, and a little known Australian politician who claimed that the graphs Cox presented, produced by NASA, had been corrupted.

 

But, in both of these cases, the proof is in the eating of the pudding. I live in a country where we followed the medical advice, underwent lockdowns and got vaccinated, and we’re now effectively living with the virus. When I look overseas, at countries like America, it was a disaster overseen by an incompetent President, who advocated all sorts of ‘crank cures’, the most notorious being bleach, not to mention UV light. At one point, the US accounted for more than 20% of the world’s recorded deaths.

 

And it’s the same with climate change where, again, the country I live in faced record fires in 2019/20 and now floods, though this is happening all over the globe. The evidence is in our face, but people are still in denial. It takes a lot of cognitive dissonance to admit when we’re wrong, and that’s part of the problem.

 

Philosophy teaches you that you can have a range of views on a specific topic, and as I keep saying: only future generations know how ignorant the current generation is. That includes me, of course. I write a blog, which hopefully outlives me and one day people should be able to tell where I was wrong. I’m quite happy for that, because that’s how knowledge grows and progresses.


Friday 18 March 2022

Our eternal fascination with Light and Dark

 Someone on FaceBook posted one of those inane questions: If you could delete one thing in the world what would it be? Obvious answers included war, hate, evil, and the like; so negative emotive states and consequences. My answer was, ‘Be careful what you wish for’.

What I find interesting about this whole issue is the vicarious relationship we have with the ‘dark side’ through the lens of fiction. If one thinks about it, it starts early with fairy tales and Bible stories. Nightmares are common in our childhood where one wakes up and is too scared to go back to sleep. Fear is an emotion we become familiar with early in our lives; I doubt that I was an exception, but it seems to me that everyone tries to keep children innocent these days. I don’t have children, so I might have it wrong.

 

Light and dark exists in the real world, but we try to keep it to the world of fiction – it’s a universal theme found in operas, mythologies and TV serials. I write fiction and I’m no exception. If there was no dark in my stories, they’d have no appeal. You have to have nemeses, figures of various shades of grey to juxtapose the figures of light, even if the light shines through flawed, imperfect glass.

 

In life we are tested, and we judge ourselves accordingly. Sometimes we pass and sometimes we fail. The same thing happens with characters in fiction. When we read a story we become actors, which makes us wonder how we’d behave in the same situation. I contend that the same thing happens in dreams. As an adult, I’ve always seen dreams as what-if scenarios and it’s the same with stories. I’ve long argued that the language of stories is the language of dreams and I think the connection is even stronger than that. I’m not surprised that storytellers will tell you that they dream a lot.

 

In the Judaeo-Christian religion I grew up with, good and evil were stark contrasts, like black and white. You have God, Christ and Satan. When I got older, I thought it a bit perverse that one feared God as much as Satan, which led me to the conclusion that they weren’t really that different. It’s Christ who is the good guy, willing to forgive the people who hate him and want him dead. I’m talking about them as fictional characters, not real people. I’m sure Jesus was a real person but we only have the myth by which to judge him.

 

The only reason I bring all this up, is because they were the template we were given. But fearing someone you are meant to love leads to neurosis, as I learned the hard way. A lot of people of my generation brought up the next generation as atheists, which is not surprising. The idea of a judgemental, schizophrenic father was past its use-by-date.

 

There is currently a conflict in Ukraine, which has grabbed the world’s attention in a way that other wars have not. It’s partly because of our Euro-centric perspective, and the fact that the 2 biggest and world-changing conflicts of the 20th Century both started in Europe. And the second one, in particular, has similarities, given it started with a dictator invading a neighbour, when he thought the world would look the other way.

 

There is a fundamental flaw in the human psyche that we’ve seen repeated throughout history. We have a tendency to follow charismatic narcissistic leaders, when you think we should know better. They create an army (not necessarily military) of supporters, but for whom they have utter contempt. This was true of Hitler, but also true of Trump and Putin.

 

Ukraine’s leader, Volodymyr Zelenskyy, like Trump, became a TV celebrity, but in a different vein. He was a satirical comedian who sent up the country’s leader, who was a Russian stooge, and then ran for office where he won by 70%. I believe this is the real reason that Putin wants to bring him down. If he’d done the same thing in Russia, he would have been assassinated while still a TV personality. It’s well known that Putin has attempted to assassinate him at least twice since the invasion, but assassinating opponents in a foreign country is a Putin specialty.

 

Zelenskyy and Putin represent, in many Western people’s minds, a modern day parable of good and evil. And, to me, the difference is stark. Putin, like all narcissists, only cares about himself, not the generals that have died in his war, not the barely out of school conscripts he’s sent into battle and certainly not the Russian people who will suffer enormous deprivations if this continues for any length of time. On the other hand, Zelenskyy doesn’t care about his self-preservation, because he would rather die for a principle than live the rest of his life in shame for deserting his country when it needed him most. Zelenskyy is like the fictional hero we believe in but know we couldn’t emulate.

 

It's when we read or watch fiction that the difference between right and wrong seems obvious. We often find ourselves telling a character, ‘don’t do that, don’t make that decision’, because we can see the consequences, but, in real life, we often seem to lose that compass.

 

My father was in a war and I know from what he told me that he didn’t lose that particular compass, but I also know that he once threatened to kill someone who was stealing from the wounded he was caring for. And I’ve no doubt he would have acted on it. So his compass got a bit bent, because he’d already seen enough killing to last several lifetimes.

 

I’ve noticed a theme in my own writing, which is subconscious, not intentional, and that is my protagonists invariably have their loyalty tested and it ends up defining them. My villains are mostly self-serving autocrats who have a hierarchical view of humanity where they logically belong at the top.

 

This is a meandering post, with no conclusion. We each of us have ambitions and desires and flaws. Few of us are ever really tested, so we make assumptions based on what we like to believe. I like something that Socrates said, who’d also been in battle.

 

To live with honour in this world, actually be what you try to appear to be.


Friday 28 January 2022

What is existentialism?

 A few years back, I wrote a ‘vanity piece’, My philosophy in 24 dot points, which I admit is a touch pretentious. But I’ve been prompted to write something more substantive, in a similar vein, whilst reading Gary Cox’s How to Be an Existentialist; or How to Get Real, Get a Grip and Stop Making Excuses. I bought this tome (the 10thAnniversary Edition) after reading an article by him on ‘Happiness’ in Philosophy Now (Issue 147, Dec 2021/Jan 2022). Cox is an Honorary Research Fellow at the University of Birmingham, UK. He’s written other books, but this one is written specifically for a general audience, not an academic one. This is revealed in some of the language he uses, like ‘being up shit creek’.

 

I didn’t really learn anything about existentialism until I studied Sartre in an off-campus university course, in my late 40s. I realised that, to all intents and purposes, I was an existentialist, without ever knowing what one was. I did write about existentialism very early in the life of this blog, in the context of my own background. The thing is that one’s philosophical worldview is a product of one’s milieu, upbringing and education, not to mention the age in which one lives. I grew up in a Western culture, post WW2, and I think that made me ripe for existentialist influences without being conscious of it. I lived in the 60s when there was a worldwide zeitgeist of questioning social mores against a background of a religious divide, the Vietnam war and the rise of feminism. 

 

If there is a key word or mantra in existentialism, it’s ‘authenticity’. It’s the key element in my 3 Rules for Humans post, and it’s also the penultimate chapter in Cox’s aforementioned book. The last chapter is on counselling and is like a bookend.

 

As Cox himself points out, existentialism is not a ‘school’ of philosophy in the way ‘analytical philosophy’ or ‘logical positivism’ are. There’s not really a set of rules – it’s more about an attitude and how to live a life without losing your soul or self-respect. It’s not an epistemology, nor an ideology, even though it’s probably associated with a liberal outlook, as I hope will become clear.

 

Many commentators associate existentialism with atheism, the absurd and nihilism. I agree with Cox that it’s actually the opposite of nihilism; if anything, it’s about finding purpose. As I wrote in a post last year:

 

If the Universe has any meaning at all, it’s because it created sentient beings who find meaning against the odds that science tells us are astronomical, both literally and figuratively. Existentialism is about finding purpose in an absurd universe, which is the opposite of nihilism.

 

And that’s the most important lesson of existentialism: if you are to find a purpose, only you can do that; it’s not dependent on anyone else, be they family, a spouse, an employer or a mentor. And logically, one could add, it’s not dependent on God either.

 

Cox doesn’t talk about God at all, but he does talk quite a lot about consciousness and about it being ‘nothing’ (materialistically). He very fleetingly gives mathematics as an example of something else that’s not ‘corporeal’, specifically numbers. Very curious, as I think that both mathematics and consciousness are ‘special’ in that they are distinct, yet intimately connected to the physical world, but that’s another topic.

 

He also talks about consciousness having a special relationship with time. I’ve said that consciousness is the only thing that exists in a constant present, whereas Cox says the opposite, but I believe we mean the same thing. He says consciousness is forever travelling from the past to the future, whereas I say that the future is forever becoming the past while only consciousness exists in the present – the experiential outcome is the same.

 

So how does God enter the picture? God only exists in someone’s consciousness – it’s part of one’s internal state. So, you can be ‘authentic’ and believe in God, but it’s totally an individualistic experience – it can’t be shared. That’s my interpretation, not anyone else’s, I should emphasise.

 

An important, even essential, aspect of all this is a belief in free will. You can’t take control of your life if you don’t have a belief in free will, and I would argue that you can’t be authentic either. And, logically, this has influenced my prejudices in physics and cosmology. To be consistent, I can’t believe we live in a deterministic universe, and have argued strongly on that point, opposing better minds than mine.

 

Existentialism has some things in common with Buddhism, which might explain why Eastern philosophy seemed to have an influence on the 60s zeitgeist. Having said that, I think the commonality is about treating life as a journey that’s transient. Accepting the impermanence and transience of life, I believe, is part of living authentically.

 

And what do I mean by ‘authentic’ in this context? Possibly, I never really appreciated this until I started writing fiction. I think anyone who creates art strives to be authentic, which means leaving your ego out of your work. I try to take the attitude that it’s my characters’ story, not mine. That’s very difficult to explain to anyone who hasn’t experienced it, but I know that actors often say something similar.

 

In my professional life, my integrity was everything to me. I often worked in disputatious environments and it was important to me that people could trust my word and my work. Cox talks about how existentialism intrinsically incorporates our interactions with others. 

 

Freedom is a much-abused, misappropriated term, but in existentialism it has a specific meaning and an interdependent relationship with responsibility – you can’t divorce one from the other. Freedom, in existentialism, means ‘free to choose’, hence the emphasis on free will. It also means, if you invoke the term, that the freedom of others is just as important as your own.

 

One can’t talk about authenticity without talking about its opposite, ‘bad faith’ (mauvaise foi), a term coined by Sartre. Bad faith is something that most of us have experienced, be it working in a job we hate, staying in a destructive relationship or not pursuing a desired goal in lieu of staying in our comfort zone.

 

Of course, sometimes we are in a situation outside our control, so what do we do? Speaking from personal experience, I think one needs to take ownership of one’s response to it; one needs to accept that only YOU can do something about it and not someone else. I’ve never been a prisoner-of-war, but my father was, and he made 3 attempts to escape, because, as he told the Commandant, ‘It’s my job’.

 

I’ve actually explored this in my own fiction. In my last story, two of my characters (independently) find themselves in circumstances of ‘bad faith’. I only analyse this in hindsight – don’t analyse what you write while you’re writing. In fact, one of those characters is attracted to another character who lives authentically, though neither of them ‘think’ in those terms.



Addendum: Someone asked me to come up with a single sentence to describe this. After sleeping on it, I came up with this:


Be responsible for what you are and who you become. That includes being responsible for your failures. (2 sentences)


Sunday 21 November 2021

Cancel culture – the scourge of our time

There are many things that cause me some anguish at the moment, not least that Donald Trump could easily be re-elected POTUS in 2024, despite deliberately undermining and damaging the very institution he wants to lead, which is American democracy. It’s not an exaggeration to say that he’s attacked it at its core.


This may seem a mile away from the topic I’ve alluded to in the title of my post, but they both seem to be symptoms of a divisiveness I haven’t seen since the Vietnam war. 

 

The word, ‘scourge’, is defined as ‘a whip used as an instrument of punishment’; and that’s exactly how cancel culture works, with social media the perfect platform from which to wield it.

 

In this weekend’s Good Weekend magazine (Fairfax Group), the feature article is on this very topic. But I would like to go back to the previous weekend, when another media outlet, Murdoch’s Weekend Australian Magazine published an article on well known atheist, Richard Dawkins. It turns out that at the ripe old age of 80, Dawkins has been cancelled. To be precise, he had his 1996 Humanist of the Year award withdrawn by the American Humanist Association (AHA) earlier this year, because, in 2015, he tweeted a defence of Rachel Doleza (a white chapter president of NAACP, the National Association for the Advancement of Coloured People) who had been vilified for identifying as Black.

 

Of course, I don’t know anything about Rachel Doleza or the context of that stoush, but I can identify with Dawkins, even though I’ve never suffered the same indignity. Dawkins and I are of a similar mould, though we live in different strata of society. In saying that, I don’t mean that I agree with all his arguments, because I obviously don’t, but we are both argumentative and are not shy in expressing our opinions. I really don’t possess the moral superiority to throw stones at Dawkins, even though I have.

 

I remember my father once telling me that if you admired an Australian fast bowler (he had someone in mind) then you also had to admire an English fast bowler (of the same generation), because they had the exact same temperament and wicket-taking abilities. Of course, that also applies to politicians. And it pretty much applies to me and Dawkins.

 

On the subject of identifying as ‘black’, I must tell a story related to me by a friend I knew when I worked in Princeton in 2001/2. She was a similar age to me and originally from Guyana. In fact, she was niece to West Indies champion cricketer, Lance Gibbs, and told me about attending his wedding when she was 8 years old (I promise no more cricketing references). But she told me how someone she knew (outside of work) told her that she ‘didn’t know what it was like to be black’. To which she replied, ‘Of course I know I’m black, I only have to look in the mirror every morning.’  Yes, it’s funny, but it goes to a deeper issue about identity. So a black person, who had lived their entire life in the USA, was telling another black person, who had come from outside of the US, that they didn’t know what it was like to be ‘black’. 

 

Dawkins said that, as a consequence, he’d started to self-censor, which is exactly what his detractors want. If Dawkins has started to self-censor, then none of us are safe or immune. What hurt him, of course, was being attacked by people on the Left, which he mostly identifies with. And, while this practice occurs on both sides, it’s on the Left where it has become most virulent. 

 

“I self-censor. More so in recent years. Why? It’s not a thing I’ve done throughout my life, I’ve always spoken my mind openly. But we’re now in a time when if you do speak your mind openly, you are at risk of being picked up and condemned.”

 

“Every time a lecturer is cancelled from an American university, that’s another God knows how many votes for Trump.”

 

And this is the thing: the Right loves nothing more than the Left turning on itself. It’s insidious, self-destructive and literally soul-destroying. In the Good Weekend article, they focus on a specific case, while also citing other cases, both in Australia and America. The specific case was actor, Hugh Sheridan, having a Sydney Festival show cancelled, which he’d really set his sights on, because he was playing a trans-gender person which created outrage in the LGBTQIA+ community. Like others cited in the article, he contemplated suicide which triggered close friends to monitor him. This is what it’s come to. It’s a very lengthy article, which I can’t do justice to on this post, but there is a perversion here: all the shows and people who are being targeted are actually bringing diversity of race and sexuality into the public arena and being crucified by the people they represent. The conservatives, wowsers and Bible-bashers must love it.

 

This is a phenomenon that is partly if not mostly, generational, and amplified by social media. People are being forced to grovel.

 

Emma Dawson, head of the Labor-aligned (Australian political party, for overseas readers) Per Capita think tank, told the Good Weekend“[cancel culture is] more worrying to me than just about anything other than far-right extremism. It is pervasive among educated young people; very few are willing to question it.”

 

‘In 2019, Barack Obama warned a group of young people: “This idea of purity, and you’re never compromised and always politically woke... you should get over that quickly. The world is messy.”

 

And this is the nub of the issue: cancel culture is all about silencing any debate, and, without debate, you have authoritarianism, even though it’s disguised as its opposite.

 

In the same article, the author, James Button, argues that the rise of Donald Trump is not a coincidence in the emergence of this phenomenon.

 

The election of Donald Trump horrified progressives. Here was a president – elected by ordinary Americans – who was racist, who winked at neo-Nazis and who told bare-faced lies in a brazen assertion of power while claiming that the liars were progressive media. His own strategy adviser, Stephen Bannon, said that the way to win the contest was to overwhelm the media with misinformation, to “flood the zone with shit”.

 

And they succeeded so well that America is more divided than it has been since its historical civil war.


To return to Hugh Sheridan, whom I think epitomises this situation, at least as it’s being played out in Australia, in that it’s the Arts that are coming under attack, and from the Left, it has to be said. Actors and writers (like myself) often portray characters who have different backgrounds to us. To give a recent example on ABC TV, which produces some outstanding free-to-air dramas with internationally renowned casts, when everything else is going into subscribed streaming services. Earlier this year, they produced and broadcast a series called The Newsreader, set in the 1980s when a lot of stuff was happening both locally and overseas. ‘At the 11th AACTA (Australian Academy of Cinema and Television Arts) awards, the show was nominated for more awards than any other program’ (Wikipedia).

 

A key plotline of the show was that the protagonist was gay but not openly so. The point is that I assume the actor was straight, although I don’t really know, but it’s what actors do. God knows, there have been enough gay actors who have played straight characters (Sir Ian McKellen, who played Gandalf, as well as Shakespearean roles). So why crucify someone who is part of the LGBTQIA+ community for playing a transgender role. He was even accused of being homophobic and transgenderphobic. He tweeted back, “you’re insane”, which only resulted in him being trolled for accusing his tormentors of being ‘insane’.

 

Someone recently asked me why I don’t publish what I write anymore. There is more than one reason, but one is fear of being cancelled. I doubt a publisher would publish what I write, anyway. But also, I suffer from impostor syndrome in that I genuinely feel like an impostor and I don’t need someone to tell me. The other thing is that I simply don’t care; I don’t feel the need to publish to validate my work.