Paul P. Mealing

Check out my book, ELVENE. Available as e-book and as paperback (print on demand, POD). Also this promotional Q&A on-line.
Showing posts with label Art. Show all posts
Showing posts with label Art. Show all posts

Thursday, 19 September 2024

Prima Facie; the play

 I went and saw a film made of a live performance of this highly rated play, put on by the National Theatre at the Harold Pinter Theatre in London’s West End in 2022. It’s a one-hander, played by Jodie Comer, best known as the quirky assassin with a diabolical sense of humour, in the black comedy hit, Killing Eve. I also saw her in Ridley Scott’s riveting and realistically rendered film, The Last Duel, set in mediaeval France, where she played alongside Matt Damon, Adam Driver and an unrecognisable Ben Affleck. The roles that Comer played in those 2 screen mediums, couldn’t be more different.
 
Theatre is more unforgiving than cinema, because there are no multiple takes or even a break once the curtain’s raised; a one-hander, even more so. In the case of Prima Facie, Comer is on the stage a full 90mins, and even does costume-changes and pushing around her own scenery unaided, without breaking stride. It’s such a tour de force performance, as the Financial Times put it; I’d go so far as to say it’s the best acting performance I’ve ever witnessed by anyone. It’s such an emotionally draining role, where she cries and even breaks into a sweat in one scene, that I marvel she could do it night-after-night, as I assume she did.
 
And I’ve yet to broach the subject matter, which is very apt, given the me-too climate, but philosophically it goes deeper than that. The premise for the entire play, which is even spelt out early on, in case you’re not paying attention, is the difference between truth and justice, and whether it matters. Comer’s character, Tessa, happens to experience it from both sides, which is what makes this so powerful.
 
She’s a defence barrister, who specialises in sexual-assault cases, where, as she explains very early on, effectively telling us the rules of the game: no one wins or loses; you either come first or second. In other words, the barristers and those involved in the legal profession, don’t see the process the same way that you and I do, and I can understand that – to get emotionally involved makes it very stressful.

In fact, I have played a small role in this process in a professional capacity, so I’ve seen this firsthand. But I wasn’t dealing with rape cases or anything involving violence, just contractual disputes where millions of dollars could be at stake. My specific role was to ‘prepare evidence’ for lawyers for either a claim or the defence of a claim or possibly a counter-claim, and I quickly realised the more dispassionate one is, the more successful one is likely to be. I also realised that the lawyers I was supporting in one case could be on the opposing side in the next one, so you don’t get personal.
 
So, I have a small insight into this world, and can appreciate why they see it as a game, where you ‘win or come second’. But in Prima Facie, Tess goes through this very visceral and emotionally scarifying transformation where she finds herself on the receiving end, and it’s suddenly very personal indeed.
 
Back in 2015, I wrote a mini-400-word essay, in answer to one of those Question of the Month topics that Philosophy Now like to throw open to amateur wannabe philosophers, like myself. And in this case, it was one that was selected for publication (among 12 others), from all around the Western globe. I bring this up, because I made the assertion that ‘justice without truth is injustice’, and I feel that this is really what Prima Facie is all about. At the end of the play, with Tess now having the perspective of the victim (there is no other word), it does become a matter of winning or losing, because, not only her career and future livelihood, but her very dignity, is now up for sacrifice.
 
I watched a Q&A programme on Australia’s ABC some years ago, where this issue was discussed. Every woman on the panel, including one from the righteous right (my coinage), had a tale to tell about discrimination or harassment in a workplace situation. But the most damming testimony came from a man, who specialised in representing women in sexual assault cases, and he said that in every case, their doctors tell them not to proceed because it will destroy their health; and he said: they’re right. I was reminded of this when I watched this play.
 
One needs to give special mention to the writer, Suzie Miller, who is an Aussie as it turns out, and as far as 6 degrees of separation go, I happen to know someone who knows her father. Over 5 decades I’ve seen some very good theatre, some of it very innovative and original. In fact, I think the best theatre I’ve seen has invariably been something completely different, unexpected and dare-I-say-it, special. I had a small involvement in theatre when I was still very young, and learned that I couldn’t act to save myself. Nevertheless, my very first foray into writing was an attempt to write a play. Now, I’d say it’s the hardest and most unforgiving medium of storytelling to write for. I had a friend who was involved in theatre for some decades and even won awards. She passed a couple of years ago and I miss her very much. At her funeral, she was given a standing ovation, when her coffin was taken out; it was very moving. I can’t go to a play now without thinking about her and wishing I could discuss it with her.

Monday, 22 July 2024

Zen and the art of flow

 This was triggered by a newsletter I received from ABC Classic (Australian radio station) with a link to a study done on ‘flow’, which is a term coined by physiologist, Mihalyi Csikszentmihalyi, to describe a specific psychological experience that many (if not all) people have had when totally immersed in some activity that they not only enjoy but have developed some expertise in.
 
The study was performed by Dr John Kounios from Drexel University's Creative Research Lab in Philadelphia, who “examined the 'neural and psychological correlates of flow' in a sample of jazz guitarists.” The article was authored by Jennifer Mills from ABC Classic’s sister station, ABC Jazz. But the experience of ‘flow’ just doesn’t apply to mental or artistic activities, but also sporting activities like playing tennis or cricket. Mills heads her article with the claim that ‘New research helps unlock the secrets of flow, an important tool for creative and problem solving tasks’. She quotes Csikszentmihalyi to provide a working definition:
 
"A state in which people are so involved in an activity that nothing else seems to matter; the experience is so enjoyable that people will continue to do it even at great cost, for the sheer sake of doing it."
 
I believe I’ve experienced ‘flow’ in 2 quite disparate activities: writing fiction and driving a car. Just to clarify, some people think that experiencing flow while driving means that you daydream, whereas I’m talking about the exact opposite. I hardly ever daydream while driving, and if I find myself doing it, I bring myself back to the moment. Of course, cars are designed these days to insulate you from the experience of driving as much as possible, as we evolve towards self-driving cars. Thankfully, there are still cars available that are designed to involve you in the experience and not remove you from it.
 
I was struck by the fact that the study used jazz musicians, as I’ve often compared the ability to play jazz with the ability to write dialogue (even though I’m not a musician). They both require extemporisation. The article references Nat Bartsch, whom I’ve seen perform live and whose music is an unusual style of jazz in that it can be very contemplative. I saw her perform one of her albums with her quartet, augmented with a cello, which made it a one-off, unique performance. (This is a different concert performed in Sydney without the cellist.)
 
The study emphasised the point that the more experienced practitioners taking part were the ones more likely to experience ‘flow’. In other words, to experience ‘flow’ you need to reach a certain skill-level. In emphasising this point, the author quotes jazz legend, Charlie Parker:
 
"You've got to learn your instrument. Then, you practise, practise, practise. And then, when you finally get up there on the bandstand, forget all that and just wail."
 
I can totally identify with this, as when I started writing it was complete crap, to the extent that I wouldn’t show it to anyone. For some irrational reason, I had the self-belief – some might say, arrogance – that, with enough perseverance and practice, I could ‘break-through’ into the required skill-level. In fact, I now create characters and write dialogue with little conscious effort – it’s become a ‘delegated’ task, so I can concentrate on the more complex tasks of resolving plot points, developing moral dilemmas and formulating plot twists. Notice that these require a completely different set of skills that also had to be learned from scratch. But all this can come together, often in unexpected and surprising ways, when one is in the mental state of ‘flow’. I’ve described this as a feeling like you’re an observer, not the progenitor, so the process occurs as if you’re a medium and you just have to trust it.
 
Dr. Steffan Herff, leader of the Sydney Music, Mind and Body Lab at Sydney University, makes a point that supports this experience:
 
"One component that makes flow so interesting from a cognitive neuroscience and psychology perspective, is that it comes with a 'loss of self-consciousness'."
 
And this allows me to segue into Zen Buddhism. Many years ago, I read an excellent book by Daisetz Suzuki titled, Zen and Japanese Culture, where he traces the evolutionary development of Zen, starting with Buddhism in India, then being adopted in China, where it was influenced by Taoism, before reaching Japan, where it was assimilated into a sister-religion (for want of a better term) with Shintoism, which is an animistic religion.
 
Suzuki describes Zen as going inward rather than outward, while acknowledging that the two can’t be disconnected. But I think it’s the loss of ‘self’ that makes it relevant to the experience of flow. When Suzuki described the way Zen is practiced in Japan, he talked about being in the moment, whatever the activity, and for me, this is an ideal that we rarely attain. It was only much later that I realised that this is synonymous with flow as described by Csikszentmihalyi and currently being examined in the studies referenced above.
 
I’ve only once before written a post on Zen (not counting a post on Buddhism and Christianity), which arose from reading Douglas Hofstadter’s seminal tome, Godel Escher Bach (which is not about Zen, although it gets a mention), and it’s worth quoting this summation from myself:
 
My own take on this is that one’s ego is not involved yet one feels totally engaged. It requires one to be completely in the moment, and what I’ve found in this situation is that time disappears. Sportsmen call it being ‘in the zone’ and it’s something that most of us have experienced at some time or another.

Saturday, 29 June 2024

Feeling is fundamental

 I’m not sure I’ve ever had an original idea, but I sometimes raise one that no one else seems to talk about. And this is one of them: I contend that the primary, essential attribute of consciousness is to be able to feel, and the ability to comprehend is a secondary attribute.
 
I don’t even mind if this contentious idea triggers debate, but we tend to always discuss consciousness in the context of human consciousness, where we metaphorically talk about making decisions based on the ‘head’ or the ‘heart’. I’m unsure of the origin of this dichotomy, but there is an inference that our emotional and rational ‘centres’ (for want of a better word) have different loci (effectively, different locations). No one believes that, of course, but possibly people once did. The thing is that we are all aware that sometimes our emotional self and rational self can be in conflict. This is already going down a path I didn’t intend, so I may return at a later point.
 
There is some debate about whether insects have consciousness, but I believe they do because they demonstrate behaviours associated with fear and desire, be it for sustenance or company. In other respects, I think they behave like automatons. Colonies of ants and bees can build a nest without a blueprint except the one that apparently exists in their DNA. Spiders build webs and birds build nests, but they don’t do it the way we would – it’s all done organically, as if they have a model in their brain that they can follow; we actually don’t know.
 
So I think the original role of consciousness in evolutionary terms was to feel, concordant with abilities to act on those feelings. I don’t believe plants can feel, but they’d have very limited ability to act on them, even if they could. They can communicate chemically, and generally rely on the animal kingdom to propagate, which is why a global threat to bee populations is very serious indeed.
 
So, in evolutionary terms, I think feeling came before cognitive abilities – a point I’ve made before. It’s one of the reasons that I think AI will never be sentient – a viewpoint not shared by most scientists and philosophers, from what I’ve read.  AI is all about cognitive abilities; specifically, the ability to acquire knowledge and then deploy it to solve problems. Some argue that by programming biases into the AI, we will be simulating emotions. I’ve explored this notion in my own sci-fi, where I’ve added so-called ‘attachment programming’ to an AI to simulate loyalty. This is fiction, remember, but it seems plausible.
 
Psychological studies have revealed that we need an emotive component to behave rationally, which seems counter-intuitive. But would we really prefer if everyone was a zombie or a psychopath, with no ability to empathise or show compassion. We see enough of this already. As I’ve pointed out before, in any ingroup-outgroup scenario, totally rational individuals can become totally irrational. We’ve all observed this, possibly actively participated.
 
An oft made point (by me) that I feel is not given enough consideration is the fact that without consciousness, the universe might as well not exist. I agree with Paul Davies (who does espouse something similar) that the universe’s ability to be self-aware, would seem to be a necessary condition for its existence (my wording, not his). I recently read a stimulating essay in the latest edition of Philosophy Now (Issue 162, June/July 2024) titled enigmatically, Significance, by Ruben David Azevedo, a ‘Portuguese philosophy and social sciences teacher’. His self-described intent is to ‘Tell us why, in a limitless universe, we’re not insignificant’. In fact, that was the trigger for this post. He makes the point (that I’ve made elsewhere myself), that in both time and space, we couldn’t be more insignificant, which leads many scientists and philosophers to see us as a freakish by-product of an otherwise purposeless universe. A perspective that Davies has coined ‘the absurd universe’. In light of this, it’s worth reading Azevedo’s conclusion:
 
In sum, humans are neither insignificant nor negligible in this mind-blowing universe. No living being is. Our smallness and apparent peripherality are far from being measures of our insignificance. Instead, it may well be the case that we represent the apex of cosmic evolution, for we have this absolute evident and at the same time mysterious ability called consciousness to know both ourselves and the universe.
 
I’m not averse to the idea that there is a cosmic role for consciousness. I like John Wheeler’s obvious yet pertinent observation:
 
The Universe gave rise to consciousness, and consciousness gives meaning to the Universe.

 
And this is my point: without consciousness, the Universe would have no meaning. And getting back to the title of this essay, we give the Universe feeling. In fact, I’d say that the ability to feel is more significant than the ability to know or comprehend.
 
Think about the role of art in all its manifestations, and how it’s totally dependent on the ability to feel. In some respects, I consider AI-generated art a perversion, because any feeling we have for its products is of our own making, not the AI’s.
 
I’m one of those weird people who can even find beauty in mathematics, while acknowledging only a limited ability to pursue it. It’s extraordinary that I can find beauty in a symphony, or a well-written story, or the relationship between prime numbers and Riemann’s Zeta function.


Addendum: I realised I can’t leave this topic without briefly discussing the biochemical role in emotional responses and behaviours. I’m thinking of the brain’s drugs-of-choice like serotonin, dopamine, oxytocin and endorphins. Some may argue that these natural ‘messengers’ are all that’s required to explain emotions. However, there are other drugs, like alcohol and caffeine (arguably the most common) that also affect us emotionally, sometimes to our detriment. My point being that the former are nature’s target-specific mechanisms to influence the way we feel, without actually being the genesis of feelings per se.

Saturday, 20 April 2024

Sigmund Freud’s and C.S. Lewis’s fictional encounter

Last week I went and saw the movie, Freud’s Last Session, where Anthony Hopkins plays Freud, when he was in London on the very cusp of WW2 and dying of cancer of the mouth, and Mathew Goode plays the Oxford Don, C.S. Lewis. It’s a fictional account, taken from a play I believe, about their meeting at Freud’s home. Its historical veracity is put into question by a disclaimer given after the movie proper finishes, saying that it’s recorded that Freud did, in fact, meet an Oxford Don, but whose identity was never revealed or confirmed.
 
It's the sort of movie that would attract people with a philosophical bent like myself. I thought the cinema better attended than I expected, though it was far from full. Anthony Hopkin’s Freud is playful in the way he challenges Mathew Goode’s Lewis, whilst still being very direct and not pulling any punches. There is an interruption to their conversation by an air-raid siren, and when they go into a bunker, Lewis has a panic-attack, because of his experience in the trenches of WW1. Freud helps him to deal with it in the moment.
 
I’ve read works by both of them, though I’m hardly a scholar. I actually studied Freud in a philosophy class, believe it or not. I’m better read in Jung than Freud. I think Lewis is a good essayist, though I disagree with him philosophically on many counts. Having said that, I expect if I’d met him, I’d have a different opinion of him than just his ideas. I have very good friends who hold almost exactly the same views, so you don’t just judge someone for what they believe, if you get to know them in the flesh.
 
And that’s what came across in this hypothetical exchange – that you have 2 intellectuals who can find mutual respect despite having antithetical views about God and religion and other things, like homosexuality. On that last point, Sigmund’s daughter, Anna, was in a relationship with a woman, which Freud obviously didn’t approve of. In fact, the father-daughter relationship in the movie, was portrayed as very Freudian, where they both seemed to suffer from an unhealthy attachment. Nevertheless, Anna Freud went on to make a name for herself in child psychoanalysis, and there’s a scene where she has to deal with an overbearing and arrogant young man, and her putdown made me want to clap; I just wish I could remember it. Anyway, Anna’s story provides a diversionary, yet not irrelevant, subplot, which makes the movie a bit more than just a two-hander.
 
There are scenes where Mathew Goode’s Lewis has dreams or visions and finds himself in a forest where he comes across a deer and one where he sees a bright overwhelming light. There was a sense in these scenes that he felt he was in the presence of God, and it made me realise that I couldn’t judge him for that. I’ve long argued that God is a personal experience that can’t be shared, but we overlay it with our cultural norms. It was in these scenes that I felt his character was portrayed most authentically.
 

Tuesday, 2 January 2024

Modes of expression in writing fiction

As I point out in the post, this is a clumsy phrase, but I find it hard to come up with a better one. It’s actually something I wrote on Quora in response to a question. I’ve written on this before, but this post has the benefit of being much more succinct while possibly just as edifying.
 
I use the term ‘introspection’ where others use the word, ‘insight’. It’s the reader’s insight but the character’s introspection, which is why I prefer that term in this context.
 
The questioner is Clxudy Pills, obviously a pseudonym. I address her directly in the answer, partly because, unlike other questions I get, she has always acknowledged my answers.
 

Is "show, not tell" actually a good writing tip?

 
Maybe. No one said that to me when I was starting out, so it had no effect on my development. But I did read a book (more than one, actually) on ‘writing’ that delineated 5 categories of writing ‘style’. Style in this context means the mode of expression rather than an author’s individual style or ‘voice’. That’s clumsily stated but it will make sense when I tell you what they are.
 

  1. Dialogue is the most important because it’s virtually unique to fiction; quotes provided in non-fiction notwithstanding. Dialogue, more than any other style, tells you about the characters and their interactions with others.



  2. Introspection is what the character thinks, effectively. This only happens in novels and short stories, not screenplays or stage plays, soliloquies being the exception and certainly not the rule. But introspection is essential to prose, especially when the character is on their own.



  3. Exposition is the ‘telling’, not showing, part. When you’re starting out and learning your craft, you tend to write a lot of exposition – I know I did – which is why we get the admonition in your question. But the exposition can be helpful to you, if not the reader, as it allows you to explore the setting, the context of the story and its characters. Eventually, you’ll learn not to rely on it. Exposition is ‘smuggled’ into movies through dialogue and into novels through introspection.



  4. Description is more difficult than you think, because it’s the part of a novel that readers will skip over to get on with the story. Description can be more boring than exposition, yet it’s necessary. My approach is to always describe a scene from a character’s POV, and keep it minimalist. Readers automatically fill in the details, because we are visual creatures and we do it without thinking.



  5. Action is description in motion. Two rules: stay in one character’s POV and keep it linear – one thing happens after another. It has the dimension of time, though it’s subliminal.

 
 So there: you get 5 topics for the price of one.
 

Saturday, 16 September 2023

Modes of thinking

 I’ve written a few posts on creative thinking as well as analytical and critical thinking. But, not that long ago, I read a not-so-recently published book (2015) by 2 psychologists (John Kounios and Mark Beeman) titled, The Eureka Factor; Creative Insights and the Brain. To quote from the back fly-leaf:
 
Dr John Kounios is Professor of Psychology at Drexel University and has published cognitive neuroscience research on insight, creativity, problem solving, memory, knowledge representation and Alzheimer’s disease.
 
Dr Mark Beeman is Professor of Psychology and Neuroscience at Northwestern University, and researches creative problem solving and creative cognition, language comprehension and how the right and left hemispheres process information.

 
They divide people into 2 broad groups: ‘Insightfuls’ and ‘analytical thinkers’. Personally, I think the coined term, ‘insightfuls’ is misleading or too narrow in its definition, and I prefer the term ‘creatives’. More on that below.
 
As the authors say, themselves, ‘People often use the terms “insight” and “creativity” interchangeably.’ So that’s obviously what they mean by the term. However, the dictionary definition of ‘insight’ is ‘an accurate and deep understanding’, which I’d argue can also be obtained by analytical thinking. Later in the book, they describe insights obtained by analytical thinking as ‘pseudo-insights’, and the difference can be ‘seen’ with neuro-imaging techniques.
 
All that aside, they do provide compelling arguments that there are 2 distinct modes of thinking that most of us experience. Very early in the book (in the preface, actually), they describe the ‘ah-ha’ experience that we’ve all had at some point, where we’re trying to solve a puzzle and then it comes to us unexpectedly, like a light-bulb going off in our head. They then relate something that I didn’t know, which is that neurological studies show that when we have this ‘insight’ there’s a spike in our brain waves and it comes from a location in the right hemisphere of the brain.
 
Many years ago (decades) I read a book called Drawing on the Right Side of the Brain by Betty Edwards. I thought neuroscientists would disparage this as pop-science, but Kounios and Beeman seem to give it some credence. Later in the book, they describe this in more detail, where there are signs of activity in other parts of the brain, but the ah-ha experience has a unique EEG signature and it’s in the right hemisphere.
 
The authors distinguish this unexpected insightful experience from an insight that is a consequence of expertise. I made this point myself, in another post, where experts make intuitive shortcuts based on experience that the rest of us don’t have in our mental toolkits.
 
They also spend an entire chapter on examples involving a special type of insight, where someone spends a lot of time thinking about a problem or an issue, and then the solution comes to them unexpected. A lot of scientific breakthroughs follow this pattern, and the point is that the insight wouldn’t happen at all without all the rumination taking place beforehand, often over a period of weeks or months, sometimes years. I’ve experienced this myself, when writing a story, and I’ll return to that experience later.
 
A lot of what we’ve learned about the brain’s functions has come from studying people with damage to specific areas of the brain. You may have heard of a condition called ‘aphasia’, which is when someone develops a serious disability in language processing following damage to the left hemisphere (possibly from a stroke). What you probably don’t know (I didn’t) is that damage to the right hemisphere, while not directly affecting one’s ability with language can interfere with its more nuanced interpretations, like sarcasm or even getting a joke. I’ve long believed that when I’m writing fiction, I’m using the right hemisphere as much as the left, but it never occurred to me that readers (or viewers) need the right hemisphere in order to follow a story.
 
According to the authors, the difference between the left and right neo-cortex is one of connections. The left hemisphere has ‘local’ connections, whereas the right hemisphere has more widely spread connections. This seems to correspond to an ‘analytic’ ability in the left hemisphere, and a more ‘creative’ ability in the right hemisphere, where we make conceptual connections that are more wideranging. I’ve probably oversimplified that, but it was the gist I got from their exposition.
 
Like most books and videos on ‘creative thinking’ or ‘insights’ (as the authors prefer), they spend a lot of time giving hints and advice on how to improve your own creativity. It’s not until one is more than halfway through the book, in a chapter titled, The Insightful and the Analyst, that they get to the crux of the issue, and describe how there are effectively 2 different types who think differently, even in a ‘resting state’, and how there is a strong genetic component.
 
I’m not surprised by this, as I saw it in my own family, where the difference is very distinct. In another chapter, they describe the relationship between creativity and mental illness, but they don’t discuss how artists are often moody and neurotic, which is a personality trait. Openness is another personality trait associated with creative people. I would add another point, based on my own experience, if someone is creative and they are not creating, they can suffer depression. This is not discussed by the authors either.
 
Regarding the 2 types they refer to, they acknowledge there is a spectrum, and I can’t help but wonder where I sit on it. I spent a working lifetime in engineering, which is full of analytic types, though I didn’t work in a technical capacity. Instead, I worked with a lot of technical people of all disciplines: from software engineers to civil and structural engineers to architects, not to mention lawyers and accountants, because I worked on disputes as well.
 
The curious thing is that I was aware of 2 modes of thinking, where I was either looking at the ‘big-picture’ or looking at the detail. I worked as a planner, and one of my ‘tricks’ was the ability to distil a large and complex project into a one-page ‘Gantt’ chart (bar chart). For the individual disciplines, I’d provide a multipage detailed ‘program’ just for them.
 
Of course, I also write stories, where the 2 components are plot and character. Creating characters is purely a non-analytic process, which requires a lot of extemporising. I try my best not to interfere, and I do this by treating them as if they are real people, independent of me. Plotting, on the other hand, requires a big-picture approach, but I almost never know the ending until I get there. In the last story I wrote, I was in COVID lockdown when I knew the ending was close, so I wrote some ‘notes’ in an attempt to work out what happens. Then, sometime later (like a month), I had one sleepless night when it all came to me. Afterwards, I went back and looked at my notes, and they were all questions – I didn’t have a clue.

Thursday, 25 May 2023

Philosophy’s 2 disparate strands: what can we know; how can we live

The question I’d like to ask, is there a philosophical view that encompasses both? Some may argue that Aristotle attempted that, but I’m going to take a different approach.
 
For a start, the first part can arguably be broken into 2 further strands: physics and metaphysics. And even this divide is contentious, with some arguing that metaphysics is an ‘abstract theory with no basis in reality’ (one dictionary definition).
 
I wrote an earlier post arguing that we are ‘metaphysical animals’ after discussing a book of the same name, though it was really a biography of 4 Oxford women in the 20th Century: Elizabeth Anscombe, Mary Midgley, Philippa Foot and Iris Murdoch. But I’ll start with this quote from said book.
 
Poetry, art, religion, history, literature and comedy are all metaphysical tools. They are how metaphysical animals explore, discover and describe what is real (and beautiful and good). (My emphasis.)
 
So, arguably, metaphysics could give us a connection between the 2 ‘strands’ in the title. Now here’s the thing: I contend that mathematics should be part of that list, hence part of metaphysics. And, of course, we all know that mathematics is essential to physics as an epistemology. So physics and metaphysics, in my philosophy, are linked in a rather intimate  way.
 
The curious thing about mathematics, or anything metaphysical for that matter, is that, without human consciousness, they don’t really exist, or are certainly not manifest. Everything on that list is a product of human consciousness, notwithstanding that there could be other conscious entities somewhere in the universe with the same capacity.
 
But again, I would argue that mathematics is an exception. I agree with a lot of mathematicians and physicists that while we create the symbols and language of mathematics, we don’t create the intrinsic relationships that said language describes. And furthermore, some of those relationships seem to govern the universe itself.
 
And completely relevant to the first part of this discussion, the limits of our knowledge of mathematics seems to determine the limits of our knowledge of the physical world.
 
I’ve written other posts on how to live, specifically, 3 rules for humans and How should I live? But I’m going to go via metaphysics again, specifically storytelling, because that’s something I do. Storytelling requires an inner and outer world, manifest as character and plot, which is analogous to free will and fate in the real world. Now, even these concepts are contentious, especially free will, because many scientists tell us it’s an illusion. Again, I’ve written about this many times, but it’s relevance to my approach to fiction is that I try and give my characters free will. An important part of my fiction is that the characters are independent of me. If my characters don’t take on a life of their own, then I know I’m wasting my time, and I’ll ditch that story.
 
Its relevance to ‘how to live’ is authenticity. Artists understand better than most the importance of authenticity in their work, which really means keeping themselves out of it. But authenticity has ramifications, as any existentialist will tell you. To live authentically requires an honesty to oneself that is integral to one’s being. And ‘being’ in this sense is about being human rather than its broader ontological meaning. In other words, it’s a fundamental aspect of our psychology, because it evolves and changes according to our environment and milieu. Also, in the world of fiction, it's a fundamental dynamic.
 
What's more, if you can maintain this authenticity (and it’s genuine), then you gain people’s trust, and that becomes your currency, whether in your professional life or your social life. However, there is nothing more fake than false authenticity; examples abound.
 
I’ll give the last word to Socrates; arguably the first existentialist.
 
To live with honour in this world, actually be what you try to appear to be.


Saturday, 14 January 2023

Why do we read?

This is the almost-same title of a book I bought recently (Why We Read), containing 70 short essays on the subject, featuring scholars of all stripes: historians, philosophers, and of course, authors. It even includes scientists: Paul Davies, Richard Dawkins and Carlo Rovelli, being 3 I’m familiar with.
 
One really can’t overstate the importance of the written word, because, oral histories aside, it allows us to extend memories across generations and accumulate knowledge over centuries that has led to civilisations and technologies that we all take for granted. By ‘we’, I mean anyone reading this post.
 
Many of the essayists write from their personal experiences and I’ll do the same. The book, edited by Josephine Greywoode and published by Penguin, specifically says on the cover in small print: 70 Writers on Non-Fiction; yet many couldn’t help but discuss fiction as well.
 
And books are generally divided between fiction and non-fiction, and I believe we read them for different reasons, and I wouldn’t necessarily consider one less important than the other. I also write fiction and non-fiction, so I have a particular view on this. Basically, I read non-fiction in order to learn and I read fiction for escapism. Both started early for me and I believe the motivation hasn’t changed.
 
I started reading extra-curricular books from about the age of 7 or 8, involving creatures mostly, and I even asked for an encyclopaedia for Christmas at around that time, which I read enthusiastically. I devoured non-fiction books, especially if they dealt with the natural world. But at the same time, I read comics, remembering that we didn’t have TV at that time, which was only just beginning to emerge.
 
I think one of the reasons that boys read less fiction than girls these days is because comics have effectively disappeared, being replaced by video games. And the modern comics that I have seen don’t even contain a complete narrative. Nevertheless, there are graphic novels that I consider brilliant. Neil Gaiman’s Sandman series and Hayao Miyazake’s Nausicaa of the Valley of the Wind, being standouts. Watchmen by Alan Moore also deserves a mention.
 
So the escapism also started early for me, in the world of superhero comics, and I started writing my own scripts and drawing my own characters pre-high school.
 
One of the essayists in the collection, Niall Ferguson (author of Doom) starts off by challenging a modern paradigm (or is it a meme?) that we live in a ‘simulation’, citing Oxford philosopher, Nick Bostrom, writing in the Philosophical Quarterly in 2003. Ferguson makes the point that reading fiction is akin to immersing the mind in a simulation (my phrasing, not his).
 
In fact, a dream is very much like a simulation, and, as I’ve often said, the language of stories is the language of dreams. But here’s the thing; the motivation for writing fiction, for me, is the same as the motivation for reading it: escapism. Whether reading or writing, you enter a world that only exists inside your head. The ultimate solipsism.

And this surely is a miracle of written language: that we can conjure a world with characters who feel real and elicit emotional responses, while we follow their exploits, failures, love life and dilemmas. It takes empathy to read a novel, and tests have shown that people’s empathy increases after they read fiction. You engage with the character and put yourself in their shoes. It’s one of the reasons we read.
 
 
Addendum: I would recommend the book, by the way, which contains better essays than mine, all with disparate, insightful perspectives.
 

Sunday, 10 July 2022

Creative and analytic thinking

I recently completed an online course with a similar title, How to Think Critically and Creatively. It must be the 8th or 9th course I’ve done through New Scientist, on a variety of topics, from cosmology and quantum mechanics to immunology and sustainable living; so quite diverse subjects. I started doing them during COVID, as they helped to pass the time and stimulate the brain at the same time.
 
All these courses rely on experts in their relevant fields from various parts of the globe, so not just UK based, as you might expect. This course was no exception with just 2 experts, both from America. Denise D Cummins is described as a ‘cognitive scientist, author and elected Fellow of the Association for Psychological Science, and she’s held faculty at Yale, UC, University of Illinois and the Centre of Adaptive Behaviours at the Max Planck Institute in Berlin’. Gerard J Puccio is ‘Department Chair and Professor at the International Centre for Studies on Creativity, Buffalo State; a unique academic department that offers the world’s only Master of Science degree in creativity’.
 
I admit to being sceptical that ‘creativity’ can be taught, but that depends on what one means by creativity. If creativity means using your imagination, then yes, I think it can, because imagination is something that we all have, and it’s probably a valid comment that we don’t make enough use of it in our everyday lives. If creativity means artistic endeavour then I think that’s another topic, even though it puts imagination centre stage, so to speak.
 
I grew up in a family where one side was obviously artistic and the other side wasn’t, which strongly suggests there’s a genetic component. The other side excelled at sport, and I was rubbish at sport. However, both sides were obviously intelligent, despite a notable lack of formal education; in my parents’ case, both leaving school in their early teens. In fact, my mother did most of her schooling by correspondence, and my father left school in the midst of the great depression, shortly followed by active duty in WW2.
 
Puccio (mentioned above) argues that creativity isn’t taught in our education system because it’s too hard. Instead, he says that we teach by memorising facts and by ‘understanding’ problems. I would suggest that there is a hierarchy, where you need some basics before you can ‘graduate’ to ‘creative thinking’, and I use the term here in the way he intends it. I spent most of my working lifetime on engineering projects, with diverse and often complex elements. I need to point out that I wasn’t one of the technical experts involved, but I worked with them, in all their variety, because my job was to effectively co-ordinate all their activities towards a common goal, by providing a plan and then keeping it on the rails.
 
Engineering is all about problem solving, and I’m not sure one can do that without being creative, as well as analytical. In fact, one could argue that there is a dialectical relationship between them, but maybe I’m getting ahead of myself.
 
Back to Puccio, who introduced 2 terms I hadn’t come across before: ‘divergent’ and ‘convergent’ thinking, arguing they should be done in that order. In a nutshell, divergent thinking is brainstorming where one thinks up as many options as possible, and convergent thinking is where one narrows in on the best solution. He argues that we tend to do the second one without doing the first one. But this is related to something else that was raised in the course, which is ‘Type 1 thinking’ and ‘Type 2 thinking’.
 
Type 1 thinking is what most of us would call ‘intuition’, because basically it’s taking a cognitive shortcut to arrive at an answer to a problem, which we all do all the time, especially when time is a premium. Type 2 thinking is when we analyse the problem, which is not only time consuming but takes up brain resources that we’d prefer not to use, because we’re basically lazy, and I’m no exception. These 2 cognitive behaviours are clinically established, so it’s not pop-science.
 
However, something that was not discussed in the course, is that type 2 thinking can become type 1 thinking when we develop expertise in something, like learning a musical instrument, or writing a story, or designing a building. In other words, we develop heuristics based on our experience, which is why we sometimes jump to convergent thinking without going through the divergent part.
 
The course also dealt with ‘critical thinking’, as per its title, but I won’t dwell on that, because critical thinking arises from being analytical, and separating true expertise from bogus expertise, which is really a separate topic.
 
How does one teach these skills? I’m not a teacher, so I’m probably not best qualified to say. But I have a lot of experience in a profession that requires analytical thinking and problem-solving as part of its job description. The one thing I’ve learned from my professional life is the more I’m restrained by ‘rules’, the worse job I’ll do. I require the freedom and trust to do things my own way, and I can’t really explain that, but it’s also what I provide to others. And maybe that’s what people mean by ‘creative thinking’; we break the rules.
 
Artistic endeavour is something different again, because it requires spontaneity. But there is ‘divergent thinking’ involved, as Puccio pointed out, giving the example of Hemingway writing countless endings to Farewell to Arms, before settling on the final version. I’m reminded of the reported difference between Beethoven and Mozart, two of the greatest composers in the history of Western classical music. Beethoven would try many different versions of something (in his head and on paper) before choosing what he considered the best. He was extraordinarily prolific but he wrote only 9 symphonies and 5 piano concertos plus one violin concerto, because he workshopped them to death. Mozart, on the other hand, apparently wrote down whatever came into his head and hardly revised it. One was very analytical in their approach and the other was almost completely spontaneous.
 
I write stories and the one area where I’ve changed type 2 thinking into type 1 thinking is in creating characters – I hardly give it a thought. A character comes into my head almost fully formed, as if I just met them in the street. Over time I learn more about them and they sometimes surprise me, which is always a good thing. I once compared writing dialogue to playing jazz, because they both require spontaneity and extemporisation. Don Burrows once said you can’t teach someone to play jazz, and I’ve argued that you can’t teach someone to write dialogue.
 
Having said that, I once taught a creative writing class, and I gave the class exercises where they were forced to write dialogue, without telling them that that was the point of the exercise. In other words, I got them to teach themselves.
 
The hard part of storytelling for me is the plot, because it’s a neverending exercise in problem-solving. How did I get back to here? Analytical thinking is very hard to avoid, at least for me.
 
As I mentioned earlier, I think there is a dialectic between analytical thinking and creativity, and the best examples are not artists but genii in physics. To look at just two: Einstein and Schrodinger, because they exemplify both. But what came first: the analysis or the creativity? Well, I’m not sure it matters, because they couldn’t have done one without the other. Einstein had an epiphany (one of many) where he realised that an object in free fall didn’t experience a force, which apparently contradicted Newton. Was that analysis or creativity or both? Anyway, he not only changed how we think about gravity, he changed the way we think about the entire cosmos.
 
Schrodinger, borrowed an idea from de Broglie that particles could behave like waves and changed how we think about quantum mechanics. As Richard Feynman once said, ‘No one knows where Schrodinger’s equation comes from. It came out of Schrodinger’s head. You can’t derive it from anything we know.’
 

Sunday, 22 May 2022

We are metaphysical animals

 I’m reading a book called Metaphysical Animals (How Four Women Brought Philosophy Back To Life). The four women were Mary Midgley, Iris Murdoch, Philippa Foot and Elizabeth Anscombe. The first two I’m acquainted with and the last two, not. They were all at Oxford during the War (WW2) at a time when women were barely tolerated in academia and had to be ‘chaperoned’ to attend lectures. Also a time when some women students ended up marrying their tutors. 

The book is authored by Clare Mac Cumhaill and Rachael Wiseman, both philosophy lecturers who became friends with Mary Midgley in her final years (Mary died in 2018, aged 99). The book is part biographical of all 4 women and part discussion of the philosophical ideas they explored.

 

Bringing ‘philosophy back to life’ is an allusion to the response (backlash is too strong a word) to the empiricism, logical positivism and general rejection of metaphysics that had taken hold of English philosophy, also known as analytical philosophy. Iris spent time in postwar Paris where she was heavily influenced by existentialism and Jean-Paul Sartre, in particular, whom she met and conversed with. 

 

If I was to categorise myself, I’m a combination of analytical philosopher and existentialist, which I suspect many would see as a contradiction. But this isn’t deliberate on my part – more a consequence of pursuing my interests, which are science on one hand (with a liberal dose of mathematical Platonism) and how-to-live a ‘good life’ (to paraphrase Aristotle) on the other.

 

Iris was intellectually seduced by Sartre’s exhortation: “Man is nothing else but that which he makes of himself”. But as her own love life fell apart along with all its inherent dreams and promises, she found putting Sartre’s implicit doctrine, of standing solitarily and independently of one’s milieu, difficult to do in practice. I’m not sure if Iris was already a budding novelist at this stage of her life, but anyone who writes fiction knows that this is what it’s all about: the protagonist sailing their lone ship on a sea full of icebergs and other vessels, all of which are outside their control. Life, like the best fiction, is an interaction between the individual and everyone else they meet. Your moral compass, in particular, is often tested. Existentialism can be seen as an attempt to arise above this, but most of us don’t. 

 

Not surprisingly, Wittgenstein looms large in many of the pages, and at least one of the women, Elizabeth Anscombe, had significant interaction with him. With Wittgenstein comes an emphasis on language, which has arguably determined the path of philosophy since. I’m not a scholar of Wittgenstein by any stretch of the imagination, but one thing he taught, or that people took from him, was that the meaning we give to words is a consequence of how they are used in ordinary discourse. Language requires a widespread consensus to actually work. It’s something we rarely think about but we all take for granted, otherwise there would be no social discourse or interaction at all. There is an assumption that when I write these words, they have the same meaning for you as they do for me, otherwise I am wasting my time.

 

But there is a way in which language is truly powerful, and I have done this myself. I can write a passage that creates a scene inside your mind complete with characters who interact and can cause you to laugh or cry, or pretty much any other emotion, as if you were present; as if you were in a dream.

 

There are a couple of specific examples in the book which illustrate Wittgenstein’s influence on Elizabeth and how she used them in debate. They are both topics I have discussed myself without knowing of these previous discourses.

 

In 1947, so just after the war, Elizabeth presented a paper to the Cambridge Moral Sciences Club, which she began with the following disclosure:

 

Everywhere in this paper I have imitated Dr Wittgenstein’s ideas and methods of discussion. The best that I have written is a weak copy of some features of the original, and its value depends only on my capacity to understand and use Dr Wittgenstein’s work.

 

The subject of her talk was whether one can truly talk about the past, which goes back to the pre-Socratic philosopher, Parmenides. In her own words, paraphrasing Parmenides, ‘To speak of something past’ would then to ‘point our thought’ at ‘something there’, but out of reach. Bringing Wittgenstein into the discussion, she claimed that Parmenides specific paradox about the past arose ‘from the way that thought and language connect to the world’.

 

We apply language to objects by naming them, but, in the case of the past, the objects no longer exist. She attempts to resolve this epistemological dilemma by discussing the nature of time as we experience it, which is like a series of pictures that move on a timeline while we stay in the present. This is analogous to my analysis that everything we observe becomes the past as soon as it happens, which is exemplified every time someone takes a photo, but we remain in the present – the time for us is always ‘now’.

 

She explains that the past is a collective recollection, documented in documents and photos, so it’s dependent on a shared memory. I would say that this is what separates our recollection of a real event from a dream, which is solipsistic and not shared with anyone else. But it doesn’t explain why the past appears fixed and the future unknown, which she also attempted to address. But I don’t think this can be addressed without discussing physics.

 

Most physicists will tell you that the asymmetry between the past and future can only be explained by the second law of thermodynamics, but I disagree. I think it is described, if not explained, by quantum mechanics (QM) where the future is probabilistic with an infinitude of possible paths and classical physics is a probability of ONE because it’s already happened and been ‘observed’. In QM, the wave function that gives the probabilities and superpositional states is NEVER observed. The alternative is that all the futures are realised in alternative universes. Of course, Elizabeth Anscombe would know nothing of these conjectures.

 

But I would make the point that language alone does not resolve this. Language can only describe these paradoxes and dilemmas but not explain them.

 

Of course, there is a psychological perspective to this, which many people claim, including physicists, gives the only sense of time passing. According to them, it’s fixed: past, present and future; and our minds create this distinction. I think our minds create the distinction because only consciousness creates a reference point for the present. Everything non-sentient is in a causal relationship that doesn’t sense time. Photons of light, for example, exist in zero time, yet they determine causality. Only light separates everything in time as well as space. I’ve gone off-topic.

 

Elizabeth touched on the psychological aspect, possibly unintentionally (I’ve never read her paper, so I could be wrong) that our memories of the past are actually imagined. We use the same part of the brain to imagine the past as we do to imagine the future, but again, Elizabeth wouldn’t have known this. Nevertheless, she understood that our (only) knowledge of the past is a thought that we turn into language in order to describe it.

 

The other point I wish to discuss is a famous debate she had with C.S. Lewis. This is quite something, because back then, C.S. Lewis was a formidable intellectual figure. Elizabeth’s challenge was all the more remarkable because Lewis’s argument appeared on the surface to be very sound. Lewis argued that the ‘naturalist’ position was self-refuting if it was dependent on ‘reason’, because reason by definition (not his terminology) is based on the premise of cause and effect and human reason has no cause. That’s a simplification, nevertheless it’s the gist of it. Elizabeth’s retort:

 

What I shall discuss is this argument’s central claim that a belief in the validity of reason is inconsistent with the idea that human thought can be fully explained as the product of non-rational causes.

 

In effect, she argued that reason is what humans do perfectly naturally, even if the underlying ‘cause’ is unknown. Not knowing the cause does not make the reasoning irrational nor unnatural. Elizabeth specifically cited the language that Lewis used. She accused him of confusing the concepts of “reason”, “cause” and “explanation”.

 

My argument would be subtly different. For a start, I would contend that by ‘reason’, he meant ‘logic’, because drawing conclusions based on cause and effect is logic, even if the causal relations (under consideration) are assumed or implied rather than observed. And here I contend that logic is not a ‘thing’ – it’s not an entity; it’s an action - something we do. In the modern age, machines perform logic; sometimes better than we do.

 

Secondly, I would ask Lewis, does he think reason only happens in humans and not other animals? I would contend that animals also use logic, though without language. I imagine they’d visualise their logic rather than express it in vocal calls. The difference with humans is that we can perform logic at a whole different level, but the underpinnings in our brains are surely the same. Elizabeth was right: not knowing its physical origins does not make it irrational; they are separate issues.

 

Elizabeth had a strong connection to Wittgenstein right up to his death. She worked with him on a translation and edit of Philosophical Investigations, and he bequeathed her a third of his estate and a third of his copyright.

 

It’s apparent from Iris’s diaries and other sources that Elizabeth and Iris fell in love at one point in their friendship, which caused them both a lot of angst and guilt because of their Catholicism. Despite marrying, Iris later had an affair with Pip (Philippa).

 

Despite my discussion of just 2 of Elizabeth’s arguments, I don’t have the level of erudition necessary to address most of the topics that these 4 philosophers published in. Just reading the 4 page Afterwards, it’s clear that I haven’t even brushed the surface of what they achieved. Nevertheless, I have a philosophical perspective that I think finds some resonance with their mutual ideas. 

 

I’ve consistently contended that the starting point for my philosophy is that for each of us individually, there is an inner and outer world. It even dictates the way I approach fiction. 

 

In the latest issue of Philosophy Now (Issue 149, April/May 2022), Richard Oxenberg, who teaches philosophy at Endicott College in Beverly, Massachusetts, wrote an article titled, What Is Truth? wherein he describes an interaction between 2 people, but only from a purely biological and mechanical perspective, and asks, ‘What is missing?’ Well, even though he doesn’t spell it out, what is missing is the emotional aspect. Our inner world is dominated by emotional content and one suspects that this is not unique to humans. I’m pretty sure that other creatures feel emotions like fear, affection and attachment. What’s more I contend that this is what separates, not just us, but the majority of the animal kingdom, from artificial intelligence.

 

But humans are unique, even among other creatures, in our ability to create an inner world every bit as rich as the one we inhabit. And this creates a dichotomy that is reflected in our division of arts and science. There is a passage on page 230 (where the authors discuss R.G. Collingwood’s influence on Mary), and provide an unexpected definition.

 

Poetry, art, religion, history, literature and comedy are all metaphysical tools. They are how metaphysical animals explore, discover and describe what is real (and beautiful and good). (My emphasis.)

 

I thought this summed up what they mean with their coinage, metaphysical animals, which titles the book, and arguably describes humanity’s most unique quality. Descriptions of metaphysics vary and elude precise definition but the word, ‘transcendent’, comes to mind. By which I mean it’s knowledge or experience that transcends the physical world and is most evident in art, music and storytelling, but also includes mathematics in my Platonic worldview.


 

Footnote: I should point out that certain chapters in the book give considerable emphasis to moral philosophy, which I haven’t even touched on, so another reader might well discuss other perspectives.


Saturday, 26 March 2022

Symptoms of living in a post-truth world

 I recently had 2 arguments with different people, who took extreme positions on what we mean by truth. One argued that there is no difference between mathematical truths and truths in fiction – in fact, he described mathematics, that is not being ‘applied’, as ‘mathematical fiction’. The other argued that there is no objective truth and everything we claim to know are only ‘beliefs’, including mathematics. When I told her that I know there will always be mathematics that remain unknown, she responded that I ‘believe I know’. I thought that was an oxymoron, but I let it go. The trivial example, that there are an infinite number of primes or an infinite number of digits in pi, should put that to rest, or so one would think. 

Norman Wildberger, whom I’ve cited before, says that he doesn’t ‘believe’ in Real numbers, and neither does he believe in infinity, and he provides compelling arguments. But I feel that he’s redefining what we mean by mathematics, because his sole criterion is that it can be computed. Meanwhile, we have a theorem by Gregory Chaitin who contends that there are infinitely more incomputable Real numbers than computable Real numbers. People will say that mathematics is an abstract product of the mind, so who cares. But, as Paul Davies says, ‘mathematics works’, and it works so well that we can comprehend the Universe from the cosmic scale to the infinitesimal. 

 

Both of my interlocutors, I should point out, were highly intelligent, well-educated and very articulate, and I believe that they really believed in what they were saying. But, if there is no objective truth, then there are no 'true or false' questions that can be answered. To take the example I’ve already mentioned, it’s either true or false that we can’t know everything in mathematics. And if it’s false, then we must know everything. But my interlocutor would say that I claimed we’d never know and I can’t say I know that for sure. 

 

Well, putting aside the trivial example of infinity, there are proofs based on logic that says it’s true and that’s good enough for me. She claimed that logic can be wrong if the inputs are wrong, which is correct. In mathematics, this is dependent on axioms, and mathematics like all other sciences never stands still, so we keep getting new axioms. But it’s the nature of science that it builds on what went before, and, if it’s all ‘belief’, then it’s a house built on sand. And if it's a house built on sand, then all the electronic gadgets we use and the satellite systems we depend on could all crash without warning, but no one really believes that.

 

So that’s one side of the debate and the other side is that truths in art have the same status as truths in science. There are a couple of arguments one can use to counter this, the most obvious being that a work of art, like Beethoven’s 5th, is unique – no one else created that. But Pythagoras’s theorem could have been discovered by anyone, and in fact, it was discovered by the Chinese some 500 years before Pythagoras. I write fiction, and while I borrow tropes and themes and even plot devices from others, I contend that my stories are unique and so are the characters I create. In fact, my stories are so unique, that they don’t even resemble each other, as at least one reader has told me.

 

But there is another argument and that involves memes, which are cultural ideas, for want of a better definition, that persist and propagate. Now, some argue that virtually everything is a meme, including scientific theories and mathematical theorems. But there is a difference. Cultural memes are successful because they outlive their counterparts, but scientific theories and mathematical theorems outlive their counterparts because they are successful. And that’s a fundamental distinction between truth in mathematics and science, and truth in art.



Addendum: I just came across this video (only posted yesterday) and it’s very apposite to this post. It’s about something called Zero Knowledge Proof, and it effectively proves if someone is lying or not. It’s relevance to my essay is that it applies to true or false questions. You can tell if someone is telling the truth without actually knowing what that truth is. Apparently, it’s used algorithmically as part of blockchain for bitcoin transactions.

 

To give the example that Jade provides in her video, if someone claims that they have a proof of Riemann’s hypothesis, you can tell if they’re lying or not without them having to reveal the actual proof. That’s a very powerful tool, and, as a consequence, it virtually guarantees that a mathematical truth exists for a true or false proposition; in this hypothetical case, Riemann’s hypothesis, because it’s either true or false by definition.






Sunday, 21 November 2021

Cancel culture – the scourge of our time

There are many things that cause me some anguish at the moment, not least that Donald Trump could easily be re-elected POTUS in 2024, despite deliberately undermining and damaging the very institution he wants to lead, which is American democracy. It’s not an exaggeration to say that he’s attacked it at its core.


This may seem a mile away from the topic I’ve alluded to in the title of my post, but they both seem to be symptoms of a divisiveness I haven’t seen since the Vietnam war. 

 

The word, ‘scourge’, is defined as ‘a whip used as an instrument of punishment’; and that’s exactly how cancel culture works, with social media the perfect platform from which to wield it.

 

In this weekend’s Good Weekend magazine (Fairfax Group), the feature article is on this very topic. But I would like to go back to the previous weekend, when another media outlet, Murdoch’s Weekend Australian Magazine published an article on well known atheist, Richard Dawkins. It turns out that at the ripe old age of 80, Dawkins has been cancelled. To be precise, he had his 1996 Humanist of the Year award withdrawn by the American Humanist Association (AHA) earlier this year, because, in 2015, he tweeted a defence of Rachel Doleza (a white chapter president of NAACP, the National Association for the Advancement of Coloured People) who had been vilified for identifying as Black.

 

Of course, I don’t know anything about Rachel Doleza or the context of that stoush, but I can identify with Dawkins, even though I’ve never suffered the same indignity. Dawkins and I are of a similar mould, though we live in different strata of society. In saying that, I don’t mean that I agree with all his arguments, because I obviously don’t, but we are both argumentative and are not shy in expressing our opinions. I really don’t possess the moral superiority to throw stones at Dawkins, even though I have.

 

I remember my father once telling me that if you admired an Australian fast bowler (he had someone in mind) then you also had to admire an English fast bowler (of the same generation), because they had the exact same temperament and wicket-taking abilities. Of course, that also applies to politicians. And it pretty much applies to me and Dawkins.

 

On the subject of identifying as ‘black’, I must tell a story related to me by a friend I knew when I worked in Princeton in 2001/2. She was a similar age to me and originally from Guyana. In fact, she was niece to West Indies champion cricketer, Lance Gibbs, and told me about attending his wedding when she was 8 years old (I promise no more cricketing references). But she told me how someone she knew (outside of work) told her that she ‘didn’t know what it was like to be black’. To which she replied, ‘Of course I know I’m black, I only have to look in the mirror every morning.’  Yes, it’s funny, but it goes to a deeper issue about identity. So a black person, who had lived their entire life in the USA, was telling another black person, who had come from outside of the US, that they didn’t know what it was like to be ‘black’. 

 

Dawkins said that, as a consequence, he’d started to self-censor, which is exactly what his detractors want. If Dawkins has started to self-censor, then none of us are safe or immune. What hurt him, of course, was being attacked by people on the Left, which he mostly identifies with. And, while this practice occurs on both sides, it’s on the Left where it has become most virulent. 

 

“I self-censor. More so in recent years. Why? It’s not a thing I’ve done throughout my life, I’ve always spoken my mind openly. But we’re now in a time when if you do speak your mind openly, you are at risk of being picked up and condemned.”

 

“Every time a lecturer is cancelled from an American university, that’s another God knows how many votes for Trump.”

 

And this is the thing: the Right loves nothing more than the Left turning on itself. It’s insidious, self-destructive and literally soul-destroying. In the Good Weekend article, they focus on a specific case, while also citing other cases, both in Australia and America. The specific case was actor, Hugh Sheridan, having a Sydney Festival show cancelled, which he’d really set his sights on, because he was playing a trans-gender person which created outrage in the LGBTQIA+ community. Like others cited in the article, he contemplated suicide which triggered close friends to monitor him. This is what it’s come to. It’s a very lengthy article, which I can’t do justice to on this post, but there is a perversion here: all the shows and people who are being targeted are actually bringing diversity of race and sexuality into the public arena and being crucified by the people they represent. The conservatives, wowsers and Bible-bashers must love it.

 

This is a phenomenon that is partly if not mostly, generational, and amplified by social media. People are being forced to grovel.

 

Emma Dawson, head of the Labor-aligned (Australian political party, for overseas readers) Per Capita think tank, told the Good Weekend“[cancel culture is] more worrying to me than just about anything other than far-right extremism. It is pervasive among educated young people; very few are willing to question it.”

 

‘In 2019, Barack Obama warned a group of young people: “This idea of purity, and you’re never compromised and always politically woke... you should get over that quickly. The world is messy.”

 

And this is the nub of the issue: cancel culture is all about silencing any debate, and, without debate, you have authoritarianism, even though it’s disguised as its opposite.

 

In the same article, the author, James Button, argues that the rise of Donald Trump is not a coincidence in the emergence of this phenomenon.

 

The election of Donald Trump horrified progressives. Here was a president – elected by ordinary Americans – who was racist, who winked at neo-Nazis and who told bare-faced lies in a brazen assertion of power while claiming that the liars were progressive media. His own strategy adviser, Stephen Bannon, said that the way to win the contest was to overwhelm the media with misinformation, to “flood the zone with shit”.

 

And they succeeded so well that America is more divided than it has been since its historical civil war.


To return to Hugh Sheridan, whom I think epitomises this situation, at least as it’s being played out in Australia, in that it’s the Arts that are coming under attack, and from the Left, it has to be said. Actors and writers (like myself) often portray characters who have different backgrounds to us. To give a recent example on ABC TV, which produces some outstanding free-to-air dramas with internationally renowned casts, when everything else is going into subscribed streaming services. Earlier this year, they produced and broadcast a series called The Newsreader, set in the 1980s when a lot of stuff was happening both locally and overseas. ‘At the 11th AACTA (Australian Academy of Cinema and Television Arts) awards, the show was nominated for more awards than any other program’ (Wikipedia).

 

A key plotline of the show was that the protagonist was gay but not openly so. The point is that I assume the actor was straight, although I don’t really know, but it’s what actors do. God knows, there have been enough gay actors who have played straight characters (Sir Ian McKellen, who played Gandalf, as well as Shakespearean roles). So why crucify someone who is part of the LGBTQIA+ community for playing a transgender role. He was even accused of being homophobic and transgenderphobic. He tweeted back, “you’re insane”, which only resulted in him being trolled for accusing his tormentors of being ‘insane’.

 

Someone recently asked me why I don’t publish what I write anymore. There is more than one reason, but one is fear of being cancelled. I doubt a publisher would publish what I write, anyway. But also, I suffer from impostor syndrome in that I genuinely feel like an impostor and I don’t need someone to tell me. The other thing is that I simply don’t care; I don’t feel the need to publish to validate my work.


Wednesday, 6 October 2021

Tips on writing sex scenes

 Yes, this is a bit tongue-in-cheek, or tongue in someone else’s cheek to borrow a well-worn witticism. This arose from reading an interview by Benjamin Law (Aussie writer) of Pulitzer-prize winning author, Viet Thanh Nguyen, who briefly discussed writing sex scenes. He gives this advice: 

Not being utterly male-centred, if you happen to be a man or masculine. Not being too vulgar. Don’t be too florid. And humour always helps.

 

Many years ago (over a decade) I attended a writers’ convention, where there are always a lot of panel discussions, and there was one on ‘How to write sex scenes’, which was appropriately entertaining and unsurprisingly well attended.

 

Even longer ago, when I attempted to write my first novel, with utterly no experience or tuition, just blindly diving in the deep end, there was the possibility of a sex scene and I chickened out. The entire opus was terrible, but over about 3 years and 3 drafts I taught myself to write. I sent it to a scrip-assessor, who was honest and unflattering. But one of the things I remember is that she criticised me for avoiding the sex scene. I was determined to fix that in subsequent attempts. I still have a hard copy of that manuscript, by the way, to remind myself of how badly I can write.

 

But there are a couple of things I remember from that particular panel discussion (including a husband and wife team on the panel). Someone asked for a definition of pornography, and someone answered: the difference between erotica and pornography is that one you don’t admit to reading (or watching, as is more often the case). So, it’s totally subjective.

 

The first editor (a woman) of ELVENE, took offense at the first sex scene. I promptly sent the manuscript to 2 women friends for second and third opinions. Anyway, I think that you’ll invariably offend someone, and the only sure way to avoid that is to have all the sex happen off the page. Some writers do that, and sometimes I do it myself. Why? I think it depends on where it sits in the story, and is it really necessary to describe every sexual encounter between 2 characters, who start doing it regularly?

 

The other thing I remember from that panel is someone explaining how important it was to describe it from one of the character’s points of view. If you describe it from the POV of a ‘third party’, you risk throwing the reader out of the story. I contend that the entire story should be told from a character’s POV, though you can change characters, even within the same scene. The obvious analogy is with dialogue. You rarely change POV in dialogue, though it’s not a hard and fast rule. In other words, the reader’s perspective invariably involves speaking and listening from just one POV, as if they were in the conversation. The POV could be male or female - it’s irrelevant - but it’s usually the protagonist. I take the same approach to sex scenes. It’s risky for a man to take a woman’s POV in a sex scene, but I’ve done it many times. 

 

I often take the POV of the ‘active’ partner, and the reader learns what the other partner is experiencing second-hand so to speak. It generally means that the orgasm is described from the other partner’s perspective which makes it a lot easier. If they come in unison, I make sure the other one comes fractionally first.

 

I don’t write overlong sex scenes, because they become boring. Mine are generally a page long, including all the carryon that happens beforehand, which is not intentional, just happenstance. I wrote a review of Cory Doctorow’s sci-fi novel, Walkaway, a novel (full title) which has a number of heavy sex scenes which I did find boring, but that probably says more about me than the author. I’m sure there are readers who find my sex scenes ‘heavy’ and possibly boring as well. 

 

I have some rules of my own. They are an interlude yet they should always serve the story. They tell us something about the characters and they invariably have consequences, which are often positive, but not necessarily so. There is always a psychological component and my approach is that you can’t separate the psychological from the physical. They change the character and they change the dynamic of a relationship. Some of my characters appear celibate, but you find them in real life too.

 

I take the approach that fiction is a combination of fantasy and reality and the mixture varies from genre to genre and even author to author. So, in this context, the physical is fantasy and the psychological is reality.

 

One should never say ‘never’, but I couldn’t imagine writing a rape scene or someone being tortured, though I’ve seen such scenes on Scan-noir TV. However, I’ve written scenes involving sexual exploitation, to both men and women, and, in those cases, they were central to the story.

 

Lastly, I often have to tell people that I’m not in the story. I don’t describe my personal sex-life, and I expect that goes for other writers too. 


Saturday, 5 December 2020

Some (personal) Notes on Writing

 This post is more personal, so don’t necessarily do what I’ve done. I struggled to find my way as a writer, and this might help to explain why. Someone recently asked me how to become a writer, and I said, ‘It helps, if you start early.’ I started pre-high school, about age 8-9. I can remember writing my own Tarzan scripts and drawing my own superheroes. 

 

Composition, as it was called then, was one of my favourite activities. At age 12 (first year high school), when asked to write about what we wanted to do as adults, I wrote that I wanted to write fiction. I used to draw a lot as a kid, as well. But, as I progressed through high school, I stopped drawing altogether and my writing deteriorated to the point that, by the time I left school, I couldn’t write an essay to save my life; I had constant writer’s block.

 

I was in my 30s before I started writing again and, when I started, I knew it was awful, so I didn’t show it to anyone. A couple of screenwriting courses (in my late 30s) was the best thing I ever did. With screenwriting, the character is all in what they say and what they do, not in what they look like. However, in my fiction, I describe mannerisms and body language as part of a character’s demeanour, in conjunction with their dialogue. Also, screenwriting taught me to be lean and economical – you don’t write anything that can’t be seen or heard on the screen. The main difference in writing prose is that you do all your writing from inside a character’s head; in effect, you turn the reader into an actor, subconsciously. Also, you write in real time so it unfolds like a movie in the reader’s imagination.

 

I break rules, but only because the rules didn’t work for me, and I learned that the hard way. So I don’t recommend that you do what I do, because, from what I’ve heard and read, most writers don’t. I don’t write every day and I don’t do multiple drafts. It took me a long time to accept this, but it was only after I became happy and confident with what I produced. In fact, I can go weeks, even months, without writing anything at all and then pick it up from where I left off.

 

I don’t do rewrites because I learned the hard way that, for me, they are a waste of time. I do revisions and you can edit something forever without changing the story or its characters in any substantial way. I correct for inconsistencies and possible plot holes, but if you’re going to do a rewrite, you might as well write something completely different – that’s how I feel about it. 

 

I recently saw a YouTube discussion between someone and a writer where they talked about the writer’s method. He said he did a lot of drafts, and there are a lot of highly successful writers who do (I’m not highly successful, yet I don’t think that’s the reason why). However, he said that if you pick something up you wrote some time ago, you can usually tell if it’s any good or not. Well, my writing passes that test for me.

 

I’m happiest when my characters surprise me, and, if they don’t, I know I’m wasting my time. I treat it like it’s their story, not mine; that’s the best advice I can give.

 

How to keep the reader engaged? I once wrote in another post that creating narrative tension is an essential writing skill, and there are a number of ways to do this. Even a slow-moving story can keep a reader engaged, if every scene moves the story forward. I found that keeping scenes short, like in a movie, and using logical sequencing so that one scene sets up the next, keeps readers turning the page. Narrative tension can be subliminally created by revealing information to the reader that the characters don’t know themselves; it’s a subtle form of suspense. Also, narrative tension is often manifest in the relationships between characters. I’ve always liked moral dilemmas, both in what I read (or watch) and what I write.

 

Finally, when I start off a new work, it will often take me into territory I didn’t anticipate; I mean psychological territory, as opposed to contextual territory or physical territory. 

 

A story has all these strands, and when you start out, you don’t necessarily know how they are going to come together – in fact, it’s probably better if you don’t. That way, when they do, it’s very satisfying and there is a sense that the story already existed before you wrote it. It’s like you’re the first to read it, not create it, which I think is a requisite perception.