Paul P. Mealing

Check out my book, ELVENE. Available as e-book and as paperback (print on demand, POD). Also this promotional Q&A on-line.
Showing posts with label Art. Show all posts
Showing posts with label Art. Show all posts

Thursday, 29 May 2025

The role of the arts. Why did it evolve? Will AI kill it?

 As I mentioned in an earlier post this month, I’m currently reading Brian Greene’s book, Until the End of Time; Mind, Matter; and Our Search for Meaning in an Evolving Universe, which covers just about everything from cosmology to evolution to consciousness, free will, mythology, religion and creativity. He spends a considerable amount of time on storytelling, compared to other art forms, partly because it allows an easy segue from language to mythology to religion.
 
One of his points of extended discussion was in trying to answer the question: why did our propensity for the arts evolve, when it has no obvious survival value? He cites people like Steven Pinker, Brian Boyd (whom I discuss at length in another post) and even Darwin, among others. I won’t elaborate on these, partly due to space, and partly because I want to put forward my own perspective, as someone who actually indulges in an artistic activity, and who could see clearly how I inherited artistic genes from one side of my family (my mother’s side). No one showed the slightest inclination towards artistic endeavour on my father’s side (including my sister). But they all excelled in sport (including my sister), and I was rubbish at sport. One can see how sporting prowess could be a side-benefit to physical survival skills like hunting, but also achieving success in combat, which humans have a propensity for, going back to antiquity.
 
Yet our artistic skills are evident going back at least 30-40,000 years, in the form of cave-art, and one can imagine that other art forms like music and storytelling have been active for a similar period. My own view is that it’s sexual selection, which Greene discusses at length, citing Darwin among others, as well as detractors, like Pinker. The thing is that other species also show sexual selection, especially among birds, which I’ve discussed before a couple of times. The best known example is the peacock’s tail, but I suspect that birdsong also plays a role, not to mention the bower bird and the lyre bird. The lyre bird is an interesting one, because they too have an extravagant tail (I’m talking about the male of the species) which surely would be a hindrance to survival, and they perform a dance and are extraordinary mimics. And the only reason one can think that this might have evolutionary value at all is because the sole purpose of those specific attributes is to attract a mate.
 
And one can see how this is analogous to behaviour in humans, where it is the male who tends to attract females with their talents in music, in particular. As Greene points out, along with others, artistic attributes are a by-product of our formidable brains, but I think these talents would be useless if we hadn’t evolved in unison a particular liking for the product of these endeavours (also discussed by Greene), which we see even in the modern world. I’m talking about the fact that music and stories both seem to be essential sources of entertainment, evident in the success of streaming services, not to mention a rich history in literature, theatre, ballet and more recently, cinema.
 
I’ve written before that there are 2 distinct forms of cognitive ability: creative and analytical; and there is neurological evidence to support this. The point is that having an analytical brain is just as important as having a creative one, otherwise scientific theories and engineering feats, for which humans seem uniquely equipped to provide, would never have happened, even going back to ancient artefacts like Stonehenge and both the Egyptian and Mayan pyramids. Note that these all happened on different continents.
 
But there are times when the analytical and creative seem to have a synergistic effect, and this is particularly evident when it comes to scientific breakthroughs – a point, unsurprisingly, not lost on Greene, who cites Einstein’s groundbreaking discoveries in relativity theory as a case-in-point.
 
One point that Greene doesn’t make is that there has been a cultural evolution that has effectively overtaken biological evolution in humans, and only in humans I would suggest. And this has been a direct consequence of our formidable brains and everything that goes along with that, but especially language.
 
I’ve made the point before that our special skill – our superpower, if you will – is the ability to nest concepts within concepts, which we do with everything, not just language, but it would have started with language, one would think. And this is significant because we all think in a language, including the ability to manipulate abstract concepts in our minds that don’t even exist in the real world. And no where is this more apparent than in the art of storytelling, where we create worlds that only exist in the imagination of someone’s mind.
 
But this cultural evolution has created civilisations and all that they entail, and survival of the fittest has nothing to do with eking out an existence in some hostile wilderness environment. These days, virtually everyone who is reading this has no idea where their food comes from. However, success is measured by different parameters than the ability to produce food, even though food production is essential. These days success is measured by one’s ability to earn money and activities that require brain-power have a higher status and higher reward than so-called low-skilled jobs. In fact, in Australia, there is a shortage of trades because, for the last 2 generations at least, the emphasis, vocationally, has been in getting kids into university courses, when it’s not necessarily the best fit for the child. This is why the professional class (including myself) are often called ‘elitist’ in the culture wars and being a tradie is sometimes seen as a stigma, even though our society is just as dependent on them as they are on professionals. I know, because I’ve spent a working lifetime in a specific environment where you need both: engineering/construction.
 
Like all my posts, I’ve gone off-track but it’s all relevant. Like Greene, I can’t be sure how or why evolution in humans was propelled, if not hi-jacked, by art, but art in all its forms is part of the human condition. A life without music, stories and visual art – often in combination – is unimaginable.
 
And this brings me to the last question in my heading. It so happens that while I was reading about this in Greene’s thought-provoking book, I was also listening to a programme on ABC Classic (an Australian radio station) called Legends, which is weekly and where the presenter, Mairi Nicolson, talks about a legend in the classical music world for an hour, providing details about their life as well as broadcasting examples of their work. In this case, she had the legend in the studio (a rare occurrence), who was Anna Goldsworthy. To quote from Wikipedia: Anna Louise Goldsworthy is an Australian classical pianist, writer, academic, playwright, and librettist, known for her 2009 memoir Piano Lessons.

But the reason I bring this up is because Anna mentioned that she attended a panel discussion on the role of AI in the arts. Anna’s own position is that she sees a role for AI, but in doing the things that humans find boring, which is what we are already seeing in manufacturing. In fact, I’ve witnessed this first-hand. Someone on the panel made the point that AI would effectively democratise art (my term, based on what I gleaned from Anna’s recall) in the sense that anyone would be able to produce a work of art and it would cease to be seen as elitist as it is now. He obviously saw this as a good thing, but I suspect many in the audience, including Anna, would have been somewhat unimpressed if not alarmed. Apparently, someone on the panel challenged that perspective but Anna seemed to think the discussion had somehow veered into a particularly dissonant aberration of the culture wars.
 
I’m one of those who would be alarmed by such a development, because it’s the ultimate portrayal of art as a consumer product, similar to the way we now perceive food. And like food, it would mean that its consumption would be completely disconnected from its production.
 
What worries me is that the person on the panel making this announcement (remember, I’m reporting this second-hand) apparently had no appreciation of the creative process and its importance in a functioning human society going back tens of thousands of years.
 
I like to quote from one of the world’s most successful and best known artists, Paul McCartney, in a talk he gave to schoolchildren (don’t know where):
 
“I don't know how to do this. You would think I do, but it's not one of these things you ever know how to do.” (my emphasis)
 
And that’s the thing: creative people can’t explain the creative process to people who have never experienced it. It feels like we have made contact with some ethereal realm. On another post, I cite Douglas Hofstadter (from his famous Pulitzer-prize winning tome, Godel, Escher, Bach: An Eternal Golden Braid) quoting Escher:
 
"While drawing I sometimes feel as if I were a spiritualist medium, controlled by the creatures I am conjuring up."

 
Many people writing a story can identify with this, including myself. But one suspects that this also happens to people exploring the abstract world of mathematics. Humans have developed a sense that there is more to the world than what we see and feel and touch, which we attempt to reveal in all art forms, and this, in turn, has led to religion. Of course, Greene spends another entire chapter on that subject, and he also recognises the connection between mind, art and the seeking of meaning beyond a mortal existence.

Friday, 23 May 2025

Insights into writing fiction

 This is a series of posts I published on Quora recently, virtually in one day, in response to specific questions, so I thought it worth posting them here.
 
 
What made you start writing science fiction?
 
I was late coming to science fiction as a reader, partly because I studied science and the suspension of disbelief was more difficult as a result. In my teens I read James Bond and Carter Brown novels that my father had, plus superhero comics, which I’d been addicted to from a young age. I think all of these influenced my later writing. Mind you, I liked innovative TV shows like Star Trek and The Twilight Zone. The British TV show, The Avengers with Emma Peel and Steed was a favourite, which had sci-fi elements. So the seeds were there.
 
I came to sci-fi novels via fantasy, where the suspension of disbelief was mandatory. I remember 2 books which had a profound effect on me: The Lord of the Rings by JRR Tolkien and Dune by Frank Herbert; and I read them in that order. I then started to read Asimov, Clarke, Heinlein and Le Guin. What I liked about sci-fi was the alternative worlds and societies more than the space-travel and gizmos.
 
The first sci-fi I wrote was a screenplay for a teenage audience, called Kidnapped in Time, and it was liberating. I immediately realised that this was my genre. I combined a real-world scenario, based (very loosely*) on my own childhood with a complete fantasy world set in the future and on another planet, which included alien creatures. To be honest, I’ve never looked back.
 
Elvene was even more liberating, partly because I used a female protagonist. Not sure why that worked, but women love it, so I must have done something right. Since then, all my stories feature female protagonists, as well as males. Villains can be male or female or even robots.
 
Science fiction is essentially what-ifs. What if we genetically engineered humans? What if we had humanoid robots? What if we found life on another world? What if we colonised other worlds? What if we could travel intergalactic distances?
 
 
What unconventional writing techniques have you found most effective in crafting compelling characters?
 
How do you differentiate ‘unconventional’ from ‘conventional’? I don’t know if my techniques are one or the other. Characters come to me, similarly, I imagine, to the way melodies and tunes come to composers and songwriters. That wasn’t always the case. When I started out all the characters were different versions of me and not very believable.
 
It’s like acting, and so, in the beginning, I was a poor actor. Don’t ask me how I changed that, because I don’t know – just practice, I guess. To extend the analogy with composing, I compare writing dialogue to playing jazz, because they both require extemporisation. I don’t overthink it, to be honest. I somehow inhabit the character and they come alive. I imagine it’s the same as acting. I say, ‘imagine’, because I can’t act to save my life – I know, I’ve tried.
 
 
How do you balance originality and familiarity when creating characters and plots in your stories?

 
All fiction is a blend of fantasy and reality, and that blend is dependent on the genre and the author’s own proclivities. I like to cite the example of Ian Fleming’s James Bond novels, where the reality was the locations, but also details like what type of gun Bond used (Walther PPK) and the type of cigarettes he smoked (Turkish blend). The fantasy was in the plots, the larger-than-life villains and the femme-fatales with outlandish names.
 
My fiction is sci-fi, so the worlds and plots are total fantasy and the reality is all in the characters and the relationships they form.
 
 
*When I say ‘loosely’, the time and milieu is pretty much the same, but whereas the protagonist had a happy home life, despite having no memory of his mother (he lived with his father and older brother), I had a mother, a father and an older sister, but my home life was anything but happy. I make it a rule not to base characters on anyone I know.

Tuesday, 29 April 2025

Writing and philosophy

 I’ve been watching a lot of YouTube videos of Alan Moore, who’s probably best known for his graphic novels, Watchmen and V for Vendetta, both of which were turned into movies. He also wrote a Batman graphic novel, The Killing Joke, which was turned into an R rated animated movie (due to Batman having sex with Batgirl) with Mark Hamill voicing the Joker. I’m unsure if it has any fidelity to Moore’s work, which was critically acclaimed, whereas the movie received mixed reviews. I haven’t read the graphic novel, so I can’t comment.
 
On the other hand, I read Watchmen and saw the movie, which I reviewed on this blog, and thought they were both very good. I also saw V for Vendetta, starring Natalie Portman and Hugo Weaving, without having read Moore’s original. Moore also wrote a novel, Jerusalem, which I haven’t read, but is referenced frequently by Robin Ince in a video I cite below.
 
All that aside, it’s hard to know where to start with Alan Moore’s philosophy on writing, but the 8 Alan Moore quotes video is as good a place as any if you want a quick overview. For a more elaborate dialogue, there is a 3-way interview, obviously done over a video link, between Moore and Brian Catling, hosted by Robin Ince, with the online YouTube channel, How to Academy. They start off talking about imagination, but get into philosophy when all 3 of them start questioning what reality is, or if there is an objective reality at all.
 
My views on this are well known, and it’s a side-issue in the context of writing or creating imaginary worlds. Nevertheless, had I been party to the discussion, I would have simply mentioned Kant, and how he distinguishes between the ‘thing-in-itself’ and our perception of it. Implicit in that concept is the belief that there is an independent reality to our internal model of it, which is mostly created by a visual representation, but other senses, like hearing, touch and smell, also play a role. This is actually important when one gets into a discussion on fiction, but I don’t want to get ahead of myself. I just wish to make the point that we know there is an external objective reality because it can kill you. Note that a dream can’t kill you, which is a fundamental distinction between reality and a dreamscape. I make this point because I think a story, which takes place in your imagination, is like a dreamscape; so that difference carries over into fiction.
 
And on the subject of life-and-death, Moore references something he’d read on how evolution selects for ‘survivability’ not ‘truth’, though he couldn’t remember the source or the authors. However, I can, because I wrote about that too. He’s obviously referring to the joint paper written by Donald Hoffman and Chetan Prakash called Objects of Consciousness (Frontiers of Psychology, 2014). This depends on what one means by ‘truth’. If you’re talking about mathematical truths then yes, it has little to do with survivability (our modern-day dependence on technical infrastructure notwithstanding). On the other hand, if you’re talking about the accuracy of the internal model in your mind matching the objective reality external to your body, then your survivability is very much dependent on it.
 
Speaking of mathematics, Ince mentions Bertrand Russell giving up on mathematics and embracing philosophy because he failed to find a foundation that ensured its truth (my wording interpretating his interpretation). Basically, that’s correct, but it was Godel who put the spanner in the works with his famous Incompleteness Theorem, which effectively tells us that there will always exist mathematical truths that can’t be proven true. In other words, he concretely demonstrated (proved, in fact) that there is a distinction between truth and proof in mathematics. Proofs rely on axioms and all axioms have limitations in what they can prove, so you need to keep finding new axioms, and this infers that mathematics is a neverending endeavour. So it’s not the end of mathematics as we know it, but the exact opposite.
 
All of this has nothing to do with writing per se, but since they raised these issues, I felt compelled to deal with them.
 
At the core of this part of their discussion, is the unstated tenet that fiction and non-fiction are distinct, even if the boundary sometimes becomes blurred. A lot of fiction, if not all, contains factual elements. I like to cite Ian Fleming’s James Bond novels containing details like the gun Bond used (a Walther PPK) and the Bentley he drove, which had an Amherst Villiers supercharger. Bizarrely, I remember these trivial facts from a teenage obsession with all things Bond.
 
And this allows me to segue into something that Moore says towards the end of this 3-way discussion, when he talks specifically about fantasy. He says it needs to be rooted in some form of reality (my words), otherwise the reader won’t be able to imagine it at all. I’ve made this point myself, and give the example of my own novel, Elvene, which contains numerous fantasy elements, including both creatures that don’t exist on our world and technology that’s yet to be invented, if ever.
 
I’ve written about imagination before, because I argue it’s essential to free will, which is not limited to humans, though others may disagree. Imagination is a form of time travel, into the past, but more significantly, into the future. Episodic memories and imagination use the same part of the brain (so we are told); but only humans seem to have the facility to time travel into realms that don’t exist anywhere else other than the imagination. And this is why storytelling is a uniquely human activity.
 
I mentioned earlier how we create an internal world that’s effectively a simulation of the external world we interact with. In fact, my entire philosophy is rooted in the idea that we each of us have an internal and external world, which is how I can separate religion from science, because one is completely internal and the other is an epistemology of the physical universe from the cosmic scale to the infinitesimal. Mathematics is a medium that bridges them, and contributes to the Kantian notion that our perception may never completely match the objective reality. Mathematics provides models that increase our understanding while never quite completing it. Godel’s incompleteness theorem (referenced earlier) effectively limits physics as well. Totally off-topic, but philosophically important.
 
Its relevance to storytelling is that it’s a visual medium even when there are no visuals presented, which is why I contend that if we didn’t dream, stories wouldn’t work. In response to a question, Moore pointed out that, because he worked on graphic novels, he had to think about the story visually. I’ve made the point before that the best thing I ever did for my own writing was to take some screenwriting courses, because one is forced to think visually and imagine the story being projected onto a screen. In a screenplay, you can only write down what is seen and heard. In other words, you can’t write what a character is thinking. On the other hand, you can write an entire novel from inside a character’s head, and usually more than one. But if you tell a story from a character’s POV (point-of-view) you axiomatically feel what they’re feeling and see what they’re witnessing. This is the whole secret to novel-writing. It’s intrinsically visual, because we automatically create images even if the writer doesn’t provide them. So my method is to provide cues, knowing that the reader will fill in the blanks. No one specifically mentions this in the video, so it’s my contribution.
 
Something else that Moore, Catling and Ince discuss is how writing something down effectively changes the way they think. This is something I can identify with, both in fiction and non-fiction, but fiction specifically. It’s hard to explain this if you haven’t experienced it, but they spend a lot of time on it, so it’s obviously significant to them. In fiction, there needs to be a spontaneity – I’ve often compared it to playing jazz, even though I’m not a musician. So most of the time, you don’t know what you’re going to write until it appears on the screen or on paper, depending which medium you’re using. Moore says it’s like it’s in your hands instead of your head, which is certainly not true. But the act of writing, as opposed to speaking, is a different process, at least for Moore, and also for me.
 
I remember many years ago (decades) when I told someone (a dentist, actually) that I was writing a book. He said he assumed that novelists must dictate it, because he couldn’t imagine someone writing down thousands upon thousands of words. At the time, I thought his suggestion just as weird as he thought mine to be. I suspect some writers do. Philip Adams (Australian broadcaster and columnist) once confessed that he dictated everything he wrote. In my professional life, I have written reports for lawyers in contractual disputes, both in Australia and the US, for which I’ve received the odd kudos. In one instance, someone I was working with was using a cassette-like dictaphone and insisted I do the same, believing it would save time. So I did, in spite of my better judgement, and it was just terrible. Based on that one example, you’d be forgiven for thinking that I had no talent or expertise in that role. Of course, I re-wrote the whole thing, and was never asked to do it again.
 
I originally became interested in Moore’s YouTube videos because he talked about how writing affects you as a person and can also affect the world. I think to be a good writer of fiction you need to know yourself very well, and I suspect that is what he meant without actually saying it. The paradox with this is that you are always creating characters who are not you. I’ve said many times that the best fiction you write is where you’re completely detached – in a Zen state – sometimes called ‘flow’. Virtuoso musicians and top sportspersons will often make the same admission.
 
I believe having an existential philosophical approach to life is an important aspect to my writing, because it requires an authenticity that’s hard to explain. To be true to your characters you need to leave yourself out of it. Virtually all writers, including Moore, talk about treating their characters like real people, and you need to extend that to your villains if you want them to be realistic and believable, not stereotypes. Moore talks about giving multiple dimensions to his characters, which I won’t go into. Not because I don’t agree, but because I don’t over-analyse it. Characters just come to me and reveal themselves as the story unfolds; the same as they do for the reader.
 
What I’ve learned from writing fiction (which I’d self-describe as sci-fi/fantasy) – as opposed to what I didn’t know – is that, at the end of the day (or story), it’s all about relationships. Not just intimate relationships, but relationships between family members, between colleagues, between protagonists and AI, and between protagonists and antagonists. This is the fundamental grist for all stories.
 
Philosophy is arguably more closely related to writing than any other artform: there is a crossover and interdependency; because fiction deals with issues relevant to living and being.

Sunday, 29 December 2024

The role of dissonance in art, not to mention science and mathematics

 I was given a book for a birthday present just after the turn of the century, titled A Terrible Beauty; The People and Ideas that Shaped the Modern Mind, by Peter Watson. A couple of things worth noting: it covers the history of the 20th Century, but not geo-politically as you might expect. Instead, he writes about the scientific discoveries alongside the arts and cultural innovations, and he talks about both with equal erudition, which is unusual.
 
The reason I mention this, is because I remember Watson talking about the human tendency to push something to its limits and then beyond. He gave examples in science, mathematics, art and music. A good example in mathematics is the adoption of √-1 (giving us ‘imaginary numbers’), which we are taught is impossible, then suddenly it isn’t. The thing is that it allows us to solve problems that were previously impossible in the same way that negative numbers give solutions to arithmetical subtractions that were previously unanswerable. There were no negative numbers in ancient Greece because their mathematics was driven by geometry, and the idea of a negative volume or area made no sense.
 
But in both cases: negative numbers and imaginary numbers; there is a cognitive dissonance that we have to overcome before we can gain familiarity and confidence in using them, or even understanding what they mean in the ‘real world’, which is the problem the ancient Greeks had. Most people reading this have no problem, conceptually, dealing with negative numbers, because, for a start, they’re an integral aspect of financial transactions – I suspect everyone reading this above a certain age has had experience with debt and loans.
 
On the other hand, I suspect a number of readers struggle with a conceptual appreciation of imaginary numbers. Some mathematicians will tell you that the term is a misnomer, and its origin would tend to back that up. Apparently, Rene Descartes coined the term, disparagingly, because, like the ancient Greek’s problem with negative numbers, he believed they had no relevance to the ‘real world’. And Descartes would have appreciated their usefulness in solving problems previously unsolvable, so I expect it would have been a real cognitive dissonance for him.
 
I’ve written an entire post on imaginary numbers, so I don’t want to go too far down that rabbit hole, but I think it’s a good example of what I’m trying to explicate. Imaginary numbers gave us something called complex algebra and opened up an entire new world of mathematics that is particularly useful in electrical engineering. But anyone who has studied physics in the last century is aware that, without imaginary numbers, an entire field of physics, quantum mechanics, would remain indescribable, let alone be comprehensible. The thing is that, even though most people have little or no understanding of QM, every electronic device you use depends on it. So, in their own way, imaginary numbers are just as important and essential to our lives as negative numbers are.
 
You might wonder how I deal with the cognitive dissonance that imaginary numbers induce. In QM, we have, at its most rudimentary level, something called Schrodinger’s equation, which he proposed in 1926 (“It’s not derived from anything we know,” to quote Richard Feynman) and Schrodinger quickly realised it relied on imaginary numbers – he couldn’t formulate it without them. But here’s the thing: Max Born, a contemporary of Schrodinger, formulated something we now call the Born rule that mathematically gets rid of the imaginary numbers (for the sake of brevity and clarity, I’ll omit the details) and this gives the probability of finding the object (usually an electron) in the real world. In fact, without the Born rule, Schrodinger’s equation is next-to-useless, and would have been consigned to the dustbin of history.
 
And that’s relevant, because prior to observing the particle, it’s in a superposition of states, described by Schrodinger’s equation as a wave function (Ψ), which some claim is a mathematical fiction. In other words, you need to get rid (clumsy phrasing, but accurate) of the imaginary component to make it relevant to the reality we actually see and detect. And the other thing is that once we have done that, the Schrodinger equation no longer applies – there is effectively a dichotomy between QM and classical physics (reality), which is called the 'measurement problem’. Roger Penrose gives a good account in this video interview. So, even in QM, imaginary numbers are associated with what we cannot observe.
 
That was a much longer detour than I intended, but I think it demonstrates the dissonance that seems necessary in science and mathematics, and arguably necessary for its progress; plus it’s a good example of the synergy between them that has been apparent since Newton.
 
My original intention was to talk about dissonance in music, and the trigger for this post was a YouTube video by musicologist, Rick Beato (pronounced be-arto), dissecting the Beatles song, Ticket to Ride, which he called, ‘A strange but perfect song’. In fact, he says, “It’s very strange in many ways: it’s rhythmically strange; it’s melodically strange too”. I’ll return to those specific points later. To call Beato a music nerd is an understatement, and he gives a technical breakdown that quite frankly, I can’t follow. I should point out that I’ve always had a good ‘ear’ that I inherited, and I used to sing, even though I can’t read music (neither could the Beatles). I realised quite young that I can hear things in music that others miss. Not totally relevant, but it might explain some things that I will expound upon later.
 
It's a lengthy, in-depth analysis, but if you go to 4.20-5.20, Beato actually introduces the term ‘dissonance’ after he describes how it applies. In effect, there is a dissonance between the notes that John Lennon sings and the chords he plays (on a 12-string guitar). And the thing is that we, the listener, don’t notice – someone (like Beato) has to point it out. Another quote from 15.00: “One of the reasons the Beatles songs are so memorable, is that they use really unusual dissonant notes at key points in the melody.”
 
The one thing that strikes you when you first hear Ticket to Ride is the unusual drum part. Ringo was very inventive and innovative, and became more adventurous, along with his bandmates, on later recordings. The Ticket to Ride drum part has become iconic: everyone knows it and recognises it. There is a good video where Ringo talks about it, along with another equally famous drum part he created. Beato barely mentions it, though right at the beginning, he specifically refers to the song as being ‘rhythmically strange’.
 
A couple of decades ago, can’t remember exactly when, I went and saw an entire Beatles concert put on by a rock band, augmented by orchestral strings and horn parts. It was in 2 parts with an intermission, and basically the 1st half was pre-Sergeant Pepper and the 2nd half, post. I can still remember that they opened the concert with Magical Mystery Tour and it blew me away. The thing is that they went to a lot of trouble to be faithful to the original recordings, and I realised that it was the first time I’d heard their music live, albeit with a cover band. And what immediately struck me was the unusual harmonics and rhythms they employed. Watching Beato’s detailed technical analysis puts this into context for me.
 
Going from imaginary numbers and quantum mechanics to one of The Beatles most popular songs may seem like a giant leap, but it highlights how dissonance is a universal principle for humans, and intrinsic to progression in both art and science.
 
Going back to Watson’s book that I reference in the introduction, another obvious example that he specifically talks about is Picasso’s cubism.
 
In storytelling, it may not be so obvious, and I think modern fiction has been influenced more by cinema than anything else, where the story needs to be more immediate and it needs to flow with minimal description. There is now an expectation that it puts you in the story – what we call immersion.
 
On another level, I’ve noticed a tendency on my part to create cognitive dissonance in my characters and therefore the reader. More than once, I have combined sexual desire with fear, which some may call perverse. I didn’t do this deliberately – a lot of my fiction contains elements I didn’t foresee. Maybe it says something about my own psyche, but I honestly don’t know.

Thursday, 19 September 2024

Prima Facie; the play

 I went and saw a film made of a live performance of this highly rated play, put on by the National Theatre at the Harold Pinter Theatre in London’s West End in 2022. It’s a one-hander, played by Jodie Comer, best known as the quirky assassin with a diabolical sense of humour, in the black comedy hit, Killing Eve. I also saw her in Ridley Scott’s riveting and realistically rendered film, The Last Duel, set in mediaeval France, where she played alongside Matt Damon, Adam Driver and an unrecognisable Ben Affleck. The roles that Comer played in those 2 screen mediums, couldn’t be more different.
 
Theatre is more unforgiving than cinema, because there are no multiple takes or even a break once the curtain’s raised; a one-hander, even more so. In the case of Prima Facie, Comer is on the stage a full 90mins, and even does costume-changes and pushing around her own scenery unaided, without breaking stride. It’s such a tour de force performance, as the Financial Times put it; I’d go so far as to say it’s the best acting performance I’ve ever witnessed by anyone. It’s such an emotionally draining role, where she cries and even breaks into a sweat in one scene, that I marvel she could do it night-after-night, as I assume she did.
 
And I’ve yet to broach the subject matter, which is very apt, given the me-too climate, but philosophically it goes deeper than that. The premise for the entire play, which is even spelt out early on, in case you’re not paying attention, is the difference between truth and justice, and whether it matters. Comer’s character, Tessa, happens to experience it from both sides, which is what makes this so powerful.
 
She’s a defence barrister, who specialises in sexual-assault cases, where, as she explains very early on, effectively telling us the rules of the game: no one wins or loses; you either come first or second. In other words, the barristers and those involved in the legal profession, don’t see the process the same way that you and I do, and I can understand that – to get emotionally involved makes it very stressful.

In fact, I have played a small role in this process in a professional capacity, so I’ve seen this firsthand. But I wasn’t dealing with rape cases or anything involving violence, just contractual disputes where millions of dollars could be at stake. My specific role was to ‘prepare evidence’ for lawyers for either a claim or the defence of a claim or possibly a counter-claim, and I quickly realised the more dispassionate one is, the more successful one is likely to be. I also realised that the lawyers I was supporting in one case could be on the opposing side in the next one, so you don’t get personal.
 
So, I have a small insight into this world, and can appreciate why they see it as a game, where you ‘win or come second’. But in Prima Facie, Tess goes through this very visceral and emotionally scarifying transformation where she finds herself on the receiving end, and it’s suddenly very personal indeed.
 
Back in 2015, I wrote a mini-400-word essay, in answer to one of those Question of the Month topics that Philosophy Now like to throw open to amateur wannabe philosophers, like myself. And in this case, it was one that was selected for publication (among 12 others), from all around the Western globe. I bring this up, because I made the assertion that ‘justice without truth is injustice’, and I feel that this is really what Prima Facie is all about. At the end of the play, with Tess now having the perspective of the victim (there is no other word), it does become a matter of winning or losing, because, not only her career and future livelihood, but her very dignity, is now up for sacrifice.
 
I watched a Q&A programme on Australia’s ABC some years ago, where this issue was discussed. Every woman on the panel, including one from the righteous right (my coinage), had a tale to tell about discrimination or harassment in a workplace situation. But the most damming testimony came from a man, who specialised in representing women in sexual assault cases, and he said that in every case, their doctors tell them not to proceed because it will destroy their health; and he said: they’re right. I was reminded of this when I watched this play.
 
One needs to give special mention to the writer, Suzie Miller, who is an Aussie as it turns out, and as far as 6 degrees of separation go, I happen to know someone who knows her father. Over 5 decades I’ve seen some very good theatre, some of it very innovative and original. In fact, I think the best theatre I’ve seen has invariably been something completely different, unexpected and dare-I-say-it, special. I had a small involvement in theatre when I was still very young, and learned that I couldn’t act to save myself. Nevertheless, my very first foray into writing was an attempt to write a play. Now, I’d say it’s the hardest and most unforgiving medium of storytelling to write for. I had a friend who was involved in theatre for some decades and even won awards. She passed a couple of years ago and I miss her very much. At her funeral, she was given a standing ovation, when her coffin was taken out; it was very moving. I can’t go to a play now without thinking about her and wishing I could discuss it with her.

Monday, 22 July 2024

Zen and the art of flow

 This was triggered by a newsletter I received from ABC Classic (Australian radio station) with a link to a study done on ‘flow’, which is a term coined by physiologist, Mihalyi Csikszentmihalyi, to describe a specific psychological experience that many (if not all) people have had when totally immersed in some activity that they not only enjoy but have developed some expertise in.
 
The study was performed by Dr John Kounios from Drexel University's Creative Research Lab in Philadelphia, who “examined the 'neural and psychological correlates of flow' in a sample of jazz guitarists.” The article was authored by Jennifer Mills from ABC Classic’s sister station, ABC Jazz. But the experience of ‘flow’ just doesn’t apply to mental or artistic activities, but also sporting activities like playing tennis or cricket. Mills heads her article with the claim that ‘New research helps unlock the secrets of flow, an important tool for creative and problem solving tasks’. She quotes Csikszentmihalyi to provide a working definition:
 
"A state in which people are so involved in an activity that nothing else seems to matter; the experience is so enjoyable that people will continue to do it even at great cost, for the sheer sake of doing it."
 
I believe I’ve experienced ‘flow’ in 2 quite disparate activities: writing fiction and driving a car. Just to clarify, some people think that experiencing flow while driving means that you daydream, whereas I’m talking about the exact opposite. I hardly ever daydream while driving, and if I find myself doing it, I bring myself back to the moment. Of course, cars are designed these days to insulate you from the experience of driving as much as possible, as we evolve towards self-driving cars. Thankfully, there are still cars available that are designed to involve you in the experience and not remove you from it.
 
I was struck by the fact that the study used jazz musicians, as I’ve often compared the ability to play jazz with the ability to write dialogue (even though I’m not a musician). They both require extemporisation. The article references Nat Bartsch, whom I’ve seen perform live and whose music is an unusual style of jazz in that it can be very contemplative. I saw her perform one of her albums with her quartet, augmented with a cello, which made it a one-off, unique performance. (This is a different concert performed in Sydney without the cellist.)
 
The study emphasised the point that the more experienced practitioners taking part were the ones more likely to experience ‘flow’. In other words, to experience ‘flow’ you need to reach a certain skill-level. In emphasising this point, the author quotes jazz legend, Charlie Parker:
 
"You've got to learn your instrument. Then, you practise, practise, practise. And then, when you finally get up there on the bandstand, forget all that and just wail."
 
I can totally identify with this, as when I started writing it was complete crap, to the extent that I wouldn’t show it to anyone. For some irrational reason, I had the self-belief – some might say, arrogance – that, with enough perseverance and practice, I could ‘break-through’ into the required skill-level. In fact, I now create characters and write dialogue with little conscious effort – it’s become a ‘delegated’ task, so I can concentrate on the more complex tasks of resolving plot points, developing moral dilemmas and formulating plot twists. Notice that these require a completely different set of skills that also had to be learned from scratch. But all this can come together, often in unexpected and surprising ways, when one is in the mental state of ‘flow’. I’ve described this as a feeling like you’re an observer, not the progenitor, so the process occurs as if you’re a medium and you just have to trust it.
 
Dr. Steffan Herff, leader of the Sydney Music, Mind and Body Lab at Sydney University, makes a point that supports this experience:
 
"One component that makes flow so interesting from a cognitive neuroscience and psychology perspective, is that it comes with a 'loss of self-consciousness'."
 
And this allows me to segue into Zen Buddhism. Many years ago, I read an excellent book by Daisetz Suzuki titled, Zen and Japanese Culture, where he traces the evolutionary development of Zen, starting with Buddhism in India, then being adopted in China, where it was influenced by Taoism, before reaching Japan, where it was assimilated into a sister-religion (for want of a better term) with Shintoism, which is an animistic religion.
 
Suzuki describes Zen as going inward rather than outward, while acknowledging that the two can’t be disconnected. But I think it’s the loss of ‘self’ that makes it relevant to the experience of flow. When Suzuki described the way Zen is practiced in Japan, he talked about being in the moment, whatever the activity, and for me, this is an ideal that we rarely attain. It was only much later that I realised that this is synonymous with flow as described by Csikszentmihalyi and currently being examined in the studies referenced above.
 
I’ve only once before written a post on Zen (not counting a post on Buddhism and Christianity), which arose from reading Douglas Hofstadter’s seminal tome, Godel Escher Bach (which is not about Zen, although it gets a mention), and it’s worth quoting this summation from myself:
 
My own take on this is that one’s ego is not involved yet one feels totally engaged. It requires one to be completely in the moment, and what I’ve found in this situation is that time disappears. Sportsmen call it being ‘in the zone’ and it’s something that most of us have experienced at some time or another.

Saturday, 29 June 2024

Feeling is fundamental

 I’m not sure I’ve ever had an original idea, but I sometimes raise one that no one else seems to talk about. And this is one of them: I contend that the primary, essential attribute of consciousness is to be able to feel, and the ability to comprehend is a secondary attribute.
 
I don’t even mind if this contentious idea triggers debate, but we tend to always discuss consciousness in the context of human consciousness, where we metaphorically talk about making decisions based on the ‘head’ or the ‘heart’. I’m unsure of the origin of this dichotomy, but there is an inference that our emotional and rational ‘centres’ (for want of a better word) have different loci (effectively, different locations). No one believes that, of course, but possibly people once did. The thing is that we are all aware that sometimes our emotional self and rational self can be in conflict. This is already going down a path I didn’t intend, so I may return at a later point.
 
There is some debate about whether insects have consciousness, but I believe they do because they demonstrate behaviours associated with fear and desire, be it for sustenance or company. In other respects, I think they behave like automatons. Colonies of ants and bees can build a nest without a blueprint except the one that apparently exists in their DNA. Spiders build webs and birds build nests, but they don’t do it the way we would – it’s all done organically, as if they have a model in their brain that they can follow; we actually don’t know.
 
So I think the original role of consciousness in evolutionary terms was to feel, concordant with abilities to act on those feelings. I don’t believe plants can feel, but they’d have very limited ability to act on them, even if they could. They can communicate chemically, and generally rely on the animal kingdom to propagate, which is why a global threat to bee populations is very serious indeed.
 
So, in evolutionary terms, I think feeling came before cognitive abilities – a point I’ve made before. It’s one of the reasons that I think AI will never be sentient – a viewpoint not shared by most scientists and philosophers, from what I’ve read.  AI is all about cognitive abilities; specifically, the ability to acquire knowledge and then deploy it to solve problems. Some argue that by programming biases into the AI, we will be simulating emotions. I’ve explored this notion in my own sci-fi, where I’ve added so-called ‘attachment programming’ to an AI to simulate loyalty. This is fiction, remember, but it seems plausible.
 
Psychological studies have revealed that we need an emotive component to behave rationally, which seems counter-intuitive. But would we really prefer if everyone was a zombie or a psychopath, with no ability to empathise or show compassion. We see enough of this already. As I’ve pointed out before, in any ingroup-outgroup scenario, totally rational individuals can become totally irrational. We’ve all observed this, possibly actively participated.
 
An oft made point (by me) that I feel is not given enough consideration is the fact that without consciousness, the universe might as well not exist. I agree with Paul Davies (who does espouse something similar) that the universe’s ability to be self-aware, would seem to be a necessary condition for its existence (my wording, not his). I recently read a stimulating essay in the latest edition of Philosophy Now (Issue 162, June/July 2024) titled enigmatically, Significance, by Ruben David Azevedo, a ‘Portuguese philosophy and social sciences teacher’. His self-described intent is to ‘Tell us why, in a limitless universe, we’re not insignificant’. In fact, that was the trigger for this post. He makes the point (that I’ve made elsewhere myself), that in both time and space, we couldn’t be more insignificant, which leads many scientists and philosophers to see us as a freakish by-product of an otherwise purposeless universe. A perspective that Davies has coined ‘the absurd universe’. In light of this, it’s worth reading Azevedo’s conclusion:
 
In sum, humans are neither insignificant nor negligible in this mind-blowing universe. No living being is. Our smallness and apparent peripherality are far from being measures of our insignificance. Instead, it may well be the case that we represent the apex of cosmic evolution, for we have this absolute evident and at the same time mysterious ability called consciousness to know both ourselves and the universe.
 
I’m not averse to the idea that there is a cosmic role for consciousness. I like John Wheeler’s obvious yet pertinent observation:
 
The Universe gave rise to consciousness, and consciousness gives meaning to the Universe.

 
And this is my point: without consciousness, the Universe would have no meaning. And getting back to the title of this essay, we give the Universe feeling. In fact, I’d say that the ability to feel is more significant than the ability to know or comprehend.
 
Think about the role of art in all its manifestations, and how it’s totally dependent on the ability to feel. In some respects, I consider AI-generated art a perversion, because any feeling we have for its products is of our own making, not the AI’s.
 
I’m one of those weird people who can even find beauty in mathematics, while acknowledging only a limited ability to pursue it. It’s extraordinary that I can find beauty in a symphony, or a well-written story, or the relationship between prime numbers and Riemann’s Zeta function.


Addendum: I realised I can’t leave this topic without briefly discussing the biochemical role in emotional responses and behaviours. I’m thinking of the brain’s drugs-of-choice like serotonin, dopamine, oxytocin and endorphins. Some may argue that these natural ‘messengers’ are all that’s required to explain emotions. However, there are other drugs, like alcohol and caffeine (arguably the most common) that also affect us emotionally, sometimes to our detriment. My point being that the former are nature’s target-specific mechanisms to influence the way we feel, without actually being the genesis of feelings per se.

Saturday, 20 April 2024

Sigmund Freud’s and C.S. Lewis’s fictional encounter

Last week I went and saw the movie, Freud’s Last Session, where Anthony Hopkins plays Freud, when he was in London on the very cusp of WW2 and dying of cancer of the mouth, and Mathew Goode plays the Oxford Don, C.S. Lewis. It’s a fictional account, taken from a play I believe, about their meeting at Freud’s home. Its historical veracity is put into question by a disclaimer given after the movie proper finishes, saying that it’s recorded that Freud did, in fact, meet an Oxford Don, but whose identity was never revealed or confirmed.
 
It's the sort of movie that would attract people with a philosophical bent like myself. I thought the cinema better attended than I expected, though it was far from full. Anthony Hopkin’s Freud is playful in the way he challenges Mathew Goode’s Lewis, whilst still being very direct and not pulling any punches. There is an interruption to their conversation by an air-raid siren, and when they go into a bunker, Lewis has a panic-attack, because of his experience in the trenches of WW1. Freud helps him to deal with it in the moment.
 
I’ve read works by both of them, though I’m hardly a scholar. I actually studied Freud in a philosophy class, believe it or not. I’m better read in Jung than Freud. I think Lewis is a good essayist, though I disagree with him philosophically on many counts. Having said that, I expect if I’d met him, I’d have a different opinion of him than just his ideas. I have very good friends who hold almost exactly the same views, so you don’t just judge someone for what they believe, if you get to know them in the flesh.
 
And that’s what came across in this hypothetical exchange – that you have 2 intellectuals who can find mutual respect despite having antithetical views about God and religion and other things, like homosexuality. On that last point, Sigmund’s daughter, Anna, was in a relationship with a woman, which Freud obviously didn’t approve of. In fact, the father-daughter relationship in the movie, was portrayed as very Freudian, where they both seemed to suffer from an unhealthy attachment. Nevertheless, Anna Freud went on to make a name for herself in child psychoanalysis, and there’s a scene where she has to deal with an overbearing and arrogant young man, and her putdown made me want to clap; I just wish I could remember it. Anyway, Anna’s story provides a diversionary, yet not irrelevant, subplot, which makes the movie a bit more than just a two-hander.
 
There are scenes where Mathew Goode’s Lewis has dreams or visions and finds himself in a forest where he comes across a deer and one where he sees a bright overwhelming light. There was a sense in these scenes that he felt he was in the presence of God, and it made me realise that I couldn’t judge him for that. I’ve long argued that God is a personal experience that can’t be shared, but we overlay it with our cultural norms. It was in these scenes that I felt his character was portrayed most authentically.
 

Tuesday, 2 January 2024

Modes of expression in writing fiction

As I point out in the post, this is a clumsy phrase, but I find it hard to come up with a better one. It’s actually something I wrote on Quora in response to a question. I’ve written on this before, but this post has the benefit of being much more succinct while possibly just as edifying.
 
I use the term ‘introspection’ where others use the word, ‘insight’. It’s the reader’s insight but the character’s introspection, which is why I prefer that term in this context.
 
The questioner is Clxudy Pills, obviously a pseudonym. I address her directly in the answer, partly because, unlike other questions I get, she has always acknowledged my answers.
 

Is "show, not tell" actually a good writing tip?

 
Maybe. No one said that to me when I was starting out, so it had no effect on my development. But I did read a book (more than one, actually) on ‘writing’ that delineated 5 categories of writing ‘style’. Style in this context means the mode of expression rather than an author’s individual style or ‘voice’. That’s clumsily stated but it will make sense when I tell you what they are.
 

  1. Dialogue is the most important because it’s virtually unique to fiction; quotes provided in non-fiction notwithstanding. Dialogue, more than any other style, tells you about the characters and their interactions with others.



  2. Introspection is what the character thinks, effectively. This only happens in novels and short stories, not screenplays or stage plays, soliloquies being the exception and certainly not the rule. But introspection is essential to prose, especially when the character is on their own.



  3. Exposition is the ‘telling’, not showing, part. When you’re starting out and learning your craft, you tend to write a lot of exposition – I know I did – which is why we get the admonition in your question. But the exposition can be helpful to you, if not the reader, as it allows you to explore the setting, the context of the story and its characters. Eventually, you’ll learn not to rely on it. Exposition is ‘smuggled’ into movies through dialogue and into novels through introspection.



  4. Description is more difficult than you think, because it’s the part of a novel that readers will skip over to get on with the story. Description can be more boring than exposition, yet it’s necessary. My approach is to always describe a scene from a character’s POV, and keep it minimalist. Readers automatically fill in the details, because we are visual creatures and we do it without thinking.



  5. Action is description in motion. Two rules: stay in one character’s POV and keep it linear – one thing happens after another. It has the dimension of time, though it’s subliminal.

 
 So there: you get 5 topics for the price of one.
 

Saturday, 16 September 2023

Modes of thinking

 I’ve written a few posts on creative thinking as well as analytical and critical thinking. But, not that long ago, I read a not-so-recently published book (2015) by 2 psychologists (John Kounios and Mark Beeman) titled, The Eureka Factor; Creative Insights and the Brain. To quote from the back fly-leaf:
 
Dr John Kounios is Professor of Psychology at Drexel University and has published cognitive neuroscience research on insight, creativity, problem solving, memory, knowledge representation and Alzheimer’s disease.
 
Dr Mark Beeman is Professor of Psychology and Neuroscience at Northwestern University, and researches creative problem solving and creative cognition, language comprehension and how the right and left hemispheres process information.

 
They divide people into 2 broad groups: ‘Insightfuls’ and ‘analytical thinkers’. Personally, I think the coined term, ‘insightfuls’ is misleading or too narrow in its definition, and I prefer the term ‘creatives’. More on that below.
 
As the authors say, themselves, ‘People often use the terms “insight” and “creativity” interchangeably.’ So that’s obviously what they mean by the term. However, the dictionary definition of ‘insight’ is ‘an accurate and deep understanding’, which I’d argue can also be obtained by analytical thinking. Later in the book, they describe insights obtained by analytical thinking as ‘pseudo-insights’, and the difference can be ‘seen’ with neuro-imaging techniques.
 
All that aside, they do provide compelling arguments that there are 2 distinct modes of thinking that most of us experience. Very early in the book (in the preface, actually), they describe the ‘ah-ha’ experience that we’ve all had at some point, where we’re trying to solve a puzzle and then it comes to us unexpectedly, like a light-bulb going off in our head. They then relate something that I didn’t know, which is that neurological studies show that when we have this ‘insight’ there’s a spike in our brain waves and it comes from a location in the right hemisphere of the brain.
 
Many years ago (decades) I read a book called Drawing on the Right Side of the Brain by Betty Edwards. I thought neuroscientists would disparage this as pop-science, but Kounios and Beeman seem to give it some credence. Later in the book, they describe this in more detail, where there are signs of activity in other parts of the brain, but the ah-ha experience has a unique EEG signature and it’s in the right hemisphere.
 
The authors distinguish this unexpected insightful experience from an insight that is a consequence of expertise. I made this point myself, in another post, where experts make intuitive shortcuts based on experience that the rest of us don’t have in our mental toolkits.
 
They also spend an entire chapter on examples involving a special type of insight, where someone spends a lot of time thinking about a problem or an issue, and then the solution comes to them unexpected. A lot of scientific breakthroughs follow this pattern, and the point is that the insight wouldn’t happen at all without all the rumination taking place beforehand, often over a period of weeks or months, sometimes years. I’ve experienced this myself, when writing a story, and I’ll return to that experience later.
 
A lot of what we’ve learned about the brain’s functions has come from studying people with damage to specific areas of the brain. You may have heard of a condition called ‘aphasia’, which is when someone develops a serious disability in language processing following damage to the left hemisphere (possibly from a stroke). What you probably don’t know (I didn’t) is that damage to the right hemisphere, while not directly affecting one’s ability with language can interfere with its more nuanced interpretations, like sarcasm or even getting a joke. I’ve long believed that when I’m writing fiction, I’m using the right hemisphere as much as the left, but it never occurred to me that readers (or viewers) need the right hemisphere in order to follow a story.
 
According to the authors, the difference between the left and right neo-cortex is one of connections. The left hemisphere has ‘local’ connections, whereas the right hemisphere has more widely spread connections. This seems to correspond to an ‘analytic’ ability in the left hemisphere, and a more ‘creative’ ability in the right hemisphere, where we make conceptual connections that are more wideranging. I’ve probably oversimplified that, but it was the gist I got from their exposition.
 
Like most books and videos on ‘creative thinking’ or ‘insights’ (as the authors prefer), they spend a lot of time giving hints and advice on how to improve your own creativity. It’s not until one is more than halfway through the book, in a chapter titled, The Insightful and the Analyst, that they get to the crux of the issue, and describe how there are effectively 2 different types who think differently, even in a ‘resting state’, and how there is a strong genetic component.
 
I’m not surprised by this, as I saw it in my own family, where the difference is very distinct. In another chapter, they describe the relationship between creativity and mental illness, but they don’t discuss how artists are often moody and neurotic, which is a personality trait. Openness is another personality trait associated with creative people. I would add another point, based on my own experience, if someone is creative and they are not creating, they can suffer depression. This is not discussed by the authors either.
 
Regarding the 2 types they refer to, they acknowledge there is a spectrum, and I can’t help but wonder where I sit on it. I spent a working lifetime in engineering, which is full of analytic types, though I didn’t work in a technical capacity. Instead, I worked with a lot of technical people of all disciplines: from software engineers to civil and structural engineers to architects, not to mention lawyers and accountants, because I worked on disputes as well.
 
The curious thing is that I was aware of 2 modes of thinking, where I was either looking at the ‘big-picture’ or looking at the detail. I worked as a planner, and one of my ‘tricks’ was the ability to distil a large and complex project into a one-page ‘Gantt’ chart (bar chart). For the individual disciplines, I’d provide a multipage detailed ‘program’ just for them.
 
Of course, I also write stories, where the 2 components are plot and character. Creating characters is purely a non-analytic process, which requires a lot of extemporising. I try my best not to interfere, and I do this by treating them as if they are real people, independent of me. Plotting, on the other hand, requires a big-picture approach, but I almost never know the ending until I get there. In the last story I wrote, I was in COVID lockdown when I knew the ending was close, so I wrote some ‘notes’ in an attempt to work out what happens. Then, sometime later (like a month), I had one sleepless night when it all came to me. Afterwards, I went back and looked at my notes, and they were all questions – I didn’t have a clue.

Thursday, 25 May 2023

Philosophy’s 2 disparate strands: what can we know; how can we live

The question I’d like to ask, is there a philosophical view that encompasses both? Some may argue that Aristotle attempted that, but I’m going to take a different approach.
 
For a start, the first part can arguably be broken into 2 further strands: physics and metaphysics. And even this divide is contentious, with some arguing that metaphysics is an ‘abstract theory with no basis in reality’ (one dictionary definition).
 
I wrote an earlier post arguing that we are ‘metaphysical animals’ after discussing a book of the same name, though it was really a biography of 4 Oxford women in the 20th Century: Elizabeth Anscombe, Mary Midgley, Philippa Foot and Iris Murdoch. But I’ll start with this quote from said book.
 
Poetry, art, religion, history, literature and comedy are all metaphysical tools. They are how metaphysical animals explore, discover and describe what is real (and beautiful and good). (My emphasis.)
 
So, arguably, metaphysics could give us a connection between the 2 ‘strands’ in the title. Now here’s the thing: I contend that mathematics should be part of that list, hence part of metaphysics. And, of course, we all know that mathematics is essential to physics as an epistemology. So physics and metaphysics, in my philosophy, are linked in a rather intimate  way.
 
The curious thing about mathematics, or anything metaphysical for that matter, is that, without human consciousness, they don’t really exist, or are certainly not manifest. Everything on that list is a product of human consciousness, notwithstanding that there could be other conscious entities somewhere in the universe with the same capacity.
 
But again, I would argue that mathematics is an exception. I agree with a lot of mathematicians and physicists that while we create the symbols and language of mathematics, we don’t create the intrinsic relationships that said language describes. And furthermore, some of those relationships seem to govern the universe itself.
 
And completely relevant to the first part of this discussion, the limits of our knowledge of mathematics seems to determine the limits of our knowledge of the physical world.
 
I’ve written other posts on how to live, specifically, 3 rules for humans and How should I live? But I’m going to go via metaphysics again, specifically storytelling, because that’s something I do. Storytelling requires an inner and outer world, manifest as character and plot, which is analogous to free will and fate in the real world. Now, even these concepts are contentious, especially free will, because many scientists tell us it’s an illusion. Again, I’ve written about this many times, but it’s relevance to my approach to fiction is that I try and give my characters free will. An important part of my fiction is that the characters are independent of me. If my characters don’t take on a life of their own, then I know I’m wasting my time, and I’ll ditch that story.
 
Its relevance to ‘how to live’ is authenticity. Artists understand better than most the importance of authenticity in their work, which really means keeping themselves out of it. But authenticity has ramifications, as any existentialist will tell you. To live authentically requires an honesty to oneself that is integral to one’s being. And ‘being’ in this sense is about being human rather than its broader ontological meaning. In other words, it’s a fundamental aspect of our psychology, because it evolves and changes according to our environment and milieu. Also, in the world of fiction, it's a fundamental dynamic.
 
What's more, if you can maintain this authenticity (and it’s genuine), then you gain people’s trust, and that becomes your currency, whether in your professional life or your social life. However, there is nothing more fake than false authenticity; examples abound.
 
I’ll give the last word to Socrates; arguably the first existentialist.
 
To live with honour in this world, actually be what you try to appear to be.


Saturday, 14 January 2023

Why do we read?

This is the almost-same title of a book I bought recently (Why We Read), containing 70 short essays on the subject, featuring scholars of all stripes: historians, philosophers, and of course, authors. It even includes scientists: Paul Davies, Richard Dawkins and Carlo Rovelli, being 3 I’m familiar with.
 
One really can’t overstate the importance of the written word, because, oral histories aside, it allows us to extend memories across generations and accumulate knowledge over centuries that has led to civilisations and technologies that we all take for granted. By ‘we’, I mean anyone reading this post.
 
Many of the essayists write from their personal experiences and I’ll do the same. The book, edited by Josephine Greywoode and published by Penguin, specifically says on the cover in small print: 70 Writers on Non-Fiction; yet many couldn’t help but discuss fiction as well.
 
And books are generally divided between fiction and non-fiction, and I believe we read them for different reasons, and I wouldn’t necessarily consider one less important than the other. I also write fiction and non-fiction, so I have a particular view on this. Basically, I read non-fiction in order to learn and I read fiction for escapism. Both started early for me and I believe the motivation hasn’t changed.
 
I started reading extra-curricular books from about the age of 7 or 8, involving creatures mostly, and I even asked for an encyclopaedia for Christmas at around that time, which I read enthusiastically. I devoured non-fiction books, especially if they dealt with the natural world. But at the same time, I read comics, remembering that we didn’t have TV at that time, which was only just beginning to emerge.
 
I think one of the reasons that boys read less fiction than girls these days is because comics have effectively disappeared, being replaced by video games. And the modern comics that I have seen don’t even contain a complete narrative. Nevertheless, there are graphic novels that I consider brilliant. Neil Gaiman’s Sandman series and Hayao Miyazake’s Nausicaa of the Valley of the Wind, being standouts. Watchmen by Alan Moore also deserves a mention.
 
So the escapism also started early for me, in the world of superhero comics, and I started writing my own scripts and drawing my own characters pre-high school.
 
One of the essayists in the collection, Niall Ferguson (author of Doom) starts off by challenging a modern paradigm (or is it a meme?) that we live in a ‘simulation’, citing Oxford philosopher, Nick Bostrom, writing in the Philosophical Quarterly in 2003. Ferguson makes the point that reading fiction is akin to immersing the mind in a simulation (my phrasing, not his).
 
In fact, a dream is very much like a simulation, and, as I’ve often said, the language of stories is the language of dreams. But here’s the thing; the motivation for writing fiction, for me, is the same as the motivation for reading it: escapism. Whether reading or writing, you enter a world that only exists inside your head. The ultimate solipsism.

And this surely is a miracle of written language: that we can conjure a world with characters who feel real and elicit emotional responses, while we follow their exploits, failures, love life and dilemmas. It takes empathy to read a novel, and tests have shown that people’s empathy increases after they read fiction. You engage with the character and put yourself in their shoes. It’s one of the reasons we read.
 
 
Addendum: I would recommend the book, by the way, which contains better essays than mine, all with disparate, insightful perspectives.
 

Sunday, 10 July 2022

Creative and analytic thinking

I recently completed an online course with a similar title, How to Think Critically and Creatively. It must be the 8th or 9th course I’ve done through New Scientist, on a variety of topics, from cosmology and quantum mechanics to immunology and sustainable living; so quite diverse subjects. I started doing them during COVID, as they helped to pass the time and stimulate the brain at the same time.
 
All these courses rely on experts in their relevant fields from various parts of the globe, so not just UK based, as you might expect. This course was no exception with just 2 experts, both from America. Denise D Cummins is described as a ‘cognitive scientist, author and elected Fellow of the Association for Psychological Science, and she’s held faculty at Yale, UC, University of Illinois and the Centre of Adaptive Behaviours at the Max Planck Institute in Berlin’. Gerard J Puccio is ‘Department Chair and Professor at the International Centre for Studies on Creativity, Buffalo State; a unique academic department that offers the world’s only Master of Science degree in creativity’.
 
I admit to being sceptical that ‘creativity’ can be taught, but that depends on what one means by creativity. If creativity means using your imagination, then yes, I think it can, because imagination is something that we all have, and it’s probably a valid comment that we don’t make enough use of it in our everyday lives. If creativity means artistic endeavour then I think that’s another topic, even though it puts imagination centre stage, so to speak.
 
I grew up in a family where one side was obviously artistic and the other side wasn’t, which strongly suggests there’s a genetic component. The other side excelled at sport, and I was rubbish at sport. However, both sides were obviously intelligent, despite a notable lack of formal education; in my parents’ case, both leaving school in their early teens. In fact, my mother did most of her schooling by correspondence, and my father left school in the midst of the great depression, shortly followed by active duty in WW2.
 
Puccio (mentioned above) argues that creativity isn’t taught in our education system because it’s too hard. Instead, he says that we teach by memorising facts and by ‘understanding’ problems. I would suggest that there is a hierarchy, where you need some basics before you can ‘graduate’ to ‘creative thinking’, and I use the term here in the way he intends it. I spent most of my working lifetime on engineering projects, with diverse and often complex elements. I need to point out that I wasn’t one of the technical experts involved, but I worked with them, in all their variety, because my job was to effectively co-ordinate all their activities towards a common goal, by providing a plan and then keeping it on the rails.
 
Engineering is all about problem solving, and I’m not sure one can do that without being creative, as well as analytical. In fact, one could argue that there is a dialectical relationship between them, but maybe I’m getting ahead of myself.
 
Back to Puccio, who introduced 2 terms I hadn’t come across before: ‘divergent’ and ‘convergent’ thinking, arguing they should be done in that order. In a nutshell, divergent thinking is brainstorming where one thinks up as many options as possible, and convergent thinking is where one narrows in on the best solution. He argues that we tend to do the second one without doing the first one. But this is related to something else that was raised in the course, which is ‘Type 1 thinking’ and ‘Type 2 thinking’.
 
Type 1 thinking is what most of us would call ‘intuition’, because basically it’s taking a cognitive shortcut to arrive at an answer to a problem, which we all do all the time, especially when time is a premium. Type 2 thinking is when we analyse the problem, which is not only time consuming but takes up brain resources that we’d prefer not to use, because we’re basically lazy, and I’m no exception. These 2 cognitive behaviours are clinically established, so it’s not pop-science.
 
However, something that was not discussed in the course, is that type 2 thinking can become type 1 thinking when we develop expertise in something, like learning a musical instrument, or writing a story, or designing a building. In other words, we develop heuristics based on our experience, which is why we sometimes jump to convergent thinking without going through the divergent part.
 
The course also dealt with ‘critical thinking’, as per its title, but I won’t dwell on that, because critical thinking arises from being analytical, and separating true expertise from bogus expertise, which is really a separate topic.
 
How does one teach these skills? I’m not a teacher, so I’m probably not best qualified to say. But I have a lot of experience in a profession that requires analytical thinking and problem-solving as part of its job description. The one thing I’ve learned from my professional life is the more I’m restrained by ‘rules’, the worse job I’ll do. I require the freedom and trust to do things my own way, and I can’t really explain that, but it’s also what I provide to others. And maybe that’s what people mean by ‘creative thinking’; we break the rules.
 
Artistic endeavour is something different again, because it requires spontaneity. But there is ‘divergent thinking’ involved, as Puccio pointed out, giving the example of Hemingway writing countless endings to Farewell to Arms, before settling on the final version. I’m reminded of the reported difference between Beethoven and Mozart, two of the greatest composers in the history of Western classical music. Beethoven would try many different versions of something (in his head and on paper) before choosing what he considered the best. He was extraordinarily prolific but he wrote only 9 symphonies and 5 piano concertos plus one violin concerto, because he workshopped them to death. Mozart, on the other hand, apparently wrote down whatever came into his head and hardly revised it. One was very analytical in their approach and the other was almost completely spontaneous.
 
I write stories and the one area where I’ve changed type 2 thinking into type 1 thinking is in creating characters – I hardly give it a thought. A character comes into my head almost fully formed, as if I just met them in the street. Over time I learn more about them and they sometimes surprise me, which is always a good thing. I once compared writing dialogue to playing jazz, because they both require spontaneity and extemporisation. Don Burrows once said you can’t teach someone to play jazz, and I’ve argued that you can’t teach someone to write dialogue.
 
Having said that, I once taught a creative writing class, and I gave the class exercises where they were forced to write dialogue, without telling them that that was the point of the exercise. In other words, I got them to teach themselves.
 
The hard part of storytelling for me is the plot, because it’s a neverending exercise in problem-solving. How did I get back to here? Analytical thinking is very hard to avoid, at least for me.
 
As I mentioned earlier, I think there is a dialectic between analytical thinking and creativity, and the best examples are not artists but genii in physics. To look at just two: Einstein and Schrodinger, because they exemplify both. But what came first: the analysis or the creativity? Well, I’m not sure it matters, because they couldn’t have done one without the other. Einstein had an epiphany (one of many) where he realised that an object in free fall didn’t experience a force, which apparently contradicted Newton. Was that analysis or creativity or both? Anyway, he not only changed how we think about gravity, he changed the way we think about the entire cosmos.
 
Schrodinger, borrowed an idea from de Broglie that particles could behave like waves and changed how we think about quantum mechanics. As Richard Feynman once said, ‘No one knows where Schrodinger’s equation comes from. It came out of Schrodinger’s head. You can’t derive it from anything we know.’
 

Sunday, 22 May 2022

We are metaphysical animals

 I’m reading a book called Metaphysical Animals (How Four Women Brought Philosophy Back To Life). The four women were Mary Midgley, Iris Murdoch, Philippa Foot and Elizabeth Anscombe. The first two I’m acquainted with and the last two, not. They were all at Oxford during the War (WW2) at a time when women were barely tolerated in academia and had to be ‘chaperoned’ to attend lectures. Also a time when some women students ended up marrying their tutors. 

The book is authored by Clare Mac Cumhaill and Rachael Wiseman, both philosophy lecturers who became friends with Mary Midgley in her final years (Mary died in 2018, aged 99). The book is part biographical of all 4 women and part discussion of the philosophical ideas they explored.

 

Bringing ‘philosophy back to life’ is an allusion to the response (backlash is too strong a word) to the empiricism, logical positivism and general rejection of metaphysics that had taken hold of English philosophy, also known as analytical philosophy. Iris spent time in postwar Paris where she was heavily influenced by existentialism and Jean-Paul Sartre, in particular, whom she met and conversed with. 

 

If I was to categorise myself, I’m a combination of analytical philosopher and existentialist, which I suspect many would see as a contradiction. But this isn’t deliberate on my part – more a consequence of pursuing my interests, which are science on one hand (with a liberal dose of mathematical Platonism) and how-to-live a ‘good life’ (to paraphrase Aristotle) on the other.

 

Iris was intellectually seduced by Sartre’s exhortation: “Man is nothing else but that which he makes of himself”. But as her own love life fell apart along with all its inherent dreams and promises, she found putting Sartre’s implicit doctrine, of standing solitarily and independently of one’s milieu, difficult to do in practice. I’m not sure if Iris was already a budding novelist at this stage of her life, but anyone who writes fiction knows that this is what it’s all about: the protagonist sailing their lone ship on a sea full of icebergs and other vessels, all of which are outside their control. Life, like the best fiction, is an interaction between the individual and everyone else they meet. Your moral compass, in particular, is often tested. Existentialism can be seen as an attempt to arise above this, but most of us don’t. 

 

Not surprisingly, Wittgenstein looms large in many of the pages, and at least one of the women, Elizabeth Anscombe, had significant interaction with him. With Wittgenstein comes an emphasis on language, which has arguably determined the path of philosophy since. I’m not a scholar of Wittgenstein by any stretch of the imagination, but one thing he taught, or that people took from him, was that the meaning we give to words is a consequence of how they are used in ordinary discourse. Language requires a widespread consensus to actually work. It’s something we rarely think about but we all take for granted, otherwise there would be no social discourse or interaction at all. There is an assumption that when I write these words, they have the same meaning for you as they do for me, otherwise I am wasting my time.

 

But there is a way in which language is truly powerful, and I have done this myself. I can write a passage that creates a scene inside your mind complete with characters who interact and can cause you to laugh or cry, or pretty much any other emotion, as if you were present; as if you were in a dream.

 

There are a couple of specific examples in the book which illustrate Wittgenstein’s influence on Elizabeth and how she used them in debate. They are both topics I have discussed myself without knowing of these previous discourses.

 

In 1947, so just after the war, Elizabeth presented a paper to the Cambridge Moral Sciences Club, which she began with the following disclosure:

 

Everywhere in this paper I have imitated Dr Wittgenstein’s ideas and methods of discussion. The best that I have written is a weak copy of some features of the original, and its value depends only on my capacity to understand and use Dr Wittgenstein’s work.

 

The subject of her talk was whether one can truly talk about the past, which goes back to the pre-Socratic philosopher, Parmenides. In her own words, paraphrasing Parmenides, ‘To speak of something past’ would then to ‘point our thought’ at ‘something there’, but out of reach. Bringing Wittgenstein into the discussion, she claimed that Parmenides specific paradox about the past arose ‘from the way that thought and language connect to the world’.

 

We apply language to objects by naming them, but, in the case of the past, the objects no longer exist. She attempts to resolve this epistemological dilemma by discussing the nature of time as we experience it, which is like a series of pictures that move on a timeline while we stay in the present. This is analogous to my analysis that everything we observe becomes the past as soon as it happens, which is exemplified every time someone takes a photo, but we remain in the present – the time for us is always ‘now’.

 

She explains that the past is a collective recollection, documented in documents and photos, so it’s dependent on a shared memory. I would say that this is what separates our recollection of a real event from a dream, which is solipsistic and not shared with anyone else. But it doesn’t explain why the past appears fixed and the future unknown, which she also attempted to address. But I don’t think this can be addressed without discussing physics.

 

Most physicists will tell you that the asymmetry between the past and future can only be explained by the second law of thermodynamics, but I disagree. I think it is described, if not explained, by quantum mechanics (QM) where the future is probabilistic with an infinitude of possible paths and classical physics is a probability of ONE because it’s already happened and been ‘observed’. In QM, the wave function that gives the probabilities and superpositional states is NEVER observed. The alternative is that all the futures are realised in alternative universes. Of course, Elizabeth Anscombe would know nothing of these conjectures.

 

But I would make the point that language alone does not resolve this. Language can only describe these paradoxes and dilemmas but not explain them.

 

Of course, there is a psychological perspective to this, which many people claim, including physicists, gives the only sense of time passing. According to them, it’s fixed: past, present and future; and our minds create this distinction. I think our minds create the distinction because only consciousness creates a reference point for the present. Everything non-sentient is in a causal relationship that doesn’t sense time. Photons of light, for example, exist in zero time, yet they determine causality. Only light separates everything in time as well as space. I’ve gone off-topic.

 

Elizabeth touched on the psychological aspect, possibly unintentionally (I’ve never read her paper, so I could be wrong) that our memories of the past are actually imagined. We use the same part of the brain to imagine the past as we do to imagine the future, but again, Elizabeth wouldn’t have known this. Nevertheless, she understood that our (only) knowledge of the past is a thought that we turn into language in order to describe it.

 

The other point I wish to discuss is a famous debate she had with C.S. Lewis. This is quite something, because back then, C.S. Lewis was a formidable intellectual figure. Elizabeth’s challenge was all the more remarkable because Lewis’s argument appeared on the surface to be very sound. Lewis argued that the ‘naturalist’ position was self-refuting if it was dependent on ‘reason’, because reason by definition (not his terminology) is based on the premise of cause and effect and human reason has no cause. That’s a simplification, nevertheless it’s the gist of it. Elizabeth’s retort:

 

What I shall discuss is this argument’s central claim that a belief in the validity of reason is inconsistent with the idea that human thought can be fully explained as the product of non-rational causes.

 

In effect, she argued that reason is what humans do perfectly naturally, even if the underlying ‘cause’ is unknown. Not knowing the cause does not make the reasoning irrational nor unnatural. Elizabeth specifically cited the language that Lewis used. She accused him of confusing the concepts of “reason”, “cause” and “explanation”.

 

My argument would be subtly different. For a start, I would contend that by ‘reason’, he meant ‘logic’, because drawing conclusions based on cause and effect is logic, even if the causal relations (under consideration) are assumed or implied rather than observed. And here I contend that logic is not a ‘thing’ – it’s not an entity; it’s an action - something we do. In the modern age, machines perform logic; sometimes better than we do.

 

Secondly, I would ask Lewis, does he think reason only happens in humans and not other animals? I would contend that animals also use logic, though without language. I imagine they’d visualise their logic rather than express it in vocal calls. The difference with humans is that we can perform logic at a whole different level, but the underpinnings in our brains are surely the same. Elizabeth was right: not knowing its physical origins does not make it irrational; they are separate issues.

 

Elizabeth had a strong connection to Wittgenstein right up to his death. She worked with him on a translation and edit of Philosophical Investigations, and he bequeathed her a third of his estate and a third of his copyright.

 

It’s apparent from Iris’s diaries and other sources that Elizabeth and Iris fell in love at one point in their friendship, which caused them both a lot of angst and guilt because of their Catholicism. Despite marrying, Iris later had an affair with Pip (Philippa).

 

Despite my discussion of just 2 of Elizabeth’s arguments, I don’t have the level of erudition necessary to address most of the topics that these 4 philosophers published in. Just reading the 4 page Afterwards, it’s clear that I haven’t even brushed the surface of what they achieved. Nevertheless, I have a philosophical perspective that I think finds some resonance with their mutual ideas. 

 

I’ve consistently contended that the starting point for my philosophy is that for each of us individually, there is an inner and outer world. It even dictates the way I approach fiction. 

 

In the latest issue of Philosophy Now (Issue 149, April/May 2022), Richard Oxenberg, who teaches philosophy at Endicott College in Beverly, Massachusetts, wrote an article titled, What Is Truth? wherein he describes an interaction between 2 people, but only from a purely biological and mechanical perspective, and asks, ‘What is missing?’ Well, even though he doesn’t spell it out, what is missing is the emotional aspect. Our inner world is dominated by emotional content and one suspects that this is not unique to humans. I’m pretty sure that other creatures feel emotions like fear, affection and attachment. What’s more I contend that this is what separates, not just us, but the majority of the animal kingdom, from artificial intelligence.

 

But humans are unique, even among other creatures, in our ability to create an inner world every bit as rich as the one we inhabit. And this creates a dichotomy that is reflected in our division of arts and science. There is a passage on page 230 (where the authors discuss R.G. Collingwood’s influence on Mary), and provide an unexpected definition.

 

Poetry, art, religion, history, literature and comedy are all metaphysical tools. They are how metaphysical animals explore, discover and describe what is real (and beautiful and good). (My emphasis.)

 

I thought this summed up what they mean with their coinage, metaphysical animals, which titles the book, and arguably describes humanity’s most unique quality. Descriptions of metaphysics vary and elude precise definition but the word, ‘transcendent’, comes to mind. By which I mean it’s knowledge or experience that transcends the physical world and is most evident in art, music and storytelling, but also includes mathematics in my Platonic worldview.


 

Footnote: I should point out that certain chapters in the book give considerable emphasis to moral philosophy, which I haven’t even touched on, so another reader might well discuss other perspectives.