Philosophy, at its best, challenges our long held views, such that we examine them more deeply than we might otherwise consider.
Paul P. Mealing
- Paul P. Mealing
- Check out my book, ELVENE. Available as e-book and as paperback (print on demand, POD). Also this promotional Q&A on-line.
Friday, 3 May 2019
What is the third way?
The ‘third way’ referenced in the question is basically a reference to an alternative societal paradigm to capitalism and communism. I expect that most, if not all responses will be variations on a 'middle way'. But if there is a completely out-of-the-box answer, I’ll be curious to read it. So, maybe the way the question is addressed will be just as important, if not more important, than the proposed resolution.
I think this is the most difficult question Philosophy Now has thrown at us in the decade or two I’ve been reading it. I think there definitely will be a third way by the end of this century, but I’m not entirely sure what it will be. Is that a copout? No, I’m going to attempt to forecast the future by looking at the past.
If one goes back before the industrial revolution, no one would have predicted that feudalism would not continue forever. But the industrial revolution unintentionally spawned two social experiments: communism and capitalism that spanned the 20th Century. I think one can fairly say that capitalism ultimately prevailed, because all communist inspired revolutions became State-run oligarchies that led to the worst excesses in totalitarianism.
What’s more, we saw more societal and technological change in the 20th Century than all previous history. There is no reason to believe that the 21st Century won’t be even more transformative. We are currently going through a technological revolution in every way analogous to the industrial revolution of the 19th Century, and it will be just as socially disruptive and economically challenging.
Capitalism has become so successful globally, especially in the high-tech industries, that corporations are starting to eclipse governments in their influence and power, and, to some extent, now embody the feudal system we thought we’d left behind. I’m referring to third world countries providing exploited labour and resources for the affluent elite, which includes me.
There is an increasing need to stop the wasteful production of goods on the altar of economic growth. It’s not only damaging the environment, it increases the gap between those who consume and those who produce. So a global economy would give the wealth to those who produce and not just those who are their puppet masters. This would require equitable wealth distribution on a global scale, not just nationally.
Future technologies will become more advanced to the point that there will be a symbiosis between humans and machines, and this will have a dramatic impact on economic drivers. A universal basic income, which is unthinkable now, will become a necessity because so many jobs will be AI executed.
People and their ideas are only considered progressive in hindsight. But what was radical in the past often becomes the status quo in the present; and voila: no one can imagine it any other way.
Addendum: I changed the last sentence of the third-last paragraph before I sent it off.
Friday, 26 April 2019
What use is philosophy?
Leafing through its pages, I came across the Letters section and saw my name. I had written a letter that I had forgotten about. It was in response to an article (referenced below), in the previous issue, about whether philosophy had lost its relevance in the modern world. Did it still have a role in the 21st Century of economic paradigms and technological miracles?
There are many aspects to Daniel Kaufman’s discussion on The Decline & Rebirth of Philosophy (Issue 130, Feb/Mar 2019, pp. 34-7), but mine is the perspective of an ‘outsider’, in as much as I’m not an academic in any field and I’m not a professional philosopher.
I think the major problem with philosophy, as it’s practiced today as an academic activity, is that it doesn’t fit into the current economic paradigm which specifically or tacitly governs all value judgements of a profession or an activity. In other words, it has no perceived economic value to either corporations or governments.
On the other hand, everyone can see the benefits of science in the form of the technological marvels they use every day, along with all the infrastructure that they quite literally couldn’t live without. Yet I would argue that science and philosophy are joined at the hip. Plato’s Academy was based on Pythagoras’s quadrivium: arithmetic, geometry, astronomy and music. In Western culture, science, mathematics and philosophy have a common origin.
The same people who benefit from the ‘magic’ of modern technology are mostly unaware of the long road from the Enlightenment to the industrial revolution, the formulation of the laws of thermodynamics, followed closely by the laws of electromagnetism, followed by the laws of quantum mechanics, upon which every electronic device depends.
John Wheeler, best known for coining the term, ‘black hole’ (in cosmology) said:
We live on an island of knowledge surrounded by a sea of ignorance. As our island of knowledge grows, so does the shore of our ignorance.
I contend that the ‘island of knowledge’ is science and the ‘shore of ignorance’ is philosophy. Philosophy is at the frontier of knowledge and because the ‘sea of ignorance’ is infinite, there will always be a role for it. Philosophy is not divorced from science and mathematics; it’s just not obviously in the guise it once was.
The marriage between science and philosophy in the 21st Century is about how we are going to live on a planet with limited resources. We need a philosophy to guide us into a collaborative global society that realises we need Earth more than it needs us.
Thursday, 4 April 2019
Is time a psychological illusion or a parameter of the Universe?
I’ve recently read Paul Davies latest book, The Demon in the Machine (released in Feb) and I would highly recommend it.
We have reached a stage in politics and media generally that you are either for or against a person, an idea or an ideology. Anyone who studies philosophy in any depth realises that there are many points of view on a single topic. There are many voices that I admire yet there is not one that I completely agree with on everything they announce or proclaim or theorise about.
Paul Davies new book is a case in point. This book is very intellectually stimulating, even provocative, which is what I expect and is what makes it worth reading. Within its 200 plus pages, there was one short, well-written and erudite passage where I found myself in serious disagreement. It was his discussion on time and its relation to our perceptions.
He starts with Einstein’s well known quote: ‘The past, present and future is only a stubbornly persistent illusion.’ It’s important to put this into its proper context. Einstein wrote this in a letter to a mother of a friend who had recently died. It was written, of course, not only to console her, but to reveal his own conclusions arising from his theories of relativity and their inherent effect on time.
A consequence of Einstein’s theory was that simultaneity was dependent on the observer, so it was possible that 2 observers could disagree on the sequence of events occurring (depending on their respective frames of reference). Note that this is only true if there is no causal relationship between these events.
Also, Einstein believed in what’s now called a ‘block universe’ whereby the future is as fixed as the past. Some physicists still argue this, in the same way that some (if not many) argue that we live in a computer simulation (Davies, it should be pointed out, definitely does not).
I’m getting off the track, because what Davies argues is that the so-called ‘arrow of time’ is an ‘illusion’, as is the ‘flow of time’. He goes so far as to contentiously claim that time can’t be measured. His argument is simple: if time was to ‘slow down’ or ‘speed up’ everything, from your heart rate to atomic clocks would do so as well, so there is no way to perceive it or measure it. He argues that you can’t measure time against time: “It has to be ‘One second per second’ – a tautology!” However, as Davies well knows, Einstein’s theory of relativity tells us that you can measure the ‘rate of time’ of one clock against another, and this is done and allowed for in GPS calculations. See my post on the special theory of relativity where I describe this very phenomenon.
Davies argues that there is no ‘backwards or forwards in time’ and the arrow of time is a ‘misnomer’, a metaphor we use to describe a psychological phenomenon. According to him, it’s our persistent belief in a continuity of self that creates the illusion of ‘time passing’. But I think he has it back-to-front. (I’ll return to this later.)
So, if there is no direction of time and no flow of time, how do we describe it? Well, one way is to talk about whether phenomena are symmetrical or asymmetrical in time. In other words, if you were to reverse a sequence of events would you get back to where you started, or is that even possible? Davies argues that entropy or the second law of thermodynamics accounts for this perception. But here’s the thing: without time, motion would not exist and causation would not exist; both of which we witness all the time. And if time does not ‘pass’ or ‘flow’, then what does it do?
Mathematically, time is a dimension, which even has a smallest unit, called ‘Planck time’. Davies says it’s not measurable, but we do, even to the extent that we derive an age of the Universe. John Barrow, in his The Book of Universes, even provides an estimate in ‘Planck units’. Mathematically, we provide 4 co-ordinates for any event in the Universe – 3 of space and 1 of time. And, obviously, they can all change, but time is unique in that it appears to change continuously.*
And time is ‘fluid’ for want of a better word. Its ‘rate’ can change in gravity and relativistically because the speed of light is constant. The speed of light is the only thing that stops everything from happening at once, and for a photon, time is zero. A photon traverses the entire universe in zero time (from the photon’s perspective).
But for the rest of us, time is a constraint created by light. Everything you observe has already happened because it always takes a finite amount of time (from your perspective) for the photon to reach you and nothing can travel faster than light (because it travels in zero time). This is the paradox, but it’s the relationship between light and time that governs our understanding of the Universe. If something speeds up relative to something else (you), then the light it emits increases in frequency if it’s coming towards you and decreases if it’s moving away. Obviously, the very fact that you can measure its frequency means you can measure its velocity (relative to you), which is meaningless without the dimension of time.
So note that all observations (involving light) mean that everything you perceive is in the past – it’s impossible to see into the future. So the ‘arrow of time’, that Davies specifically calls a ‘misnomer’, is a pertinent description of this everyday perception – we can only observe time in one direction, which is the past.
Davies explains our perception of time as a neurological effect:
It is incontestable that we possess a very strong psychological impression that our awareness is being swept along on an unstoppable current of time, and it is perfectly legitimate to seek a scientific explanation for the feeling that time passes. The explanation of this familiar psychological flux is, in my view, to be found in neuroscience, not physics. (emphasis in the original.)
I’ve argued previously that perhaps it is only consciousness that exists in a constant present. It is certainly true that only consciousness can perceive time as a dynamic entity. Everything around us becomes instantly the past like we are standing in a river where we can’t see upstream. It is for this reason that the concepts of past, present and future are uniquely perceived by a conscious mind. Davies effectively argues that this is the sole representation of time: that ‘time passing’ only exists in our minds and not in reality. But if our minds exist in a constant present (relative to everything else) then time does pass us by; and past, present and future is not an illusion, but a consequence of consciousness interacting with reality.
There are causal events that occur around us all the time, but, like a photographic image, they become past events as soon as they happen. I believe there is a universal ‘now’, otherwise the idea of the age of the Universe makes no sense. But, possibly, only conscious entities ride this constant now, which is why everything else is dynamically going past us in a literal, not just a psychological, sense. This is where Davies and I disagree.
Meanwhile, the future exists in light beams yet to be seen. Quantum mechanically, a photon is a wave function (ψ) that’s in the future of whatever it interacts with. A photon is only observed in retrospect, along with its path, and that’s true for all quantum events, including the famous double slit experiment. As Freeman Dyson points out, QM gives us probabilities which are in the future. To paraphrase: ‘quantum mechanics describes the future and classical physics describes the past’. Most physicists (including Davies, I suspect) would disagree. The orthodox view is that classical physics is a special case of quantum mechanics and, in quantum cosmology, time mathematically disappears.
Footnote: I should point out that Paul Davies is someone I’ve greatly admired and respected for many years.
*Paradoxically, at the event horizon of a black hole, time stops and we enter the world of quantum gravity. The evidence for black holes are accretion disks
where the matter from a companion star forms a ring at the event
horizon and emits high energy radiation as a result, which can be
observed. However, from everything I've read, we need new physics to understand what happens beyond the event horizon of a black hole.
Addendum: I've since resolved this paradox to my satisfaction: it's space that crosses the event horizon at c. Then I learned that Kip Thorne effectively provided the same explanation, demonstrated with graphics, in Scientific American in 1967. He cited David Finkelstein who demonstrated it mathematically in 1958.
Friday, 15 February 2019
3 rules for humans
Someone asked the question: what would the equivalent 3 laws for humans be, analogous to Asimov’s 3 laws for robotics?
The 3 laws of robotics (without looking them up) are about avoiding harm to humans within certain constraints and then avoiding harm to robots or itself. It’s hierarchical with humans' safety being at the top, or the first law (from memory).
So I submitted an answer, which I can no longer find, so maybe someone took the post down. But it got me thinking, and I found that what I came up with was more like a manifesto than laws per se; so they're nothing like Asimov’s 3 laws for robotics.
In the end, my so-called laws aren't exactly what I submitted but they are succinct and logically consistent, with enough substance to elaborate upon.
1. Don’t try or pretend to be something you’re not
This is a direct attempt at what existentialists call ‘authenticity’, but it’s as plain as one can make it. I originally thought of something Socrates apparently said:
To live with honour in this world, actually be what you try to seem to be.
And my Rule No1 (preferable to law) is really another way of saying the same thing, only it’s more direct, and it has a cultural origin as well. As a child, growing up, ‘having tickets on yourself’, or ‘being up yourself’, to use some local colloquialisms, was considered the greatest sin. So I grew up with a disdain for pretentiousness that became ingrained. But there is more to it than that. I don’t believe in false modesty either.
There is a particular profession where being someone you’re not is an essential skill. I’m talking about acting. Spying also comes to mind, but the secret there I believe is to become invisible, which is the opposite to what James Bond does. That’s why John Le Carre’s George Smiley seems more like the real thing than 007 does. Going undercover, by the way, is extremely stressful and potentially detrimental to your health – just ask anyone who’s done it.
But actors routinely become someone they’re not. Many years ago, I used to watch a TV programme called The Actor’s Studio, where well known actors were interviewed, and I have to say that many of them struck me with their authenticity, which seems like a contradiction. But an Australian actress, Kerry Armstrong, once pointed out that acting requires leaving your ego behind. It struck me that actors know better than anyone else what the difference is between being yourself and being someone else.
I’m not an actor but I create characters in fiction, and I’ve always believed the process is mentally the same. Someone once said that ‘acting requires you to say something as if you’ve just thought of it, and not everyone can do that.’ So it’s spontaneity that matters. Someone else once said that acting requires you to always be in the moment. Writing fiction, I would contend, requires the same attributes. Writing, at least for me, requires you to inhabit the character, and that’s why the dialogue feels spontaneous, because it is. But paradoxically, it also requires authenticity. The secret is to leave yourself out of it.
The Chinese hold modesty in high regard. The I Ching has a lot to say about modesty, but basically we all like and admire people who are what they appear to be, as Socrates himself said.
We all wear masks, but I think those rare people who seem most comfortable without a mask are those we intrinsically admire the most.
2. Honesty starts with honesty to yourself
It’s not hard to see that this is directly related to Rule 1. The truth is that we can’t be honest to others if we are not honest to ourselves. It should be no surprise that sociopathic narcissists are also serial liars. Narcissists, from my experience, and from what I’ve read, create a ‘reality distortion field’ that is often at odds with everyone else except for their most loyal followers.
There is an argument that this should be Rule 1. They are obviously interdependent. But Rule 1 seems to be the logical starting point for me. Rule 2 is a consequence of Rule 1 rather than the other way round.
Hugh Mackay made the observation in his book, Right & Wrong: How to Decide for Yourself, that ‘The most damaging lies are the ones we tell ourselves’. From this, neurosis is born and many of the ills that beleaguer us. Self-honesty can be much harder than we think. Obviously, if we are deceiving ourselves, then, by definition, we are unaware of it. But the real objective of self-honesty is so we can have social intercourse with others and all that entails.
So you can see there is a hierarchy in my rules. It goes from how we perceive ourselves to how others perceive us, and logically to how we interact with them.
But before leaving Rule 2, I would like to mention a movie I saw a few years back called Ali’s Wedding, which was an Australian Muslim rom-com. Yes, it sounds like an oxymoron but it was a really good film, partly because it was based on real events experienced by the filmmaker. The music by Nigel Weslake was so good, I bought the soundtrack. It’s relevance to this discussion is that the movie opens with a quote from the Quran about lying. It effectively says that lies have a habit of snowballing; so you dig yourself deeper the further you go. It’s the premise upon which the entire film is based.
3. Assume all humans have the same rights as you
This is so fundamental, it could be Rule 1, but I would argue that you can’t put this into practice without Rules 1 and 2. It’s the opposite to narcissism, which is what Rules 1 and 2 are attempting to counter.
One can see that a direct consequence is Confucius’s dictum: ‘Don’t do to others what you wouldn’t want done to yourself’; better known in the West as the Golden Rule: ‘Do unto others as you would have others do unto you’; and attributed to Jesus of course.
It’s also the premise behind the United Nations Bill of Human Rights. All these rules are actually hard to live by, and I include myself in that broad statement.
A couple of years back when I wrote a post in response to the question: Is morality objective? I effectively argued that Rule No3 is the only objective morality.
Friday, 8 February 2019
Some people might be offended by this
I was reminded of the cultural difference between America and Australia, when it comes to religion. A difference I was very aware of when I lived and worked in America over a combined period of 9 months, including New Jersey, Texas and California.
It’s hard to imagine any mainstream magazine or newspaper having this discussion in Australia, or, if they did, it would be more academic. I was in the US post 9/11 – in fact, I landed in New York the night before. I remember reading an editorial in a newspaper where people were arguing about whether the victims of the attack would go to heaven or not. I thought: how ridiculous. In the end, someone quoted from the Bible, as if that resolved all arguments – even more ridiculous, from my perspective.
I remember reading in an altogether different context someone criticising a doctor for facilitating prayer meetings in a Jewish hospital because the people weren’t praying to Jesus, so their prayers would be ineffective. This was a cultural shock to me. No one discussed these issues or had these arguments in Australian media. At least, not in mainstream media, be it conservative or liberal.
Reading Cunningham’s article reminded me of all this because he talks about how real hell is for many people. To be fair, he also talks about how hell has been sidelined in secular societies. In Australia, people don’t discuss their religious views that much, so one can’t be sure what people really believe. But I was part of a generation that all but rejected institutionalised religion. I’ve met many people from succeeding generations who have no knowledge of biblical stories, whereas for me, it was simply part of one’s education.
One of the best ‘modern’ examples of hell or the underworld I found was in Neil Gaiman’s Sandman graphic novel series. It’s arguably the best graphic novel series written by anyone, though I’m sure aficionados of the medium may beg to differ. Gaiman borrowed freely from a range of mythologies, including Orpheus, the Bible (in particular the story of Cain and Abel) and even Shakespeare. His hero has to go to Hell and gets out by answering a riddle from its caretaker, the details of which I’ve long forgotten, but I remember thinking it to be one of those gems that writers of fiction (like me) envy.
Gaiman also co-wrote a book with Terry Pratchett called Good Omens: The Nice and Accurate Prophecies of Agnes Nutter (1990) which is a great deal of fun. The premise, as described in Wikipedia: ‘The book is a comedy about the birth of the son of Satan, the coming of the End Times.’ Both authors are English, which possibly allows them a sense of irreverence that many Americans would find hard to manage. I might be wrong, but it seems to me that Americans take their religion way more seriously than the rest of us English-speaking nations, and this is reflected in their media.
And this brings me back to Cunningham’s article because it’s written in a cultural context that I simply don’t share. And I feel that’s the crux of this issue. Religion and all its mental constructs are cultural, and hell is nothing if not a mental construct.
My own father, whom I’ve written about before, witnessed hell first hand. He was in the Field Ambulance Corp in WW2 so he retrieved bodies in various states of beyond-repair from both sides of the conflict. He also spent 2.5 years as a POW in Germany. I bring this up, because when I was a teenager he told me why he didn’t believe in the biblical hell. He said, in effect, he couldn’t believe in a ‘father’ who sent his children to everlasting torment. I immediately saw the sense in his argument and I rejected the biblical god from that day on. This is the same man, I should point out, who believed it was his duty that I should have a Christian education. I thank him for that, otherwise I’d know nothing about it. When I was young I believed everything I was taught, which perversely made it easier to reject when I started questioning things. I know many people who had the same experience. The more they believed, the stronger their rejection.
I recently watched an excellent 3 part series, available on YouTube, called Testing God, which is really a discussion about science and religion. It was made by the UK’s Channel 4 in 2001, and includes some celebrity names in science, like Roger Penrose, Paul Davies and Richard Dawkins, and theologians as well; in particular, theologians who had become, or been, scientists.
In the last episode they interviewed someone who suffered horrendously in the War – he was German, and a victim of the fire-storm bombing. Contrary to many who have had similar experiences he found God, whereas, before, he’d been an atheist. But his idea of God is of someone who is patiently waiting for us.
I’ve long argued that God is subjective not objective. If humans are the only connection between the Universe and God, then, without humans, there is no reason for God to exist. There is no doubt in my mind that God is a projection, otherwise there wouldn’t be so many variants. Xenophanes, who lived in the 5th century BC, famously said:
The Ethiops say that their gods are flat-nosed and black,
While the Thracians say that theirs have blue eyes and red hair.
Yet if cattle or horses or lions had hands and could draw,
And could sculpt like men, then the horses would draw their gods
Like horses, and cattle like cattle; and each they would shape
Bodies of gods in the likeness, each kind, of their own.
At the risk of offending people even further, the idea that the God one finds in oneself is the Creator of the Universe is a non sequitur. My point is that there are two concepts of God which are commonly conflated. God as a Creator and God as a mystic experience, and there is no reason to believe that they are one and the same. In fact, the God as experience is unique to the person who has it, whilst God as Creator is, by definition, outside of space and time. One does not logically follow from the other.
In another YouTube video altogether, I watched an interview with Freeman Dyson on science and religion. He argues that they are quite separate and there is only conflict when people try to adapt religion to science or science to religion. In fact, he is critical of Einstein because Dyson believes that Einstein made science a religion. Einstein was influenced by Spinoza and would have argued, I believe, that the laws of physics are God.
John Barrow in one his books (Pi in the Sky) half-seriously suggests that the traditional God could be replaced by mathematics.
This brings me to a joke, which I’ve told elsewhere, but is appropriate, given the context.
What is the difference between a physicist and a mathematician?
A physicist studies the laws that God chose for the Universe to obey.
A mathematician studies the laws that God has to obey.
Einstein, in a letter to a friend, once asked the rhetorical question: Do you think God had a choice in creating the laws of the Universe?
I expect that’s unanswerable, but I would argue that if God created mathematics he had no choice. It’s not difficult to see that God can’t make a prime number non-prime, nor can he change the value of pi. To put it more succinctly, God can’t exist without mathematics, but mathematics can exist without God.
In light of this, I expect Freeman Dyson would accuse me of the same philosophical faux pas as Einstein.
As for hell, it’s a cultural artefact, a mental construct devised to manipulate people on a political scale. An anachronism at best and a perverse psychological contrivance at worst.
Thursday, 24 January 2019
Understanding Einstein’s special theory of relativity
Imagine if a flight to the moon was no different to flying half way round the world in a contemporary airliner. In my scenario, the ‘shuttle’ would use an anti-gravity drive that allows high accelerations without killing its occupants with inertial forces. In other words, it would accelerate at hyper-speeds without anyone feeling it. I even imagined this when I was in high school, believe it or not.
The craft would still not be able to break the speed of light but it would travel fast enough that relativistic effects would be observable, both by the occupants and anyone remaining on the Earth or at its destination, the Moon.
So what are those relativistic effects? There is a very simple equation for velocity, and this is the only equation I will use to supplement my description.
Where v is the velocity, s is the distance travelled and t is the time or duration it takes. You can’t get much simpler than that. Note that s and t have an inverse relationship: if s gets larger, v increases, but if t gets larger, v decreases.
But it also means that for v to remain constant, if s gets smaller then so must t.
For the occupants of the shuttle, getting to the moon in such a short time means that, for them, the distance has shrunk. It normally takes about 3 days to get to the Moon (using current technology), so let’s say we manage it in 10 hrs instead. I haven’t done the calculations, because it depends on what speeds are attained and I’m trying to provide a qualitative, even intuitive, explanation rather than a technical one. The point is that if the occupants measured the distance using some sort of range finder, they’d find it was measurably less than if they did it using a range finder on Earth or on the Moon. It also means that whatever clocks they were carrying (including their own heartbeats) they would show that the duration was less, completely consistent with the equation above.
For the people on the Moon awaiting their arrival, or those on Earth left behind, the duration would be consistent with the distance they would measure independently of the craft, which means the distance would be whatever it was all of the time (allowing for small variances created by any elliptic eccentricity in its orbit). That means they would expect the occupants’ clocks to be the same as theirs. So when they see the discrepancy in the clocks it can only mean that time elapsed slower for the shuttle occupants compared to the moon’s inhabitants.
Now, many of you reading this will see a conundrum if not a flaw in my description. Einstein’s special theory of relativity infers that for the occupants of the shuttle, the clocks of the Moon and Earth occupants should also have slowed down, but when they disembark, they notice that they haven’t. That’s because there is an asymmetry inherent in this scenario. The shuttle occupants had to accelerate and decelerate to make the journey, whereas the so-called stationary observers didn’t. This is the same for the famous twin paradox.
Note that from the shuttle occupants’ perspective, the distance is shorter than the moon and Earth inhabitants’ measurements; therefore so is the time. But from the perspective of the moon and Earth inhabitants, the distance is unchanged but the time duration has shortened for the shuttle occupants compared to their own timekeeping. And that is special relativity theory in a nutshell.
Footnote: If you watch videos explaining the twin paradox, they emphasise that it’s not the acceleration that makes the difference (because it’s not part of the Lorentz transformation). But the acceleration and deceleration is what creates the asymmetry that one ‘moved’ respect to another that was ‘stationary’. In the scenario above, the entire solar system doesn’t accelerate and decelerate with respect to the shuttle, which would be absurd. This is my exposition on the twin paradox.
Addendum 1: Here is an attempted explanation of Einstein’s general theory of relativity, which is slightly more esoteric.
Addendum 2: I’ve done a rough calculation and the differences would be negligible, but if I changed the destination to Mars, the difference in distances would be in the order of 70,000 kilometres, but the time difference would be only in the order of 10 seconds. You could, of course, make the journey closer to lightspeed so the effects are more obvious.
Addendum 3: I’ve read the chapter on the twin paradox in Jim Al-Khalili’s book, Paradox: The Nine Greatest Enigmas in Physics. He points out that during the Apollo missions to the moon, the astronauts actually aged more (by nanoseconds) because the time increase by leaving Earth’s gravity was greater than any special relativistic effects experienced over the week-long return trip. Al-Khalili also explains that the twin who makes the journey, endures less time because the distance is shorter for them (as I expounded above). But, contrary to the YouTube lectures (that I viewed) he claims that it’s the acceleration and deceleration creating general relativistic effects that creates the asymmetry.