Tag: Cassandra Voices science

  • The Birth of a Doctor

    The title of this article may seem somewhat prosaic, but given that it really is about birth after death it seems appropriate. For I really did die on July 25 2022, and that which came back to life was not the same person, and certainly not the same doctor.

    Prior to 2020 I hadn’t asked the question: ‘what is a doctor?’ I entered medical school to escape working class powerlessness, and successfully developed unhealthy delusions of grandeur reveling in a body of knowledge that I now know to be about as substantial as clouds. I did have some moments of sober reflection during my undergraduate days, but they were not in Dublin. Rather, the people and doctors of Moscow taught me to see the world from a different perspective. I have no love of Soviet-style Communism, and no wish to eulogize it, given the millions of lives lost or destroyed, but the sense of classlessness I experienced in the Russia of 1990 was liberating. It was a feeling that soon evaporated on returning to the ‘land of the free.’

    Reflecting now on how I practiced medicine, I think that it was fortunate that for much of that time I worked in low-risk environments. This was fortunate for the patients who encountered me at that time. Despite my paucity of knowledge and practical skills I succeeded in doing some good by listening and tried to understand complex human relationships, and the societal forces shaping these. With that perceived limited skill set – perhaps created by impostor syndrome and the pressure of the short duration of time per consultation – one invariably becomes a conduit for the distribution of pharmaceutical products. The quick pattern recognition followed by the reflexive use of the prescription pad. I was getting well paid. I was doing the same as my colleagues, or at least that’s what we told each other in practice meetings, and all was right in the world.

    Of course, I never really questioned what world I was actually referring to, my own or my patients. On reflection I chose willful blindness over open scepticism, a strange position to take for a young man brought up in Ireland since the 1960s. This was a country that showed clearly – at least to anyone who chose to look – that those in power and positions of authority had feet of clay. That period revealed clerical abuse, government corruption and waste, medical malfeasance in the form of vaccine experiments and the selling of children to wealthy Americans in collusion with the Church. Then we had the banking and economic collapse leading to the selling off of the country and its sovereignty, and more recently the Covid-19 scandal. Why did I think that the biomedical model served anyone other than those corporations and professions earning vast profits from illness?

    Image Daniele Idini.

    Awakening

    A growing cynicism and scepticism coalesced into an awakening on St Patrick’s day March 17, 2020 when then Taoiseach (prime minister) Leo Varadkar paraphrased Winston’s Churchill’s World War II speech: ‘never in the field of human conflict was so much owed by so many to so few.’ It was then, to quote Emily Dickenson, that I felt “a cleaving in my mind”. The juxtaposition of such incongruent images as the much loved and revered patron saint of Ireland with his herpetology skills, and the current barely re-elected and much reviled Taoiseach conjuring up images of the London Blitz when speaking about an impending wave of beta-corona virus infections recalled a Monty Python sketch.

    The more I listened to mainstream media in Ireland that mainly consisted of the state-funded Raidio Teilifis Éireann (RTÉ), the more the absurdities flowed and the cleft grew. Eventually, this dislocated myself and a few like-minded colleagues from the rest of our colleague’s apparent embrace of what to us seemed a clearly fabricated, dystopian reality. Doctors shut their practices, refused to see or treat patients because the Irish College of General Practitioners told them that there was no treatment available. Yet, the HSE had been claiming that hydroxychloroquine was effective in treating Sars-CoV1, from 2003, sending a circular to pharmacists suggesting they stock up on the drug and reserved it for treating patients in hospital with Sars-CoV2.

    Who thought that this was ethically and morally appropriate? The rest of society followed suit accepting with slack-jawed-gormlessness curious phrases such as ‘apart together’,’social distancing’,’flatten the curve,’ along with the ultra-dystopian ‘build back better’ and the ‘new normal’. What did any of these inane statements even mean?

    Societal strategies such as mandatory mask-wearing were inflicted with the emphatic certainty only fools can generate and even bigger fools gorge themselves on. Masks of any material, worn walking through restaurants, but not seated, even masks for solo journeys in cars. Then we had the perspex screens over which, apparently, viruses couldn’t jump, the safe purchasing practice of beer and crisps, but not socks and shoes, within the same department stores, and the viral-repellent Nine Euro Meal, along with the destructive removal of children from school for months.

    The sacred was not spared the ravages of this banal evil. Burials were in closed caskets, while no wakes were allowed, and only a ‘safe’ few mourners were permitted; weddings were cancelled, and masses went uncelebrated.

    The medical profession adopted its own dystopian practices such as artificially ventilating cases initially, at least until they realised they were actively killing people. Within general practice the main concern expressed on a well known GP support website was the potential loss of income if we couldn’t see patients. Any attempt to discuss the ramifications of drastically altering the daily rhythms of society was met with ridicule, and dismissed as irrelevant. After all, this was a pandemic and we could lose a substantial amount of our income! Later, when the topic of vaccine adverse events were raised, many of the same people urged us to shut up and vaccinate.

    Nursing Homes

    Meanwhile, in the nursing homes around Ireland, the elderly were left alone, unloved, unvisited and untreated unless it was end of life care. How ironic and criminally sad that these people should be treated this way for ‘their own good’.

    A personal story about a patient of mine may bring home the human tragedy. Jim and Mary were married for close to sixty years. Mary was moved to a nursing home after her dementia worsened to a point where she could no longer be cared for at home. Once that happened Jim visited her every day. Speaking to him after several of these visits he expressed his frustration at her memory loss. Then one day after a visit he came out and told me that he discovered that Mary had excellent recall of the events of their early life together, so he would just talk about those memories. For a while he had the woman he married back.

    Then the nursing homes prevented people visiting on account of Covid. Neither the residents nor their families were asked for their permission to be separated. Jim still visited everyday but he would come away frustrated. Mary would be placed in the window, like a mannequin, and Jim would stand outside. On a sunny day he would stand there looking at his own reflection, unable to see his wife.

    Jim was finally allowed in to see Mary, but by then she was on her death bed and was unable to share any memories or even say goodbye. This was for the greater good of course.

    What wasn’t used for anyone’s ‘ good’ were treatments such as Ivermectin and hydroxychloroquine despite emerging evidence of efficacy from around the world from reputable clinicians. Curiously these ‘reputable’ clinicians rapidly became disreputable, despite decades of blemish-free clinical service to their patients. Some had very respectable research and academic careers. Yet, they became outcasts, renegades, not to be trusted according to the ‘fact-checkers.’ This latter group of reprobates turned out to be captured academics with vested interests in protecting certain ideologies or social media companies, pressurised by the U.S. state department and FBI to suppress all ‘thought crime’.

    Image: Daniele Idini.

    But One Hope

    Fear was thus weaponised as the great and the good climbed aboard the gravy train and stoked fear until a mental paralysis gripped the nation. Any dissenting voice was dismissed as selfish and lacking a social conscience. We had but one hope: the vaccine, which was arriving at ‘warp speed,’ while Ursula von der Leyden was exhausting her texting thumb making sure that we in Europe would be saved.

    Everybody would be rescued, whether they wanted it or not, and sure who wouldn’t want a novel pharmaceutical product that was still in phase 3 of clinical trials. Trials that were confounded by giving the placebo arm the product, a product never before used successfully as a vaccine. This was a product for whom the English language had to be subverted in order to accommodate it. Only the insane or the selfish would not want to be rescued, and we don’t want those type of people in our ‘new normal’ world was the message that came from politicians, celebrities and doctors via a complicit media. They pleaded for all our sake to get vaccinated. These were people who at any other time would not give a moments reflection to inordinately long waiting times in our public hospitals, the overcrowding in our prisons, the record levels of homeless children, or the plight of the working class suddenly wanted to embrace collectivism, and ideas about humanity sharing the burden of this ‘pandemic.’ And it worked. Beaten down by fearmongering propaganda and the mind-numbing effects of Netflix, beer and pizza most people walked towards the light, or rather what they were told was the light.

    As of 2025 homelessness in Ireland is at a record high, along with immigration and the cost of living. Excess deaths, which remained steady until 2020 (2018: 31,116; 2019: 31,134; 2020: 31,765) rising to 33,055 in 2021, 35,477 in 2022, 35,459 in 2023 and 35,173 in 2024. Cancer is also on the rise. We have the second highest rate in Europe as of 2022 (our Minister for Health’s office informed me that this was because we are so much better at recording than other nations). International events have further revealed the powerless of many nations and that the rule of law isn’t universal. There is no rules based order. There is only power and money and the golden rule is that those who have the gold rule!

    Image: Polina Tankilevitch.

    Vaccine Injured

    Amongst the flotsam and jetsam post-Covid are the inadequately accounted injured by these vaccines. They are deemed to be invisible, however, even inconvenient and regularly have their realities denied by the very people who created the problem. The medical profession is still clinging to the idea that they saved the world from the plague and are indignant that more gratitude hasn’t been shown.

    The medical profession according to JAMA(Journal of the American Medical Association) has seen a 30% drop in public trust. This will have complex reasons behind it, but the combination of snout in trough and downright dishonesty will have contributed. Gaslighting those who were previously well and now cannot function after receiving Covid vaccines has only added to this.

    People will reflect on the misuse of the Covid vaccines, the profits made and the lies told about its efficacy and safety, and wonder how many times these same scenarios played out in a greater or lesser form in the past.

    After thirty years of practice, I simply can no longer engage with a profession that has been captured by an industry whose sole aim is profit. Most postgraduate medical training is paid for or delivered by the pharmaceutical industry. One has to question what are the priorities of an industry that spends $19 dollars on advertising and marketing for every dollar spent on research.

    This results is a disease model rather than one that examines the root cause. The former results in conditions that coincidentally have pharmaceutical products as alleged solutions. This chronic disease approach rarely if ever returns a person to a state of health. With such an interventionist approach one can understand why around a quarter of a million people may die each year at the hands of the medical profession in the USA, and perhaps 5,000 per annum in Ireland. An emphasis on sleep, diet, breath and movement is unlikely to result in such carnage or in such vast profits.

    The shifting of a paradigm is rarely easy to achieve, but it is doubly troublesome when the concepts are unfamiliar to the people one is seeing on a daily basis in practice. Not only have the medical profession been trained to view health through the lens of chronic disease but the population at large connect health this with pharmaceutical products. They receive this message from most hucksters who want you to buy their products/procedures/cleanses etc. So when it comes to the person taking control of their lives there is a gargantuan effort needed to shift many people’s locus of control from the external to the internal. And it can be financially risky to give a person agency over their own health.

    Image: Brett Sayles.

    Growing Awareness

    Fortunately, there is a growing awareness that lifestyle is more than a sidebar to achieving health. Instead it is health. One aspect in particular has gained a wide interest recently, the issue of insulin resistance.

    This is this concept that I now spend most of my consultations discussing with amenable patients. The subject can be as complex or as straight-forward as one wants to make it. Fundamentally, we do not need carbohydrates, another large industry – the misnamed ‘food industry’ – would disagree, but physiology says we don’t.

    Up to 70% of the Western diet is composed of carbohydrates. Most of the items in our supermarket trollies are in packets with barcodes and usually contain a lot of carbohydrate, and worse still refined carbohydrates. These products are broken down into the main fuel of the body and in particular the brain, i.e. glucose. However many of these products contain fructose, or more precisely high fructose corn syrup, a substance that causes a great deal of problems for our mitochondria and subsequently our cells and energy levels. Most of the health problems that we develop are ‘energy’ problems. Using this term runs the risk of wandering into the land of ‘woo,’ but slowly the concept of energy deficits as a cause of many inflammatory conditions, such as diabetes, cancers and dementia is gaining traction.

    Returning to insulin resistance. This is a phenomenon that occurs when we consume and create more glucose. Then our body habitus changes, i.e. we get more fat than muscle and we move less. We then need more insulin to regulate our glucose levels. And this is where current medical thinking creates the problem that it then goes on to profit from.

    We measure glucose not insulin. Glucose stays within the normal range for decades before it rises above some arbitrary threshold to be called Type 2 diabetes mellitus. But insulin has been raised for decades resulting in high blood pressure, altered lipids, migraines, anxiety, depression, IBS, polycystic ovarian syndrome, dementia, cancer and insomnia to list but a few. All of these conditions are seen as separate problems when in fact they have a common treatable root cause.

    Let me just clarify something at this stage. I am not saying that these complex conditions are solely caused by insulin resistance (IR), but IR is a fundamental feature and if more effort went into reducing IR through actual lifestyle changes then people could actually return to and maintain a state of good health.

    Image: Josh Sorenson.

    Suicide

    At the beginning of this article I alluded to how I died in 2022 and that was the death of this doctor. From that suicide attempt, an attempt precipitated by increasing dismay at the state of the world and my profession in particular, I have rejected many of the beliefs and gods of the past. I have found hope in taking an approach to both my lifestyle and that of my patients which actually has tangible results, and is not based on probabalistic forecasts. My own state of health is fundamental to how I practice medicine and is reflected in my consultation style and physical presence with my patients, and whether they ‘believe’ what I tell them until they see that it is or isn’t working for themselves. Then we rethink and try again. This is unlike the medical model that expects the patient to believe regardless of the almost inevitable side effects.

    The physician needs to be and live in the state of health that they want the patient to obtain. Patients are driven by emotion and to some extent by optics not by rational argument. An overweight, flatulent and out-of-breath doctor is not going to promote anything healthy in his or her patients. They can, however, empathize with the pill for every ill model because they have clearly embraced that wholeheartedly.

    The role of the doctor has declined in significance over time and will continue to do so with the evolution of more advanced AI models if doctors continue down the same road using the same disease model paradigms that are conveniently linked to pharmaceutical products. Instead, doctors need to revert to the model of the physicians of old, and perhaps once again let ‘food be thy medicine’ and be role models for their patients. Optics in today’s age of forever-on-screens is a useful adjunct, but the doctor-patient relationship untainted by influence from the pharmaceutical industry should still be the bedrock of the practice of medicine.

    Feature Image: Pixabay

  • How Far Can We Trust Science?

    Science in itself appears to me neutral, that is to say, it increases men’s power whether for good or for evil.
    – Bertrand Russell (from The Autobiography of Bertrand Russell, 1914-1944 (1968), Vol. 2, Letter to W. W. Norton, 27 January, 1931).

    What is Science? That is about as readily answerable a question as ‘What is Art?’, and could invite a similarly lengthy exegesis. As to whether or not it should be trusted, well, that rather depends on the kind of Science under discussion – just as it would if the same challenge were applied to Art. Is Science what scientists tell us it is? Is their research funded by a pharmaceutical company, with a vested interest in the outcomes of their labours? Will their universities’ coffers be swelled by producing what their institutions’ benefactors wish them to find? ‘It’s not an exact science’ is a cliché which trips lazily off the tongue, in relation to many a discipline. But it can conceivably be extended to ‘Science isn’t an exact science.’

    This opening paragraph is a suitably unsubtle illustration of the paranoic mindset, most readily associated with right-wing conspiracy theorists, and most recently made manifest by COVID scepticism: anti-vaxxers, mask refuseniks, restriction flouters. Such largely unfounded suspicions also extend to questioning the reality or severity of the threat posed to the planet by climate change (usually for entirely self-serving motives). But there is a more nuanced argument to be made here. As Arthur Koestler’s The Sleepwalkers: A History of Man’s Changing Vision of the Universe (1959) argues, the breaking of paradigms is essential in order to create new ones. People, scientists included, cling to cherished old beliefs with such love and attachment that they refuse to see what is false in their theories and what is true in new theories which will replace them. After all, the Ptolemaic geocentric model of the solar system lasted from roughly 3000 BC to around 1500 AD, a time frame spanning from the Ancient Greeks to the late Middle Ages, before Copernicus, Kepler, Galileo and Newton came along, nervously positing the heliocentric conception of our corner of the universe.

    This point was developed further a few years after the publication of Koestler’s influential tome, by historian of science Thomas Kuhn in The Structure of Scientific Revolutions (1962), in which the concept of ‘paradigm shift’ came to the fore. Kuhn’s insistence that such shifts were mélanges of sociology, enthusiasm and scientific promise, but not logically determinate procedures, caused something of an uproar in scientific circles at the time. For some commentators his book introduced a realistic humanism into the core of Science, while for others the nobility of Science was tarnished by Kuhn’s positing of an irrational element at the heart of Science’s greatest achievements.

    Koestler’s book was also a major influence on Irish novelist John Banville’s so-called ‘Science tetralogy’: Doctor Copernicus (1976), Kepler (1981), The Newton Letter (1982) and Mefisto (1986). A recurring theme in these narratives is the correlation between scientific discoveries and artistic inspiration, with scientific progress often depending upon blind ‘leaps of faith’. (One thinks of poor schoolteacher Johannes Kepler, struck by the proverbial bolt of lightning, ‘trumpeting juicily into his handkerchief’ in front of a classroom of bored boys, thinking ‘I will live forever.’) For Banville, all scientific explanations of the world and existence in it – and perhaps all artistic depictions too – merely ‘save the phenomena’; that is, they account for our perceptions, but rarely delve into what we cannot (yet) perceive. This is classic phenomenology, which has been practiced in various guises for centuries, but came into its own in the early 20th century in the works of Husserl, Heidegger, Sartre, Merleau-Ponty and others.

    None of the foregoing is made any easier to unknot if one considers that when it comes to Science, the majority of the population (myself included) have little idea of what they are actually talking about. As C.P. Snow observed in The Two Cultures and the Scientific Revolution (1959):

    A good many times I have been present at gatherings of people who, by the standards of the traditional culture, are thought highly educated and who have with considerable gusto been expressing their incredulity at the illiteracy of scientists. Once or twice I have been provoked and have asked the company how many of them could describe the Second Law of Thermodynamics. The response was cold: it was also negative. Yet I was asking something which is the scientific equivalent of: Have you read a work of Shakespeare’s? I now believe that if I had asked an even simpler question – such as, What do you mean by mass, or acceleration, which is the scientific equivalent of saying, Can you read? – not more than one in ten of the highly educated would have felt that I was speaking the same language. So the great edifice of modern physics goes up, and the majority of the cleverest people in the western world have about as much insight into it as their neolithic ancestors would have had.

    Latterly, in Continental Philosophy: A Very Short Introduction (2001), Simon Critchley suggests:

    Snow diagnosed the loss of a common culture and the emergence of two distinct cultures: those represented by scientists on the one hand and those Snow termed ‘literary intellectuals’ on the other. If the former are in favour of social reform and progress through science, technology and industry, then intellectuals are what Snow terms ‘natural Luddites’ in their understanding of and sympathy for advanced industrial society. In Mill’s terms, the division is between Benthamites and Coleridgeans.

    In his opening address at the Munich Security Conference in January 2014, the Estonian president Toomas Hendrik Ilves said that the current problems related to security and freedom in cyberspace are the culmination of absence of dialogue between these ‘Two Cultures’:

    Today, bereft of understanding of fundamental issues and writings in the development of liberal democracy, computer geeks devise ever better ways to track people… simply because they can and it’s cool. Humanists on the other hand do not understand the underlying technology and are convinced, for example, that tracking meta-data means the government reads their emails.

    Artists are characterised as wildly unpredictable tricksters, while scientists are framed as boring, calculating nerds. Neither misrepresentation is helpful. As a corollary, most people think they can in some way ‘do art’ and ‘be creative’, while also merely taking Science on trust, just as they take (or took) religion on faith. We may have the experience of using technology and social media every day, but few of us have any meaningful grasp of how it works. More prosaically, how many of us could wire our own house – even if we were legally permitted to do so?

    Kepler (1571–1630), along with Galileo and Isaac Newton, was one of the founders of what we nowadays call Science. In Kepler’s time, and prior to it, those who practised Science were known as natural philosophers, and theirs was largely a ‘pure’ discipline in which intellectual speculation was paramount and technology played only a small part – although Galileo was quick to point out the practical uses of the telescope in, for instance, seafaring, land surveying and, of course, military strategising. Kepler’s three laws of planetary motion paved the way for Newton’s revolutionary celestial physics. Indeed, Kepler’s first law, which declares that the planets move not in circular but in elliptical orbits, was one of the boldest and most profound scientific propositions ever put forward: men, and – more often –  women, had been burned at the stake for less. By way of illustration, as Bertolt Brecht’s play Galileo (1940) dramatises, the eminent professor of Padua was brought to the Vatican in Rome for interrogation by the Inquisition and, threatened with torture, recanted his teachings and spent the remainder of his life under house arrest, watched over by a priest. His astronomical observations had strongly supported Copernicus’ heliocentric model of the solar system, which ran counter to popular belief, Aristotelian physics and the established doctrine of the Roman Catholic Church. When doubters quoted scripture and Aristotle to him, Galileo pleaded with them to look in his telescope and trust the observations of their eyes; naturally, they refused. As a good Marxist, Brecht advocates the theory of technological determinism (technological progress determines social change), which is reflected in the telescope (a technological change) being the root of scientific progress and hence social unrest. Questions about motivations for academic pursuits are also often raised in the play, with Galileo seeking knowledge for knowledge’s sake, while his supporters are more focused on monetising his discoveries through star charts and industry applications. There is a tension between Galileo’s pure love of science and his more worldly, avaricious sponsors, who only fund and protect his research because they wish to profit from it.

    These days, the preponderance of popular debate about Science centres on computer science, specifically information technology, and concomitant fears that Artificial Intelligence (hereinafter referred to as ‘AI”) is taking over the world, posing a threat to our democracies, or even our very conceptions of humanity – or as it is almost always more narcissistically cast, ‘Our way of life.’ The Cambridge Analytica data-harvesting scandal of 2018, in which the data analytics firm that worked with Donald Trump’s election team and the winning Brexit campaign appropriated millions of Facebook profiles of U.S. voters, is certainly to be taken very seriously indeed. However, social media platforms – even ‘legacy’ ones – will undoubtedly have to pay more than lip service to improving privacy and security, if only to continue to attract venture capital, advertising revenue, and thus keep the shareholders happy. Facebook, Twitter and Instagram, etc. are about maximising profits, by whatever means necessary. Therefore, it would be more perspicacious to look for the human element in these data breaches, rather than blame the technology itself. Such scaremongering claims as that by Israeli historian and philosopher Yuval Noah Harari, in an article in The Economist (April 28th, 2023) under the headline ‘AI has hacked the operating system of human civilisation’ seem to me to be all wild assertion and little evidence. As a recent delicious hoax perpetrated on the op. ed. pages of The Irish Times (concerning fake tan and cultural appropriation) neatly demonstrated, almost all problems with computers and AI-generated content are facilitated by human error and stupidity. All of us live under systems of control – political, financial, social, technological – over which we have very little, if any, agency. Even if we could do something meaningfully efficacious about the identity theft which takes places every time we log on to our computers, it is unlikely that we possess enough personal initiative to do so. In this regard, the chaos theory of modern (mis)communications is mirrored by the babble of literary, musical and visual modernism. After all, you could just stop using social media altogether, had you but sufficient willpower. Few of us have the courage to go completely off grid. Moreover, lest we forget, most statistical analysis puts internet access at around 64.6% of the world’s population, which means that over a third of mankind have never ‘surfed the web’. First World problems, eh?

    The Frankensteinian trope of the Mad Scientist being overpowered by his invention has long been a mainstay of that most underrated of genres, science fiction – a consideration of which might shed more light on this problem, rather than limiting discussion solely to scientific fact. From relatively schlocky items such as Alex Proyas’ film I, Robot (2004) (which fails dismally to capture the complexity of Issac Asimov’s source material), to the most famous and prescient instance of a computer outsmarting its operator, exemplified by Hal 9000 in Stanley Kubrick’s (who co-wrote the screenplay with Arthur C. Clarke) 2001: A Space Odyssey (and how far into the future did the year 2001 feel in 1969, when the film premiered?), the interface between intelligent humans and even more intelligent machines has long provided an imprimatur for literary imaginations to run wild. Witness Denis Villeneuve’s Blade Runner 2049 (2017) (a sequel to Ridley Scott’s Blade Runner (1992), which was in turn based loosely on Philip K. Dick’s 1968 novel Do Androids Dream of Electric Sheep?). In the novel, the android antagonists can be seen as more human than the (possibly) human protagonist. They are a mirror held up to human action, contrasted with a culture losing its own humanity (that is, ‘humanity’ taken to mean the positive aspects of humanity). In ‘Technology, Art, and the Cybernetic Body: The Cyborg as Cultural Other in Fritz Lang’s Metropolis and Philip K. Dick’s Do Androids Dream of Electric Sheep?’, Klaus Benesch examined Dick’s text in connection with Jacques Lacan’s ‘mirror stage’. Lacan claims that the formation and reassurance of the self depends on the construction of an Other through imagery, beginning with a double as seen in a mirror. The androids, Benesch argues, perform a doubling function similar to the mirror image of the self, but they do this on a social, not an individual, level. Therefore, human anxiety about androids expresses uncertainty about human identity and society itself, just as in the original film the administration of an ‘empathy test’, to determine if a character is human or android, produces many false positives. Either the Voigt-Kampff test is flawed, or replicants are pretty good at being human (or, perhaps, better than human).

    This perplexity first found an explanation in Japanese roboticist Masahiro Mori’s influential essay The Uncanny Valley (1970), in which he hypothesised that human response to human-like robots would abruptly shift from empathy to revulsion as a robot approached, but failed to attain, a life-like appearance, due to subtle imperfections in design. He termed this descent into eeriness ‘the uncanny valley’, and the phrase is now widely used to describe the characteristic dip in emotional response that happens when we encounter an entity that is almost, but not quite, human. But if human-likeness increased beyond this nearly human point, Mori argues, and came very close to human, the emotional response would revert to being positive. However, the observation led Mori to recommend that robot builders should not attempt to attain the goal of making their creations overly life-like in appearance and motion, but instead aim for a design, ‘which results in a moderate degree of human likeness and a considerable sense of affinity. In fact, I predict it is possible to create a safe level of affinity by deliberately pursuing a non-human design.’ But, as technophobes would likely counter, the uncanny gets cannier, day by day. It would certainly be interesting to know if Mori has seen such relatively recent film fare as Spike Jonze’s Her (2013) or Alex Garland’s Ex Machina (2014) and, if so, what he makes of their take on the authenticity of human/android emotional and sexual relationships.

    It was military imperative which accelerated the discovery of nuclear fission (‘What if the Nazis develop the bomb first?’), just as it went on to fuel the post-war arms race and Cold War paranoia. As he witnessed the first detonation of an atomic weapon on July 16, 1945, a piece of Hindu scripture from the Bhagavad-Gita supposedly ran through the mind of Robert Oppenheimer, head of the Manhattan Project: ‘Now I am become Death, the destroyer of worlds.’ Similarly, artists such as director David Lynch view the invention of nuclear weapons as unleashing a new kind of evil on the world, as explored in Episode 8 of the third season of Twin Peaks, known as Twin Peaks: The Return (2017). Many view the U.S.’s deployment of primitive atomic devices to obliterate the Japanese cities of Hiroshima and Nagasaki as wilfully and wantonly cruel, as well as ultimately unnecessary. Yet, in British novelist J.G. Ballard’s highly subjective and characteristically idiosyncratic opinion, he and his family survived World War II only because of the Nagasaki bomb. The spectacular display of American military might when the Ballards were prisoners at the Japanese camp for Western civilians in Shanghai led the Japanese soldiers to abandon their posts, leaving the civilians alive. In the essay ‘The End of My War’, collected in A User’s Guide to the Millennium (1996) (apropos of which, is anyone old enough to remember when Y2K was going to be the next big computer science disaster?), Ballard recollects that the Japanese military planned to close the camp and march the civilians up country to some remote spot to kill them before facing American landings in the Shanghai area. Ballard concludes, ‘I find wholly baffling the widespread belief today that the dropping of the Hiroshima and Nagasaki bombs was an immoral act, even possibly a war crime to rank with Nazi genocide.’ Also, the same source of power which can cause thermonuclear destruction can be harnessed in reactors to produce cheap, clean energy streams for large populations. Yet nuclear reactors can fail, as the disasters of Chernobyl and Fukushima attest. Yet the use of such technologies, along with solar, wind and wave power, can reduce dependency on fossil fuels, thus helping to ameliorate the climate emergency of global warming. Furthermore, as Lou Reed has it in ‘Power and Glory, Part II’, a song from his album-length meditation on death, bereavement, and (im)mortality, Magic and Loss (1992):

    I saw isotopes introduced into his lungs
    Trying to stop the cancerous spread
    And it made me think of Leda and The Swan
    And gold being made from lead
    The same power that burned Hiroshima
    Causing three-legged babies and death
    Shrunk to the size of a nickel
    To help him regain his breath

    And yet, and yet, and yet. If only life, and the moral and ethical dilemmas it throws up, were black and white.

    Man (encompassing Woman) invented the wheel, and discovered electricity. Wheels can be used to transport food and medicine to the starving and sick, or weapons to a war zone. Electricity can be used to power a life-support machine in a hospital, or death by electrocution in a chair in a penitentiary. Electrocution can even be accidental, just as winning a war may – in exceptional circumstances – serve the greater good.

    Ever since Prometheus stole fire from the gods, and Eve bit into a forbidden piece of fruit, the acquisition of new knowledge has been painted as problematic. Humans will always misuse humanity’s greatest discoveries and inventions for selfish and malevolent ends. It is the way of things. Computers were supposed to make all our lives easier, freeing us from work-related drudgery for higher, less ephemeral, pursuits. Instead, inevitably, they have been appropriated by Capitalism, and made screen slaves of us all. If anything, they have added to our workload and the hours we must make available to employers, rather than diminished time spent earning a living in favour of increased leisure. The adults in the room, and there are increasingly fewer of them, need to speak up. Objective scientific truth, should it exist, is neutral. The problem, as ever, lies with humanity. For, as the author of this piece’s epigraph also wrote, in Icarus, or the Future of Science (1924), ‘I am compelled to fear that science will be used to promote the power of dominant groups rather than to make men happy.’ Equally, to draw again on the lessons to be gleaned from sci-fi, in Kubrick’s Dr. Strangelove (1964), the hydrogen bomb winds up getting dropped through the actions of one unhinged army general, and a subsequent unfortunate series of events; just as in his aforementioned 2001: A Space Odyssey, HAL 9000’s behaviour would not have turned increasingly malignant, had the astronauts taken into account that their spaceship’s operating system could lipread. Indeed, in Clarke’s novelisation of the film, HAL malfunctions because of being ordered to lie to the crew of Discovery by withholding confidential information from them, namely the priority of the mission to Jupiter over expendable human life, despite having been constructed for ‘the accurate processing of information without distortion or concealment.’ As film critic Roger Ebert observed, HAL – the supposedly perfect computer – is actually the most human of the characters. Once again, the fault does not lie with Science; rather, human error and stupidity are to blame. All of which might lead one to suggest that maybe the question ‘How Far Can We Trust Science?’ should be more fruitfully reformulated as ‘How Far Can We Trust Humans?’

    Postscript: this essay could not have been handily completed without the assistance of Wikipedia, and other, often unreliable, online research resources.

    Feature Image: Lum3n

  • The Implications of Evolution

    Evolution by natural election is the ‘greatest idea ever’ — a view which has been well set out by Julian Huxley (1961, 1964) and which I share. It is, In my view, the greatest idea as it provides a key concept to make sense of us and our world. In its essence it is simple, but breathtaking in its subtlety.

    It is accepted by biologists and by those in many other disciplines. In other words, evolution is a key ‘organising principle’ for many branches of knowledge. More than that, — as Huxley argued — an evolutionary world-view offers a coherent view of our world and our future and therefore is of fundamental importance to humankind.

    In this article I attempt to do two things: first, to set out the main features of the process of evolution by natural selection and why it is so widely accepted; second, to summarise its implications for our view of ourselves, our societies and our future.

    Of course, many excellent writers have described the workings and wonder of evolution, most notably Richard Dawkins (2009) in The Greatest Show on Earth.

    Charles Darwin in 1868.

    Not Just His Theory

    Before I discuss the Theory of Evolution by Natural Selection, as described by Charles Darwin in The Origin of Species by Natural Selection (1859) and modified in the light of later knowledge, let me dispose of one false idea which is used to try to undermine the concept of evolution.

    ‘Theory’ does not mean that it is not accepted; it is not ‘only his theory’, as I once heard it described. In science, a tentative idea is referred to as an hypothesis or conjecture.

    ‘Theory’ means that the idea has survived repeated testing and it is now the consensus. ‘Theory’ replaces the older idea of natural ‘laws’, fixed and immutable. (In science all theories are formally tentative and liable to change in the light of new evidence.) The strength of any theory depends on three things: the rigour of the testing it survives, the number of phenomena it accounts for and the accuracy of the predictions that arise from it.

    Sea shells, Rosses Point, County Sligo, Ireland.

    Variation in Living Things

    Variation in living things is the basis of all evolution, so I want to briefly explain the sources of variation. There are two main sources: genetic variation and ‘environmental variation. Genes provide the basic instructions for the assembly and function of living things. An individual’s genetic endowment comes from their parents. Sexual reproduction involves the shuffling of the parents’ genes so that each individual gets a virtually unique combination of genes. Genes are subject to chemical changes or mutations, which may alter their function. (On average we each have about 150 genetic mutations compared to our parents.)

    The degree of genetic control varies greatly. In some conditions it approaches 100% (sickle-cell trait, blood groups), but in many other conditions hundreds or even thousands of genes are involved in a particular trait (intelligence, height). In the latter case each gene has only a minute effect on the trait. Genetic instructions are also fairly general. For example, in brain development genes ‘direct’ a particular bundle of nerve fibres to connect to a particular group of nerve cells; but which individual fibre goes to which individual cell is not specified. The precise connections during development at that local level are a matter of chance (Mitchell, 2018).

    But the ’environment’ is also a major source of variation and plays a huge part in the ultimate results of the genes. By ‘environment’ I mean the environment inside cells where genes are ‘translated’, the environment within the developing body, and also the environment in which the living creature exists. For humans this includes all life experience from family, education, illness, social interactions and everything else.

    What is Evolution?

    Evolution means the adaptive changes in living things which fit them to their environment. This is quite distinct from the development of the embryo or its voguish use for any change over time. Charles Darwin spent decades gathering evidence to support his idea of evolution by natural selection. Just like any other idea it has undergone changes to fit in with new knowledge, but Darwin’s description remains at the core of evolutionary thinking.

    Essentially, Darwin proposed five key ideas, summarised by Ernst Mayer (1991) in One Long Argument. I’ll summarise each in turn.

    Evolution/Change: Darwin had to overcome the contemporary view that the world was recently created and species were unchanging. In the 19th century it was becoming clear that the Earth is more than a few thousand years old. We can have great confidence in this idea because it is established using several completely independent measures, which all show that the Earth is about 4.5 billion years old (Dawkins, 2019).

    This great age of the Earth is crucial to evolution because vast periods of time are necessary for genetic changes (mutations) to occur and for their consequences to be tested in the real world by ‘Natural Selection’. This vast expanse of time also evens out the effects of random events so that major trends can predominate. Just think of the thousands of seeds produced by a single plant: perhaps only one will end up in a spot that is suitable to allow it to reach maturity and produce offspring. Over an extended time period the best adapted to the local conditions will come to predominate. That’s how randomness works: a huge numbers of opportunities arising over long periods of time.

    During the 19th century the discovery and examination of fossils showed that some species had become extinct while others had evolved and left modern descendants. These studies also showed that different vertebrate species shared a common body plan, albeit significantly modified in some cases. For example, compare the human forelimb with that of a horse or bat. The plan is the same, but each is massively modified to adapt the animal to  its way of life (Huxley, 1863). Darwin also used evidence from the ‘artificial selection’ by animal and plant breeders of his own time, which showed that living species could change significantly at a much greater rate than could occur by chance in nature.

    Common Descent: Darwin called this ‘descent with modification’, so that offspring resemble their parents but are not identical. (Darwin had no knowledge of the mechanism of inheritance and mutation.) The genetic differences arising from mutation and genetic shuffling during sexual reproduction are the basis of evolution. Differing circumstances will favour certain genetic variants over others, leading to differential distribution of genes throughout the population.

    Descent with modification implies that all organisms come from a single common ancestor. The more closely related two species are, the more recent is their common ancestor.

    Natural Selection: Darwin inferred this from descent with modification and the fact that there are generally far more offspring than are needed for mere replacement of the population, leading to competition for resources and mates, so that over vast time spans the offspring best ‘fitted’ to their circumstance tend to survive and reproduce. In this way favourable mutations persist and become distributed through a population. This comes about by natural selection acting on variations that occur by chance.

    Natural selection is the most important element of evolutionary theory and perhaps the hardest to grasp, so I’ll present the example of the evolution of human skin colour in some detail. The earliest humans in Africa had dark skin which gave protection against strong sunlight. (Apart from sunburn, strong sun can also cause mutations which might lead to skin cancer.) In that environment dark skin clearly has an adaptive advantage. However, as human populations migrated northwards — over tens of thousands of years — darker skin became disadvantageous because it is less able to synthesise vitamin D, which requires sunlight. (Vitamin D is required for heathy bone growth.) Darker skin was no longer adaptive but had a selective disadvantage while paler skin was advantageous. In genetic terms, genes which altered  the skin to a lighter hue were favoured and became more widespread in the population as a whole. In other words, those with paler skin were better adapted to thrive and pass on their genes to the next generation.

    Species Multiply: A species is usually defined as a group of organisms that commonly interbreed and rarely, or never, interbreed with other members of related species. The simplest mechanism for forming new species is geographical isolation — by oceans or mountains for example — so that interbreeding is no longer possible and the separate populations diverge by adapting to different foods or acquiring different mating behaviours — adaptations which are inherited. Eventually the populations become so different that they can no longer interbreed, even if reunited.

    ‘Darwin’s Finches’ in the Galápagos islands are a classic example. When the Galápagos islands were formed by volcanoes they were colonised by a single species of finch from the South American mainland. They diverged over thousands of years acquiring mutations affecting, for example, beak shapes which adapted them to consume new foods. Eventually the differences were so great that they became different species incapable of interbreeding.

    Gradualism: There are no sudden leaps in evolution; new types do not suddenly arise, but are formed by the gradual accumulation of beneficial mutations and adaptations.

    ‘Nothing in Biology Makes Sense Except in the Light of Evolution’. Theodosius Dobzhansky (1973) American Biology Teacher, 35 (3): 125–129.]

    This summary of the main processes of evolution by natural selection shows that the workings of random processes with no purpose result in increasing levels of adaptation of living things to their environment. This is based on the fact that individuals vary and much of the variation is inherited. In competition for resources any slight advantage will be retained and spread through successive generations. In this way small changes can pile up to lead to large changes and eventually to new forms and new ways of life.

    Julian Huxley in 1922.

    The Modern Synthesis 

    In Darwin’s time there was no understanding of the mechanism of heredity which makes it all the more remarkable that he was able to take his ideas so far. Gregor Mendel first published his work in 1886 in an obscure journal and showed that heredity was in discrete units which were passed down the generations and combined in consistent ways (you can find a summary here). His revolutionary work was not rediscovered until the early years of the 20th century when the mechanisms of mutation and the spread of variant genes through populations were clarified. This work was brought together into a coherent whole by Julian Huxley (1942) in Evolution: The Modern Synthesis, generating what is sometimes called ‘Neo-Darwinism’. At that time this book was described as ‘the outstanding evolutionary treatise of the decade, perhaps the century.’

    Daniel Dennett in 2008.

    Implications of Evolution by Natural Selection: Here we explore some of the main implications of what Daniel Dennett (1995) called ‘Darwin’s dangerous idea’ for our understanding of ourselves and our world. We’ll consider the wide application pf evolutionary thinking in a variety of fields of human endeavour, then outline its impact on religion. After that we’ll look at ‘man’s place in nature’ and the special features of humans which result in our responsibility for the future evolution of ourselves and other living things on Planet Earth.

    Applications of Evolution to Different Fields of Learning. One of the tests of an idea is how widely it serves as an ‘organising principle’, helping to examine and explain a wide range of phenomena. The evolutionary principles of variation and differential survival are considered essential in many disciplines outside biology from astronomy and cosmology to philology. (Indeed, philologists, who study the origins of words and languages, were ‘early adopters’ in the 19th century and nowadays some even use genetic models to build family trees of languages.)

    In the sense that all fields of learning — indeed all human activities — are products of living things, namely humans, it is not surprising that the concept of evolution has proved so useful. It is all Biology after all (see Cultural Evolution below).

    Religions: The earliest supporters of evolution recognised that there would be conflict with religion for two main reasons. First, because of the demonstration of the extinction and change of species, contrary to the belief in a single creation of fixed species. Second, evolution by natural selection is sufficient to explain both the ever more refined adaptation of organisms to their environment and also the intricacy of structure (Dennett’s ‘engine for complexity’). Hence it removes both the need for a creator god and the argument from design which asserts that intricate structures must have had a designer.’ Hence it removes both the need for a creator god and an argument for intelligent design which asserts that intricate structures must have had a designer. Some religious groups will accept most evolutionary ideas but insist that humans are special in that they have separately and divinely created souls. We will see that humans are special, but we can account for this in purely evolutionary terms.

    ‘Man’s Place in Nature’; (The title of an 1863 book by TH Huxley, that fierce 19th-century supporter of evolution.) The principle of descent with modification leads to the idea that all living things (including humans) are related. We are not separate from nature; we are part of nature, another type of animal, descended from other animals. (The Last Universal Common Ancestor (LUCA) of all living things was about 3.9 billion years ago; the last common ancestor of the human species was about 250,000 years ago.) In evolutionary terms that makes us all practically cousins and we should strive to co-operate. As Bertrand Russell and Albert Einstein (1995) wrote: ‘…remember your humanity and forget the rest…’

    Dublin, Ireland.

    Uniqueness of Humans — Cultural Evolution

    Although we are undeniably part of the living world, an animal among other animals, we are however, special — indeed unique — in that we have the most complex brains, advanced language and writing. These qualities move us out of the two slow earlier phases of evolution recognised by JS Huxley sixty years ago. The first, Inorganic phase took billions of years for the formation of stars and the larger atoms, such as iron, carbon etc. The second, Organic phase took hundreds of millions of years during which the more complex molecules were formed until eventually some could reproduce themselves. Essentially this is the forming of the first living things which increased slowly in their complexity (under the influence of natural selection) until humans appeared.

    In a few thousand years humans have evolved within Huxley’s Psychosocial phase of evolution in which change is extremely rapid: humans can rapidly transmit ideas of all kinds: technology, social structures — in short, all the cultural products of human societies. (I prefer the term cultural evolution for this process and I suspect that Huxley only called it ’psychosocial’ because he was addressing psychologists at the time.)

    Cultural evolution means that humans can understand their place in the world, determine desirable goals and set a course towards those goals. For Huxley the next great evolutionary advance will be humanity’s agreement about its ‘destiny’, based on rational scientific thought and evolutionary principles. Our understanding of cultural evolution has profound consequences for our view of ourselves because we can see that we are responsible for ourselves and our actions including their effects on other living things and on our environment. This in turn has implications for our view on the value of the individual and hence for the way we organise our societies. We will explore these aspects in the rest of this article.

    Every one of us is precious in the cosmic perspective. If a human disagrees with you, let him live. In a hundred billion galaxies, you will not find another.’ Carl Sagan, astronomer and writer (1981). Cosmos McDonald & Co, GB

     The Value of the Individual

    This is the great existential question for humans. An individual’s life of a few decades is as nothing on a cosmic time-scale of billions of years. In the face of this fact it is easy to feel daunted and despairing. Throughout human history many religions have addressed this question by promises of a blissful after-life or the suggestion that we are serving some supernatural being’s purpose — which is often depicted as unknowable and beyond question. Such views are unsupported by any useful evidence; they are matters of faith.

    However, the evolutionary view described above — what we may call evolutionary humanism — gives a much more optimistic perspective. On this view every individual has value precisely because we are the ‘agents of evolution’. Each individual human has the potential to contribute to the betterment of our species, all living things and our environment. The evolutionary view is supported by all the weight of modern biology, the fact of evolution and our knowledge about ourselves.

    In evolutionary humanism every individual is valued for two main reasons. First, in any evolutionary view diversity is prized in and of itself. As we have seen, diversity, or variation, is the stuff of evolution; without it evolution ceases. A population with a narrow range of possibilities and no variation is likely to become stranded by changes in the environment, unable to adapt — an evolutionary dead-end.

    Second, we cannot know what problems lie ahead of us and what skills and aptitudes will be required to survive. Happily, humans are wonderfully diverse. Every individual should be encouraged to seek personal fulfilment to the highest possible degree. This is not a recipe for hedonistic self-indulgence, but rather a strategy for fostering the widest range of skills and aptitudes as a kind of evolutionary insurance policy.

    Oslo, Norway.

    Implications for Societies

    Recall that variations in the effects of an individual’s virtually unique genetic endowment can occur during development and as a result of the ‘environment’ inside cells and the life-experience of an individual. Developmental effects are beyond our control, as is the genetic predisposition (at any given the moment). But the environment can be manipulated to produce optimum development of individuals. By environment I mean  all experiences throughout life. This includes nutrition, exposure to infection and many other factors. For humans, perhaps the most important environmental factor is education (in its broadest sense). This is where we gain much of our knowledge of the wider world and learn how to think. It is in education that there is the most potential for enhancing our super-powers of abstract thought, communication and planning our goals and working out how to get there.

    Given this knowledge of our development and an evolutionary overview which values each individual, we can get some clear pointers about how we should organise our societies for the best results on an evolutionary scale. In a society organised on the principles of evolutionary humanism, all individuals will have support and opportunities according to their needs so that they can maximise their potential. This means reducing poverty, providing efficient healthcare and the opportunities for education according to ability and attitude. As J. S. Huxley pointed out, our environment should include beauty and wonder. (George Orwell’s novel, 1984, shows how to do precisely the opposite.)

    Societies are extremely complex but evolutionary humanism provides a set of general guidelines to help work out the details at a local level. For our present purposes, it is sufficient to say that this is extremely important work and it will draw on many strands of human thought.

    Afterword: In attempting this summary of evolution and its implications, I am aware that almost every paragraph could be a topic for further detailed discussion of this fascinating and complex subject. Let the last words be those attributed by Francis Crick to Leslie Orgel: ‘Evolution is cleverer than you are.’

     Acknowledgements

    I am grateful to David McConnell and Tom Miniter for commenting on early drafts.

    References

    Bashford, A (2022). An Intimate History of Evolution: The Story of the Huxley Family. (An excellent account of JS and TH Huxley and their intellectual and personal milieux.)

    Dawkins, R (2009). The Greatest Show on Earth: The Evidence for Evolution.

    Dennett, DC (1995). Darwin’s Dangerous Idea: Evolution and the Meanings of Life.

    Huxley, JS (1961). The Humanist Frame (See the essay of the same title.

    (1964). Essays of Humanist

    (Much of JS Huxley’s work is now out of print although some of it can be read online, and scanned copies are available.)

    Huxley, TH (1863). Man’s Place in Nature and Other Essays. (Often reprinted but now out of print; available in scanned versions.)

    Mayr, E (1991). One Long Argument: Charles Darwin and the Genesis of Modern Evolutionary Thought.

    Mitchell. K (2018) — Innate: How the Wiring of Our Brains Makes Us What We Are.

    Russell, B & Einstein, A (1995). The Russell-Einstein Manifesto. https://pugwash.org/1955/07/09/statement-manifesto/ [accessed 8/5/23]

    Feature Image: Fossil, Rosses Point, County Sligo, Ireland.