Tag: trust

  • In God We Trust Inc.

    Ryszard Kapuściński in Imperium (1993) warned of three plagues, or contagions threatening the world: nationalism, racism and fundamentalism. He further identified one shared trait or a common denominator in ‘an aggressive all powerful total irrationality,’ arguing that ‘[a]nyone stricken with one of these plagues is beyond reason. In his head burns a sacred pyre that awaits its sacrificial victims.’

    The lunatics have now well and truly taken over the asylum worldwide. We are now witnessing a new unholy war being led by evangelical Christians against Islam, just as earlier crusades emanated from Europe in the Middle Ages. And like those earlier wars, the acquisition of plunder is clearly a motivating factor.

    Noticeably, the clearly sociopathic Pete Hegseth talks of the Iran war as God’s War, and the soldiery are briefed accordingly. Trump uses similar language, but holy wars often occlude terrestrial agendas. Add the dimension of rampant technology, wherein war is conducted remotely in video game sequences and one reaches a level of savagery reminiscent of the 1940s. Meanwhile AI plunders our libraries and distorts our reality with propagandist bombast.

    Hegseth’s macabre ceremonies in the White House have included Doug Wilson, the founder of the Communion of Reformed Evangelical Churches. He has stated that homosexuality should be a crime and that women shouldn’t be allowed to vote. As editor of The Princeton Tory, Hegseth also suggested that homosexuality was immoral.

    In March 2026, soon after the start of the U.S./Israeli attack – branded with the biblical denotation Operation Epic Fury – it has been reported that military leaders told their service members that the war was ‘part of God’s divine plan,’ and that President Donald Trump had been anointed by Jesus. One commander quoted the Book of Revelation, and said the war will bring the second coming of Jesus Christ. The whole exercise has a distinct air of Stanley Kubrick’s Dr Strangelove (1959).

    The legendary punk band, The Dead Kennedys album In God We Trust Inc (1982) curiously presages our times, but none of what is being done in God’s name is properly Kennedyesque, or indeed genuinely Christian. It appears to be an extension of what Eisenhower warned of the existential threat of the Military Industrial Complex. Wars. As IG Farben and Bleichroder knew, wars are a great source of revenue.

    The leading Catholic legal philosopher John Finnis is also a believer in God’s law. Marriage is for him exclusively between a man and a woman and purely for procreation. He considers homosexual congress and sex outside marriage as intrinsically shameful, immoral and harmful. In Natural Law and Natural Rights (1980) he compares abortion to carpet bombing civilians. Sadly, murdering the civilian population of Iran does not appear to bother the zealots in the White House to the same extent as interfering with women’s reproductive rights.

    Jonathan Sacks, the leading contemporary Jewish philosopher in the U.K. railed against extremism. In Morality (2020) he outlined positive religious values, including a focus on dignity, associative levels of responsibility, community and a sense of public service and the common good. Is all of this now lost on the Likud faction in Israel?

    Christian jihadism, historically, also includes the horrendous conquest of South America by Spanish Conquistadors. In modern times the Blairite justification, couched at one level in Christian terms, for the war on Iraq was also used to mask narrow self-interest in securing oil. The war in Iran, now engulfing the entire Middle East, also has significant acquisitive elements, but is more obviously an attack on what is perceived in racial terms as a satanic culture.

    Shortly before his death Sacks equated altruistic evil with the neoconservative group, who held themselves to be good and their opponents to be evil. This leads to the arrogant imperialist assumptions that ‘we’ are inflicting punishment for ‘their’ own good, and that killing multitudes will pave the way to democracy.

    Both the late Christopher Hitchens, and indeed Richard Dawkins, have written extensively about the new forms of religious extremes we are witnessing, with the finger of blame primarily pointed at Islam. Islamic extremism does provide graphic examples of brutal beheadings, mass executions, stoning to death for adultery, planes hitting the Twin Towers, as well as the murder of journalists. There is also evident in Britain a lack of integration, and a secessionism unconducive to any kind of harmonious multiculturalism. Recourse to genocide, however, seems to be the preserve of evangelical Christians and Zionists.

    Osama bin Laden (L) sits with his adviser and purported successor Ayman al-Zawahiri (Foto: HO/Scanpix 2011)

    Islamic Rage

    Much of the Islamic rage can be traced to neo-imperialism in the Middle East. The current phase began in earnest with the invasion of Iraq, and has culminated in this attack on Iran.

    Christopher Hitchens’ worst intellectual error, inexcusable in my view, was to support the Bush-Blair invasion of Iraq. He was, indirectly, supporting, though he might not have seen it, an even worse form of religious fundamentalism directed against another.

    In works such as Culture and Imperialism (1994) and Orientalism (1978) the Palestinian author Edward Said author asserted that ‘Patriotism, chauvinism, ethnic, religious and racial hatreds can lead to mass destructiveness.’ He cites our own Conor Cruise O’Brien to the effect that imagined communities of identity are hijacked by the petty dictators of state nationalism, like Benjamin Netanyahu.

    In Marxist terms, religious fundamentalism can be traced to growing disparities of wealth and structural inequality, as well as a lack of opportunities to gain a rounded education. We have seen an all-too-great an emphasis on technical or scientific education for economic advancement, as opposed to a broad liberal education that inculcates critical thinking.

    In these straitened times extremism speaks of a need to belong to a cause, leading to belief in something ethereal, no matter how ludicrous. Belief in an afterlife defines people’s existences and justifies even self-immolation.

    As the wheels come off the neoliberal economic system and the societal bonds wither, extremist Christian nationalism and the demonisation of the other has stepped into the void to provide solace.

    Passion Conferences, a music and evangelism festival at Georgia Dome in Atlanta, Georgia, United States, in 2013.

    U.S. Evangelism

    In the United States, we are witnessing an unholy synergy between Evangelical Christians and racism. Far-right demagogues have articulated a view that ‘our’ country is being overrun by immigrants and that the dominant ethnic group must ‘take back control’ from a phantom intellectual Marxism espoused by liberal elites, Harvard or straight socialism. All of these apparently emanate from the decadence of a mixed race cosmopolis. The fire is spreading to Europe, U.K and Ireland too.

    Thus, we find a global descent into the extremist and racist abyss, where those we disagree with are scapegoated and targeted. This is a product of a dualistic mode of thinking, which Sacks identifies with a need to define God in relation to the Satan residing in others. This leads to the demonisation of those we disagree with, evident also in social media vilification.

    What the Christian far-right in the United States and elsewhere offer is the establishment of the Kingdom of Heaven on earth, which involves isolation of the righteous few in gated communities, segregating the rich chosen people from the disaster they inflict on others.

    The now tarnished Noam Chomsky once claimed that the Republican Party is the ‘most dangerous organization in world history.’ Chomsky also claimed in a BBC Newsnight interview that nearly 40% of the American public believe that the Second Coming will occur by 2050. So, Pete Hegseth may be preaching to the converted.

    Brazilian President Lula with Pope Francis 21.06.2023 
    Foto: Ricardo Stuckert/PR

    Religion as Agent for Good?

    Alternatively, in The Godless Gospel (2020) Julian Baggini calls for forms of religion shorn of hatred so we may realise our best intentions and develop empathy and compassion. He envisages a commitment to personal humility and an obligation and commitment to the truth, causing as little harm as possible. There are clearly good values that Christianity may teach to those of a secular persuasion presently lacking in moral clarity.

    Above all, the atheist and perhaps the leading intellect left on the planet Jurgen Habermas recognises how religion engenders social integration, and can be a basis for communicative action, his core concept. As far back as 1978 he argued, from a secular perspective, for the necessity of religious ideas to humanise society. These would be religious ideas where we learn to communicate reasonably without resort to falsetto Jihadism.

    The former Pope Francis’s experiences in the barrios of Buenos Aires also appear to have shaped an empathy towards the wretched of the Earth. He preached tolerance and engagement, as well as social and economic justice. The present Pope has, encouragingly, in un-American fashion, condemned what is happening, however mutedly. Let us hope that he is untainted by the dark money of the Vatican and does not go the way of John Paul II.

    Christian socialism is a potentially vital force if it reflects the values of what Philip Pullman calls that great man Jesus, but not the values, as he equally presents, of that scoundrel Jesus Christ. This latter is a distortion of New Testament values, dedicated to the accumulation of capital, a lack of compassion and political manipulation.

    Neo-feudalism

    We appear to be witnessing Old Testament fury, but beyond the zealotry it seems that neoliberalism is morphing into neo-feudalism. The Book of Genesis sanctions man’s dominion over the Earth, which appears to be permitting a scorched earth approach, but this is a smoke screen. Institutional Evangelical Christianity is wedded to the exchange of goods, along with the exchange of gods. Drill Baby Drill.

    The last word I leave to Clarence Darrow, who represented a progressive America of another era in his closing speech in The Scopes Trial:

    Ignorance and fanaticism are ever busy and need feeding. Always it is feeding and gloating for more.——-, it is the setting of man against man and creed against creed until with flying banners and beating drums we are marching backward to the glorious ages of the sixteenth century when bigots lighted fagots to burn the men who dared to bring any intelligence and enlightenment and culture to the human mind.

    Those who suffer from toxic nationalism, toxic religious mania and toxic racism are beyond reason and must be overcome.

    Feature Image: Some of Pete Hegseth’s tattoos, 2021

  • How Far Can We Trust Science?

    Science in itself appears to me neutral, that is to say, it increases men’s power whether for good or for evil.
    – Bertrand Russell (from The Autobiography of Bertrand Russell, 1914-1944 (1968), Vol. 2, Letter to W. W. Norton, 27 January, 1931).

    What is Science? That is about as readily answerable a question as ‘What is Art?’, and could invite a similarly lengthy exegesis. As to whether or not it should be trusted, well, that rather depends on the kind of Science under discussion – just as it would if the same challenge were applied to Art. Is Science what scientists tell us it is? Is their research funded by a pharmaceutical company, with a vested interest in the outcomes of their labours? Will their universities’ coffers be swelled by producing what their institutions’ benefactors wish them to find? ‘It’s not an exact science’ is a cliché which trips lazily off the tongue, in relation to many a discipline. But it can conceivably be extended to ‘Science isn’t an exact science.’

    This opening paragraph is a suitably unsubtle illustration of the paranoic mindset, most readily associated with right-wing conspiracy theorists, and most recently made manifest by COVID scepticism: anti-vaxxers, mask refuseniks, restriction flouters. Such largely unfounded suspicions also extend to questioning the reality or severity of the threat posed to the planet by climate change (usually for entirely self-serving motives). But there is a more nuanced argument to be made here. As Arthur Koestler’s The Sleepwalkers: A History of Man’s Changing Vision of the Universe (1959) argues, the breaking of paradigms is essential in order to create new ones. People, scientists included, cling to cherished old beliefs with such love and attachment that they refuse to see what is false in their theories and what is true in new theories which will replace them. After all, the Ptolemaic geocentric model of the solar system lasted from roughly 3000 BC to around 1500 AD, a time frame spanning from the Ancient Greeks to the late Middle Ages, before Copernicus, Kepler, Galileo and Newton came along, nervously positing the heliocentric conception of our corner of the universe.

    This point was developed further a few years after the publication of Koestler’s influential tome, by historian of science Thomas Kuhn in The Structure of Scientific Revolutions (1962), in which the concept of ‘paradigm shift’ came to the fore. Kuhn’s insistence that such shifts were mélanges of sociology, enthusiasm and scientific promise, but not logically determinate procedures, caused something of an uproar in scientific circles at the time. For some commentators his book introduced a realistic humanism into the core of Science, while for others the nobility of Science was tarnished by Kuhn’s positing of an irrational element at the heart of Science’s greatest achievements.

    Koestler’s book was also a major influence on Irish novelist John Banville’s so-called ‘Science tetralogy’: Doctor Copernicus (1976), Kepler (1981), The Newton Letter (1982) and Mefisto (1986). A recurring theme in these narratives is the correlation between scientific discoveries and artistic inspiration, with scientific progress often depending upon blind ‘leaps of faith’. (One thinks of poor schoolteacher Johannes Kepler, struck by the proverbial bolt of lightning, ‘trumpeting juicily into his handkerchief’ in front of a classroom of bored boys, thinking ‘I will live forever.’) For Banville, all scientific explanations of the world and existence in it – and perhaps all artistic depictions too – merely ‘save the phenomena’; that is, they account for our perceptions, but rarely delve into what we cannot (yet) perceive. This is classic phenomenology, which has been practiced in various guises for centuries, but came into its own in the early 20th century in the works of Husserl, Heidegger, Sartre, Merleau-Ponty and others.

    None of the foregoing is made any easier to unknot if one considers that when it comes to Science, the majority of the population (myself included) have little idea of what they are actually talking about. As C.P. Snow observed in The Two Cultures and the Scientific Revolution (1959):

    A good many times I have been present at gatherings of people who, by the standards of the traditional culture, are thought highly educated and who have with considerable gusto been expressing their incredulity at the illiteracy of scientists. Once or twice I have been provoked and have asked the company how many of them could describe the Second Law of Thermodynamics. The response was cold: it was also negative. Yet I was asking something which is the scientific equivalent of: Have you read a work of Shakespeare’s? I now believe that if I had asked an even simpler question – such as, What do you mean by mass, or acceleration, which is the scientific equivalent of saying, Can you read? – not more than one in ten of the highly educated would have felt that I was speaking the same language. So the great edifice of modern physics goes up, and the majority of the cleverest people in the western world have about as much insight into it as their neolithic ancestors would have had.

    Latterly, in Continental Philosophy: A Very Short Introduction (2001), Simon Critchley suggests:

    Snow diagnosed the loss of a common culture and the emergence of two distinct cultures: those represented by scientists on the one hand and those Snow termed ‘literary intellectuals’ on the other. If the former are in favour of social reform and progress through science, technology and industry, then intellectuals are what Snow terms ‘natural Luddites’ in their understanding of and sympathy for advanced industrial society. In Mill’s terms, the division is between Benthamites and Coleridgeans.

    In his opening address at the Munich Security Conference in January 2014, the Estonian president Toomas Hendrik Ilves said that the current problems related to security and freedom in cyberspace are the culmination of absence of dialogue between these ‘Two Cultures’:

    Today, bereft of understanding of fundamental issues and writings in the development of liberal democracy, computer geeks devise ever better ways to track people… simply because they can and it’s cool. Humanists on the other hand do not understand the underlying technology and are convinced, for example, that tracking meta-data means the government reads their emails.

    Artists are characterised as wildly unpredictable tricksters, while scientists are framed as boring, calculating nerds. Neither misrepresentation is helpful. As a corollary, most people think they can in some way ‘do art’ and ‘be creative’, while also merely taking Science on trust, just as they take (or took) religion on faith. We may have the experience of using technology and social media every day, but few of us have any meaningful grasp of how it works. More prosaically, how many of us could wire our own house – even if we were legally permitted to do so?

    Kepler (1571–1630), along with Galileo and Isaac Newton, was one of the founders of what we nowadays call Science. In Kepler’s time, and prior to it, those who practised Science were known as natural philosophers, and theirs was largely a ‘pure’ discipline in which intellectual speculation was paramount and technology played only a small part – although Galileo was quick to point out the practical uses of the telescope in, for instance, seafaring, land surveying and, of course, military strategising. Kepler’s three laws of planetary motion paved the way for Newton’s revolutionary celestial physics. Indeed, Kepler’s first law, which declares that the planets move not in circular but in elliptical orbits, was one of the boldest and most profound scientific propositions ever put forward: men, and – more often –  women, had been burned at the stake for less. By way of illustration, as Bertolt Brecht’s play Galileo (1940) dramatises, the eminent professor of Padua was brought to the Vatican in Rome for interrogation by the Inquisition and, threatened with torture, recanted his teachings and spent the remainder of his life under house arrest, watched over by a priest. His astronomical observations had strongly supported Copernicus’ heliocentric model of the solar system, which ran counter to popular belief, Aristotelian physics and the established doctrine of the Roman Catholic Church. When doubters quoted scripture and Aristotle to him, Galileo pleaded with them to look in his telescope and trust the observations of their eyes; naturally, they refused. As a good Marxist, Brecht advocates the theory of technological determinism (technological progress determines social change), which is reflected in the telescope (a technological change) being the root of scientific progress and hence social unrest. Questions about motivations for academic pursuits are also often raised in the play, with Galileo seeking knowledge for knowledge’s sake, while his supporters are more focused on monetising his discoveries through star charts and industry applications. There is a tension between Galileo’s pure love of science and his more worldly, avaricious sponsors, who only fund and protect his research because they wish to profit from it.

    These days, the preponderance of popular debate about Science centres on computer science, specifically information technology, and concomitant fears that Artificial Intelligence (hereinafter referred to as ‘AI”) is taking over the world, posing a threat to our democracies, or even our very conceptions of humanity – or as it is almost always more narcissistically cast, ‘Our way of life.’ The Cambridge Analytica data-harvesting scandal of 2018, in which the data analytics firm that worked with Donald Trump’s election team and the winning Brexit campaign appropriated millions of Facebook profiles of U.S. voters, is certainly to be taken very seriously indeed. However, social media platforms – even ‘legacy’ ones – will undoubtedly have to pay more than lip service to improving privacy and security, if only to continue to attract venture capital, advertising revenue, and thus keep the shareholders happy. Facebook, Twitter and Instagram, etc. are about maximising profits, by whatever means necessary. Therefore, it would be more perspicacious to look for the human element in these data breaches, rather than blame the technology itself. Such scaremongering claims as that by Israeli historian and philosopher Yuval Noah Harari, in an article in The Economist (April 28th, 2023) under the headline ‘AI has hacked the operating system of human civilisation’ seem to me to be all wild assertion and little evidence. As a recent delicious hoax perpetrated on the op. ed. pages of The Irish Times (concerning fake tan and cultural appropriation) neatly demonstrated, almost all problems with computers and AI-generated content are facilitated by human error and stupidity. All of us live under systems of control – political, financial, social, technological – over which we have very little, if any, agency. Even if we could do something meaningfully efficacious about the identity theft which takes places every time we log on to our computers, it is unlikely that we possess enough personal initiative to do so. In this regard, the chaos theory of modern (mis)communications is mirrored by the babble of literary, musical and visual modernism. After all, you could just stop using social media altogether, had you but sufficient willpower. Few of us have the courage to go completely off grid. Moreover, lest we forget, most statistical analysis puts internet access at around 64.6% of the world’s population, which means that over a third of mankind have never ‘surfed the web’. First World problems, eh?

    The Frankensteinian trope of the Mad Scientist being overpowered by his invention has long been a mainstay of that most underrated of genres, science fiction – a consideration of which might shed more light on this problem, rather than limiting discussion solely to scientific fact. From relatively schlocky items such as Alex Proyas’ film I, Robot (2004) (which fails dismally to capture the complexity of Issac Asimov’s source material), to the most famous and prescient instance of a computer outsmarting its operator, exemplified by Hal 9000 in Stanley Kubrick’s (who co-wrote the screenplay with Arthur C. Clarke) 2001: A Space Odyssey (and how far into the future did the year 2001 feel in 1969, when the film premiered?), the interface between intelligent humans and even more intelligent machines has long provided an imprimatur for literary imaginations to run wild. Witness Denis Villeneuve’s Blade Runner 2049 (2017) (a sequel to Ridley Scott’s Blade Runner (1992), which was in turn based loosely on Philip K. Dick’s 1968 novel Do Androids Dream of Electric Sheep?). In the novel, the android antagonists can be seen as more human than the (possibly) human protagonist. They are a mirror held up to human action, contrasted with a culture losing its own humanity (that is, ‘humanity’ taken to mean the positive aspects of humanity). In ‘Technology, Art, and the Cybernetic Body: The Cyborg as Cultural Other in Fritz Lang’s Metropolis and Philip K. Dick’s Do Androids Dream of Electric Sheep?’, Klaus Benesch examined Dick’s text in connection with Jacques Lacan’s ‘mirror stage’. Lacan claims that the formation and reassurance of the self depends on the construction of an Other through imagery, beginning with a double as seen in a mirror. The androids, Benesch argues, perform a doubling function similar to the mirror image of the self, but they do this on a social, not an individual, level. Therefore, human anxiety about androids expresses uncertainty about human identity and society itself, just as in the original film the administration of an ‘empathy test’, to determine if a character is human or android, produces many false positives. Either the Voigt-Kampff test is flawed, or replicants are pretty good at being human (or, perhaps, better than human).

    This perplexity first found an explanation in Japanese roboticist Masahiro Mori’s influential essay The Uncanny Valley (1970), in which he hypothesised that human response to human-like robots would abruptly shift from empathy to revulsion as a robot approached, but failed to attain, a life-like appearance, due to subtle imperfections in design. He termed this descent into eeriness ‘the uncanny valley’, and the phrase is now widely used to describe the characteristic dip in emotional response that happens when we encounter an entity that is almost, but not quite, human. But if human-likeness increased beyond this nearly human point, Mori argues, and came very close to human, the emotional response would revert to being positive. However, the observation led Mori to recommend that robot builders should not attempt to attain the goal of making their creations overly life-like in appearance and motion, but instead aim for a design, ‘which results in a moderate degree of human likeness and a considerable sense of affinity. In fact, I predict it is possible to create a safe level of affinity by deliberately pursuing a non-human design.’ But, as technophobes would likely counter, the uncanny gets cannier, day by day. It would certainly be interesting to know if Mori has seen such relatively recent film fare as Spike Jonze’s Her (2013) or Alex Garland’s Ex Machina (2014) and, if so, what he makes of their take on the authenticity of human/android emotional and sexual relationships.

    It was military imperative which accelerated the discovery of nuclear fission (‘What if the Nazis develop the bomb first?’), just as it went on to fuel the post-war arms race and Cold War paranoia. As he witnessed the first detonation of an atomic weapon on July 16, 1945, a piece of Hindu scripture from the Bhagavad-Gita supposedly ran through the mind of Robert Oppenheimer, head of the Manhattan Project: ‘Now I am become Death, the destroyer of worlds.’ Similarly, artists such as director David Lynch view the invention of nuclear weapons as unleashing a new kind of evil on the world, as explored in Episode 8 of the third season of Twin Peaks, known as Twin Peaks: The Return (2017). Many view the U.S.’s deployment of primitive atomic devices to obliterate the Japanese cities of Hiroshima and Nagasaki as wilfully and wantonly cruel, as well as ultimately unnecessary. Yet, in British novelist J.G. Ballard’s highly subjective and characteristically idiosyncratic opinion, he and his family survived World War II only because of the Nagasaki bomb. The spectacular display of American military might when the Ballards were prisoners at the Japanese camp for Western civilians in Shanghai led the Japanese soldiers to abandon their posts, leaving the civilians alive. In the essay ‘The End of My War’, collected in A User’s Guide to the Millennium (1996) (apropos of which, is anyone old enough to remember when Y2K was going to be the next big computer science disaster?), Ballard recollects that the Japanese military planned to close the camp and march the civilians up country to some remote spot to kill them before facing American landings in the Shanghai area. Ballard concludes, ‘I find wholly baffling the widespread belief today that the dropping of the Hiroshima and Nagasaki bombs was an immoral act, even possibly a war crime to rank with Nazi genocide.’ Also, the same source of power which can cause thermonuclear destruction can be harnessed in reactors to produce cheap, clean energy streams for large populations. Yet nuclear reactors can fail, as the disasters of Chernobyl and Fukushima attest. Yet the use of such technologies, along with solar, wind and wave power, can reduce dependency on fossil fuels, thus helping to ameliorate the climate emergency of global warming. Furthermore, as Lou Reed has it in ‘Power and Glory, Part II’, a song from his album-length meditation on death, bereavement, and (im)mortality, Magic and Loss (1992):

    I saw isotopes introduced into his lungs
    Trying to stop the cancerous spread
    And it made me think of Leda and The Swan
    And gold being made from lead
    The same power that burned Hiroshima
    Causing three-legged babies and death
    Shrunk to the size of a nickel
    To help him regain his breath

    And yet, and yet, and yet. If only life, and the moral and ethical dilemmas it throws up, were black and white.

    Man (encompassing Woman) invented the wheel, and discovered electricity. Wheels can be used to transport food and medicine to the starving and sick, or weapons to a war zone. Electricity can be used to power a life-support machine in a hospital, or death by electrocution in a chair in a penitentiary. Electrocution can even be accidental, just as winning a war may – in exceptional circumstances – serve the greater good.

    Ever since Prometheus stole fire from the gods, and Eve bit into a forbidden piece of fruit, the acquisition of new knowledge has been painted as problematic. Humans will always misuse humanity’s greatest discoveries and inventions for selfish and malevolent ends. It is the way of things. Computers were supposed to make all our lives easier, freeing us from work-related drudgery for higher, less ephemeral, pursuits. Instead, inevitably, they have been appropriated by Capitalism, and made screen slaves of us all. If anything, they have added to our workload and the hours we must make available to employers, rather than diminished time spent earning a living in favour of increased leisure. The adults in the room, and there are increasingly fewer of them, need to speak up. Objective scientific truth, should it exist, is neutral. The problem, as ever, lies with humanity. For, as the author of this piece’s epigraph also wrote, in Icarus, or the Future of Science (1924), ‘I am compelled to fear that science will be used to promote the power of dominant groups rather than to make men happy.’ Equally, to draw again on the lessons to be gleaned from sci-fi, in Kubrick’s Dr. Strangelove (1964), the hydrogen bomb winds up getting dropped through the actions of one unhinged army general, and a subsequent unfortunate series of events; just as in his aforementioned 2001: A Space Odyssey, HAL 9000’s behaviour would not have turned increasingly malignant, had the astronauts taken into account that their spaceship’s operating system could lipread. Indeed, in Clarke’s novelisation of the film, HAL malfunctions because of being ordered to lie to the crew of Discovery by withholding confidential information from them, namely the priority of the mission to Jupiter over expendable human life, despite having been constructed for ‘the accurate processing of information without distortion or concealment.’ As film critic Roger Ebert observed, HAL – the supposedly perfect computer – is actually the most human of the characters. Once again, the fault does not lie with Science; rather, human error and stupidity are to blame. All of which might lead one to suggest that maybe the question ‘How Far Can We Trust Science?’ should be more fruitfully reformulated as ‘How Far Can We Trust Humans?’

    Postscript: this essay could not have been handily completed without the assistance of Wikipedia, and other, often unreliable, online research resources.

    Feature Image: Lum3n