It’s been difficult finding the words to express my worsening mood and deepening depression. I’m referring specifically to my subconscious responses to altered public behaviour and the marks left by social reaction to Covid-19. For the first time in my life, I’m noticing increasing anxiety and, with the stress, a direct link to declining health. I’ve been struggling with this worsening dynamic over the last month or two, trying to get to grips with it. Trying to better understand its cause. I’m sure I’m not alone in this.
2/7/1986 President Reagan with William F Buckley in the White House Residence during Private birthday party in honor of President Reagan’s 75th Birthday
on the other hand, is hardly even capitalist in outlook. It is really an offshoot of a more authoritarian leftism combined with a fundamentalist, morally self-righteous neocolonialism informed by ‘Christian’ values. It is associated in particular with the administrations of George W. Bush, with Paul Wolfowitz and Richard Perle its most prominent ideologues.
There are thousands of content moderators, who are paid to view objectionable posts and decide which need to be removed from digital platforms. Many are severely traumatized by the images of hate, abuse and violence they see on a daily basis so that we, our families and children get to see ‘WARNING: The following post or content may be disturbing to some viewers.’
From Spain, Connor Blennerhassett brought a report on the ordeal suffered by vegan activist Juan Manuel Bustamante, who spent sixteen months in jail on trumped-up terrorist crimes: ‘a Kafkaesque nightmare that saw him pass through five of Spain’s most notorious prisons, often locked up in solitary confinement and denied a vegan diet by his captors, who also beat him. It ruined his family’s finances and lead him to attempt to take his life after his release.’
Icaria, Greece
Over in Greece Frank Armstrong found a hardening of borders, and attitudes, in the wake of the pandemic, and drew wisdom from the writings of Albert Camus:
Albert Camus in The Rebel (1951), identified an enduring tension between a Caesarian Marxist project that permits all manner of atrocity on the journey to earthly paradise, and an approach he identifies with Ancient Greece, characterised by moderation, incrementalism and respect for tradition. He suggests:
The profound conflict of this century is, perhaps, not so much between the German ideologies of history and Christian political concepts, which in a certain way are accomplices, as between German dreams and Mediterranean traditions … in other words, between history and nature.
Vietnam. Image (c) Hectic Fish
Also, for the first time since his arrival, Hectic Fish was also able to travel around Vietnam, he proceeded to the territory of the Mnong accompanied by a copy of Rachel Carson’s The Marginal World ‘the otherworldly essay that opens The Edge of the Sea.’
The shore is an ancient world, for as long as there has been an earth and sea there has been this place of the meeting of land and water. Yet it is a world that keeps alive the sense of continuing creation and of the relentless drive of life. Each time that I enter it, I gain some new awareness of its beauty and its deeper meanings, sensing that intricate fabric of life by which one creature is linked with another, and each with its surroundings.
There was also fiction fromSarah Johnson with ‘The Candidate for the Roberts Prize’ where ‘The significance of discovery lies exactly in the degree to which it can be appreciated and put to use by the human community.’ And Glenda Miller’s ‘The Club’ in which an experience of cancer prepares her for the agonies of the birthing process.
Next election onwards, there’ll be a second vote for those who turn up with, under their arm, a print copy of one of the larger newspapers and answer a few unobtrusive questions to prove they’ve consumed it correctly.
A third for those who also present receipts that show they’ve dined sufficiently in restaurants with at least four stars, and a note from the maitre d that they know their way around the cutlery.
A fourth for the lucky few in possession – to boot – of a ticket for one of those pampering spas at which one temporarily discards worldly things to have one’s darker parts irrigated of all subversive thoughts.
So when all’s said and counted, people who shouldn’t matter can go back to not mattering.
Tick Yes or No: ‘I understand the content I will be reviewing may be disturbing. It is possible that reviewing such content may impact my mental health, and it could even lead to post-traumatic stress disorder (PTSD).’[i]
Last year, a sixteen-year-old Malay girl posted a poll on Instagram asking her followers whether she should live or die.[ii] 69% voted for death and she took her own life. The followers who voted that she should die neither took action to protect their ‘friend’ nor shared empathy or concern.
“People are awful.[iii] This is what my job has taught me”, says a former Facebook content-moderator who recently sued the social media giant after experiencing psychological trauma as a direct consequence of his work. The Wall Street Journal recently described content moderator as ‘the worst job in the US’[iv] , and the same applies to other countries, which this article elaborates on.
Very little is known about the role, mental health toll or other work experiences of content moderators. They may work for YouTube, Facebook, Google and other such platforms that we are all pretty much ‘addicted’ to.
A few studies are now looking into the working conditions for people[v] who determine what ‘material’ or ‘content’ can be posted to Facebook or Twitter or YouTube. Their job is to decide on whether content adheres to the ‘community guidelines’ of online platforms. They work day and night so that we the users are saved from exposure to videos of graphic violence or child abuse as well as hate speech, among the constant stream of user generated material uploaded on to social media feeds.
There are thousands of content moderators, who are paid to view objectionable posts and decide which need to be removed from digital platforms. Many are severely traumatized by the images of hate, abuse and violence they see on a daily basis so that we, our families and children get to see ‘WARNING: The following post or content may be disturbing to some viewers.’
The heavy mental health toll on content moderators who are hired on a ‘freelance’ or ‘gig’ basis cannot be underestimated.
Never-ending Uploads and Ever-Expanding Platforms
A staggering three hundred hours of video content is uploaded on to YouTube every minute, while over ninety-five million photos[vi] are uploaded to Instagram each day, along with over five hundred million tweets sent out on Twitter (or 6,000 tweets per second). Therefore, it is virtually impossible for human moderators to vet every piece before a content is uploaded and goes live (with some potentially going ‘viral’). Popular platforms such as these serve user-generated content uploaded by a global community of contributors.
The uploaded content is just as diverse as the user base, meaning inevitably that a significant amount is offensive to most users and, by extension, the platforms. Users routinely upload (or attempt to upload) content such as: child abuse, animal torture, and disturbing, hate-filled messages.
Facebook outsources the hiring of content moderators and provides office space. Its sites are largely outside the United States – mainly in south, south-east and east Asia, but the operations have expanded to the US, more specifically in California, Arizona, Texas and Florida.[vii] Content moderators work at a computer workstation where they review content – a steady stream of text posts, images and videos. These can range from random personal musings to information with ramifications for international politics. Some of it may seem rather benign – just words on a screen that someone didn’t like. While the worst may be incredibly disturbing. On a regular basis moderators have to witness beheadings, murders, animal abuse, and child exploitation. Therefore, one might wonder, what toll on mental health does this take?
One previously unreported aspect of a moderator’w job is the numerical quotas that these subcontractors[viii] are forced to meet: each moderator is required to screen thousands of images or videos per day in order to maintain their employment.
Facebook alone has an army of about 15,000 people in 20 locations[ix] around the world, who decide what content should be allowed to stay on Facebook, and what should be marked as ‘disturbing’, whether execution videos from terrorist groups, murders, beatings, child exploitation or the torture of animals. In addition to the stress of exposure to disturbing images and videos, there is also the pressure to make the right call about what how to mark the content. A wrong decision taken under stress will have penalties, financially for the worker, and also may have mental health effects on other human lives.
Platforms, as we know them, reserve the right to police user-generated content through a clause in their Terms of Service (which none of us read, or do we? Should we?), usually by incorporating their Community Guidelines as a reference. For example, YouTube’s Community Guidelines prohibit ‘nudity or sexual content’, ‘harmful or dangerous content’, ‘hateful content’, ‘violent or graphic content’, ‘harassment and cyberbullying’, ‘spam, misleading metadata’, ‘scams’, ‘threats’ videos that would violate someone else’s copyright, ‘impersonation’ and ‘child endangerment.’
‘Now you see me’
The Cleaners, a recent documentary, features interviews with several former moderators who were previously outsourced by a subcontractor in the Philippines. The interviewees exposed their experiences of filtering the very worst images and video the internet has to offer. In the Philippines, workers operate out of jam-packed malls, where they spend over nine hours a day moderating content for as little as $480 a month.[x] With few workday breaks and no access to counselling, many of these individuals end up suffering from insomnia, depression and post-traumatic stress disorder.
Records also show the average pay of a full-time online content moderator in the US is around $28,000, but globally and by a large measure a significant amount of hiring is done through outsourcing and on a temporary basis. In Ireland, research shows that typically a Facebook employee would be paid a basic rate of €12.98 per hour,[xi] with a 25% bonus after 8pm, plus a travel allowance of €12 per night – the equivalent of about €25,000 to €32,000 per year. Yet the average Facebook employee in Ireland earned €154,000 in 2017.
On average, the workload involves moderating about 300 to 400 pieces of content[xii] – called ‘tickets’ – on an average night. On a busy night, their queue might have 800 to 1,000 tickets. The average handling time is 20 to 30 seconds – longer if it’s a particularly difficult decision.
‘We are trash to them, just a body in a seat’ shares a content moderator. Every work minute is strictly bound.[xiii] Harsh working conditions characterised by specified bathroom breaks and a meagre nine minutes of wellness time engenders a stress that is exacerbated by employers’ downplaying the importance of mental health care.
The continuum of content in those quotas range from tone-deaf jokes; kids dressed up as history’s great dictators that may constitute hate speech; nude images; domestic violence images, and then the really graphic and inhumane ones that inevitably surface. The content moderators have about twenty-four hours[xiv] within which they have to classify the posts under bullying, hate speech, and other content as appropriate.
Like other forms of gig workers, digital reputation or future work orders come from high ratings. Several former moderators felt pressurised to achieve a 98% quality rating. This would mean that the auditor would agree with 98% of their decisions taken on a random sample of tickets. Moderators are therefore scrutinised for the smallest mistakes. An unending stream of extremism, violence, child sexual abuse imagery and revenge porn, does not give moderators time to consider the more subtle implications of particular posts.
Artificial Intelligence (AI) cannot nail this one… just yet!
Moderators are human beings, so mistakes are inevitable. However, to shatter one misconception on this front: Artificial Intelligence (AI) cannot help much in this field. They currently act as triage systems; for example, by pushing suspect content to human moderators and weeding out some unwanted material on their own. But AI cannot solve the online content moderation problem without human help. For example, AI uses either a visual recognition to identify a broad category of objectionable content or match content to an index of banned items (for example, illicit materials, child abuse, terrorist content, etc.) – and then it allocates a ‘hash’ or an ID so that if these are detected again, the uploading process will be disabled. But then guess who will need to set the parameters before the automation can work!?
Automated systems using AI and machine learning still have a long way to go before they can carry out content moderation independently (free of human help that is). We are surely not there yet.
Content moderation is arguably one of the most important tasks that BPOs perform today, fulfilling outsourced contracts for social media giants ranging from Facebook and TikTok to Live, among many others. This has led to a process-driven BPO[xv] industry that has become the refuge for quick-fix content moderation based on subjective criteria. Add to that how many of the mods are often young people (their average age is less than thirty), who sometimes join even before finishing college degrees, and the problems begin to add up.
The Need for (Content Upload) Speed and…Training!
One might have assumed that US companies who hire moderators would have a good understanding of these issues, but it turns out that they really don’t. It has been reported for instance that Facebook doesn’t provide ongoing cultural education for these moderators to bring them up to speed. The one exception is when a particular issue goes viral on Facebook, and there’s a sudden need to bring everybody up to speed in real time. With this laissez faire approach it is unsurprising how many Court, Senate and Congressional hearings Mark Zuckerberg has had to attend over the past four years (and not just for the Cambridge Analytica scandal).
One former moderator shared how he witnessed images of child sexual abuse[xvi] and bestiality with me while weeding out content that was unsuitable for the platform. He suffered from psychological trauma as a result of these working conditions and a lack of proper training.
Accenture is one of the companies that hires contract workers to review content for big networks like Google, Facebook, and Twitter. There is a well-documented history of content moderators reviewing[xvii] including graphic and disturbing imagery – with jobs taking significant mental health tolls, and leading to psychological trauma.
In order to share more of what goes on during content moderation, the freelancers have to break the nondisclosure agreements first, and this is an area where there is journalistic investigations and research work pending. One of the burning questions is whether the company has anything to say about the psychological and emotional impact of watching the brutality, pornography, and hate that the moderators have to look at on a daily basis?
Some Debt Cannot be Repaid
Facebook has already paid out a $52 million settlement to content moderators suffering from mental health problems such as Post Traumatic Stress Disorder (PTSD).[xviii] In light of repeated allegations and the seriousness of the situation, the company has agreed to compensate American content moderators and provide extra counselling during their tenure. The social media giant will pay a minimum of $1,000 to each moderator.[xix] The settlement covers 11,250 moderators which is a glimpse at the colossal number (in hundreds of thousands) of moderators involved in this work globally.
“I know it’s not normal, but now everything is normalized[xx],” said a moderator who declined to share his name and other details because of the confidentiality clause he signed when he took the job. Non-disclosure agreements are non-negotiable for moderators, and are forcibly imposed by the platforms. For example, YouTube content moderators are reportedly being told they could be fired if they don’t sign ‘voluntary’ statements acknowledging their jobs could give them PTSD (i.e. post-traumatic stress disorder).
Reports also shows that Accenture managers repeatedly coerced site counsellors to break patient confidentiality.[xxi] Although these allegations were refuted by Accenture, such fault lines between workers and management are bound to affect organisational morale.
Further studies are elusive on whether companies such as Accenture are shifting the responsibility of mental health care onto individual employees, and thus avoiding liability in the face of increasing lawsuits from dormer moderators. In response to growing allegations, certain social media giants have reinstated their commitment towards safeguarding their employees’ mental health and have clinical psychologists on call.
The Valley of Uploads
While some of the specifics remain intentionally obfuscated, content moderation is done by tens of thousands of online content moderators, mostly employed by subcontractors in India and the Philippines, who are paid wages well below what the average Silicon Valley tech employee earns. We need more studies and investigations on this as time progresses, as our hunger for newer ‘tailor-made’ media feeds continues to grow.
The general assumption is that the large tech companies can easily hide the worst parts of humanity, otherwise freely available on the internet. There is no easy solution. With billions of users and unending uploads, there will never be enough moderators to check everything before it is shared with the world.[xxii]
Legal challenges and new methods of reporting abuse help to narrow the risks, but the task is nonetheless Sisyphean. The complexities are ongoing, ever-growing and multi-faceted. The trade-off between a ‘quick fix’ of myriad issues would still create a dispersed range of unintended externalities to the stakeholders involve. This list includes the users, content moderators, companies, lawmakers and legal systems monitoring these behemoth digital platforms.
[i] Madhumita Murgia, ‘Facebook content moderators required to sign PTSD forms’, Financial Times, January 26th, 2020, https://www.ft.com/content/98aad2f0-3ec9-11ea-a01a-bae547046735
[ii] Jamie Fullerton, ‘Teenage girl kills herself ‘after Instagram poll’ in Malaysia’, May 15th, 2020 https://www.theguardian.com/world/2019/may/15/teenage-girl-kills-herself-after-instagram-poll-in-malaysia
[iii] Marie Boren, ‘Life as a Facebook moderator: ‘People are awful. This is what my job has taught me’’ Irish Times, February 27th, 2020, https://www.irishtimes.com/business/technology/life-as-a-facebook-moderator-people-are-awful-this-is-what-my-job-has-taught-me-1.4184711.
[vi] Daisy Soderberg-Rivkin, ‘Five myths about online content moderation, from a former content moderator’. October 30th, 2019, https://www.rstreet.org/2019/10/30/five-myths-about-online-content-moderation-from-a-former-content-moderator/
[vii] ‘Inside Facebook, the second-class workers who do the hardest job are waging a quiet battle’, Washington Post, https://www.washingtonpost.com/technology/2019/05/08/inside-facebook-second-class-workers-who-do-hardest-job-are-waging-quiet-battle/
[viii] Terry Gross, ‘For Facebook Content Moderators, Traumatizing Material Is A Job Hazard’, NPR, July 1st, 2019,
[xiii] Prithvi Iyer, Suyash Barve, ‘Humanising digital labour: The toll of content moderation on mental health,’ Digital Frontiers, April 2nd, 2020, https://www.orfonline.org/expert-speak/humanising-digital-labour-the-toll-of-content-moderation-on-mental-health-64005/
[xvi] Kelly Earley, ‘Irish content moderators prepare lawsuit against Facebook and CPL’ December 4th, 2019, https://www.siliconrepublic.com/companies/irish-content-moderators-facebook-cpl-recruitment
[xx] Elizabeth Dowskin et al, ‘Content moderators at YouTube, Facebook and Twitter see the worst of the web — and suffer silently’, July 25th, 2019, https://www.washingtonpost.com/technology/2019/07/25/social-media-companies-are-outsourcing-their-dirty-work-philippines-generation-workers-is-paying-price/