{"id":9368,"date":"2020-08-26T17:34:17","date_gmt":"2020-08-26T16:34:17","guid":{"rendered":"https:\/\/cassandravoices.com\/?p=9368"},"modified":"2020-08-26T17:34:17","modified_gmt":"2020-08-26T16:34:17","slug":"warning-the-open-secret-lives-of-content-moderators","status":"publish","type":"post","link":"https:\/\/casswp.eutonom.eu\/index.php\/2020\/08\/26\/warning-the-open-secret-lives-of-content-moderators\/","title":{"rendered":"WARNING: The (Open) Secret lives of Content Moderators"},"content":{"rendered":"<p>Tick Yes or No: \u2018I understand the content I will be reviewing may be disturbing. It is possible that reviewing such content may impact my mental health, and it could even lead to post-traumatic stress disorder (<span style=\"color: #0000ff;\"><a style=\"color: #0000ff;\" href=\"https:\/\/www.ft.com\/content\/98aad2f0-3ec9-11ea-a01a-bae547046735\">PTSD<\/a><\/span>).\u2019<a href=\"#_edn1\" name=\"_ednref1\">[i]<\/a><\/p>\n<p>Last year, a sixteen-year-old Malay girl posted a poll on Instagram asking her followers <span style=\"color: #0000ff;\"><a style=\"color: #0000ff;\" href=\"https:\/\/www.theguardian.com\/world\/2019\/may\/15\/teenage-girl-kills-herself-after-instagram-poll-in-malaysia\">whether she should live or die<\/a><\/span>.<a href=\"#_edn2\" name=\"_ednref2\">[ii]<\/a> 69% voted for death and she took her own life. The followers who voted that she should die neither took action to protect their \u2018friend\u2019 nor shared empathy or concern.<\/p>\n<p>\u201c<span style=\"color: #0000ff;\"><a style=\"color: #0000ff;\" href=\"https:\/\/www.irishtimes.com\/business\/technology\/life-as-a-facebook-moderator-people-are-awful-this-is-what-my-job-has-taught-me-1.4184711\">People are awful<\/a><\/span>.<a href=\"#_edn3\" name=\"_ednref3\">[iii]<\/a> This is what my job has taught me\u201d, says a former Facebook content-moderator who recently sued the <span style=\"color: #0000ff;\"><a style=\"color: #0000ff;\" href=\"https:\/\/www.irishtimes.com\/business\/technology\/life-as-a-facebook-moderator-people-are-awful-this-is-what-my-job-has-taught-me-1.4184711\">social media giant<\/a><\/span> after experiencing psychological trauma as a direct consequence of his work. The Wall Street Journal recently described content moderator as \u2018the <span style=\"color: #0000ff;\"><a style=\"color: #0000ff;\" href=\"https:\/\/www.irishtimes.com\/culture\/tv-radio-web\/facebook-s-dirty-work-in-ireland-i-had-to-watch-footage-of-a-person-being-beaten-to-death-1.3841743\">worst job in the US<\/a><\/span>\u2019<a href=\"#_edn4\" name=\"_ednref4\">[iv]<\/a> , and the same applies to other countries, which this article elaborates on.<\/p>\n<p>Very little is known about the role, mental health toll or other work experiences of content moderators. They may work for YouTube, Facebook, Google and other such platforms that we are all pretty much \u2018addicted\u2019 to.<\/p>\n<p>A few studies are now looking into the working conditions <span style=\"color: #0000ff;\"><a style=\"color: #0000ff;\" href=\"https:\/\/www.shrm.org\/resourcesandtools\/tools-and-samples\/toolkits\/pages\/managingsocialmedia.aspx\">for people<\/a><\/span><a href=\"#_edn5\" name=\"_ednref5\">[v]<\/a> who determine what \u2018material\u2019 or \u2018content\u2019 can be posted to Facebook or Twitter or YouTube. Their job is to decide on whether content adheres to the \u2018community guidelines\u2019 of online platforms. They work day and night so that we the users are saved from exposure to videos of graphic violence or child abuse as well as hate speech, among the constant stream of user generated material uploaded on to social media feeds.<\/p>\n<p>There are thousands of content moderators, who are paid to view objectionable posts and decide which need to be removed from digital platforms. Many are severely traumatized by the images of hate, abuse and violence they see on a daily basis so that we, our families and children get to see \u2018WARNING: The following post or content may be disturbing to some viewers.\u2019<\/p>\n<p>The heavy mental health toll on content moderators who are hired on a \u2018freelance\u2019 or \u2018<span style=\"color: #0000ff;\"><a style=\"color: #0000ff;\" href=\"https:\/\/cassandravoices.com\/history\/covid-19-and-the-gig-economy-hope-springs-eternal\/\">gig<\/a><\/span>\u2019 basis cannot be underestimated.<\/p>\n<p><strong><em>Never-ending Uploads and Ever-Expanding Platforms <\/em><\/strong><\/p>\n<p>A staggering three hundred hours of video content is uploaded on to YouTube every minute, while <span style=\"color: #0000ff;\"><a style=\"color: #0000ff;\" href=\"https:\/\/www.rstreet.org\/2019\/10\/30\/five-myths-about-online-content-moderation-from-a-former-content-moderator\/\">over ninety-five million photos<\/a><\/span><a href=\"#_edn6\" name=\"_ednref6\">[vi]<\/a> are uploaded to Instagram each day, along with over five hundred million tweets sent out on Twitter (or 6,000 tweets per second). Therefore, it is virtually impossible for human moderators to vet every piece before a content is uploaded and goes live (with some potentially going \u2018viral\u2019). Popular platforms such as these serve user-generated content uploaded by a global community of contributors.<\/p>\n<p>The uploaded content is just as diverse as the user base, meaning inevitably that a significant amount is offensive to most users and, by extension, the platforms. Users routinely upload (or attempt to upload) content such as: child abuse, animal torture, and disturbing, hate-filled messages.<\/p>\n<p>Facebook outsources the hiring of content moderators and provides office space. Its sites are largely outside the United States \u2013 mainly in south, south-east and east Asia, but the operations have expanded to the US, <span style=\"color: #0000ff;\"><a style=\"color: #0000ff;\" href=\"https:\/\/www.washingtonpost.com\/technology\/2019\/05\/08\/inside-facebook-second-class-workers-who-do-hardest-job-are-waging-quiet-battle\/\">more specifically in California, Arizona, Texas and Florida<\/a><\/span>.<a href=\"#_edn7\" name=\"_ednref7\">[vii]<\/a> Content moderators work at a computer workstation where they review content \u2013 \u00a0a steady stream of text posts, images and videos. These can range from random personal musings to information with ramifications for international politics. Some of it may seem rather benign \u2013 just words on a screen that someone didn&#8217;t like. While the worst may be incredibly disturbing. On a regular basis moderators have to witness beheadings, murders, animal abuse, and child exploitation. Therefore, one might wonder, what toll on mental health does this take?<\/p>\n<p>One previously unreported aspect of a moderator\u2019w job is the numerical quotas that these <a href=\"https:\/\/www.npr.org\/2019\/07\/01\/737498507\/for-facebook-content-moderators-traumatizing-material-is-a-job-hazard\"><span style=\"color: #0000ff;\">subcontractors<\/span><\/a><a href=\"#_edn8\" name=\"_ednref8\">[viii]<\/a> are forced to meet: each moderator is required to screen thousands of images or videos per day in order to maintain their employment.<\/p>\n<p>Facebook alone has an army of about <a href=\"https:\/\/www.irishtimes.com\/culture\/tv-radio-web\/facebook-s-dirty-work-in-ireland-i-had-to-watch-footage-of-a-person-being-beaten-to-death-1.3841743\"><span style=\"color: #0000ff;\">15,000 people in 20 locations<\/span><\/a><a href=\"#_edn9\" name=\"_ednref9\">[ix]<\/a> around the world, who decide what content should be allowed to stay on Facebook, and what should be marked as \u2018disturbing\u2019, whether execution videos from terrorist groups, murders, beatings, child exploitation or the torture of animals. In addition to the stress of exposure to disturbing images and videos, there is also the pressure to make the right call about what how to mark the content. A wrong decision taken under stress will have penalties, financially for the worker, and also may have mental health effects on other human lives.<\/p>\n<p>Platforms, as we know them, reserve the right to police user-generated content through a clause in their Terms of Service (which none of us read, or do we? Should we?), usually by incorporating their Community Guidelines as a reference. For example, <span style=\"color: #0000ff;\"><a style=\"color: #0000ff;\" href=\"https:\/\/creatoracademy.youtube.com\/page\/course\/community-guidelines\">YouTube\u2019s Community Guidelines<\/a><\/span> prohibit\u00a0 \u2018nudity or sexual content\u2019, \u2018harmful or dangerous content\u2019, \u2018hateful content\u2019, \u2018violent or graphic content\u2019, \u2018harassment and cyberbullying\u2019, \u2018spam, misleading metadata\u2019, \u2018scams\u2019, \u2018threats\u2019 videos that would violate someone else\u2019s copyright, \u2018impersonation\u2019 and \u2018child endangerment.\u2019<\/p>\n<p><strong><em>\u2018Now you see me\u2019<\/em><\/strong><\/p>\n<p><em>The Cleaners<\/em>, a recent documentary, features interviews with several former moderators who were previously outsourced by a subcontractor in the Philippines. The interviewees exposed their experiences of filtering the very worst images and video the internet has to offer. In the Philippines, workers operate out of jam-packed malls, where they spend over nine hours a day moderating content for as little as <span style=\"color: #0000ff;\"><a style=\"color: #0000ff;\" href=\"https:\/\/www.rstreet.org\/2019\/10\/30\/five-myths-about-online-content-moderation-from-a-former-content-moderator\/\">$480 a month<\/a><\/span>.<a href=\"#_edn10\" name=\"_ednref10\">[x]<\/a> With few workday breaks and no access to counselling, many of these individuals end up suffering from insomnia, depression and post-traumatic stress disorder.<\/p>\n<p><iframe loading=\"lazy\" title=\"The Cleaners - Official Trailer\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube.com\/embed\/iGCGhD8i-o4?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe><\/p>\n<p>Records also show the average pay of a full-time online content moderator in the US is around $28,000, but globally and by a large measure a significant amount of hiring is done through outsourcing and on a temporary basis. In Ireland, research shows that typically a Facebook employee would be paid a <span style=\"color: #0000ff;\"><a style=\"color: #0000ff;\" href=\"https:\/\/www.irishtimes.com\/culture\/tv-radio-web\/facebook-s-dirty-work-in-ireland-i-had-to-watch-footage-of-a-person-being-beaten-to-death-1.3841743\">basic rate of \u20ac12.98<\/a><\/span> per hour,<a href=\"#_edn11\" name=\"_ednref11\">[xi]<\/a> with a 25% bonus after 8pm, plus a travel allowance of \u20ac12 per night \u2013 the equivalent of about \u20ac25,000 to \u20ac32,000 per year. Yet the average Facebook employee in Ireland earned \u20ac154,000 in 2017.<\/p>\n<p>On average, the workload involves moderating about 300 to <span style=\"color: #0000ff;\"><a style=\"color: #0000ff;\" href=\"https:\/\/www.irishtimes.com\/culture\/tv-radio-web\/facebook-s-dirty-work-in-ireland-i-had-to-watch-footage-of-a-person-being-beaten-to-death-1.3841743\">400 pieces of content<\/a><\/span><a href=\"#_edn12\" name=\"_ednref12\">[xii]<\/a>\u00a0 \u2013 called \u2018tickets\u2019 \u2013 on an average night. On a busy night, their queue might have 800 to 1,000 tickets. The average handling time is 20 to 30 seconds \u2013 longer if it\u2019s a particularly difficult decision.<\/p>\n<p>\u2018We are trash to them, just a body in a seat\u2019 shares a content moderator. <span style=\"color: #0000ff;\"><a style=\"color: #0000ff;\" href=\"https:\/\/www.orfonline.org\/expert-speak\/humanising-digital-labour-the-toll-of-content-moderation-on-mental-health-64005\/\">Every work minute is strictly bound<\/a><\/span>.<a href=\"#_edn13\" name=\"_ednref13\">[xiii]<\/a> \u00a0Harsh working conditions characterised by specified bathroom breaks and a meagre nine minutes of wellness time engenders a stress that is exacerbated by employers\u2019 downplaying the importance of mental health care.<\/p>\n<p>The continuum of content in those quotas range from tone-deaf jokes; kids dressed up as history\u2019s great dictators that may constitute hate speech; nude images; domestic violence images, and then the really graphic and inhumane ones that inevitably surface. The content moderators have about <a href=\"https:\/\/www.irishtimes.com\/culture\/tv-radio-web\/facebook-s-dirty-work-in-ireland-i-had-to-watch-footage-of-a-person-being-beaten-to-death-1.3841743\"><span style=\"color: #0000ff;\">twenty-four hours<\/span><\/a><a href=\"#_edn14\" name=\"_ednref14\">[xiv]<\/a> within which they have to classify the posts under bullying, hate speech, and other content as appropriate.<\/p>\n<p>Like other forms of gig workers, digital reputation or future work orders come from high ratings. Several former moderators felt pressurised to achieve a 98% quality rating. This would mean that the auditor would agree with 98% of their decisions taken on a random sample of tickets. Moderators are therefore scrutinised for the smallest mistakes. An unending stream of extremism, violence, child sexual abuse imagery and revenge porn, does not give moderators time to consider the more subtle implications of particular posts.<\/p>\n<p><strong><em>Artificial Intelligence (AI) cannot nail this one\u2026 just yet!<\/em><\/strong><\/p>\n<p>Moderators are human beings, so mistakes are inevitable. However, to shatter one misconception on this front: Artificial Intelligence (AI) cannot help much in this field. They currently act as triage systems; for example, by pushing suspect content to human moderators and weeding out some unwanted material on their own. But AI cannot solve the online content moderation problem without human help. For example, AI uses either a visual recognition to identify a broad category of objectionable content or match content to an index of banned items (for example, illicit materials, child abuse, terrorist content, etc.) \u2013 and then it allocates a \u2018hash\u2019 or an ID so that if these are detected again, the uploading process will be disabled. But then guess who will need to set the parameters before the automation can work!?<\/p>\n<p>Automated systems using AI and machine learning still have a long way to go before they can carry out content moderation independently (free of human help that is). We are surely not there yet.<\/p>\n<p>Content moderation is arguably one of the most important tasks that BPOs perform today, fulfilling outsourced contracts for social media giants ranging from Facebook and TikTok to Live, among many others. This has led to <a href=\"https:\/\/www.livemint.com\/news\/india\/inside-the-world-of-india-s-content-mods-11584543074609.html\"><span style=\"color: #0000ff;\">a process-driven BPO<\/span><\/a><a href=\"#_edn15\" name=\"_ednref15\">[xv]<\/a> industry that has become the refuge for quick-fix content moderation based on subjective criteria. Add to that how many of the mods are often young people (their average age is less than thirty), who sometimes join even before finishing college degrees, and the problems begin to add up.<\/p>\n<p><strong><em>The Need for (Content Upload) Speed and\u2026Training!<\/em><\/strong><\/p>\n<p>One might have assumed that US companies who hire moderators would have a good understanding of these issues, but it turns out that they really don&#8217;t. It has been reported for instance that Facebook doesn\u2019t provide ongoing cultural education for these moderators to bring them up to speed. The one exception is when a particular issue goes viral on Facebook, and there&#8217;s a sudden need to bring everybody up to speed in real time. With this<em> laissez faire <\/em>approach it is unsurprising how many Court, Senate and Congressional hearings <span style=\"color: #0000ff;\">Mark Zuckerberg<\/span> has had to attend over the past four years (and not just for the Cambridge Analytica scandal).<\/p>\n<p>One former moderator shared how he witnessed images of <a href=\"https:\/\/www.siliconrepublic.com\/companies\/irish-content-moderators-facebook-cpl-recruitment\"><span style=\"color: #0000ff;\">child sexual abuse<\/span><\/a><a href=\"#_edn16\" name=\"_ednref16\">[xvi]<\/a> and bestiality with me while weeding out content that was unsuitable for the platform. He suffered from psychological trauma as a result of these working conditions and a lack of proper training.<\/p>\n<p>Accenture is one of the companies that hires contract workers to review content for big networks like Google, Facebook, and Twitter. There is a well-documented history of <span style=\"color: #0000ff;\"><a style=\"color: #0000ff;\" href=\"https:\/\/www.businessinsider.in\/careers\/news\/some-youtube-content-moderators-are-reportedly-being-told-they-could-be-fired-if-they-dont-sign-voluntary-statements-acknowledging-their-jobs-could-give-them-ptsd\/articleshow\/73594478.cms\">content moderators reviewing<\/a><\/span><a href=\"#_edn17\" name=\"_ednref17\">[xvii]<\/a> including graphic and disturbing imagery \u2013 with jobs taking significant mental health tolls, and leading to psychological trauma.<\/p>\n<p>In order to share more of what goes on during content moderation, the freelancers have to break the nondisclosure agreements first, and this is an area where there is journalistic investigations and research work pending. One of the burning questions is whether the company has anything to say about the psychological and emotional impact of watching the brutality, pornography, and hate that the moderators have to look at on a daily basis?<\/p>\n<p><em><strong>Some Debt Cannot be Repaid<\/strong><\/em><\/p>\n<p>Facebook has already paid out a $52 million settlement to content moderators suffering from mental health problems such as <span style=\"color: #0000ff;\"><a style=\"color: #0000ff;\" href=\"https:\/\/www.bbc.com\/news\/technology-52642633\">Post Traumatic Stress Disorder (PTSD)<\/a><\/span>.<a href=\"#_edn18\" name=\"_ednref18\">[xviii]<\/a> In light of repeated allegations and the seriousness of the situation, the company has agreed to compensate American content moderators and provide extra counselling during their tenure. The social media giant will pay a minimum of <a href=\"https:\/\/www.bbc.com\/news\/technology-52642633\"><span style=\"color: #0000ff;\">$1,000 to each moderator<\/span><\/a>.<a href=\"#_edn19\" name=\"_ednref19\">[xix]<\/a>\u00a0 The settlement covers 11,250 moderators which is a glimpse at the colossal number (in hundreds of thousands) of moderators involved in this work globally.<\/p>\n<p>\u201cI know it\u2019s not normal, but now <span style=\"color: #0000ff;\"><a style=\"color: #0000ff;\" href=\"https:\/\/www.washingtonpost.com\/technology\/2019\/07\/25\/social-media-companies-are-outsourcing-their-dirty-work-philippines-generation-workers-is-paying-price\/\">everything is normalized<\/a><\/span><a href=\"#_edn20\" name=\"_ednref20\">[xx]<\/a>,\u201d said a moderator who declined to share his name and other details because of the confidentiality clause he signed when he took the job. Non-disclosure agreements are non-negotiable for moderators, and are forcibly imposed by the platforms. For example, YouTube content moderators are reportedly being told they could be fired if they don&#8217;t sign &#8216;voluntary&#8217; statements acknowledging their jobs could give them PTSD (i.e. post-traumatic stress disorder).<\/p>\n<p>Reports also shows that Accenture managers repeatedly coerced site counsellors to break <span style=\"color: #0000ff;\"><a style=\"color: #0000ff;\" href=\"https:\/\/theintercept.com\/2019\/08\/16\/facebook-moderators-mental-health-accenture\/\">patient confidentiality<\/a><\/span>.<a href=\"#_edn21\" name=\"_ednref21\">[xxi]<\/a> Although these allegations were refuted by Accenture, such fault lines between workers and management are bound to affect organisational morale.<\/p>\n<p>Further studies are elusive on whether companies such as Accenture are shifting the responsibility of mental health care onto individual employees, and thus avoiding liability in the face of increasing lawsuits from dormer moderators. In response to growing allegations, certain social media giants have reinstated their commitment towards safeguarding their employees\u2019 mental health and have clinical psychologists on call.<\/p>\n<p><strong><em>The Valley of Uploads<\/em><\/strong><\/p>\n<p>While some of the specifics remain intentionally obfuscated, content moderation is done by tens of thousands of online content moderators, mostly employed by subcontractors in India and the Philippines, who are paid wages well below what the average Silicon Valley tech employee earns. We need more studies and investigations on this as time progresses, as our hunger for newer \u2018tailor-made\u2019 media feeds continues to grow.<\/p>\n<p>The general assumption is that the large tech companies can easily hide the worst parts of humanity, otherwise freely available on the internet. There is no easy solution. With billions of users and unending uploads, there will never be enough moderators to check everything before it is <a href=\"https:\/\/www.rstreet.org\/2019\/10\/30\/five-myths-about-online-content-moderation-from-a-former-content-moderator\/\"><span style=\"color: #0000ff;\">shared with the world<\/span><\/a>.<a href=\"#_edn22\" name=\"_ednref22\">[xxii]<\/a><\/p>\n<p>Legal challenges and new methods of reporting abuse help to narrow the risks, but the task is nonetheless Sisyphean. The complexities are ongoing, ever-growing and multi-faceted. The trade-off between a \u2018quick fix\u2019 of myriad issues would still create a dispersed range of unintended externalities to the stakeholders involve. This list includes the users, content moderators, companies, lawmakers and legal systems monitoring these behemoth digital platforms.<\/p>\n<p><a href=\"#_ednref1\" name=\"_edn1\">[i]<\/a> Madhumita Murgia, \u2018Facebook content moderators required to sign PTSD forms\u2019, <em>Financial Times<\/em>, January 26<sup>th<\/sup>, 2020, https:\/\/www.ft.com\/content\/98aad2f0-3ec9-11ea-a01a-bae547046735<\/p>\n<p><a href=\"#_ednref2\" name=\"_edn2\">[ii]<\/a> Jamie Fullerton, \u2018Teenage girl kills herself &#8216;after Instagram poll&#8217; in Malaysia\u2019, May 15<sup>th<\/sup>, 2020 https:\/\/www.theguardian.com\/world\/2019\/may\/15\/teenage-girl-kills-herself-after-instagram-poll-in-malaysia<\/p>\n<p><a href=\"#_ednref3\" name=\"_edn3\">[iii]<\/a> Marie Boren, \u2018Life as a Facebook moderator: \u2018People are awful. This is what my job has taught me\u2019\u2019 <em>Irish Times<\/em>, February 27<sup>th<\/sup>, 2020, https:\/\/www.irishtimes.com\/business\/technology\/life-as-a-facebook-moderator-people-are-awful-this-is-what-my-job-has-taught-me-1.4184711.<\/p>\n<p><a href=\"#_ednref4\" name=\"_edn4\">[iv]<\/a> Jennifer O\u2019Connell, \u2018Facebook\u2019s dirty work in Ireland: \u2018I had to watch footage of a person being beaten to death\u2019\u2019, <em>Irish Times<\/em>, March 30<sup>th<\/sup>, 2019, <a href=\"https:\/\/www.irishtimes.com\/culture\/tv-radio-web\/facebook-s-dirty-work-in-ireland-i-had-to-watch-footage-of-a-person-being-beaten-to-death-1.3841743\">https:\/\/www.irishtimes.com\/culture\/tv-radio-web\/facebook-s-dirty-work-in-ireland-i-had-to-watch-footage-of-a-person-being-beaten-to-death-1.3841743<\/a><\/p>\n<p><a href=\"#_ednref5\" name=\"_edn5\">[v]<\/a> \u2018Managing and Leveraging Workplace Use of Social Media\u2019, <em>SHRM<\/em>, January 19<sup>th<\/sup>, 2019, \u00a0<a href=\"https:\/\/www.shrm.org\/resourcesandtools\/tools-and-samples\/toolkits\/pages\/managingsocialmedia.aspx\">https:\/\/www.shrm.org\/resourcesandtools\/tools-and-samples\/toolkits\/pages\/managingsocialmedia.aspx<\/a><\/p>\n<p><a href=\"#_ednref6\" name=\"_edn6\">[vi]<\/a> Daisy Soderberg-Rivkin, \u2018Five myths about online content moderation, from a former content moderator\u2019. October 30<sup>th<\/sup>, 2019, https:\/\/www.rstreet.org\/2019\/10\/30\/five-myths-about-online-content-moderation-from-a-former-content-moderator\/<\/p>\n<p><a href=\"#_ednref7\" name=\"_edn7\">[vii]<\/a> \u2018Inside Facebook, the second-class workers who do the hardest job are waging a quiet battle\u2019, <em>Washington Post<\/em>, https:\/\/www.washingtonpost.com\/technology\/2019\/05\/08\/inside-facebook-second-class-workers-who-do-hardest-job-are-waging-quiet-battle\/<\/p>\n<p><a href=\"#_ednref8\" name=\"_edn8\">[viii]<\/a> Terry Gross, \u00a0\u2018For Facebook Content Moderators, Traumatizing Material Is A Job Hazard\u2019, <em>NPR<\/em>, July 1<sup>st<\/sup>, 2019,<\/p>\n<p><a href=\"#_ednref9\" name=\"_edn9\">[ix]<\/a> Ibid, O\u2019Connell, March 20<sup>th<\/sup>, 2019.<\/p>\n<p><a href=\"#_ednref10\" name=\"_edn10\">[x]<\/a> Ibid, Soderberg-Rivkin, October 30<sup>th<\/sup>, 2019.<\/p>\n<p><a href=\"#_ednref11\" name=\"_edn11\">[xi]<\/a> Ibid, O\u2019Connell, March 20<sup>th<\/sup>, 2019.<\/p>\n<p><a href=\"#_ednref12\" name=\"_edn12\">[xii]<\/a> Ibid O\u2019Connell, March 20<sup>th<\/sup>, 2019.<\/p>\n<p><a href=\"#_ednref13\" name=\"_edn13\">[xiii]<\/a> Prithvi Iyer, Suyash Barve, \u2018Humanising digital labour: The toll of content moderation on mental health,\u2019 <em>Digital Frontiers<\/em>, April 2<sup>nd<\/sup>, 2020, https:\/\/www.orfonline.org\/expert-speak\/humanising-digital-labour-the-toll-of-content-moderation-on-mental-health-64005\/<\/p>\n<p><a href=\"#_ednref14\" name=\"_edn14\">[xiv]<\/a> Ibid O\u2019Connell, March 20<sup>th<\/sup>, 2019.<\/p>\n<p><a href=\"#_ednref15\" name=\"_edn15\">[xv]<\/a> Prasid Banerjee, \u2018Inside the secretive world of India\u2019s social media content moderators\u2019, <em>LiveMint<\/em>, March 18<sup>th<\/sup>, 2020, <a href=\"https:\/\/www.livemint.com\/news\/india\/inside-the-world-of-india-s-content-mods-11584543074609.html\">https:\/\/www.livemint.com\/news\/india\/inside-the-world-of-india-s-content-mods-11584543074609.html<\/a><\/p>\n<p><a href=\"#_ednref16\" name=\"_edn16\">[xvi]<\/a> Kelly Earley, \u2018Irish content moderators prepare lawsuit against Facebook and CPL\u2019 December 4<sup>th<\/sup>, 2019, https:\/\/www.siliconrepublic.com\/companies\/irish-content-moderators-facebook-cpl-recruitment<\/p>\n<p><a href=\"#_ednref17\" name=\"_edn17\">[xvii]<\/a> Paige Leskin, \u2018Some YouTube content moderators are reportedly being told they could be fired if they don&#8217;t sign &#8216;voluntary&#8217; statements acknowledging their jobs could give them PTSD\u2019, January 24<sup>th<\/sup>, 2020, <a href=\"https:\/\/www.businessinsider.in\/careers\/news\/some-youtube-content-moderators-are-reportedly-being-told-they-could-be-fired-if-they-dont-sign-voluntary-statements-acknowledging-their-jobs-could-give-them-ptsd\/articleshow\/73594478.cms\">https:\/\/www.businessinsider.in\/careers\/news\/some-youtube-content-moderators-are-reportedly-being-told-they-could-be-fired-if-they-dont-sign-voluntary-statements-acknowledging-their-jobs-could-give-them-ptsd\/articleshow\/73594478.cms<\/a><\/p>\n<p><a href=\"#_ednref18\" name=\"_edn18\">[xviii]<\/a> Untitled, \u2018Facebook to pay $52m to content moderators over PTSD\u2019, <em>BBC<\/em>, May 13<sup>th<\/sup>, 2020, https:\/\/www.bbc.com\/news\/technology-52642633<\/p>\n<p><a href=\"#_ednref19\" name=\"_edn19\">[xix]<\/a> Ibid<\/p>\n<p><a href=\"#_ednref20\" name=\"_edn20\">[xx]<\/a> Elizabeth Dowskin et al, \u2018Content moderators at YouTube, Facebook and Twitter see the worst of the web \u2014 and suffer silently\u2019, July 25<sup>th<\/sup>, 2019, https:\/\/www.washingtonpost.com\/technology\/2019\/07\/25\/social-media-companies-are-outsourcing-their-dirty-work-philippines-generation-workers-is-paying-price\/<\/p>\n<p><a href=\"#_ednref21\" name=\"_edn21\">[xxi]<\/a> Sam Biddle, \u2018Trauma Counselors Were Pressured to Divulge Confidential Information About Facebook Moderators, Internal Letter Claims\u2019, <em>The Intercept<\/em>, August 16<sup>th<\/sup>, 2019, <a href=\"https:\/\/theintercept.com\/2019\/08\/16\/facebook-moderators-mental-health-accenture\/\">https:\/\/theintercept.com\/2019\/08\/16\/facebook-moderators-mental-health-accenture\/<\/a><\/p>\n<p><a href=\"#_ednref22\" name=\"_edn22\">[xxii]<\/a> Ibid, Soderberg-Rivkin, October 30<sup>th<\/sup>, 2019. https:\/\/www.rstreet.org\/2019\/10\/30\/five-myths-about-online-content-moderation-from-a-former-content-moderator\/<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Tick Yes or No: \u2018I understand the content I will be reviewing may be disturbing. It is possible that reviewing such content may impact my mental health, and it could even lead to post-traumatic stress disorder (PTSD).\u2019[i] Last year, a sixteen-year-old Malay girl posted a poll on Instagram asking her followers whether she should live [&hellip;]<\/p>\n","protected":false},"author":2,"featured_media":9375,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[18],"tags":[683,1073,1074,1250,3139,3140,3714,3715,7403,7404,7794,8972,9343,9344,9520,10027],"class_list":["post-9368","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-society","tag-artificial-intelligence-for-content-moderation","tag-boidurjo-rick-mukhopadhyay","tag-boidurjo-rick-mukhopadhyay-cassandra-voices","tag-can-ai-do-content-moderation","tag-facebook-content-moderators","tag-facebook-moderators","tag-gig-worker-compensation","tag-gig-workers","tag-post-traumatic-stress-disorder","tag-post-traumatic-stress-disorders-platform-moderators","tag-revelations-from-content-moderator","tag-the-cleaners-documentary","tag-the-worst-job-in-the-us","tag-the-worst-job-in-the-us-is-content-moderator","tag-trauma-of-platform-moderators","tag-what-facebook-content-moderators-have-to-watch"],"_links":{"self":[{"href":"https:\/\/casswp.eutonom.eu\/index.php\/wp-json\/wp\/v2\/posts\/9368","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/casswp.eutonom.eu\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/casswp.eutonom.eu\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/casswp.eutonom.eu\/index.php\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/casswp.eutonom.eu\/index.php\/wp-json\/wp\/v2\/comments?post=9368"}],"version-history":[{"count":0,"href":"https:\/\/casswp.eutonom.eu\/index.php\/wp-json\/wp\/v2\/posts\/9368\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/casswp.eutonom.eu\/index.php\/wp-json\/"}],"wp:attachment":[{"href":"https:\/\/casswp.eutonom.eu\/index.php\/wp-json\/wp\/v2\/media?parent=9368"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/casswp.eutonom.eu\/index.php\/wp-json\/wp\/v2\/categories?post=9368"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/casswp.eutonom.eu\/index.php\/wp-json\/wp\/v2\/tags?post=9368"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}