The Guardian’s recent story on Facebook moderators brings to light the plight of those performing a critical function for modern civil society.
Facebook appoints content moderators to uphold its community standards and to examine and vet its online content for instances of abuse, exploitation, self-harm, racial prejudice and hate speech.
The report details how the task of moderating Facebook has left psychological scars on the company’s employees. Being exposed to large amounts of extreme content on a daily basis evidently has a detrimental impact on their mental health.
Facebook hired a large number of content moderators in the face of criticism regarding violent content on the platform. The UN for example noted how it was complicit in spreading hate speech during the Rohingya genocide in Myanmar. As a follow up, it has hired more than 30,000 employees to work on safety and security – mostly depending on cheap contract labour to do this job. The hiring of most of these content moderators is outsourced to professional vendors such as Cognizant, which euphemistically refers to them as ‘process executives’.
Facebook introduced a slew of measures guidelines to boost its efforts to review and monitor content. It no longer allows graphic images of self-harm on its platform as the company tightened its policies following criticism of the moderation of violent and potentially dangerous content on social media.
The platform also released a public version of its guidelines for what is and is not allowed on the site, and for the first time created a process for individuals to appeal censorship decisions.
Despite hiring more staff, content moderation remains a challenge.
Reuters has reported on the difficulties that Facebook faces while regulating content in different languages. It reports that Facebook’s 15,000-strong content moderation workforce speaks about 50 tongues, though the company said it hires professional translators when needed.
Very little is known about the work of these moderators who vet Facebook’s content for racism, abuse, hate speech and pornographic content as they are made to sign non-disclosure agreements with the company which, in some cases, can be as long as 14 pages!
A little- known aspect of a moderators’ job is checking private conversations between adult and minors for sexual content or exploitation apart from content that can include images of sexual abuse and murder, terrorist videos, illegal pornography and even live broadcasts of people committing suicide.
The American publication, the Verge was one of the first to produce a report on Facebook moderators. According to the report, a content moderator named Keith Utley died because of stress at the Facebook’s content moderation site in Tampa, Florida.
American content moderators have reported that “the conspiracy videos and memes that they see each day gradually led them to embrace fringe views,” and that a former moderator “now sleeps with a gun at his side” after he was traumatised by a video of a stabbing. The Verge story also reported that “after regular exposure to graphic violence and child-exploitation, many workers are subsequently diagnosed with post-traumatic stress disorder and related conditions.”
Contractors employed by Facebook to moderate its content have found themselves becoming addicted to graphic content. Reading a lot of fake news and hate speeches has resulted in some moderators developing far-right leanings. Being subjected to hate speech and fake news can, as we understand, turn even very liberal individuals into people with rabid views on issues such as immigration.
Most content moderators are overworked, being exposed to graphic content at a large scale on a daily basis. They work on as many as 1,000 pieces of content a day – more than one every 30 seconds over an eight-hour shift with only a 30 minutes lunch break and 9-minute break for wellness. Working under tremendous pressure to meet accuracy rates, most workers quit within two years – either being fired over low accuracy rates or quitting because of poor working conditions
The hiring of these content moderators is mostly outsourced to vendors such as Cognizant, working with five outsourcing vendors in at least eight countries on content review. Outsourcing allows the company to save costs.
There are several instances of content moderators suffering from post traumatic stress disorder (PTSD) because of their jobs. Selena Scola, a former Facebook moderator sued the company for failing to protect her from the trauma she suffered while combing through thousands of disturbing images and videos
The suit alleged that Facebook ignored guidelines for traumatic content drawn in 2015 by the technology coalition. The guidelines recommend extensive psychological screening for new employees and mandatory counselling during the job, as well as technical measures to reduce the impact of the content.
Apart from PTSD and panic attacks The Verge’s Casey Newton reported how moderators also experienced symptoms of secondary traumatic stress — a disorder that can result from observing first-hand trauma experienced by others. The disorder, whose symptoms can be identical to post-traumatic stress disorder, is often seen in physicians, psychotherapists and social workers.
Most content moderators deal with the trauma of encountering graphic content on a daily basis either through self-medication or by taking up drugs or alcohol. Facebook content moderators are allowed only nine minutes of wellness time if they are feeling traumatised.
There is no proper counselling support available for them with most onsite counsellors being largely passive.
Facebook has responded by disallowing and disabling the saving of any data which is reviewed. Measures taken by Facebook include- building a ‘global resilience team’ to improve the well-being of employees and post-employment counselling: tools to moderate the effects of harmful videos.
Arun Chandra the vice-president of scaled support and the person responsible for managing Facebook’s growing contractor workforce has initiated some reforms as well. In May, Facebook announced that it will raise contractor wages by $3 an hour, make on-site counsellors available during all hours of operation, and develop further programs for its contractor workforce. That being said, pay raises are not due to take effect until the middle of 2020
It is important, however, to open up a larger debate about content moderators, their working conditions and the mental health problems they face. As the virtual space expands, there are those who perform the job of its guardians, conducting safety checks to protect users from unwanted and inappropriate content. However, the guardians themselves need to be guarded at times from the violent nature of online content the face day in and day out.
Featured image credit: Reuters