01 873 2134 
Chris Gray’s work for Facebook entailed watching videos of stonings, torture and executions 
 
“Whenever I talk about the content, I just get more upset. I’ve really had to pull back on talking about it. I don’t sleep for a week before one of these interviews, and I can’t sleep afterwards,” says Chris Gray. 
The former content moderator has issued proceedings in the Irish High Court against Facebook and CPL Solutions, over psychological trauma he claims he suffered while working for the social media giant between 2017 and 2018. 
 
Facebook indirectly employs some 15,000 content moderators in 20 locations worldwide including Ireland, through agencies such as CPL. Their job is to decide what content should be allowed to stay on Facebook, what should be left up and marked as “disturbing”, and what should be deleted. 
 
Legal documents filed this week by Gray claim the material he was required to view included a video of a woman wearing an abaya being stoned to death; executions of individuals “shot at point blank range with machine guns”; the abuse and murder of the Rohingya people; videos of what appears to be migrants in Libya being tortured with molten metal; whippings, beatings and animal torture. 
 
He was not psychologically screened before he took the job with CPL. Over time, he says he began to suffer symptoms of psychological distress, becoming “numb and desensitised” to the content, and increasingly irritable, argumentative and aggressive. He says that after a time he noticed “a slow creep” whereby his own views began to change as a result of the material to which he was exposed. 
 
‘Tickets’ 
His official title was “community operations analyst”. The job involved moderating some 600 pieces of content – called “tickets” – a night. Documents filed in court by lawyers for Gray claim moderators aimed to spend on average of no more than 30 seconds on a ticket, and must maintain 98 per cent accuracy on decision-making, which meant no more than four errors a month, a target that he claims was “never reasonably achievable”. 
 
The Dublin-based law firm Coleman Legal is also acting for about a dozen other content moderators in Ireland 
Gray claims he began to obsess about distressing individual videos, after having to repeatedly review them. He has been diagnosed with “chronic adjustment reaction to traumatisation, secondary to his overall experiences at Facebook”. 
 
Facebook previously pointed out that content reviewers were not given targets for either the time a job takes or the amount processed, and it instructed partners “not put this sort of pressure on reviewers”. 
 
In a statement this week Facebook said it was committed to providing support for content reviewers and recognises that reviewing certain content can be difficult. It said all those who review content go through an in-depth training programme on community standards and have “access to extensive psychological support”. 
 
Upsetting content 
The writ filed by Gray claims much of his training focused on the obligations of privacy and confidentiality, and that much less was offered on how to deal with graphic, violent or upsetting content. 
 
However, a spokeswoman for Facebook said everyone reviewing its content goes through “extensive training” including mandatory resiliency training, wellness breaks and access to psychological support. It is also introducing technical solutions to help moderators, such as tools to blur graphic images. 
 
The Dublin-based law firm Coleman Legal, which is representing Gray, is also acting for about a dozen other content moderators in Ireland, including one direct employee. CPL was not available for comment but previously said it takes any concerns employees raise seriously and provides extensive training. 
 
Share this post:
Our site uses cookies. For more information, see our cookie policy. Accept cookies and close
Reject cookies Manage settings