The secret lives of Facebook moderators

Casey Newton writing for The Verge,

She spent the past three and a half weeks in training, trying to harden herself against the daily onslaught of disturbing posts: the hate speech, the violent attacks, the graphic pornography. In a few more days, she will become a full-time Facebook content moderator, or what the company she works for, a professional services vendor named Cognizant, opaquely calls a “process executive.”

For this portion of her education, Chloe will have to moderate a Facebook post in front of her fellow trainees. When it’s her turn, she walks to the front of the room, where a monitor displays a video that has been posted to the world’s largest social network. None of the trainees have seen it before, Chloe included. She presses play.

The video depicts a man being murdered. Someone is stabbing him, dozens of times, while he screams and begs for his life. Chloe’s job is to tell the room whether this post should be removed. She knows that section 13 of the Facebook community standards prohibits videos that depict the murder of one or more people. When Chloe explains this to the class, she hears her voice shaking.

This is the 21st-century coal mine. Except this one destroys your mind not your lungs. I do not want to think about the mental health ramifications a job like this might lead to. Nothing good for sure.

The question that must be asked is this: if your business relies on human beings to be exposed to this level of psychological abuse, is your business truly adding value to the world?

I was disappointed that I missed the Independent Lens documentary on a similar subject, The Cleaners. Hopefully, it will be available online soon. Here’s a behind-the-scenes piece on it.