From the dark side to the light: The changing face of moderation

From the dark side to the light: The changing face of moderation

Until recently, many people were unaware that moderators even existed. In some ways this was unsurprising. Too frequently, offensive and inappropriate comments remained in the public domain, even when the platform featuring them was questioned. Often the reply would be “it didn’t break the guidelines” or “it was free speech” and protected under the 1st Amendment if in the US. So on the few occasions when the content was removed, there was a taciturn acceptance that it was somehow magical, the result of the ‘AI fairy’. The modern day equivalent of The Elves and the Shoemaker - only not as efficient.

The dark side of moderation

Nowadays it’s different. Not only do we know that moderators exist, but there’s enough information circulating about what they do and under the conditions they work that now the question is: ‘Why would anyone want to do that job?’ Especially if you’ve seen Riesewieck and Blockread’s documentary The Cleaners or read any recent press articles, the titles of which contain such phrases as: ‘The Trauma Floor’, ‘The Impossible Job’ and ‘Underpaid and Overburdened’. Together they shine a light on what it’s like to work as a moderator for the world’s biggest platform: Facebook. And it’s not pretty.

From the outset, Facebook’s moderators only review reported content. In other words, content that has broken - or potentially broken - their guidelines, meaning they spend all day looking at the worst content the internet can throw up at them. And we mean the worst. Terrorist beheadings, live suicides, child abuse, etc. In short, if you can imagine it, someone somewhere has probably done it, filmed it and uploaded it to either Facebook or YouTube.

The challenge of moderator support

Then you hear the complaints that they feel inadequately supported. The counsellors and Employee Assistance programs are so inefficient, some de-stress by taking drugs or having sex at work. A number of ex-moderators sued Facebook claiming the work gave them PTSD. And on top of all that, the moderating guidelines are often contradictory, and yet if they make more than a handful of errors a week, they can be fired. One Quality Assurance worker whose job it was to evaluate their work, felt so physically threatened by moderators desperate to keep their jobs he started carrying a gun to work.

And these aren’t the conditions from outposts somewhere in Asia, where to be fair to Facebook, many companies have used the ‘out of sight, out of mind’ approach regarding employee welfare if it saves money. This is happening in Phoenix, USA, for which the moderators currently receive the princely sum of $15ph.

So why would anyone want to be a moderator, when it sounds as if they’re getting the short end of a certain 'sticky' stick? Well, some do it because they want to say they work at Facebook; others because they genuinely want to make a difference, but mainly it’s because they need the job.

Moderation making a real difference

But it doesn’t have to be this way. Facebook’s problems are ultimately the result of its scale; they’ve implemented a global set of rules across multiple countries with different cultural, political and sociological needs. Yet many moderators who are lucky enough to work on smaller, more independent platforms, have a completely different experience.

Less offensive material

Smaller platforms receive significantly less offensive material, so their moderators suffer less emotional stress, less anxiety and are less likely to ‘burn out’ after a year - as many Facebook moderators do.

Greater support & feeling valued

Moderators for smaller platforms tend to feel more supported, less rushed and have greater access to the people who ultimately make the decisions. This means they receive less contradictory and conflicting guidance than those working for larger networks. Most importantly, they feel part of the team. And not just the moderation team, but the wider company too. They feel valued by their employers, knowing they’re an integral part of the company without whom that business would fail.

Making a positive difference

And finally, on a personal level, they tend to feel that what they do makes a real difference and that their input positively impacts the platform’s user experience. This is even to the point of getting to know an individual user’s likes and dislikes and how they interact with others.

In other words, moderation doesn’t have to be the equivalent of wading through human sewage for 8 hours a day. With smaller linked platforms, the appropriate support, comprehensive training and an appreciation of the moderator’s worth, it can genuinely be so much better. The 2.0 version of a valued, happy and efficient human ‘elf’.