How Moderation Can Create A Safe Online Space

Last week I wrote a post on my personal blog about the practicalities of how a one-click report abuse button for Twitter might work. As a professional moderator I thought I might be able to offer a counterpoint to several media comment articles that had been published, pooh-poohing the very idea. To, you know, the dozen or so people who'd probably read it. I'll leave you to imagine my surprise when it racked up 4,000 page views in 24 hours.

Most of the feedback was positive, but some of the comments on the post and on Twitter made me realise that there's still some confusion about the separation between site guidelines, or rules, and the moderators whose job it is to apply those guidelines.

Guidelines are set by the site owners. Platforms like Facebook and Twitter set their own, minimum, baseline rules to assess reported content against (the volume is too much to check every item posted). Page owners on Facebook can then impose more stringent rules on top of those, in the same way that other sites and forums set their own standards, e.g. the acceptable level of swearing and so on.

(As an aside: one thing I think needs to change, and fast, is for all sites and Facebook pages to list comprehensive site rules in plain English. When content is deleted, users tend to complain that the site or moderator is censoring them. In reality they'll have used a banned swearword, gone beyond what I'm going to call 'spirited' debate into pure abuse, or accidentally broken a law. A clear list of rules, easy to find, could go a long way to restoring trust online.)

So the moderators really don't make it up on the fly or delete your comment because we hate One Direction and that's your profile picture. We're working to guidelines. We can see them and you can't, and that can be confusing. But something I was asked on Twitter goes deeper: can the moderators be trusted to assess content fairly? It's our job to implement site guidelines without fear or favour, but everyone has internal biases. Can a white person moderate a discussion about race? Can a man moderate a discussion about feminism? Can a non-transgender person moderate a discussion about transgender issues?

There are people who've had bad experiences with moderation and they say, no. That moderation can't be done fairly because the person with their finger hovering over the delete button cannot possibly make a neutral judgement of what is or isn't acceptable.

Thing is, if we follow that argument to its logical conclusion, we end up with nobody moderating anything or automating the whole process (which isn't a good idea, since machines can't yet gauge context or nuance). And given the recent rape and bomb threats on Twitter, and suicides linked to bullying on, that's just not an option any more. While no model is 100%, here's how Tempero tries to make sure everyone gets dealt with equally:

  1. Training: new employees go through a four day induction course to make sure they’re fully versed in the practical and theoretical sides. (I don't think the inductions from my previous jobs added together come to four days.) We also go through training specific to each client when we start working on a new project.
  2. Oversight: managers keep an eye on what we're removing (and not removing) and give us regular feedback. A moderator with an axe to grind will be quickly picked up.
  3. Distance: we're not active participants on the sites we moderate, we don't have anything to gain from directing a conversation in a particular direction or silencing a line of argument. It also means we're not susceptible to users clamouring for a comment to be deleted or a user to be banned – we have no personal loyalties to maintain, so we can make a judgement based on the rules.

I'm aware that what I've written here is basically an advert for my job and my employer, but I'm OK with that. I'm proud of what we do. It can be a tough slog sometimes but I believe in moderation that creates a safe space for everyone. And the internet definitely needs to be safer than it is right now.

Rachel has been a Tempero moderator for the past 2 years. During the rare times she's not online she's at the theatre or comedy gigs, making parkin and fussing over her neighbours' cats.