Community Moderation

What is community moderation?

Moderation means different things to different people and a large part of an organisation’s approach to moderation is contingent on their understanding of what it is and what they are trying to achieve. 

For most digital platforms, moderation refers to the practice of identifying and deleting content by applying a pre-determined set of rules and guidelines. RNW Media, for example, implements a community moderation which aims to encourage young people to engage in respectful dialogue, allowing everyone to have a voice. Careful strategic moderation of online conversations helps build trust among community members who then feel safe to express themselves freely. This, in turn, nurtures diverse and resilient communities with strong relationships among members.

With RNW Media as an example, below are some of possible objectives of community moderation:

  • RNW Media wants to bring people together to engage with one another and with different points of view
  • RNW Media wants young people to think critically about the challenges facing their society and in the case of Love Matters – in relation to their SRHR
  • RNW Media wants to increase users’ knowledge, and in some cases, change their attitudes as a result of exposure to our platforms and communities; we therefore encourage a diversity of viewpoints
  • RNW Media creates that safe space where women are encouraged and feel safe to participate

Moderation is a tool to build a vibrant, respectful and inclusive digital community. It is an important tool to communicate with users and it allows users to be heard, in order to create knowledge exchange circles among the moderators and experts and users as well as among users themselves. Moderation encourages users to actively participate in the conversation, it creates a safe space where young people feel not only able to take part in discussions but benefit from doing so.

From a more operational perspective moderation offers us the chance to set the tone of discussion on certain content, bring users back on topic if they go on a tangent, diffuse tense conversations, provide additional information or answer users’ questions, and propagate a healthy, safe and respectful dialogue. 

Online abuse and harassment can discourage people from discussing important issues or expressing themselves and mean they give up on seeking different opinions. Therefore, discussions on our platforms are moderated with the aim of ensuring they remain a safe place for discussion and debate.  

Example of Moderation Guidelines: Love Matters Global Network

MODERATION GUIDELINES

  • Moderation responsibilities
  1. Moderators should monitor social media regularly – at least once / twice a day. Engagement keeps a community enthused. If not monitored though, sites and pages can quickly become filled with spam, offensive or negative comments that ultimately drive users away.
  2. It is important that moderators are available to manage the distribution of new content –particularly on norm topics – this is to set the tone, keep the discussion on topic and, ultimately, to help the conversation go further. They should work closely with community managers and marketeers and know when content will be published.
  3. Moderators enforce the community guidelines and ensure the community is safe and that users feel able to join the discussion.
  4. Moderators teach by example. Challenging unsavoury viewpoints and demonstrating how to have robust yet respectful conversations.
  • Categorising user comments

Categorising users’ comments and intentions ultimately relies on moderators’ judgement. Moderators don’t need to record and categorise all user comments but when deciding how to respond this can be a useful tool. Below you can find a brief description of what to expect.

  1. Supportive/constructive; a user responding to the page or another user with the intention of having a respectful discussion.
  2. Inquisitive; users asking a question, clarifying content, or requesting more information on the topic.
  3. Negative/unconstructive; negative response from a user to another user or the page –not really seeking deeper conversation. In the polarisation context, this would be monologuing
  4. Antagonistic; a user participating in the conversation but not with good intentions –really aiming to stir up discord.
  5. Abusive/offensive; this can be anything from threats of violence to hate speech.
  • How to deal with negative or antagonistic comments

If a user is posting negative comments, these should be treated with caution, but not removed. If a forum only has positive comments – this is not what LMGN partners are trying to achieve – instead, genuine discussion is!

The moderator should monitor negative/unconstructive comments, however, not act unless it threatens to dominate the entire conversation, in which case you need to review why people are being so critical.

If negative commenting is inaccurate, then it is important to add content or additional information which resolves inaccuracies or adds an alternate view. Alternatively, the moderator can ask a negative commenter to elaborate their point.

Antagonistic comments are usually the habitat of trolls. They are purposefully aiming to derail the conversation and prevent a meaningful discussion from taking place.

With negative or antagonistic comments, moderators have essentially 3 options:

  • Challenge: This could take the form of questioning the user, providing information / research to dispute their claims, or direct the user to the community guidelines.
  • Ignore comments: Which is a method of managing conversations as unconstructive comments don’t get extra visibility.
  • Hide comments: This should be applied for repeat offenders, at the moderator’s discretion, and should be accompanied by a private message. If a user becomes a nuisance on the page and is unwilling to follow community guidelines after warnings, consider banning.
  • How to deal with abusive or offensive comments

Abusive comments include offensive, obscene or discriminatory comments, personal attacks and incitements to violence. They should not be tolerated under any circumstances. If abusive or offensive comments are made, the moderator should hide the comments as soon as they are seen.

Depending on the moderator’s judgement they should message the user and either inform them this is not that kind of community or deliver a yellow card. If a user offends repeatedly, or it’s obvious they are a spammer, consider blocking them but be transparent and consistent. Never ban someone just for being critical or having a controversial opinion. If in doubt assess the comments against your community guidelines. After banning a user, you can log the circumstances in a report and send it to your relevant team members.

  • How to acknowledge and reinforce respectful user practices

It is equally important to acknowledge users respectfully participating in the conversation and abiding by the community guidelines. Moderators can do this by thanking users for their contributions, liking their comments (or replies) and thus giving prominence to these comments in the thread, or replying to users in a positive manner.

  • How to diffuse polarisation

When it comes to dealing with polarised discussions online, we’re already quite advanced in terms of what we’re doing, and a couple of the techniques to diffuse polarisation are already ingrained in our approach – for example, focusing on aspirations and moderating conversations in a non-judgemental way.

It’s important to understand when polarisation is encountered. Polarisation in its simplest form is: “we are right, they are wrong”. Polarisation is an artificial construction of identities. It’s about people who are being targeted by narrow identity communication to choose sides. Pushers try to lure them into polarisation. The definition of the problem and problem ownership is not very clear.

  • How to deal with polarised situations
  1. Change the target audience. Pushers portray an enemy in the other pusher and target the middle ground. This could take the form of ignoring extreme, polarised positions and looking and highlighting the opinions of the middle.
  2. Change the topic. Move away from the identity construct chosen by the pushers and start a conversation on the common concerns and interests of those in the middle ground. Apply the aspirational approach. Already apply it with your content, but let’s get the conversation back on the issues that all young people are experiencing – price of rice, etc.
  3. Change position. Don’t act above the parties, in between the poles, but move towards the middle ground. Stop trying to build bridges (positioning above the poles) but rather to a position in the middle (connected and mediating).
  4. Change the tone. This is not about right or wrong or facts. Use mediating speech and try to engage and connect with the diverse middle ground. Apply a non-judgemental approach to moderation. Moderators should not moralise, nor ask who is guilty but should focus on the development of mediating speech and behaviour. (Source: Understanding the Dynamics of Us Versus Them by Bart Brandsma)