Moderation means different things to different people and a large part of an organisation’s approach to moderation is contingent on their understanding of what it is and what they are trying to achieve.
For most digital platforms, moderation refers to the practice of identifying and deleting content by applying a pre-determined set of rules and guidelines. RNW Media, for example, implements a community moderation which aims to encourage young people to engage in respectful dialogue, allowing everyone to have a voice. Careful strategic moderation of online conversations helps build trust among community members who then feel safe to express themselves freely. This, in turn, nurtures diverse and resilient communities with strong relationships among members.
With RNW Media as an example, below are some of possible objectives of community moderation:
Moderation is a tool to build a vibrant, respectful and inclusive digital community. It is an important tool to communicate with users and it allows users to be heard, in order to create knowledge exchange circles among the moderators and experts and users as well as among users themselves. Moderation encourages users to actively participate in the conversation, it creates a safe space where young people feel not only able to take part in discussions but benefit from doing so.
From a more operational perspective moderation offers us the chance to set the tone of discussion on certain content, bring users back on topic if they go on a tangent, diffuse tense conversations, provide additional information or answer users’ questions, and propagate a healthy, safe and respectful dialogue.
Online abuse and harassment can discourage people from discussing important issues or expressing themselves and mean they give up on seeking different opinions. Therefore, discussions on our platforms are moderated with the aim of ensuring they remain a safe place for discussion and debate.
MODERATION GUIDELINES
Categorising users’ comments and intentions ultimately relies on moderators’ judgement. Moderators don’t need to record and categorise all user comments but when deciding how to respond this can be a useful tool. Below you can find a brief description of what to expect.
If a user is posting negative comments, these should be treated with caution, but not removed. If a forum only has positive comments – this is not what LMGN partners are trying to achieve – instead, genuine discussion is!
The moderator should monitor negative/unconstructive comments, however, not act unless it threatens to dominate the entire conversation, in which case you need to review why people are being so critical.
If negative commenting is inaccurate, then it is important to add content or additional information which resolves inaccuracies or adds an alternate view. Alternatively, the moderator can ask a negative commenter to elaborate their point.
Antagonistic comments are usually the habitat of trolls. They are purposefully aiming to derail the conversation and prevent a meaningful discussion from taking place.
With negative or antagonistic comments, moderators have essentially 3 options:
Abusive comments include offensive, obscene or discriminatory comments, personal attacks and incitements to violence. They should not be tolerated under any circumstances. If abusive or offensive comments are made, the moderator should hide the comments as soon as they are seen.
Depending on the moderator’s judgement they should message the user and either inform them this is not that kind of community or deliver a yellow card. If a user offends repeatedly, or it’s obvious they are a spammer, consider blocking them but be transparent and consistent. Never ban someone just for being critical or having a controversial opinion. If in doubt assess the comments against your community guidelines. After banning a user, you can log the circumstances in a report and send it to your relevant team members.
It is equally important to acknowledge users respectfully participating in the conversation and abiding by the community guidelines. Moderators can do this by thanking users for their contributions, liking their comments (or replies) and thus giving prominence to these comments in the thread, or replying to users in a positive manner.
When it comes to dealing with polarised discussions online, we’re already quite advanced in terms of what we’re doing, and a couple of the techniques to diffuse polarisation are already ingrained in our approach – for example, focusing on aspirations and moderating conversations in a non-judgemental way.
It’s important to understand when polarisation is encountered. Polarisation in its simplest form is: “we are right, they are wrong”. Polarisation is an artificial construction of identities. It’s about people who are being targeted by narrow identity communication to choose sides. Pushers try to lure them into polarisation. The definition of the problem and problem ownership is not very clear.