Moderation is a tool to build a vibrant, respectful and inclusive digital community. It helps you communicate with users and allows you to bring marginalised groups into the conversations. Moderation can encourage users to actively participate in the conversation and create a safe space where young people feel not only able to take part but benefit from it.
Careful strategic moderation of online conversations helps build trust among community members who then feel safe to express themselves freely. This, in turn, nurtures diverse and resilient communities with strong relationships among members. From a more operational perspective moderation offers you the chance to set the tone of discussion on certain content, bring users back on topic if they go on a tangent, diffuse tense conversations, provide additional information or answer users’ questions, and propagate a healthy, respectful dialogue. Online abuse and harassment can discourage people from discussing important issues or expressing themselves and mean they give up on seeking different opinions. Therefore, discussions should be moderated with the aim of ensuring they remain a safe place for discussion and debate.
Users’ comments and intentions can be categorised, based on the moderator’s judgement. Moderators don’t have to record and categorise all user comments, but when deciding how to respond, these descriptions below can guide:
If you want to read more about moderation strategies and how to deal with the different types of user responses, click on the file below.
It can be very useful to set up online community guidelines. In addition, you can make policies for all your staff to be clear how to react online to attempt to undermine your campaign.
If you run a successful campaign you should anticipate some negative comments. But don’t freak out! It means that you reached someone on an emotional level, which – in itself – could be a success indicator. Trolls and hate comments are still annoying, so here are some tips on how to deal with trolls: