As we learned in the previous unit, community moderation aims to create a safe environment for people, in which they are encouraged to think critically and engage in constructive discussion. By applying moderation techniques, online engagement can be supported. But how do we know if there indeed is more engagement when we moderate our communities? Can we measure this?
One way to measure the effectiveness of moderation, and thus find out if it leads to an increase in engagement on your platform, is by conducting A/B tests. An A/B test is an experimental research design that applies a form of randomised research, comparing two groups against each other. Both groups are exposed to a different version of the experiment.
In the case of moderation, you can expose one group to a moderated post, and the other to an unmoderated post, while all other factors are kept the same. Differences in the outcomes between group A and B could (potentially) explain the effect of a certain treatment.
These A/B tests were conducted on different platforms of RNW Media’s Citizens’ Voice programme using Facebook, as follows:
Facebook provides the (paid) option to advertise certain content to a selected group of people, for example based on age and location. This allows the managers of a Facebook page to promote their content to a preferred target audience.
Also, Facebook provides the option to conduct A/B tests to test whether a certain treatment affects users differently. For each A/B test in our research, we created two identical posts that included a link to the article on the website, followed by a question or discussion point that was related to the article, aiming to start a discussion among the readers on the Facebook page.
The only difference between version A and version B was that the first was moderated and the second was left unmoderated. Facebook randomly assigned people into group A or B and made sure that people who saw version A appearing in their Facebook timeline, would never see version B, and vice versa.
Also, people who were already followers of the Facebook page would never see the post appearing in their timeline, only in the rare case that a person was tagged by someone into the conversation.
To analyse the findings within discussions and find out whether moderation contributes to more engagement online, the number of words per comment can be calculated for both the moderated and unmoderated conversation.
The quality of the comments can be determined by manually categorising and coding the comments into, for example:
Note: Although moderators stimulate users to respond with a thoughtful comment, we should avoid diminishing the value of phatic or feedback comments. For some people, the step to express themselves and provide a fully justified opinion could be huge, especially if the content addresses a sensitive topic. Commenting with an emoticon, or simply stating “Top” or “Nice” is already a first step to express oneself. Also, people who seem not to respond to the article and comment with “Hello” are likely to be looking for interaction with other community members and are not meant to disrupt the conversation.
In general, the moderated posts had more comments than the unmoderated posts. But, though looking at the quantitative element of discussions provides interesting insights, it is even more interesting to look at the quality of the comments. The number of comments that consist of an emoji/Gif are much higher in the unmoderated versions, and there is a significant relationship between moderation and the number of thoughtful comments made by commenters. Thus, this validates our hypothesis that with community moderation, young people are more willing to share their views and opinions on the platform.
These were initial tests done in RNW Media’s Citizens’ Voice programme, and such test can be carried out for a wide range of articles, comments and other types of engagement research.
Go to Assignment 7.2.3: Identify type of comments.
Can you think of other ways to measure if online moderation leads to more engagement?