What is Content Moderation? and Why Outsourcing Content Moderation Services?
On Internet websites which invite users to post comments, links, images, videos and just about any form of user input, moderation is a big concern. Content moderation is the method the site provider or webmaster chooses to sort contributions which are irrelevant, obscene, illegal or insulting with regards to useful or informative contributions.
Various types of Internet sites permit user input, such as Internet forums, blogs, social networking sites and news sites powered by scripts such as phpBB, a Wiki, or PHP-Nuke. Depending on the site’s content and intended audience, the host site will decide what kinds of user comments are appropriate and then delegate the responsibility of sifting through site contents to moderators. These moderators will attempt to eliminate trolling, spamming, or flaming, although this varies widely from site to site.
That is where content moderation as a service, comes into play. With the huge amount of data being transmitted, posted and viewed on websites, content regulation is a definite need. Content moderation is a hands-on regulation and review system for uploads, posts, and links that ensure protection against unethical, inflammatory and illegal content on a serviced website. Moderation services range from media uploads, text posts, and miscellaneous files, attachments or links. Here comes the need of outsourcing content moderation services.
A lot of big names use outsourcing content moderation services to maintain a specific level of dignity and keep their sites as inviting as possible. Facebook uses moderation to filter photos, posts, links, applications and other contents to ensure that there are no problems that may arise from user-to-user conflicts or service-to-user problems. Another big name on the internet is YouTube which uses moderation to filter inappropriate videos and audio content.
Recognizing the need for data security, website reputation, and user-related concerns, New Media Services offers “WWW Moderation” – a moderation system geared towards clients who want professional service for their website needs. This service covers Images, Videos, Wallpapers, Ringtones, and Music – Blogs, Comments, and Descriptions – Tags, Status Updates, Usernames, and Avatars.
Moderation is generally defined as staying within reasonable limits that aren’t excessive or extreme and avoiding. In the context of community members content moderation, it refers to the practice of monitoring submissions and applying a set of rules which define what is acceptable and what is not. Unacceptable content is then removed.
There are 8 common types of moderation which must be considered when deciding how to maintain some sense of order within the community.
When someone submits content to the website and it is placed in a queue to be checked by a moderator before it is visible to all, it is pre-moderating. Pre-moderation has the benefit of ensuring that content deemed to be undesirable, particularly libelous content, is kept off the visible community sections of the site. It is also a popular moderation choice for online communities targeted at children.
While pre-moderation provides high control of what community content ends up being displayed on the site, it has many downsides. Commonly thought to cause the death of online communities, it creates a lack of instant gratification on the part of the participant, who is left waiting for their submission to be cleared by a moderator. In turn, content that is conversational become stilted and judder to a halt if the time delay between submission and display is too long. The other disincentive to use pre-moderation is the high cost involved if and when your community grows and submissions cross a threshold of user-generated content unmanageable by your maximum team of moderators.
It is most suited to communities with a high level of legal risks such as celebrity-based ones, or communities where child protection is vital. If content is not conversational or time-sensitive, such as reviews or photos, it can also be deployed without affecting the community’s dynamic too much, in this case, outsourcing content moderation services can be done by approving the good contents and disapproving the unwanted ones.
In an environment where active moderation must take place, post-moderation is a better alternative to pre-moderation from a user experience perspective, as all content is displayed on the site immediately after submission, but replicated in a queue for a moderator to pass or remove afterward.
The main benefit of this type of moderation is that conversations take place in real-time, which makes for a faster paced community. People expect a level of immediacy when interacting on the web, and post moderation allows for this whilst also allowing moderators to ensure security, behavioral and legal problems can be identified and acted upon in a timely manner.
Unfortunately, as the community grows, the cost can become prohibitive. As well as this, as each piece of content is viewed and approved or rejected, the website operator legally becomes the publisher of the content, which can prove to be too much of a risk for certain communities such as gossip ones which attract salacious and potentially defamatory submissions. Given that the number of times the content is viewed will directly impact on the size of damages awarded should a court case result from publication of a submission, a short time frame for the review of content is advisable, in this case, outsourcing content moderation services can be done by keeping the good contents and disapproving the unwanted ones.
3. Reactive moderation
Reactive moderation is defined by relying on community members to flag up content that is either in breach of House Rules or that the members deem to be undesirable. It can be utilized alongside pre- and post- moderation, as a ‘safety net’ in case anything gets through the moderators, or more commonly as the sole moderation method.
The members themselves essentially become responsible for reporting content that they feel is inappropriate as they encounter this content on the site or community platform. The process is usually to include a reporting button on each piece of user-generated content that if clicked, will file an alert with the administrators or moderator team for that content to be looked at, and if in breach of the site’s rules of use, to remove.
The main advantage of this method of moderation is that it can scale with community growth without putting extra strain on moderation resource or cost, as well as theoretically avoid responsibility for defamatory or illegal content uploaded by the users of website, as long as process for removing content upon notification within an acceptable timeframe is in place.
However, if the company is particularly concerned about how their brand is viewed, it might not be willing to take the risk that some undesirable content will be visible on the website for any period of time, as there exists a relying connection between company members, whose common duty is to see and bother reporting the content which is not to be placed, seen or shared. Moreover, it is thought that a reactive moderation provides legal protection.
4. Distributed moderation
Distributed moderation is still a somewhat rare type of user generated content moderation method. It usually relies on a rating system which members of the community use to vote on whether submissions are either in line with community expectations or within the rules of use. It allows controlling comments, or forums posts to mostly reside within the community, usually with guidance from experienced senior moderators.
Expecting the community to self-moderate is very rare direction companies are willing to take, for legal and branding reasons. For this reason, a distributed moderation system can also be applied within an organization, using several members of staff to process contributions and aggregating an average score to determine whether content should be allowed to stay public or be reviewed. Distributed moderation comes in two types: user moderation and spontaneous moderation.
User moderation User moderation allows any user to moderate any other user’s contributions. On a large site with a sufficiently large active population, this usually works well, since relatively small numbers of troublemakers are screened out by the votes of the rest of the community. Strictly speaking, wikis such as Wikipedia are the ultimate in user moderation, but in the context of Internet forums, the definitive example of a user moderation system is Slashdot.
For example, each moderator is given a limited number of “mod points,” each of which can be used to moderate an individual comment up or down by one point. Comments thus accumulate a score, which is additionally bounded by the range of –one to five points. When viewing the site, a threshold can be chosen from the same scale, and only posts meeting or exceeding that threshold will be displayed. This system is further refined by the concept of karma—the ratings assigned to a user’s’ previous contributions can bias the initial rating of contributions he or she makes.
On sufficiently specialized websites, user moderation will often lead to groupthink, in which any opinion that is in disagreement with the website’s established principles (no matter how sound or well-phrased) will very likely be “modded down” and censored, leading to the perpetuation of the groupthink mentality.
Spontaneous moderation Spontaneous moderation is what occurs when no official moderation scheme exists. Without any ability to moderate comments, users will spontaneously moderate their peers through posting their own comments about others’ comments. Because spontaneous moderation exists, no system that allows users to submit their own content can ever go completely without any kind of moderation.
5. Automated moderation
In addition to all of the above human-powered moderation systems, automated moderation is a valuable weapon in the moderator’s arsenal. It consists of deploying various technical tools to process UGC (User-generated content) and apply defined rules to reject or approve submissions. The most typical tool used is the word filter, in which a list of banned words is entered and the tool either stars the word out or otherwise replaces it with a defined alternative, or blocks or rejects the message altogether. A similar tool is the IP ban list. There are also a number of more recent and sophisticated tools being developed, such as those supplied by Crisp Thinking. These include engines that allow for automated conversational pattern analytics and relationship analytics.
6. Supervisor moderation
Also known as unilateral moderation, this kind of moderation system is often seen on Internet forums. A group of people is chosen by the webmaster (usually on a long-term basis) to act as delegates, enforcing the community rules on the webmaster’s behalf. These moderators are given special privileges to delete or edit others’ contributions and/or exclude people based on their e-mail address or IP address, and generally, attempt to remove negative contributions throughout the community.
7. Commercial content moderation (CCM)
Commercial content moderation is a term applied to describe the practice of “monitoring and vetting user-generated content (UGC) for social media platforms of all types, in order to ensure that the content complies with legal and regulatory exigencies, site/community guidelines, user agreements, and that it falls within norms of taste and acceptability for that site and its cultural context. While at one time this work may have been done by volunteers within the online community, for commercial websites this is largely achieved through outsourcing the task to specialized companies, often in low-wage areas. Employees work by viewing, assessing and deleting disturbing content.
8. No moderation
Operating without moderation is a risky business but there are all sorts of reasons why a company might choose not to regulate in any way the content submitted by its members. One of the reasons for avoiding content moderation is that the company doesn’t have the resource or finance, or simply because the CEO doesn’t believe in any form of molding or control on content. From a legal standpoint, not adhering to moderation may lead to such a position when the community will quickly descend into anarchy, and the atmosphere will probably become so unpleasant it will turn off potential new members. Basically, without moderation you are not in any control of your community, which leaves you wide open to all sorts of abuse, both anti-social as well as illegal.
What are the advantages of content moderation for the brand?
Firstly, it helps to control brand’s reputation. Reputations are easily formed online and user-generated content, or UGC, in the form of comments, social media posts and other forms of content does show up in search engine results. Moderation gives a possibility to project the positive image needed for the brand to prosper.
Secondly, it gives a real-time moderation around the clock to avoid delays in pulling inappropriate content and allowing beneficial content to post—naturally giving brand’s content marketing a boost.
Finally, when moderation is present on the website, it holds more credibility in the eyes of consumers when UGC is controlled to remove what is inappropriate or spam. Content moderation, whatever type is preferred will lead to creating trustworthy brand’s name and reputation, attracted readers and contributors.
Why Outsourcing Content Moderation Services
Outsourcing will save you money Outsourcing Content Moderation services will save you money, recruiting professionals to do the tasks might cost you 10 times more, you will need to hire at least 3 professionals of these; a developer, a Web & graphic designer, a Marketer, a Copywriter/ content writer, and SEO expert.
Outsourcing will save you time Outsourcing Content Moderation services will save you time, it will save 1000’s of hours on content creation, content distribution, monitoring website, social Marketing, Web development, and of course recruiting professionals.
Outsourcing will leverage the quality Outsourcing Content Moderation services will leverage the quality, an expert certified team following the best practices in marketing and web development can gain a great experience by promoting many services and products and that will form the ability to promote any service.
Outsourcing will make you focused Outsourcing Content Moderation services will make you focus on your core business and this is definitely the most important reason to outsource.