What is content moderation – an overview
We may not always have the same name for it, but all of us understand what is meant by content. A dictionary may define it summarily and confusingly as something that is contained within something, but that does not come close to the context in which the term is used here. Of course, dictionaries are upping their game as well, as the “something that is to be expressed through some medium, as speech, writing, or any of various arts” definition by dictionary.com illustrates.
If you are reading these lines and words, what you are reading is known as the content of the article with a title as given at the top of this page. You would, perhaps, already know that.
And moderation? What is moderation?
Cambridge dictionary defines moderation as the “quality of doing something within defined limits.” In other words, the act of being moderate and avoiding extremes.
What is content moderation?
What does one get when one combines content with moderation?
We know that a humongous amount of content is produced and consumed, every day, every hour, every minute and every second. Each time we react on social media, or upload a picture on Facebook, send money through our internet banking account, or send an email, or forward a WhatsApp message, content is being created. And consumed. Someone is seeing our social media reaction, someone is receiving the WhatsApp message and someone is receiving the money we have sent.
How much is the data being produced and consumed?
Domo has been publishing an annual ‘Data Never Sleeps’ infographic for several years, that summarizes the content being produced (and consumed) every minute on the world wide web. They published the eighth version in Q3 of 2020. According to the latest version, these are some of the things that happen on the web IN ONE MINUTE:
- 347,222 stories are posted by Instagram users
- 4497420 searches are run on Google
- 41,666,667 WhatsApp messages are shared
- 69,444 jobs are applied for by LinkedIn users
- 208333 participants are hosted by Zoom in various meetings
- 147,000 photos are uploaded by Facebook users
All this in ONE MINUTE. Besides, the rate of sharing, posting, and uploading is only increasing.
Since the web is open for use, or abuse, by anyone and everyone, a mechanism is needed to ensure that the content being posted, that can be consumed by any of the seven billion people in the world, is suitable. While each platform has its own set of policies and guidelines, they are, in general, also guided by the many unwritten rules of civil society pertaining to interaction between humans.
Content moderation is the process through which this is done. It is the process of screening and monitoring user generated or open content, based on platform rules as well as rules of civil society, with a view to determining if the content is suitable for publishing. All platforms that rely on user generated content (referred to as UGC), such as dating sites, ecommerce platforms, social media, marketplaces, will have some form of content moderation as a control process.
oWorkers has over eight years of experience in moderation web content, for clients from all over the world. As traffic on the internet grows, so does the volume of content and the need to moderate. It is completely focused on data based projects such as those of moderation, and has been repeatedly recognized as one of the top three providers of data based BPO services in the world. It is a provider that knows the meaning of what is content moderation.
The value of User Generated Content (UGC)
If UGC is a nuisance, who don’t we just disallow it? After all, the keys to the website are with its owners. They can decide the level of access to be given to visitors, and whether they should be allowed to leave their footprint on their web property or not.
The simple reason is that UGC creates great value for businesses. How?
- Marketing gurus say that a satisfied customer is the best advertisement for a business. As a company, I am expected to talk up my products and services and say things about them that induce people to buy. In a globally connected world, consumers have access to competitive information and know that. They don’t necessarily trust what you have to say about your brand. But it matters to them when they see that there are other real people, like them, who are saying things about the product they are interested in, well, that holds value for them because that is unbiased feedback. Hence businesses want UGC that is complimentary.
- It is free. While companies spend loads of money in promoting their products and advertising, they don’t need to spend a penny on the feedback that organically accumulates on their web property. They just need to create it and maintain it. Of course, they need to ensure that the products and services they are selling are delivering what they promise, else the UGC could misfire.
- The brand gets exposure. Social media users like to talk about themselves. If they like the company’s products and are invested enough, they are likely to feature in their social media messaging, spreading the word across to their networks. Again, brand promotion for no cost. Some people who do this regularly and are seen as trustworthy and reliable, acquire the tag of influencers; who can influence purchasing decisions of other consumers. Some companies do invest in influencers who then need to balance out their own brand value of trustworthiness with their sponsoring brand;s objectives.
- Organically created UGC provides good SEO karma to the brand as well and improves its search engine rankings. The more the content on a website, the more opportunity search engines have to locate matches. In addition, when the content is user generated, it provides a variety greater than what a small group of people creating content may be able to come up with. As users, they may have more flexibility in language and also be better able to mirror terms that are likely to be used by other users. As a result, you get a keyword-rich website.
What is content moderation becomes a relevant question for companies to know and understand since, with such rich pickings from UGC, most self-respecting, confident brands will want to benefit from it.
With its deep roots in the community where we have located our delivery centers, oWorkers attracts the best local talent which it then trains with the help of a dedicated training team, whose job it is to make hired resources ‘fit for purpose’ for client projects, several of them being content moderation projects. The trained resources become adept at handling different types of moderation requirements for clients. Under the guidance of a leadership team that have over 20 years of hands-on experience in the business, oWorkers delivers customised solutions for clients. With its ability to attract talent, oWorkers can also handle temporary spikes in volumes with aplomb. We can ramp up by almost a hundred people in 48 hours.
Common methods of content moderation
So, we now have answers to the ‘what is content moderation’ question. But that is not enough. It needs to be done. How is it done?
Pre-publishing
As the name suggests, in this method, the content is not allowed to go live until it has passed through the process of moderation instituted by the property owner. This is the safest method of moderation and poses no risk to the brand by virtue of harmful content getting published.
The downside of this method is that it requires a large moderation capacity which could slow down the process of approval. Content creators, who normally like to see their content being visible as soon as they create it, might lose interest in interacting with the brand. It could also call into question the spontaneity and validity of the content.
Post-publishing
Letting content go live and continuing to monitor what has been published is also a popular method of moderation. The pros and cons are the opposite of the pros and cons of pre-publishing moderation. The content gets published immediately, which creates a more vibrant and interactive community. A user posting content is perhaps most interested at the time she has created content and is willing to do more with it. Immediate publication also raised the perception of honesty about the content.
The downside is that malicious content could be posted and potentially do some damage before it is caught. Even though it is removed after a review, with modern technologies easily accessible, images and screen-shots taken by visitors cannot be erased.
Reactive
This method relies on visitors to the website to report content that they deem to be harmful. It is then reviewed by site administrators and acted upon as they deem fit.
This method relies not only upon the users’ ability to identify content that is inappropriate but also upon her engagement with the brand to be of a magnitude where she will report it to the site administration and not just move to the next website. Hence, there are obvious gaps that are possible if you use this strategy. However, if you do have a set of engaged users, this can save the company a lot of money in moderation expenses.
Crowdsourcing
Visitors are asked to rate the content visible on the website, including UGC, on its suitability and appropriateness. The theory is that suitable content will get better ratings and will become more visible on the top of the listings to users. Inappropriate content, being voted low, will slip to the bottom of the lists and gradually disappear out of view.
This is a slow method but inexpensive to implement and suffers from all the shortcomings of a group of users who may not have a deep commitment to the brand’s objectives even though they know the answer to what is content moderation.
Automated
There are many automated tools that can be deployed for content moderation.
There are simple ones that rely on finding matches with a set of words and phrases that are not acceptable, and can either replace them with alternates, or refuse permission to the entire piece of content to be published.
There are also more advanced ones that rely on technologies like Natural Language Processing (NLP) that have some ability to understand the context in which a certain word or phrase has been used and then mete out appropriate treatment to it.
Artificial Intelligence (AI) based solutions are also fast gaining ground and could well become the go-to solutions and owners may not even need to even know what is content moderation.
In addition to its hiring and training capability, oWorkers has forged partnerships with leading technology providers, creating for itself the ability to access the right technology when needed. These technologies also help clients as they are eventually deployed for delivery on their projects.
Add to this the advantages of operating from super secure facilities with protocols that further enhance the security of client data, and you have a winning combination.
oWorkers is GDPR compliant as well as ISO (27001:2013 & 9001:2015) certified. It is able to operate either from office or from home, given the constraints imposed on account of the pandemic.
The oWorkers advantage
If more reasons are needed to engage oWorkers in your content moderation journey, they operate out of three global delivery locations.
So what, you might ask? How does that help me?
This gives them access to a multicultural talent pool through which they are able to support client moderation requirements in 22 languages. Content creation is happening from all corners of the world and can be in any language. You can’t keep adding vendors each time someone posts in a new language from a far corner of the world.
Thanks to their management practices, costs are managed and kept in check, which is why they offer some of the best pricing amongst competitors. Clients advise savings of almost 80% after engaging them in their projects, especially clients from the US and Western Europe, without any compromise on delivery quality.
You will not even need to know the meaning of what is content moderation.