Types of Content Moderation: How to Choose the Right Method
Content may still be king but in the times we live in, the king also needs to be moderated, or toned down and regulated, or even altogether rejected.
While the 7 billion plus population of the world is seen as consumers, which is probably true for most products and services that are produced by companies and businesses and professionals, specifically with regard to content, they are all producers as well.
The graduation photographs shared on Instagram, the book review on Amazon and the greetings on Facebook are examples of content being created by these 7 billion, or at least the 4 or 5 billion who access the internet. This is known as User Generated Content (UGC), the content being generated by users or visitors to websites, as opposed to the ‘official’ content that is created by the owners of websites and blogs and other web properties.
The internet has brought about many changes in the lives of people, the power to produce content on the fly being one of them. But, “with great power comes great responsibility,” as we have heard ever so often courtesy the Spiderman comic books, to the extent that it has also come to be known as the Peter Parker principle, though it is known to predate Spiderman by several hundred years and has even been used by Prime Minister Winston Churchill in one of his addresses to the British Parliament.
Unfortunately, it appears that human beings, or at least some of us human beings, are unable to exercise this power with responsibility. Hence, as a measure of collective responsibility, all human beings perhaps pay the price. This becomes one of the main reasons for the need for content moderation.
There are others, too.
As a new-age BPO, oWorkers has developed its expertise and grown on the back of fulfilling client needs on requirements of the present-day business corporation, such as that of content moderation, which was not even a remotely possible requirement when many of the large BPO companies were established. Hence, oWorkers remains uniquely positioned to manage all types of content moderation requirements of organizations.
Need for content moderation
It seems that the real culprit is content. If there were no content there would be no need for various types of content moderation, would there? Because there is content there needs to be moderation.
So, the question really is, what is the need for user generated content (UGC)?
UGC on platforms
Over the last 20 years, social media platforms have come to occupy an important place in our lives. They promote interaction between people, in their own unique ways, and enable people to share content that becomes UGC.
The content shared on these platforms becomes UGC. While most of humanity is responsible and adheres to defined and accepted rules and norms, a small percentage does not. There is a possibility that the content they create is not wholesome and could violate either the rules of the platform, or norms of accepted social behavior, or both. If left unchecked it could poison impressionable minds with content that is extreme either in violence or sexuality, or cause rifts between people and groups based on religious or national or other affiliations or do damage to the social fabric in other ways.
So, why don’t we disallow UGC on these platforms?
The whole reason for the existence of these platforms is UGC. They promote communication, interaction and sharing between people and organizations and groups. If there is no UGC, these platforms would be bare shells, not vibrant communities and marketplaces. They must permit UGC to remain relevant. And if they must permit UGC, they must moderate it so that content on their platform remains wholesome.
UGC on spaces owned by companies and organizations
An online presence has become a necessity for organizations of all types; a website is akin to being ‘proof of life’ of an organization. If you don’t have a website you don’t exist.
Equally important nowadays is the need to create, develop and maintain thriving online communities which have the brand as a central theme. Many of these communities are built on different social media platforms, leveraging their tools for reaching out to and engaging people.
These communities are designed to work for the brand or organization and help in generating a substantial volume of content without much effort on part of the host. Through the easy sharing and communication and outreach facilities available on most such platforms, the brand also seeks to reach out to newer audiences and customer segments.
Newer customers often seek unbiased opinions about the company or brand that they can get from these communities. Given a choice between taking an opinion about a product from the seller of the product or a buyer, which one do you think would be more reliable? The buyer’s review of course. And that is the value of UGC for companies.
It seems customers have more power than they are given credit for.
Competitors getting through their defences and subtly disparaging the brand owning the web space while talking up their own, also needs to be looked out for.
The cost for them is in the form of moderation. For communities and spaces that they create on a platform, they need to ensure it fulfils the basic criteria of acceptability. Secondly, as part of moderation, they will also perhaps ensure that the interaction on a forum is not inimical to the interest of their brand.
Regardless of the need and where it stems from, oWorkers has the expertise to deliver the goods. As active, contributing members of local communities we work with, positions us as favored employers, generating walk-in traffic of candidates seeking employment. Our hiring team selects people based on the requirement in different projects we are executing, which our dedicated training teams then work on and polish and make the hired candidates job-ready in a short period of time. Our resources being employees, and not contractors or freelancers that some of our competitors seem to prefer, creates a long-term commitment even as the company invests in tracking their performance and facilitating long-term career planning.
Types of content moderation
A process of managing their online communities is the need of the hour for organizations. This helps in promoting the brand’s messages while keeping users as well as the brand safe, and expanding the user network.
Many people argue that ‘No Moderation’ should be given pride of place in discussion on types of content moderation. After all, no decision is also a decision, as it forces the continuance of the status quo. It is a decision in support of the present state of affairs. It is a decision for ‘no change.’
And that is perhaps a valid argument. However, we submit that that is a valid choice at the stage when one is debating whether to moderate or not to moderate the UGC on one’s online community. Our present discussion stems from that choice having been exercised in favor of ‘moderate,’ and taking it forward to how that is to be done, now that the initial decision has been taken. And that is where we will now take it.
With our ready supply of resources, we can also provide the benefit of contingent hiring to clients that takes care of unexpected, or even expected, volume spikes, without burdening them with the cost of those resources for the remaining period. oWorkers has the ability to hire almost a hundred additional people in the short period of 48 hours.
Let us take the example of the process of entry of visitors to a country, variously known as passport control or visa regimen in different places.
The authorities could construct a barrier at the entry point so that credentials of all visitors could be checked and only the ones that meet the entry criteria be allowed to enter while the others are turned back.
This way the authorities are certain that only authorised, acceptable personnel are entering, limiting, even eliminating the damage that could be caused by the entry of unwelcome people. However, this entails a cost for the host country as they will need to hire a battery of resources to ensure that each visitor’s credentials are checked.
Visitors will be unhappy as they would be interrupted by a barrier, and having to stop and submit themselves to scrutiny after possibly a long time traveling. If the pre-entry checks are draconian, it could eventually reduce the flow of visitors to the nation.
Pre-moderation works in the same manner. It checks every piece of content posted by users and authorizes it for publishing, only after which it becomes visible to others. It ensures protection from malicious content of any kind as it will be screened out. Users, however, are less than happy as they will be subject to a censorship process and there would be delays in their UGC showing up. Hence, this can leave online communities stilted and jaded.
Continuing with our analogy, let us say that the immigration authorities decide to dispense with the border control system and implement a different one. They decide to let everyone in while leaving some defined information and credentials about themselves at an appointed desk in the arrival hall. The immigration authorities would gradually review the information, based on available resources, identify the ones who should not have been permitted, and attempt to search for them and deport them.
This would no doubt ensure a smooth passage for all comers, without the hassle of queuing up and being scrutinized. The nation might be viewed as a welcoming place which might encourage more arrivals.
The authorities will save time and money in real-time checks. However, if they find unacceptable aliens having gotten through, by the time they locate and get them back, they would perhaps have done some damage.
This is how post-moderation works. UGC is allowed to go through when posted, giving life to the community and satisfaction to participants. However, by the time offensive content is identified and removed, it would have been viewed by many, and left its mark. In today’s world of easy access to technology, it is also possible images of the content being visible have been taken and keep circulating long after the content is removed.
Now, in the immigration system, let us introduce one more innovation, to relate to one more among the types of content moderation methods available to us. The immigration authorities are crunched for funds to support checking even the information left behind at official entry points as they are busy battling unauthorized entries at a weak border to the south.
Since verification of aliens needs to be done, they ask each alien to provide details of one local contact person instead of the detailed information earlier asked for. They request this local contact, each time an alien comes in, to report suspicious activities to them, so that investigation can be carried out and action taken, if required.
However, this system is based on the sincerity of a distributed population. Not all may be keen to participate, and some may even be complicit in the activities of the alien. This will be easy and cheap for the authorities, but could leave some illegal arrivals undetected.
In reactive moderation, community members are encouraged to provide inputs on submissions, through buttons and tools made available to them while they are viewing the content, to elicit their views on the relevance of the content and whether it should be permitted to continue. It is not a fool-proof way, but it does something and could be an inexpensive method for brands with a low risk of being affected by UGC that is malicious.
Now, resources and budgets get crunched further, and the immigration authorities cannot provide much support to this issue. They are not in a position to take any action even if cases are identified and highlighted by others.
In a way, they leave it to the regular law-enforcement agencies of the nation to handle the situations that might emerge as a result of permitting entry to everyone. The nation does have a budget and resources to tackle law and order issues. These resources can be deployed whether these issues emanate from wrongful acts of residents or of aliens, could they not?
In the context of moderation, community members are provided with rating and/ or voting mechanisms to enable them to provide their inputs. If a piece of content is routinely being rated low, it will gradually get pushed so low in the sequence that it will, for all practical purposes, cease to exist. Again, not fool-proof, as it depends on the commitment of community members who need to feel ownership for it, but does achieve something.
As we can perhaps make out, all the various types of content moderation methods discussed earlier are manual methods. It is also fairly apparent that for companies sensitive to inappropriate content being posted, the pre-moderation method is the one that would work best. But it has limitations like obstructing the free flow of interaction within the group or community as well as being expensive as manual resources need to be deployed.
This is where automated moderation plays an important role in content moderation. As automated tools can handle large volumes of data in the ‘blink of an eye,’ it can obviate the need for employing a battery of human moderators. Besides, since it can review and act, once again in the ‘blink of an eye,’ the brand can review every piece of content before it is permitted to go live, and allow online communities to survive and thrive.
There are different methods of automation being used, from the simple list of words and phrases to be filtered out, to IP addresses that are ‘persona non grata’ to more advanced Artificial Intelligence (AI) based automation tools. These will no doubt continue to be refined and perfected as time goes by. At this point, it seems, using an automated tool along with some method of manual moderation might be the option for content-sensitive brands, till they can get confidence in a fully automated solution.
As a GDPR compliant, ISO (27001:2013 & 9001:2015) certified company, oWorkers is uniquely positioned to handle automated solutions for content moderation, whether provided by the client, or one that we access from our scores of partnerships with technology companies. Our facilities are secure, as is the technology backbone we use for enabling employees to work from home during the Covid-19 driven ‘shelter in place’ regulations.
oWorkers handles the growing need for content moderation
The need for all types of content moderation is growing rapidly. The generation of user content being a vital cog in the marketing wheels of many organizations, it naturally follows that the process of moderation grows with it. It seeks to enhance a brand’s value, along with its reach and engagement.
With several unicorn marketplaces as longtime clients, oWorkers understand the challenges of this work and is equipped to handle them. With centers in three of the most sought-after delivery locations in the world, oWorkers employs a multi-cultural team which enables it to offer services in 22 languages.
You can leave your content moderation requirements to an expert. You can leave it to oWorkers.