Types of Content Moderation: How to Choose the Right Method

Types of Content Moderation: How to Choose the Right Method

Types of Content Moderation: How to Choose the Right Method

Content may still be king but in the times we live in, the king also needs to be moderated, or toned down and regulated, or even altogether rejected.

While the 7 billion plus population of the world is seen as consumers, which is probably true for most products and services that are produced by companies and businesses and professionals, specifically with regard to content, they are all producers as well.

The graduation photographs shared on Instagram, the book review on Amazon and the greetings on Facebook are examples of content being created by these 7 billion, or at least the 4 or 5 billion who access the internet. This is known as User Generated Content (UGC), the content being generated by users or visitors to websites, as opposed to the ‘official’ content that is created by the owners of websites and blogs and other web properties.

The internet has brought about many changes in the lives of people, the power to produce content on the fly being one of them. But, “with great power comes great responsibility,” as we have heard ever so often courtesy the Spiderman comic books, to the extent that it has also come to be known as the Peter Parker principle, though it is known to predate Spiderman by several hundred years and has even been used by Prime Minister Winston Churchill in one of his addresses to the British Parliament.

Unfortunately, it appears that human beings, or at least some of us human beings, are unable to exercise this power with responsibility. Hence, as a measure of collective responsibility, all human beings perhaps pay the price. This becomes one of the main reasons for the need for content moderation.

There are others, too.

As a new-age BPO, oWorkers has developed its expertise and grown on the back of fulfilling client needs on requirements of the present-day business corporation, such as that of content moderation, which was not even a remotely possible requirement when many of the large BPO companies were established. Hence, oWorkers remains uniquely positioned to manage all types of content moderation requirements of organizations. 

 

Need for content moderation

It seems that the real culprit is content. If there were no content there would be no need for various types of content moderation, would there? Because there is content there needs to be moderation.

So, the question really is, what is the need for user generated content (UGC)?

UGC on platforms

Over the last 20 years, social media platforms have come to occupy an important place in our lives. They promote interaction between people, in their own unique ways, and enable people to share content that becomes UGC.

The content shared on these platforms becomes UGC. While most of humanity is responsible and adheres to defined and accepted rules and norms, a small percentage does not. There is a possibility that the content they create is not wholesome and could violate either the rules of the platform, or norms of accepted social behavior, or both. If left unchecked it could poison impressionable minds with content that is extreme either in violence or sexuality, or cause rifts between people and groups based on religious or national or other affiliations or do damage to the social fabric in other ways.

So, why don’t we disallow UGC on these platforms?

The whole reason for the existence of these platforms is UGC. They promote communication, interaction and sharing between people and organizations and groups. If there is no UGC, these platforms would be bare shells, not vibrant communities and marketplaces. They must permit UGC to remain relevant. And if they must permit UGC, they must moderate it so that content on their platform remains wholesome.

UGC on spaces owned by companies and organizations

An online presence has become a necessity for organizations of all types; a website is akin to being ‘proof of life’ of an organization. If you don’t have a website you don’t exist.

Equally important nowadays is the need to create, develop and maintain thriving online communities which have the brand as a central theme. Many of these communities are built on different social media platforms, leveraging their tools for reaching out to and engaging people.

These communities are designed to work for the brand or organization and help in generating a substantial volume of content without much effort on part of the host. Through the easy sharing and communication and outreach facilities available on most such platforms, the brand also seeks to reach out to newer audiences and customer segments.

Newer customers often seek unbiased opinions about the company or brand that they can get from these communities. Given a choice between taking an opinion about a product from the seller of the product or a buyer, which one do you think would be more reliable? The buyer’s review of course. And that is the value of UGC for companies.

It seems customers have more power than they are given credit for.

Competitors getting through their defences and subtly disparaging the brand owning the web space while talking up their own, also needs to be looked out for.

The cost for them is in the form of moderation. For communities and spaces that they create on a platform, they need to ensure it fulfils the basic criteria of acceptability. Secondly, as part of moderation, they will also perhaps ensure that the interaction on a forum is not inimical to the interest of their brand.

Regardless of the need and where it stems from, oWorkers has the expertise to deliver the goods. As active, contributing members of local communities we work with, positions us as favored employers, generating walk-in traffic of candidates seeking employment. Our hiring team selects people based on the requirement in different projects we are executing, which our dedicated training teams then work on and polish and make the hired candidates job-ready in a short period of time. Our resources being employees, and not contractors or freelancers that some of our competitors seem to prefer, creates a long-term commitment even as the company invests in tracking their performance and facilitating long-term career planning.

 

Types of content moderation

A process of managing their online communities is the need of the hour for organizations. This helps in promoting the brand’s messages while keeping users as well as the brand safe, and expanding the user network.

Many people argue that ‘No Moderation’ should be given pride of place in discussion on types of content moderation. After all, no decision is also a decision, as it forces the continuance of the status quo. It is a decision in support of the present state of affairs. It is a decision for ‘no change.’

And that is perhaps a valid argument. However, we submit that that is a valid choice at the stage when one is debating whether to moderate or not to moderate the UGC on one’s online community. Our present discussion stems from that choice having been exercised in favor of ‘moderate,’ and taking it forward to how that is to be done, now that the initial decision has been taken. And that is where we will now take it.

With our ready supply of resources, we can also provide the benefit of contingent hiring to clients that takes care of unexpected, or even expected, volume spikes, without burdening them with the cost of those resources for the remaining period. oWorkers has the ability to hire almost a hundred additional people in the short period of 48 hours.

Pre-moderation

Let us take the example of the process of entry of visitors to a country, variously known as passport control or visa regimen in different places.

The authorities could construct a barrier at the entry point so that credentials of all visitors could be checked and only the ones that meet the entry criteria be allowed to enter while the others are turned back.

This way the authorities are certain that only authorised, acceptable personnel are entering, limiting, even eliminating the damage that could be caused by the entry of unwelcome people. However, this entails a cost for the host country as they will need to hire a battery of resources to ensure that each visitor’s credentials are checked.

Visitors will be unhappy as they would be interrupted by a barrier, and having to stop and submit themselves to scrutiny after possibly a long time traveling. If the pre-entry checks are draconian, it could eventually reduce the flow of visitors to the nation.

Pre-moderation works in the same manner. It checks every piece of content posted by users and authorizes it for publishing, only after which it becomes visible to others. It ensures protection from malicious content of any kind as it will be screened out. Users, however, are less than happy as they will be subject to a censorship process and there would be delays in their UGC showing up. Hence, this can leave online communities stilted and jaded.

Post-moderation

Continuing with our analogy, let us say that the immigration authorities decide to dispense with the border control system and implement a different one. They decide to let everyone in while leaving some defined information and credentials about themselves at an appointed desk in the arrival hall. The immigration authorities would gradually review the information, based on available resources, identify the ones who should not have been permitted, and attempt to search for them and deport them.

This would no doubt ensure a smooth passage for all comers, without the hassle of queuing up and being scrutinized. The nation might be viewed as a welcoming place which might encourage more arrivals.

The authorities will save time and money in real-time checks. However, if they find unacceptable aliens having gotten through, by the time they locate and get them back, they would perhaps have done some damage.

This is how post-moderation works. UGC is allowed to go through when posted, giving life to the community and satisfaction to participants. However, by the time offensive content is identified and removed, it would have been viewed by many, and left its mark. In today’s world of easy access to technology, it is also possible images of the content being visible have been taken and keep circulating long after the content is removed.

Reactive moderation

Now, in the immigration system, let us introduce one more innovation, to relate to one more among the types of content moderation methods available to us. The immigration authorities are crunched for funds to support checking even the information left behind at official entry points as they are busy battling unauthorized entries at a weak border to the south.

Since verification of aliens needs to be done, they ask each alien to provide details of one local contact person instead of the detailed information earlier asked for. They request this local contact, each time an alien comes in, to report suspicious activities to them, so that investigation can be carried out and action taken, if required.

However, this system is based on the sincerity of a distributed population. Not all may be keen to participate, and some may even be complicit in the activities of the alien. This will be easy and cheap for the authorities, but could leave some illegal arrivals undetected.

In reactive moderation, community members are encouraged to provide inputs on submissions, through buttons and tools made available to them while they are viewing the content, to elicit their views on the relevance of the content and whether it should be permitted to continue. It is not a fool-proof way, but it does something and could be an inexpensive method for brands with a low risk of being affected by UGC that is malicious.

Distributed moderation

Now, resources and budgets get crunched further, and the immigration authorities cannot provide much support to this issue. They are not in a position to take any action even if cases are identified and highlighted by others.

In a way, they leave it to the regular law-enforcement agencies of the nation to handle the situations that might emerge as a result of permitting entry to everyone. The nation does have a budget and resources to tackle law and order issues. These resources can be deployed whether these issues emanate from wrongful acts of residents or of aliens, could they not?

In the context of moderation, community members are provided with rating and/ or voting mechanisms to enable them to provide their inputs. If a piece of content is routinely being rated low, it will gradually get pushed so low in the sequence that it will, for all practical purposes, cease to exist. Again, not fool-proof, as it depends on the commitment of community members who need to feel ownership for it, but does achieve something.

Automated moderation

As we can perhaps make out, all the various types of content moderation methods discussed earlier are manual methods. It is also fairly apparent that for companies sensitive to inappropriate content being posted, the pre-moderation method is the one that would work best. But it has limitations like obstructing the free flow of interaction within the group or community as well as being expensive as manual resources need to be deployed.

This is where automated moderation plays an important role in content moderation. As automated tools can handle large volumes of data in the ‘blink of an eye,’ it can obviate the need for employing a battery of human moderators. Besides, since it can review and act, once again in the ‘blink of an eye,’ the brand can review every piece of content before it is permitted to go live, and allow online communities to survive and thrive.

There are different methods of automation being used, from the simple list of words and phrases to be filtered out, to IP addresses that are ‘persona non grata’ to more advanced Artificial Intelligence (AI) based automation tools. These will no doubt continue to be refined and perfected as time goes by. At this point, it seems, using an automated tool along with some method of manual moderation might be the option for content-sensitive brands, till they can get confidence in a fully automated solution.

As a GDPR compliant, ISO (27001:2013 & 9001:2015) certified company, oWorkers is uniquely positioned to handle automated solutions for content moderation, whether provided by the client, or one that we access from our scores of partnerships with technology companies. Our facilities are secure, as is the technology backbone we use for enabling employees to work from home during the Covid-19 driven ‘shelter in place’ regulations.

 

oWorkers handles the growing need for content moderation

The need for all types of content moderation is growing rapidly. The generation of user content being a vital cog in the marketing wheels of many organizations, it naturally follows that the process of moderation grows with it. It seeks to enhance a brand’s value, along with its reach and engagement.

With several unicorn marketplaces as longtime clients, oWorkers understand the challenges of this work and is equipped to handle them. With centers in three of the most sought-after delivery locations in the world, oWorkers employs a multi-cultural team which enables it to offer services in 22 languages.

You can leave your content moderation requirements to an expert. You can leave it to oWorkers.

The What, Why and How to do Content Tagging

The What, Why and How to do Content Tagging

The What, Why and How to do Content Tagging

In the physical world that humans have historically lived in, and still do despite the deep inroads digitization has made into our ways of life, we have been used to the concept of tags, or tagging, which have been used to provide information about the object that is not evident or displayed or available.

  • A tag could be a ‘price tag’ attached to objects displayed on the shelves of a shop.
  • A tag could be stickers on artefacts displayed in a museum identifying their source.
  • A tag could be similar colored pins affixed to the shorts of children to identify them as belonging to one specific group.
  • A tag is an identifier or a descriptor and the process of tagging is the process of creating an identification or description by affixing a suitable tag on the object to be identified or described.

 

What is content tagging

The digital world tries to keep things simple and mostly uses terms and phrases in the same way they would be used in normal parlance. Tag, or tagging, is a good example.

Techopedia describes a tag as “a piece of information that describes the data or content that it is assigned to. Tags are non-hierarchical keywords used for Internet bookmarks, digital images, videos, files and so on. A tag doesn’t carry any information or semantics itself.” It could be considered as metadata for the content for which it acts as a tag.

One could look at tags as markers of content on the internet that provide additional information about the content and/ or help us in locating content in the unimaginably large universe of cyberspace. Search engines use tags to identify content and produce accurate results.

Added to a piece of content on the web, a content tag seeks to connect the publication to other pieces of content that are similarly tagged.

Blog tags

Blog posts are based on themes or subjects and tags are created in order that the post gets associated with some form of topic or theme. In some cases, tags associated with posts in the site are displayed in the form of a tag cloud by the Content Management System (CMS) where the post resides, with the tags used more frequently appearing more prominent than other tags. By clicking on one of the tags, a user can call up all the posts to which that tag was assigned. Most blogs permit multiple tags to be associated with a post.

Social Media tags

Social platforms use tags widely. Content can be brought to the attention of other users by tagging them. Tagging creates a link that adds the content to the tagged user’s timeline.

Twitter has made popular the use of a modified version known as the hashtag. A hashtag on Twitter is used to collate information about a topic on the platform. The use of hashtags has since been adopted by many other social media platforms.

Tags could incorporate one or more of the following pieces of information about the content:

  • Titles and subtitles
  • The volume of content
  • Editors and authors
  • Copyright information
  • Key phrases, terms, words
  • Bibliographic information

oWorkers, with its focus on data based BPO services, has developed expertise in tagging content based on guidelines and objectives, over many projects executed for clients over the last eight years. It has been repeatedly identified as one of the top three data services BPO providers in the world. Partnering with oWorkers can give you a head start in your efforts at tagging your content.

 

Why is content tagging beneficial?

Being reasonable, rational creatures, human beings would not be doing it unless they saw some advantages to it. How does tagging help?

At the basic level, in the digital world, tagging is a process through which content is classified into classes or categories, which also links them to other pieces of content that have a similar classification. It gives a layer of structure to the publications and content you manage, allowing you a tool through which you can, firstly, measure or analyse it, and secondly, use those insights to structure it in ways that deliver benefit to the organization or company.

Some benefits of tagging:

  • Better searchability is the first and foremost benefit. Tags can be used by content consumers to reach other publications which have been tagged in a manner similar to what they are currently reading, in an efficient manner.
  • It helps in creating an archive, a historical record of publications, where it is possible to understand what has gone on in the past. This can provide some guidance on future additions to the repository.
  • With the help of this information, you could create a strategy that could choose to fill gaps in your content mix or focus more on tags that are popular. Possessing the knowledge gives you the power to decide.
  • Efficient tagging is used by search engines which consequently drives traffic to these publications. This can help drive advertising revenues for the publisher. It can also drive syndication opportunities as publishers are interested in content that receives more eyeballs.
  • Tags also enable content to free itself from being confined to visibility as a subset of the platform it is posted or hosted on, and get an identity that is independent, or additional. It becomes searchable over multiple channels.
  • An effective system of tagging provides users with relevant information about your content which could be in the form of text or audio or images or video.

While clients’ business benefits from effective tagging through oWorkers, they will also get the benefit of the unique position oWorkers occupies in the communities it works in. Being a preferred employer, it is able to attract a lot of walk-in talent without spending money on advertising for positions. This not only affords us the luxury of choice for client projects, but also provides the flexibility to support sudden ramps in volume, either on account of seasonality or driven by specific actions. We can hire a hundred people in 48 hours, if called upon to. This becomes a significant cost-saving measure for clients as they don’t need to maintain staffing at peak levels.

 

Different types of data

Content is of various types and often divided into Textual, Image, Audio and Video content.

Textual content is relatively simple to understand for search engines and for marketing teams to leverage. It can be further enhanced by adding metadata like tags for the various reasons described earlier.

The other forms of content, however, do not offer a natural ‘handle’ for search engines and marketing teams to leverage. How does one identify or search for an audio or an image or a video file? There are no characters or words to look for. Natural Voice Processing (NLP) technologies have been used to convert the audio, where available, to text, for use as a ‘handle.’ However, speech to text conversion routines are not perfect. If the context is sensitive, it cannot be left to an NLP tool to convert and forget. It has to be humanly supervised. In any case, image files do lend themselves to speech to text conversion.

For such data, the addition of metadata becomes key to their use in any application, whether it is for marketing leverage or for searchability. With the explosion of bandwidth available to users at lower and lower prices, the consumption of videos has gone through the roof. There is a huge opportunity for marketing teams to leverage by ensuring the content tagging of such data is done in a suitable manner.

Our leadership team comes with over 20 years of hands-on experience in the business and understands the nuances of the business. They lead projects from the front and have the knowledge to guide each of them, from inception through to maturity. Regardless of the type of data you wish to tag, oWorkers is equipped to do it for you.

 

Best practice suggestions for content tagging

Look at it from the visitors’ perspective

SEO should be a consideration while deciding on tags. How would a potential visitor or reader or consumer search for the content you have put out? Is it an acronym or the full form? Is it a colloquialism or the official form of the word?

Do not overdo tagging

It is believed that putting excessive tags on content can actually reduce the searchability of content. For a blogpost, a thumb-rule often used is that there should be no more than one tag for every hundred words of content in it.

Be specific

Because that is what people searching for content will be. Search engines allow free format typing that enables users to type in what they have in mind. Hence, keeping the tags closest to the contents is considered useful.

Provide a bank to choose from

Tags serve the important purpose of pandering to the tastes of readers by collecting together content that have the same tags. It will be a disappointment for the reader to find that there is no other, or very little, content that answers to that tag. Hence, it is considered desirable to have a wealth of content answering to each tag used.

oWorkers follows the best practice of employing an Internal Quality (IQ) team. It monitors the performance of the delivery teams, gives them feedback, and calibrates output expectations with clients. It is also responsible for implementing best practices and undertaking process improvement initiatives in tandem with the delivery teams. The IQ team acts as the eyes and ears of senior management on the shopfloor and reports directly to them.

It operates out of three global locations and employs a multicultural team that enables offering services in 22 of the most commonly spoken languages of the world. The centers are equipped to operate 24×7 for clients whose business requires it for quicker turnarounds

 

How is content tagging done?

Manual tagging

Manual tagging is always the starting point when you begin to create and publish content on the web. With tagging being done by a small group of people who are aligned with the content strategy of the company, it is possible to ensure that tagging stays in line with the company philosophy and direction. This is also known as tagging by the publisher.

As in everything else, this is an expensive proposition as it leverages the time of qualified resources who could be doing other things for the company. Besides, as the company starts to ramp up the volume of content, capacity can become a challenge.

Automated tagging

When we are trying to automate everything, why not tagging, especially since it can help us overcome the constraints of capacity that manual tagging might have.

There are many technologies that rely on NLP and semantic extraction engines and emotion analytics to add contextual metadata to your content. The challenge is the same as with most automation of unstructured information; engines and tools do not have the fine sensibilities of the human mind to always make the right choices.

Public tagging

This is another way content tagging can be done. As can perhaps be made out, this can be done by any and everybody and could be considered as the opposite of publisher tagging. Visitors to the website are the ones who choose how to tag the content they are consuming. They are the ones who decide its relevance and context. Some of the social bookmarking sites like Digg.com and 9rules.com could be considered as examples of public tagging.

Thanks to their partnership with leading providers of technology, oWorkers has access to the latest technologies. These technologies can also be used for client projects, enabling clients to benefit from them.

Delivery takes place from super secure facilities with protocols that further enhance the security of your data. It is GDPR compliant, ISO (27001:2013 & 9001:2015) certified, and able to operate either from office or from home, given the constraints imposed on account of the pandemic.

 

The oWorkers advantage

oWorkers operates in a cost and resource-managed environment that results in low operating costs. This, in turn, results in offering better pricing to clients. Clients have a choice of either dollars per unit of input (usually manhours) or dollars per unit of output-based pricing. Clients from Western Europe and the US report savings in excess of 80% over their pre-outsourcing costs.

Besides, the content tagging work you outsource to us will help us in employing a few more people from the disadvantaged communities we work with and give them their ticket to the global digital workplace.

The What, Why and How of Forum Moderation

The What, Why and How of Forum Moderation

The What, Why and How of Forum Moderation

The definition of a forum can be found in most traditional dictionaries. For example, the Merriam-Webster dictionary defines a forum as:

  • “the marketplace or public place of an ancient Roman city forming the center of judicial and public business,” or
  • “a public meeting place for open discussion,” or
  • “a medium (such as a newspaper or online service) of open discussion or expression of ideas.”

Since we live in the digital age where much of human activity seems to be done over the internet, merely knowing what a ‘forum’ is may not be sufficient, we need to know what an ‘internet forum’ is.

According to pcmag.com, an internet forum is “A website that provides an online exchange of information between people about a particular topic. It provides a venue for questions and answers and may be monitored to keep the content appropriate. Also called a “discussion board” or “discussion group,” an Internet forum is similar to an Internet newsgroup, but uses the Web browser for access. Before the Web, text-only forums were common on bulletin boards and proprietary online services. However, Internet forums include all the extras people expect from the Web, including images, videos, downloads and links, sometimes functioning as a mini-portal on the topic.”

Participation in internet forums can be done anonymously or it can be registration-based. Sometimes they get confused with ‘chat rooms’ that facilitate live interaction and are interactive whereas a forum is generally not real-time.

Most forums fall under one of the following categories; product, network, ambassador and support, but while falling under one primary category, there is often an overlap with at least one other type.

Coming of age in the digital age, oWorkers has been fortunate to have witnessed the internet boom from its inception, which has cemented its thinking and helped it become a relevant player in the marketplace for data related services. In just eight years of existence, it has already been selected as one of the three top providers of data based BPO services in the world.

What are some examples of internet forums with a large participation?

Quora – Quora is an internet forum where you can ask a question, any question, and others, anyone, can provide a response. Users can also create customised spaces, like a homepage where topics of relevance to them can be collected together and displayed for easy access. User responses to questions can also be voted on by other users, resulting in greater visibility for the answers considered to be the best.

Reddit – This is a discussion forum on the internet with a simple and easy design and access. Users can subscribe to groups and sub-groups (called subreddits) based on their areas of interest. It is known as a thriving community where people help you find the right answers. A new thread of discussion topic can be started by anyone. As many people like to see their Reddit feed before anything else, it is also sometimes fondly known as the front page of the internet.

 

The need for forum moderation

A forum is a type of a website that owes its existence to vibrant and meaningful interaction amongst users. Hence, there needs to be a mechanism for interaction to take place. Each internet forum would have its own unique method for it to happen.

Users participate mostly based on their own interpretation of the right way to participate. There is an underlying assumption here that the forum has specified its purpose and rules and regulations as well as what is acceptable and what is not on the forum. It is to be assumed that users, while signing up for it, would have checked the box which said that they have read the rules and regulations and agreed to abide by them.

Thanks to the reach of the internet, barring issues that have a local importance or flavor, many forums are widely participated in from around the world. And people around the world communicate in many different languages. Hence, if moderation is a need, then the ability to operate in these languages becomes as much of a need.

With its centers across three distinct geographies, and an embedded policy of working with a multi-cultural and multi-ethnic team, oWorkers offers the ability to handle moderation services in 22 languages.

Sometimes, despite their best intentions, user participation can fall foul of the specified rules.

This, we could say, is need number one.

Internet activity, eventually, is a reflection of the real world. There are a large number of rules and regulations in the real world, but only a small minority of people end up as criminals, vitiating the society we live in. The same thing happens online. Most people will make all efforts to ensure they are compliant and that their actions do not create an environment that might be harmful for others.

But a small minority will go beyond accepted rules and insist on pushing their agenda through offensive participation.

Let us call this need number two.

These two, involuntary breaking some rule of the forum and wilful perpetration of offensive content, are the two primary reasons that fuel the need for forum moderation.

Moderation enables the forum to stay on its avowed path, and true to the users who have joined based on that avowed path and objectives. If the forum starts seeing irrelevant or malicious content, it is likely to lose its user base who do not want to navigate through the irrelevant and offensive content to find the nuggets of suitable content. Irrelevant content is also likely to lead to impacting its SEO karma, which could impact on new users joining and making the forum a more vibrant and thriving community.

Thus, moderation can be said to be important for a forum by ensuring its health through the management of the content that gets created.

 

Forum moderation – making it easy for moderators

A moderator is the lynchpin of a forum, and has the ability to influence its eventual success or failure. But that could be said for any job, the ability of the person to make or break it. The additional sensitivity of an online platform arises out of its ability to immediately reach a large number of people.

With its position as a preferred employer in each of its three locations, oWorkers attracts a steady stream of walk-in candidates interested in working for it. This not only keeps its hiring costs low, as it does not need to advertise much to attract talent, it also enables oWorkers to choose from the best available talent and deploy them to projects based on need as well as aptitude.

It makes sense to strengthen the hands of the moderators who carry out this task.

What are some of the things a moderator should be aware of?

There is a reason for the forum to exist

There is a reason the forum exists. It strikes a chord with its users which is why they come to it. That consideration and understanding everything else that may be done to keep the forum sane and its users secure. Participation on the forum, as well as any moderation that might be done, need to be done under that overarching objective of the forum.

Help users – most are genuine

Forum moderation needs to be carried out with the assumption that most users are genuine and caring and sensitive to the needs of others, and hence need to be handled as such. Infractions can be genuine oversights or an inadequate understanding of the rules. It does not necessarily mean that the user is out to sabotage the forum.

It is not personal

In the heat of ‘battle’ sometimes it can happen that a moderator loses objectivity and starts seeing violative participation as a personal attack. That needs to be guarded against, as it can lead to taking decisions beyond the scope of the job. If she steps back, the moderator will realize that none of the users actually know her, or that she is the one doing the moderation of the forum. If needed, ask the organization for help.

Be consistent and fair

Though easier said than done, handling issues with an even hand is always desirable. The organization is likely to have rules covering most common situations that need to be referred to in times of need. If in doubt, consult with your peer group or escalate.

Provide inputs to the organization

As the front-end of the organization most frequently interacting with users, a moderator has the greatest visibility into user behavior. This visibility needs to be fed back into the organization for the continuous improvement of the forum. Equally, the moderator is also likely to be the best placed to identify where the existing rules and guidelines for the forum are falling short. By feeding this information back to the management, she can ensure they stay relevant to the community and, eventually, make her own job easier.

oWorkers’ ability to attract talent also gives it an almost unfair advantage in providing staffing for unplanned peaks in volumes. It can hire almost a hundred people extra within just 48 hours. Beat that!

 

Forum Moderation should result in consequences

Setting up rules perhaps has no value if violating them does not lead to any consequence. Imagine if murder is a crime in a jurisdiction but when it happens, everyone looks the other way and the perpetrator gets away scot-free. Such a situation is likely to encourage more people to commit the crime. Perhaps even more than if it was never called out as a crime.

The online world being a reflection of the real world, similar rules apply and human beings also behave in a similar manner. This is why, if there are rules for participation, if there is a violation detected as a result of the moderation being practised, the perpetrator has to face consequences, ideally built into the rules and regulations.

Despite the work pressures that moderators routinely experience, oWorkers gets extremely positive ratings from employees, past as well as present, 4.6 out of 5 and upwards, on platforms like Glassdoor. It is a statement not only on the employee policies and treatment, but also on their management team, who themselves come with hands-on experience of over 20 years in the industry.

What are some of the tools available to moderators for violation of forum moderation guidelines?

Inappropriate content can be edited

If you find that the language is unsuitable, or inappropriate external links have been shared, or any other issue with the content that makes it inappropriate, like the sharing personal contact details, it could be handled simply by editing out the offensive content.

Individual items can also be marked as spam so that they stop being available to other users.

Discussion threads can be locked or moved

Occasionally a thread runs out of steam and stops getting traction. Even the originator of the thread no longer checking in to the thread, could be a suitable case for locking. The locking could be done along with a message requesting people interested in the subject to initiate a new thread, even though it is on the same subject. Locking could also be initiated if a duplicate thread has been created while another one is still live. A solution being reached could also be a logical reason for a thread to be locked.

A discussion thread can be moved by a moderator to its rightful location if inadvertently created by a user in a location unsuitable for it.

Deletion of a thread

Deleting a thread is perhaps a more severe action which could be undertaken if spam is detected in it. Deletion can also be done if the thread is not serving any meaningful purpose with regard to the original subject and may even have degenerated into profanities and personal attacks.

Account deactivation

This is a higher-level action and typically taken against repeat offenders or in the event of a serious transgression.

With its partnerships with IT companies, oWorkers can access the latest technologies, even emerging ones. This is another benefit clients get as these technologies are deployed on their projects. They are not only GDPR compliant but also ISO  (27001:2013 & 9001:2015) certified.

 

Forum moderation through a specialist partner

oWorkers is now a leading provider of outsourced moderation services. Companies from around the world trust oWorkers with their business, including several unicorn marketplaces. 85% of the clients of oWorkers are IT companies themselves.

With its efficient operational processes, like the saving of cost in hiring, oWorkers is well positioned to offer the finest pricing in the industry, with clients often realizing savings of almost 80%, especially the ones from the US and Western Europe.

Its three global centers offer quick turnaround and are equipped to operate on a 24×7 basis, as and when required by a client. Power your growth with support from oWorkers.

Facebook Comment Moderation Can Work For You

Facebook Comment Moderation Can Work For You

Facebook Comment Moderation Can Work For You

It is perhaps a reasonable assumption that if you are reading this online article, you are connected to the internet. Though it is not a logical extension, it is perhaps also to be assumed that if you have some familiarity with the internet, you have heard of Facebook, if not actually used it. Of course, the chances are high that you have used it because 60% of the world’s population uses it.

It is also common knowledge that it takes all sorts to make up this world. Social media platforms like Facebook, that promote open communication and sharing, often find themselves to be sitting ducks when a person chooses to post offensive or malicious content on the platform and struggle to cope with it through various types of moderation processes.

Thankfully, such people are in a minority, but they exist. Despite the existence of platform policies that define what can be posted and what cannot be, they either choose to pretend otherwise, or interpret guidelines to their own narrow advantage, or have plainly mischievous intent. Whatever be the reason, once posted, the content becomes widely available.

We know that Facebook spends a fortune in tracking such content and taking action against the content as well as its perpetrators, through initiatives like Facebook comment moderation. Though content can be posted in any media format, such as audio, image, video or text, text is a fairly significant proportion of the content that is being posted and, much of it in the form of comments, it perhaps follows, that comments constitute a significant part of the offending content that needs monitoring and removal.

As a service provider, it is an advantage being born at such a time, as modern practices and technologies become a part of your DNA from birth, obviating the need to make an effort to learn about them. oWorkers, an eight-year-old BPO company, is proud to be selected as one of the top three providers of data related services in the world.   

 

Facebook for companies

The popularity of social media platforms has let to a lot of companies rushing to them in an effort to engage with their current and prospective customers, and talking up their products and brands. Though designed to facilitate communication and interaction between people, beginning with university students in the US, social media platforms, like all technologies, have kept evolving and pivoting. Being private businesses, their eventual goal is revenue and profits, towards which end they will steer the business when opportunity knocks.

As adoption by individuals spiralled, companies also started beating a path to the doors of these platforms in an effort to engage with the users. After all, for most companies, eventual customers are individuals. So, if they can be found in droves around social media platforms, that is where companies must go too.

Platforms also evolved to create spaces and opportunities for companies to create a presence on their platform and drive discussions that they wanted to drive which, unsurprisingly, is mostly around their brand and products. Users who are favorably disposed towards the brand will become the early participants driving the conversation. Most brands hope that this would lead to a much larger set of users being driven to their social media pages and properties which eventually results in conversions and revenue realization. 

It works in the manner of back-scratching between Facebook and these companies. Just like companies seek to benefit from the traffic on the platform, the traffic these companies are able to drive to their groups and communities on the platform, also adds to the adoption of that platform, so why not?

Facebook comment moderation becomes as much, perhaps even more, of a requirement for these companies that create their own space and promote conversation about their products and brand. It reflects their voice and brand and consequently business and must be managed. Not only from malicious content and comments posted by users, but also from spam and abuse. This keeps the brand’s fans safe and presents a wholesome view to the external world.

Over the years, recognizing the role moderation can play in promoting healthy interaction on the platform as well as in groups and communities, Facebook, apart from the moderation it carries out on its own, also creates and makes available tools to the administrators of groups and communities on Facebook also that they can do the same in their space and can perform Facebook comment moderation from a simple interface.

Increasing adoption of social media platforms by individuals also creates a deeper pool of talent required to handle new-age services like moderation. This further augments the capacity of oWorkers to hire the best talent based on its position as a preferred employer. From the candidates that queue up for a job at oWorkers, we are able to hire the best and deploy them to projects most suited to their skills and aptitude, which we assess based on the series of IQ and EQ tests done at the time of hiring.

 

Types of comments

While on the subject of comment moderation, it is perhaps useful to understand the channels through which comments can be generated on a Facebook page.

Organic comments

Being a social media platform, Facebook is designed to promote unfettered communication and sharing. They can do so on a Facebook page too. That is, after all, the basic purpose and design of the platform. This content is also available to all users.

As an administrator of the page, if you are one, you will receive information as and when a new comment is posted by a user. You also have the power to turn off the default capability that users have of posting on your page, if you so desire.

It helps to have a multilingual capability that oWorkers possesses as a result of its multicultural and multi-ethnic talent pool. It can handle moderation in over 22 languages, as comments can emanate from anywhere in the world in any language.

Comments on posts

The purpose of a Facebook page for a business being to promote its brand and products, it often posts content that serves this purpose. It also serves as an invitation to visitors to the page and fans of the brand to leave their thoughts and comments on it, and even share it further, as a way of pushing the communication forward.

While spam comments automatically get hidden (while still being visible to the poster and page administrators), admins have the option of turning on profanity filters for their pages. Of course, it needs to be understood that Facebook comment moderation through such automated solutions will have limitations. It may be able to weed out the obvious ones but, without the ability to create a context, borderline cases may be incorrectly handled.

The ability to attract talent gives the ability of hiring for short-term peaks to oWorkers. When client projects experience unplanned spikes in volume, as can happen with comments, we can hire up to a hundred additional people within 48 hours. This means the client saves money, as they do not need to keep buffer resources available.

Activity on advertisements

Many companies advertise on Facebook. On account of the detailed profile of users that is available to Facebook, many companies find that they are able to reach their target segment of customers much better through Facebook advertising than through traditional media, which is usually a scatter-gun exercise, liberally spraying content in the hope of catching a few relevant ones.

Users can leave their footprint on these advertising messages as well, in the form of comments. As these comments do not form a part of the feed received by the administrators of activity on their page, as this is considered to be advertising activity and not activity on their page, they are often much less managed than comments on their posts.

With employed staff, as opposed to freelancers and contractors, as some of our competitors seem to prefer, oWorkers is well placed to move people around. It also enables people to grow within the company, creating a reliable pool of supervisory resources.

Reviews

Users also leave reviews for the brand, including a rating on a 5-point scale.

These reviews and ratings are of great value to the company as they, at least the complimentary ones, are proof to the world that people without an interest in the brand (users) have reviewed the product and rated it highly.

This section can also get reviews that are not complimentary. That is par for the course. The company needs to be able to detect it and handle it efficiently and effectively so that it can become a turnaround or service recovery story. Some companies try to remove or reduce the negative comments. This leaves the page with a sanitised look and may appear artificial to users. So, caution needs to be exercised in how far you go with your Facebook comment moderation.

Talking of reviews, oWorkers gets reviewed by its staff, both past and present, on platforms like Glassdoor. We are proud to say that our scores have consistently been above 4.65 on a scale of 5.

Direct Messages

Facebook users can also send direct messages to the company; something like an email. The company’s alacrity in responding can earn for it the ‘very responsive to messages’ label on its profile. It may not result in customer conversions or additional revenue, but it certainly would add up towards the wholesomeness of the company’s profile.

The leadership team of oWorkers comes with hands-on experience in the industry of over 20 years. They are able to provide guidance where required.

Benefits of Facebook comment moderation

It must be remembered that moderation is used in its widest possible meaning and is not limited to ‘shooting from the hip’ and killing content found to be offensive, as some people seem to believe. In its wider version, moderation essentially refers to the act of tracking and managing the user activity taking place on your web properties.

There are several benefits that could accrue to the brand as a result:

Influences revenue

Buyers look for authentic information about a product they wish to buy from unbiased sources who do not have any interest in the sale of the product. Facebook comments and reviews on the company’s page are a great source of such information for prospective buyers, even as the company leverages its social media presence to make itself visible to a larger universe. As these comments may not always be complimentary, they need to be closely monitored.

Source of information from the marketplace

A survey was used by organizations not too long back to get information from the ground and the marketplace about consumer preferences as well as other developments in the marketplace. Of course, it could be supplemented through other means like representatives directly interacting with different stakeholders. While it collected a lot of information, it was believed that a survey was an artificial setting where respondents were asked to imagine situations and respond. Hence, it might not be natural or accurate.

Interaction on social media, however, is completely natural. It happens when the consumer wants it to happen, in the heat of the moment, to post a review, respond to a comment or anything else. Hence, it is seen as the most realistic representation of the marketplace and consumer sentiments.

Elimination of malicious content

This is the narrow definition of moderation that many people go by. However, this could also be one of the greatest benefits arising out of moderation, with its ability to identify and remove offensive content and keep the wholesomeness of the site intact, enabling target customers to continue using it without worry.

Channel for quick resolution to customer issues

With a device in their hands at most times, and social media accounts almost permanently logged in, for many people it becomes the easiest way to access and communicate with a company, rather than having to look up a phone number or email ID. Even if they were not initially set up for handling customer service, perforce, they are having to deal with comments that can only be classified as customer service. Through moderation, companies get a chance to intercept such issues of customers, resolve and respond to them, and convert possibly disgruntled customers into happy ones.

With its technology pedigree, GDPR compliance and ISO (27001:2013 & 9001:2015) certification, oWorkers provides a combination of human and technology solutions for handling moderation needs of its global clients. 85% of our clients are technology companies, with several unicorn marketplaces being among them. We also have deep partnerships with technology companies that enable us to leverage the latest technologies for client needs.

 

Good practices for Facebook comment moderation

Be even handed – Do not take sides. Try to ensure that similar issues get similar responses or treatment. If the company has a policy of responding to all comments, don’t pick and choose.

Respond quickly – Turnaround times, if published, should be adhered to. Speed of response will always be a parameter customers will judge you on. By default, the faster the better.

Go beyond a bot – Many companies deploy automated response and chat solutions. They have their value and customers understand that. When the time comes for a human response, respond like one, with emotion, personalization and care; not as a bot. Incorrect (not inappropriate) language may also be fine as long as it means there is a caring human sending it out.

Take the bull by its horns – Shying away from sensitive issues will not make them go away. Whether it is a customer issue, or inappropriate content, or spam, the earlier you address it the better it is for everyone in the community. If difficult issues are raised in an open forum, they need to be answered in the same open forum.

Use your judgment – What starts as a comment does not need to finish as a comment. Many situations are better handled out of the public glare. If you need to share sensitive information, like personal particulars, it should be a direct interaction between the company and the customer/ user. A long explanation with links and attachments is perhaps better done as an email exchange. The comments forum is to enable the customer to initiate a request/ issue/ requirement. It is up to you how you conclude it.

 

Partner with oWorkers for best results

Our clients vouch for our capability and note savings of almost 80% after outsourcing to us. They also find the ‘choice’ we are able to offer in pricing, between per unit of output or per unit of input, with the client allowed to pick the one they prefer, to be unique.

Our three centers, located in three of the most sought-after geographies in the world, offer compelling turnaround times, driven by time zone differences as well as their 24×7 operations.

We work with people from the weaker sections of society. Each new project enables us to employ a few more, train them and equip them to be a part of the global digital economy.

The role played by content moderation tools

The role played by content moderation tools

The role played by content moderation tools

Over 3 million years ago, even before the present version of humans emerged, tools shaped out of stone are believed to have been used by ancestors of the present-day humans.

About a million years ago, ancestors of modern-day humans discovered the ability to light a fire.

More than 15,000 years ago, humans invented/ discovered agriculture as a means of sustenance and livelihood, moving away from their foraging and hunting past.

10,000 to 15,000 years ago, man discovered the art of making pottery, bricks and clothes. The wheel is also understood to be invented in the same period.

Iron, Gunpowder, Compass, Mechanical clock, Printing press, Steam engine followed in the few thousand years thereafter.

Though the term itself is understood to have been ‘invented’ as recently as the 12th century, mankind’s search for tools that enable him to do more, faster, better, has been going on since time immemorial. If anything, it has only gathered pace as time has gone by. In fact, the use of tools is one of the ways of separating man from other animals.

Some of the inventions may have, at times, created unforeseen or unintended consequences, like disease, but the objective behind the relentless striving has always been noble; to enable mankind to do more, faster, better.

We may not be as old as some of the tools invented by humans, but we have made rapid progress in the few years we have been in existence. oWorkers prides itself on being selected as one of the three top providers of data based BPO services in the world, despite not yet being a teenager.

 

Introduction of content moderation tools

Why should the digital age be any different?

Man’s desire to introduce tools that enable him to do something better and faster applies to all his endeavors. Moreover, with the history of all the tools that have been invented over millions of years, pathways to the creation or discovery of the next set of tools are, perhaps, also clearer.

The use of tools for an activity is not a natural starting point. Many activities start off organically, as natural processes, based on human needs and desires. Social media platforms started as a means of sharing thoughts and ideas and communicating with others. They grew as they were found to be useful.

While they were growing, unintended uses of platforms began. A few saw an opportunity in leveraging the reach of these platforms and started sharing objectionable content with vulnerable audiences.

When this was recognized to be a problem, moderation systems were introduced that were likely to have been mostly manual.

Usage and adoption continued to grow. The volume of malicious content also continued to grow, requiring more and more bodies to be ‘thrown at the task’ creating profitability issues for platform owners, mostly private businesses. And the search for tools began to overcome the challenge of using people for moderation, such as recurring cost, speed of review and variations in interpretation and application of standards.

Once content moderation tools entered the frame there has been no looking back. Most organizations with a substantial moderation need now entrust the heavy lifting to automated systems and tools.

That being said, for better or for worse, automated solutions are widely used and some of them are very capable, but, so far, they have not been able to match the capacity of the human brain. The ability to apply context to a situation, identification of nuances, drawing meaning out of unstructured content, taking all content in their stride, regardless of format, are unique to human beings.

This is the reason that while the heavy lifting is increasingly entrusted to automation tools, the contentious issues and decision-making in case of reasonable doubt, is still left to human beings.

Even for operating the tools, smart people are needed. With its unique place in the communities, it is located in, oWorkers attracts the best talent, with which it is able to staff its many different client projects.

We are also able to offer additional staffing to meet short-term, unplanned spikes in volume. We can hire a hundred additional staff within 48 hours, thanks to the liberal supply of walk-in talent.

 

Human beings as content moderation tools

As we have seen, human beings are the best moderators, best anything in fact. Of course, they have various limitations too, but there is no denying their intellectual superiority to any machine.

Human moderation has resulted in the evolution of several methods of content moderation:

Pre-moderation

The content submitted by a user is reviewed before it is made available to others. Though this is seen as the ideal method, it causes delays in publishing, leading to user dissatisfaction, as they want to see immediate results. Besides, it uses up a lot of resources and becomes expensive.

Post-moderation

The user generated content (UGC) is allowed to be published, with moderators playing catch-up. In other words, they continue the review process and, in the event of coming across offensive content, will take it down even though it may have been published. While this results in a better user experience, it enables offensive content to slip through.

Reactive moderation

All users are given the means to report and complain about the content that is available. Complains drive content into a review queue which, then, leads to it being handled on its merit by the review team. Clearly not fool-proof, but an inexpensive solution that would serve the purpose of websites that have a high tolerance threshold.

Distributed moderation

This method requires users to vote on the content they access and its suitability for the site, which could be in a variety of ways. The eventual result is that content voted down by users keeps going down in rankings, with the lowest ones becoming virtually non-existent or invisible.

With all our centers equipped to operate 24×7, oWorkers can provide quick turnaround on transactions, over and above any benefit that we might get on account of time-zone differences.

We have made the choice of working with employed staff, and not freelancers and contractors that some of our competitors do. This gives us flexibility in deployment of resources, as well as a trained middle management team of supervisors who have grown in the company. We are registered as a local company in our delivery locations, and pay social and local taxes for our staff.

Our staff, both past and present, rate us 4.65 or above on platforms like Glassdoor.

 

Adoption of tools for content moderation

When the need for tools manifested itself, as a result of growing volumes, limited human capacity and rising cost of human moderation, it became possible to apply the pre-moderation method across a much larger swathe of content, because machines, once trained, can handle a lot more information at the same time as compared to a human.

Besides, with the introduction of technologies like Artificial Intelligence (AI), unstructured information, a no-go zone thus far for computers and machines, suddenly became reachable.

Based on AI or otherwise, the tools adopted for content moderation are not meant to be merely tools that do their automated thing in isolation, in complete disregard of what the human moderators are doing. Present-day tools are expected to do their ‘tool thing’ while allowing human moderators to leverage them and do the ‘human thing.’ In other words, tools should provide an interface that not only allows the automated functionality of moderation to operate, it also provides a window to the human operators to review and handle escalations. It obviates the need to move content between platforms and the attendant challenges of doing so.

Some of the other features reliable content moderation tools are expected to offer:

  • The ability to handle a variety of formats like video, audio, image and text
  • The flexibility to handle emails, blogs, comments, reviews and all other types of content
  • A filtering mechanism for keeping out obviously objectionable content such as pornography, vulgarity, etc.
  • Capabilities like Natural Language Processing (NLP) that make it possible to evaluate content such as audio and video and create context around them
  • An interface with a dashboard for online monitoring of traffic and performance
  • All web spaces, including community forums, websites and social media handles, should be accessible through the tool
  • The tools to automate tasks, delegate and monitor tasks

oWorkers operates in a highly secure environment designed to keep client data secure. It is ISO (27001 :2013 & 9001:2015) certified and GDPR compliant. It has long-standing relationships with technology companies, and can access the latest technologies for client projects.

 

Limitations of content moderation tools

Blessings are generally mixed, and that applies to tools for content moderation too. While we are going to pursue all opportunities for automation, it would be beneficial to understand their limitations instead of going in blind.

Automation requires scale, not everyone has it

Each industry is different and each company is unique, which is what sets them apart in a crowded marketplace. Their requirements also, are unique to some extent. It is, therefore, logical, that they need to develop unique automated solutions to handle their unique requirements of moderation.

Since companies operate at different levels, the smaller ones will struggle to put together the scale that will justify an investment in such tools. They will, therefore, have to rely on off-the-shelf solution that, while being very good at what they do, could fall short of handling their unique requirements in an ideal manner. They will need to understand the strengths and weaknesses of the tool they adopt, in order to ensure they are able to plug the holes.

As good or as bad as the training they get

Being unthinking, they depend on human intelligence to ‘learn’ about what they need to do, and then go about the learned task at a scale of efficiency that cannot be matched by humans.

These tools are, thus, limited by the training they receive. For AI models that are trained with the help of Machine Learning datasets, the larger and more varied the datasets, the greater the learning for the tools. Similarly, the smaller and less varied the datasets, the lower the quality of learning, leading to reduced reliability and accuracy.

Absence of context

This is the obvious disadvantage of not possessing as fine as the human brain which, intuitively, knows. The moment a tool encounters an unfamiliar situation, it is likely to deliver unexpected results as its context is limited to what it has been fed by its human operators.

An image of a female breast may be considered to be a case of nudity, but an image of a woman breastfeeding an infant is not nudity and is permitted on most sites. A human can make out the difference but a tool will struggle to, unless painstakingly taught the difference.

Struggle with unstructured data

Computers have been taught to understand humans through a set of characters, arranged in a certain sequence for a certain meaning, and another sequence for a different meaning. Structured sets of characters came to be known as software code or programming.

Tools will, however, struggle to understand sets of characters not arranged in those sequences, or data in any other format, such as audio and video and images. This is why an audio file needs to be converted into text before a tool can even begin to extract meaning out of it. Once transcribed to text, speech loses its context. For example, were the words spoken in anger or softly?

An image is merely a set of randomly arranged pixels, till ML datasets teach the tool the meaning attached to each arrangement.

A lot of progress has been made with AI models, but we are far from content moderation tools having an understanding anywhere near complete.

Training datasets may be biased

While the internet is global and now penetrates to the darkest, remotest corners of the globe, the work that is being done in creating automation tools is mostly done in English.

So what?

It could result in prejudicial treatment of content that is not in English.

A related risk is that the biases of the data annotator preparing the ML datasets are also likely to creep into the training and impair its judgment to some degree.

oWorkers has a multi-ethnic, multi-cultural workforce in each of its three delivery centers. With this workforce, we are in a position to support clients in 22 languages.

Lack of an audit trail

In a software program, it is possible to review the code and identify an error, if the need arises, by going through the lines of code.

Content moderation tools, trained with the help of ML datasets, do not afford the same transparency. It is not a linear relationship. If a tool takes a particular decision, it is very difficult to identify the exact reason why it did so. Thousands and millions of data points might have gone into making up its training. Many of them might have been identical, many others with minor variations. What caused the algorithm to be constructed in a certain manner is extremely difficult to unravel. Hence, if a decision taken by an algorithm gets challenged, while it can be changed by a human being reviewing the case, providing a logical explanation for it does not seem possible at this point.

 

Humans and tools need to co-exist

The foregoing creates an ideal environment where human beings and automated tools need to co-exist for delivering the best results.

A combination that an established player like oWorkers, with its focus on data related services like moderation and a leadership team with over 20 years of hands-on experience in the industry, can offer.

Our customers routinely note substantial savings upon outsourcing to oWorkers and appreciate the transparency and choice they get in pricing. 85% of our customers are technology companies, and includes several unicorn marketplaces. We hope you will be one soon as well.

How Is YouTube Content Moderation Done?

How Is YouTube Content Moderation Done?

How Is YouTube Content Moderation Done?

What is YouTube?

“Where have you been?” might well be the answer if the question was “What is YouTube?”

Launched in 2005, at a time when the internet was beginning to make its presence felt around the world and straining at the leash to penetrate to the far corners of the globe, YouTube is the second most frequented website today.

It is a social media platform that lets users create, watch and share videos. It has more than a billion active monthly users. Its users upload over 500 hours of video content every minute, minute after minute, from all corners of the globe. These users are also consumers, and collectively watch over a billion hours of video each day. This means that each human being on the planet watches about 8 minutes of video on YouTube every day. Remember that a video is uploaded by one person but watched by many, hence the difference between the upload and download hours.

Google saw the potential in YouTube fairly early and bought it for $1.65 billion in 2006. With the changes introduced and under Google’s stewardship, YouTube generates $20 billion in revenue annually.

Coming into existence after Google’s takeover of YouTube, oWorkers is a BPO player that specializes in supporting requirements emanating from the digitalization happening around us. Technically advanced, it operates from super secure facilities & protocols for ensuring the security of client data with its ISO certifications (27001:2013 & 9001:2015).

 

The logic for YouTube content moderation

Social media platforms like YouTube have made content publishers out of all human beings.

In the days of yore, publishing was an activity restricted to a few individuals and organizations. They would source content, establish veracity, create an output for readers and viewers, submit to an editorial process, ensure that the content being pout out was kosher with the rules and regulations in place, and then make it available to the target audiences. Publishers were few and identifiable, and it was generally easy to trace a piece of content back to where it originated. This also kept the publishing community on a leash, in terms of possible repercussions if they stepped out of line.

With the internet and social media, everyone is a publisher. There are 4 billion active users on social media, of which about a billion are on YouTube, not exclusively in most cases. These users, while they consume content, now also have the power to publish it, through a simple process of upload and submit or type and submit. The comments they leave, the opinions they post, the photos they share, the videos they upload, is content being published by them.

Photos, for example, which used to be a personal keepsake, to be shared with close friends and family members, once uploaded on social media, becomes a piece of published content. It can be viewed, commented on and, in most cases, further shared. It becomes a living entity of the world wide web.

This publishing, done by the 4 billion social media users, differs from traditional publishing in that it does not necessarily have a process or checks and balances before it sees the light of day on a platform. A user wishes to upload, she goes ahead and does it, probably without much thought to the impact it might have on others.

If this appears like an introduction to a horror story, let me clarify that most, an overwhelmingly large proportion of the content being uploaded is perfectly acceptable as content that can be viewed and shared further. It meets the content guidelines put out by social media platforms, as well as the unspoken rules of civil engagement in a society.

However, a small percentage is not. The hate speech exhorting followers to violence. The graphic violence perpetrated by a terror group. The demonic rites carried out by a religious cult. Pornographic material. These are examples of such content that should not be put out there for open consumption. Only a small number cross the line, but they do.

oWorkers draws a majority of its workforce from the millennials, many of whom are consumers as well as producers of content themselves, and hence familiar with the context. Its positioning as an employer of choice enables it to draw lines of job applicants, enabling a choice of the most suitable resources for its various projects. This also enables us to hire at speed to fulfil short-term, unplanned spikes in client volume which would, otherwise, cost a lot to clients on account of having to maintain an idle workforce for handling a few days of unplanned surges.

As an open platform, YouTube is not immune to such events. This creates the need for YouTube content moderation. Something akin to the editorial process of the traditional publishing and content creation industry, so that platforms can be kept safe and orderly, where people find it pleasurable to exchange ideas and opinions, and content, without fear.

How does YouTube moderate?

YouTube has had a profound influence on popular culture over the years of its existence. It has also enabled many people to express their creativity through the video content they could upload that could be accessed by millions, and made many millionaires out of this process.

However, as we have seen earlier, moderation has become a requirement driven by a few ‘loose cannons’ who take advantage of and interpret the ‘freedom of speech’ that platforms like YouTube offer, for their own devious and suspect benefit.

How, then, does YouTube moderate?

 

Setting up publishing guidelines

Like almost all other platforms, YouTube content moderation requires policies to be set out in no uncertain terms. Users of the platform are required to confirm that they agree to abide by the policies of the platform, while signing up for it. In the absence of explicit guidelines, users could contend that they were not aware and get away with murder. Hence, setting up the rules and regulations is generally an important step in taking away that excuse from potential wrongdoers.

An extract from their Community Guidelines:

“YouTube has always had a set of Community Guidelines that outline what type of content isn’t allowed on YouTube. These policies apply to all types of content on our platform, including videos, comments, links and thumbnails. Our Community Guidelines are a key part of our broader suite of policies and are regularly evaluated in consultation with outside experts and YouTube creators to keep pace with emerging challenges.

We enforce these Community Guidelines using a combination of human reviewers and machine learning, and apply them to everyone equally – regardless of the subject or the creator’s background, political viewpoint, position or affiliation.

Our policies aim to make YouTube a safer community while still giving creators the freedom to share a broad range of experiences and perspectives.”

They cover a wide range of subjects, such as:

  • Fake engagement
  • Impersonation
  • Spam, deceptive practices and scams
  • Child Safety
  • Nudity and Sexual Content
  • Suicide and self-injury
  • Vulgar language
  • Hate speech
  • Violent or graphic content

and many others.

They even spell out each in some detail. For example, this is what their vulgar language policy states:

“Some language may not be appropriate for viewers under 18. We may consider the following factors when deciding whether to age-restrict or remove content. Keep in mind that this isn’t a complete list.

  • Use of sexually explicit language or narratives
  • Use of excessive profanity in your video 
  • Use of heavy profanity in your video’s title, thumbnail or associated metadata

Here are some examples of content which may be age-restricted:

  • A video focused on the use of profanities such as a compilation or clip taken out of context
  • A video featuring road rage or sustained rant with heavy profanities
  • A video with use of heavy profanities during a physical confrontation or to describe acts of violence”

 

YouTube content moderation – how is it done?

Rules and regulations can only go so far, and no further. Forget social media, in real life too we have rules and regulations articulated in reasonable detail. Despite that, transgressions take place, houses get broken into, people get murdered and vehicles speed through stop signs.

Why?

Because someone has reached a point where breaking the rule has a greater payoff for the transgressor than abiding by it. While nobody would do a formal break even analysis before committing a crime, the person’s moral, emotional, physical and mental state has perhaps reached a point, where committing the crime just makes more sense than anything else, rules or no rules.

This is why rules and regulations need to be backed up by an enforcement mechanism, without which they will remain an academic exercise; they sound good but nobody really cares about them.

How it goes about managing its processes, like moderating offensive content, is YouTube’s internal business. However, the company is making efforts to articulate its strategy in clear terms and share it with users and others who may be interested.

It has put out a video, what else, along with its community guidelines where two senior functionaries of the company explain the efforts it makes at what we know as moderation.

According to the video, content can be flagged off by any logged in user. As each viewer has a limited perspective, they have created a ‘trusted flagger’ program through which people can more easily understand and flag offensive content. This is generally applied to content that pertains to areas with a higher level of sensitivity such as government, defence and non-profits. Trusted flaggers also have access to training.

The flagged off content is reviewed by, who else, reviewers. Reviewers are a team of experts well-versed in the platform guidelines who are entrusted with the job of taking a decision of the flagged off content. YouTube content moderation experts provide coverage throughout the day, and night, and cover all time zones. They are a multi-lingual group widely distributed across the world, as content can emanate from anywhere, in any language.

oWorkers mirrors the infrastructure by providing support 24×7 across its three global centers. With its avowed policy of multi-cultural and multi-ethnic hiring, it supports work in over 22 of the most common languages of the world.

YouTube understands that much of the identified content was not put out by users with harmful intentions. It is possible they were not able to identify the content as offensive. Hence, the first time content is identified as offensive, a warning may be issued to the user, followed by a ‘strike’ against her if the offense is repeated. If there are three strikes within a 90-day period, the account is blocked. The inadvertence of the offense can be gauged from the statistic that 94% of users who get a first ‘strike’ never get a second. YouTube also makes available an appeal process to ‘struck’ users.

Since 2017, YouTube has increasingly introduced Machine Learning (ML) into the mix. Computer programs are taught to identify offensive content based on examples and drawing connections, including samples of what is not offensive. The ML initiative has enabled moderation efforts to handle much larger volumes, at much greater speed. It has also enabled them to continue their work more or less unimpacted during the Covid-19 induced lockdowns when a lot of staff was not available. Flagged content is reviewed and adjudicated upon, in some cases, when the algorithm has a statistically high level of confidence that the ML-identified content is offensive, it may be removed even without a review. Reviewers keep feeding back into the ML process to make it smarter.

With its close relationships with technology companies, built over many years, oWorkers has access to the most advanced technologies, whether for ML or AI (Artificial Intelligence) or anything else. These technologies are deployed for client work, many of whom are also technology companies. It is not an accident that oWorkers delivers cutting-edge technology solutions to its clients.

Of course, as much of it is a human dependent activity, variations in treatment can happen. There are also some documented exceptions which are termed as EDSA (Educational, Documentary, Scientific and Artistic) content which may be allowed to go through despite not being in line with some of the principles. Snippets of child abuse may be permitted if the perspective is to create education and knowledge and awareness around the subject, for example.

Its efforts appear to be bearing fruit. YouTube has even started measuring the efficacy of its moderation process by calculating the ‘violative view rate,’ the views a video identified as offensive received before being taken down. It says that since the introduction of ML into the mix, this rate has gone down by 70%, testifying to the speed with which ML is helping identify offensive content. It claims that 94% of the content taken down was identified with the help of automated systems, and that a majority of those videos had garnered less than 10 views. It claims to have taken down over 83 million videos since enforcement reports started being released by it over 3 years back.

 

YouTube content moderation for channels

Like any other social media platform, businesses seek to expand their message and reach through YouTube as well. Again, like other platforms, they need to monitor and moderate the user generated content that gets created on their space.

This is where specialist providers like oWorkers come into the picture and help companies monitor and moderate their YouTube channels and comments. 

oWorkers is a specialist, focused on data related BPO services such as content moderation. As one of the top three data related providers, having supported global clients for over 8 years, and led by a management team with over 20 years of hands-on experience in services of the digital age, it is a natural choice as partner for YouTube content moderation.

Many of our clients, especially from the US and Western Europe, consistently report savings of up to 80% when they outsource to us. We offer transparency in pricing, usually giving a choice between dollars per unit of output and dollars per unit of input to a client.

We work with employed staff, and not freelancers and contractors that some of our competitors seem to prefer, to avoid long-term commitments and responsibility. We regularly receive ratings of 4.65, on a scale of 5, and above from past as well as present employees on platforms like Glassdoor.

The Benefits of UGC Moderation

The Benefits of UGC Moderation

The Benefits of UGC Moderation

There was a time, not too long back, when publishing was done by a few people and organizations that were equipped to do so. It was possible to mostly trace back each piece of content that was published to its publisher, should the need arise. Publishers of content followed established and trusted processes for ensuring what they put out was verifiable and accurate, and also maintained high standards of quality as it reflected on their company and brand. They also had processes to enable the published content to reach the target audience through networks of distributors, stores, shops, libraries, and other outlets. One could not just wake up one fine day and become a publisher. Everyone else was a consumer of the content created by publishers.

Much has changed in publishing since the advent and rapid growth of the internet and the many social media platforms that have mushroomed as the internet has penetrated to the deepest and darkest corners of the globe. As human interaction moved from direct physical interaction where communication was done mainly in physical proximity at close quarters and informally or verbally, to communication and sharing of information over the internet with people spread out across the globe, it has taken the shape of content. Once created, it acquires life. It is believed that even if content is deleted, it continues to be available in some shape or form in the deep crevices of the digital world.

As we share and exchange ideas on social platforms powered by the internet, we create content. Each time we share pictures of a newborn, we create content. Each time someone posts a complaint to the public handle of a government department, content is created. Each time an extremist organization raises a slogan for action against a community, and posts it online, content is created. From being mere consumers a generation ago, each of the 7 billion plus human beings, while being consumers, now also have the power and means to become producers of content, and well over half who use the internet and social media platforms with regularity, already are.

Expertise is needed to manage the train that would otherwise be a runaway train, from drowning us all under the sheer weight of the content created. BPO providers like oWorkers have stepped up to build expertise in handling services related to creation of content on the web. They have been recognized as one of the three best BPO providers of data services in the world. It is a testament to the capability of their staff as well as the leadership team that has over 20 years of hands-on experience in the industry.

 

User Generated Content (UGC) and UGC Moderation

UGC refers to User Generated Content, the truckload of content that Average Joe, folks like you and I, are creating on platforms on the World Wide Web, the Internet. In slightly more technical terms, UGC is the content published or created by people who don’t have a commercial or any other connection with the subject they are creating content for.

Truckloads is not an exaggeration. If we look at the internet as a platform, more than 60% of the world’s population are active users of the internet. Over 90% of these internet users, a little over 4 billion, are active users of social media.

What does this mean?

This means generation of a voluminous amount of content.

According to domo.com, that runs a cloud-based operating system that unifies every component of your business, and also collects and publishes information on internet usage statistics, every minute of the day, day after day, the following happens:

  • YouTube users upload 48 hours of new video
  • Instagram users share 3,600 new photos
  • Brands and organizations on Facebook receive 34,722 “likes”
  • Over 100,000 tweets are sent

This, as you can guess, is only a sample. There are more than 4 social media platforms as we all know, and there are multiple ways in which content can be created on each. If you have interest in knowing more, you could look at the infographics periodically published by domo.

Social media platforms need to undertake moderation to ensure the sanctity of the space and keeping it safe for everyone. This may include monitoring and removing malicious content such as hate speeches, pornography, violence. Platforms are, after all, also businesses, and they need to protect their business interest by maximizing participation.

Being a people driven activity, it requires a constant supply of competent resources to undertake the task. This is where oWorkers stands out. With its deep connections in the local communities, they are now a preferred employer and attract a stream of jobseekers on a continuous basis. The pandemic has done nothing to stem the flow; it has only moved online. They are able to hire the most suitable resources for client projects and even hire replacements for attriting employees, in good time.

The related advantage this provides, which translates to a significant cost benefit for their clients, is the ability to ramp up at short notice. oWorkers can hire almost a hundred additional resources in 48 hours to cater to unexpected spikes in volume. From the clients’ point of view, this replaces the resources that they would otherwise need to hire and keep idle when volumes were low. It could be a big number for many clients.

 

Companies’ need UGC Moderation too

While individuals use the internet and social media platforms and create organic content, since the time they have become aware of its possibilities for their business, companies actively promote and solicit user participation. They create communities on social media platforms dedicated to their own brands and try to stir activity around it in an effort to build a self-sustaining community brought together by their common interest in the brand.

While it provides many benefits, as recounted later in this piece, driven by the UGC it can potentially create, this UGC creation also creates the need for UGC moderation as a widely distributed, unmonitored group, even if it of well-wishers, cannot be left to its own devices to drive the groundswell of popular opinion for the brand.

Outsiders may be ignorant

Like they say, ‘the road to hell is paved with good intentions.’ UGC contributors may be well intentioned, but they come from a place of feelings and sentiment, not necessarily supported by research and data that the company might possess. This could create situations and content that is inimical to the brand’s interests and hence may need to be monitored.

Competitors could sabotage content

If the site is open to brand followers and well-wishers, it is also open to competitors. Like spies sneaking into enemy territory in Hollywood movies, competitors could unleash unfavorable UGC through the medium of moles that creep into your territory. They could even use your property to push their own products and brands through fake accounts.

Compliance with brand guidelines

While it is understood that the common user is not a representative of the company and is not expected to abide by its policies and guidelines, each brand does wish to project a certain kind of image and persona. Some may wish to project a family-friendly persona, some others may like to project an irreverent and edgy image, and so on. They would perhaps expect even their UGC creators to be able to link in with their theme and build on it. They may need to look out for content that militates against their target personas.

Responsibility rests with the company

While UGC may have merits, furtherance of the objectives of the company lies squarely on the shoulders of its authorized representatives, and not that of the user who creates content. Hence, while taking action on a particular piece of content may be a choice, monitoring and overseeing that process is not. It needs to be done. The company may choose to leave negative comments untouched for the sake of creating a trustworthy site, but they need to know that negative comments have been posted.

With clients from around the world, oWorkers understand the expectations of corporate clients. They are GDPR compliant, as part of the basic requirements of operating from the Eurozone, and operate from super secure facilities to ensure security of client data. Their ISO (27001:2013 & 9001:2015) certification further strengthens their environment. They are also equipped to provide physical segregation between clients with the help of access control.

The strong partnerships oWorkers has built with technology companies gives them access to the latest technology that, again, benefits clients as it their projects that it is deployed on.

 

Benefits of UGC

If UGC moderation is a requirement that stems from its lack of reliability, why don’t we just get rid of it, or don’t allow it? After all, it is in the hands of creators of online spaces to decide what nature of participation they would like to permit on their space, if any at all. No UGC, no need for moderation. Right?

But that may not be a preferred alternative. UGC delivers significant benefits to companies that they are unwilling to eliminate.

UGC drives confidence in the brand

While advertising has brought many products to the attention of target customers, its messaging is taken with a pinch of salt. Consumers are smart enough to realize that the advertiser has a vested interest and is likely to only put out the good word about the product. Hence, caution needs to be exercised before action upon the stimulus.

UGC, on the other hand, does not suffer from this disbelief. It has increasingly become the go-to source for validation before purchase decisions are taken. The ability of a brand to openly seek and display such feedback creates confidence in the brand.

Creates a virtual, extended team

Every employed resource or hired vendor costs the company money. The greater the requirement for content creation, the greater is the likely cost of doing so.

Enter UGC, and the need for an extended team, and related costs, vanishes.

If the company is able to create a thriving, vibrant community around its brand, they will create marketing content for the company as a result of communicating around the brand. This will create a vast amount of marketing content at no cost, except the cost the company may need to incur on UGC moderation, which is likely to be a fraction of the cost saved in additional hired resources.

UGC comes at almost no cost

As a corollary to the above, fans and members of the community created around the brand, expect no remuneration for their work. They do it because they like the brand and identify with it. It creates a sense of community for them and enables them to be a part of something larger than themselves. It also allows them to connect and interact with other similarly disposed people. This enduring goodwill creation tool comes at no cost for the company.

Gives you good SEO karma

A key initiative for marketing teams in the internet world is to get good search engine rankings for their web properties. This is an organic process and often takes years, and many times does not even deliver beneficial results as there are many other moving parts that are also changing shape and form at the same time.

Getting UGC into the frame enhances the dwell time, the amount of time a visitor spends on your site. Visitors are known to favor UGC over brand-created content and spend more time on it. It also gives you a cache of keywords far richer than your own marketers may be able to conjure up. And, it contains ratings and review scores which are also known to drive SEO optimization.

Is it, therefore, any surprise that companies would rather live with the ill consequences of UGC and handle it through UGC moderation, than eschewing it altogether.

The ability of oWorkers to provide support in 22 languages is advantageous for global clients operating in diverse geographies as well as companies looking for geographical growth. They don’t need to look for a new partner each time they expand. oWorkers becomes a support for their growth and not a hindrance.

 

Outsourcing for best results

Outsourcing critical tasks to a specialist is no longer an alien concept in business.

oWorkers has built a reputation as a reliable partner. It offers pricing that adds value and is transparent, often giving a choice to the client between two competing mechanisms. Many clients note saving more than 80% cost after engaging oWorkers as a partner.

It has stuck to its policy of working with employees, and not contractors or freelancers. This provides the company flexibility in deployment as well as a competent and experienced middle management talent pool. It has repeatedly been ranked above 4.65 on a scale of 5, on platforms like Glassdoor.

Its 24×7 ready facilities are equipped to handle most requirements and deliver quick turnaround. Several unicorn marketplaces around the world choose oWorkers as their partner for UGC moderation.

The Best Social Media Moderation Tools

The Best Social Media Moderation Tools

The Best Social Media Moderation Tools

Freedom of speech is both a used as well as abused concept.

It is understood as the right of individuals and groups to express themselves without fear of retribution. It is recognized as a basic human right in the Universal Declaration of Human Rights and enshrined in the constitutions of many nations that have one.

Freedom of speech is such a fundamental concept that it could be interpreted to even mean that we have the freedom of speech to define what freedom of speech means.

Having said that, Article 19 of the UDHR states that “everyone shall have the right to hold opinions without interference” and “everyone shall have the right to freedom of expression; this right shall include freedom to seek, receive and impart information and ideas of all kinds, regardless of frontiers, either orally, in writing or in print, in the form of art, or through any other media of his choice.”

Perhaps to prevent, or at least guard against misuse, it is further qualified by stating that the exercise of these rights carries “special duties and responsibilities” and may “therefore be subject to certain restrictions” when necessary “for respect of the rights or reputation of others” or “for the protection of national security or of public order (order public), or of public health or morals.”

Regardless of the language the speech is in, even free or not, oWorkers has the skills to understand and handle it. With its policy of employing a multicultural and multi-ethnic team, it has developed the ability to provide services in 22 languages. It stands atop the BPO provider world, identified as one of the top three providers of data based services in the world.

 

Burgeoning content fuelled by social media

Social media has magnified the availability of content. Every person on social media is both a consumer as well as publisher of content. The days of publishing being the preserve of a few are passe. The teenager planning to meet her friends through a series of messages, the homemaker sharing pictures of her latest culinary achievement and the senior leaving a message for his grandson’s college graduation, are all publishing content on social media platforms. Nobody is telling them what to share and what not to share. Nobody is editing their content before it is available on the platform, widely accessible.

The easy access to publishing on a platform that has wide membership and usage is also attractive for people with their own agendas that might have scant regard for the policies of the platform, or the accepted norms of civil society. Thankfully these people are a small minority but create the need for the massive infrastructure around moderation that also fuels the need for social media moderation tools so that the processing can be done effectively.

Whether it is a hate speech inciting followers to violence against an identified group, or videos of graphic violence, or pornographic images, the process of moderation seeks to identify malicious content before it is able to do its dirty work.

Social media moderation can be defined as the process through which user generated content on social media platforms is managed in order to ensure that it adheres to the policies of the platform as well as accepted civil society norms.

The leadership team of oWorkers has over 20 years of hands-on experience in the industry and is able to guide the team to develop new skills. It is now a force to reckon with in content moderation, a service not even provided by BPOs over 15 years back.

 

Social media moderation tools for companies

As we know, companies have made a beeline for social media platforms, drawn by the presence of larger and larger numbers of present and prospective customers, and the prospect of a platform through which they could be reached relatively inexpensively as compared to traditional advertising. Of course, with platforms waking up to the revenue opportunity, the difference has gradually become smaller and smaller.

The need for companies to leverage social media stems from a continuous need for keeping their brands and products and services active and fresh and central to what is happening in the world. For this, they create communities and groups on social media platforms where their brand is the hero and the discussion happens around it. Once this conversation becomes organic, it requires less fueling by the company itself, and hence less resources, and does the work of promoting the brand on its own, with existing users looping in more and more new users through the platform.

Participation in these communities and the resultant content that gets created, and that companies seek to promote and generate, is known as User Generated Content (UGC). It is the elixir that all companies look for to keep their message alive and brand healthy. And why not? If I am interested in buying a product, will I trust what the interested party, the company that is selling it, has to say about the product, or will I trust the feedback of a user who has no stake in whether one more unit gets sold or not. The latter, obviously.

But creating platforms where such content can be generated, with obvious benefits, has attendant risks as well. Firstly, they are subject to the increasing volume of content, as platforms themselves are. That may not be bad in itself. But that burgeoning content also includes malicious, defamatory, unsuitable content as well, just like platforms. Hence, deploying social media moderation tools for keeping the web spaces squeaky clean has become a requirement for companies using social media for various purposes.

With its centers located in three distinct geographies, oWorkers provides the benefit of business continuity to interested clients. All its centers are equipped to operate 24×7. Clients from the US and Western Europe have lauded the fact that they save almost 80% of their pre-outsourcing cost once they outsource to oWorkers. Several unicorn marketplaces around the world trust oWorkers with their content moderation requirements.

 

Human beings as social media moderation tools

For better or for worse, the human brain remains a peerless organ. While mankind continues to seek automation as a means of saving labor, as well as ensuring adherence to guidelines and saving recurring cost of people, there are limits to what automation can do, despite the fact that the boundaries are constantly being pushed. Artificial Intelligence (AI) for example, that has been in development for many years, has now created another frontier, by getting machines to understand and act on unstructured information.

As far as the human brain is concerned, however, there is nothing that is beyond its pale. Before an activity is considered for automation, it is probably already being done by humans. Not that human performed tasks are without issues. Far from it. Humans think, and that is an issue in repetitive tasks. They make mistakes. Humans have emotions. Repetitive tasks can tire them out and lead to burnout. And they take money to maintain, as they need to be kept fed, clothed and housed at the very least.

Nevertheless, human beings have proved themselves to be indispensable, yet again, this time for the purpose of moderating UGC on social media. There are many ways this most intelligent of social media moderation tools can be deployed.

Pre-moderation

This is what is considered to be the ideal way of moderation, especially for web properties that are particularly sensitive to offensive content. The content submitted by a user is reviewed and, if found acceptable, authorized for publishing.

While it provides control over content, the process is resource hungry and can cause delays, leaving users dissatisfied. After all, if you are publishing content on social media, you want to see the result immediately. Conversation can become stilted as a result. Besides, it can be a nightmare doing this for a site that generates heavy traffic.

Post-moderation

The ‘post’ in post-moderation can be understood in two ways:

  1. The moderation is done afterwards (‘post’ means after)
  2. The moderation is of a ‘post,’ of something that has already been ‘posted.’

The result is the same. The moderators attempt to play catch-up with the posts and remove the ones found offensive to the site. They also evolve smart search and monitoring criteria based on past experience so that they do not need to go through the full set and can do sample monitoring.

This provides a better experience to users, as their posts are visible immediately, but could allow some malicious content to be visible before it can be brought down.

Reactive moderation

This mechanism assumes all content is kosher unless identified otherwise by a user. Users and visitors are presented with a facility, like a Report button, through which they could express displeasure or highlight its inappropriateness. Different sites could do it differently. Once it has been flagged, the moderation team gets into the act and reviews the content for suitability and takes action in accordance with their findings.

It is not fool-proof, but it is inexpensive. It relies on the sincerity of the participants and the amount of ownership they feel for the site and to keep it safe and clean.

Distributed moderation

This method is also based on user inputs. It requires users to vote on the content they access and its suitability for the site, which could be in a variety of ways. The eventual result is that content voted down by users keeps going down in rankings, with the lowest ones becoming virtually non-existent or invisible.

This is a cost-effective method of moderation and may be suitable for sites that are less sensitive to offensive content, if perchance it is left visible.

Whatever the method adopted, oWorkers has the right resources for it, owing to its ability to attract the best talent, being a preferred employer in each of the locations it operates from. Its employees, both past and present, routinely rate it above 4.6 on a scale of 5 on platforms like Glassdoor.

The added benefit, which gets passed on to clients through the pricing mechanism, is the ability to provide up to 100 extra resources within 48 hours, to meet short term peaks in demand. Clients working with vendors who don’t have this ability, normally end up paying to retain additional resources on the bench.

 

Automated social media moderation tools

At some stage it becomes imperative to move beyond human processing and explore automation options. The drivers could be many; high processing cost, chronological processing by humans causing delays, propensity to deviate from set rules, errors or burnout. Imagine if an automated solution could handle pre-moderation. Pre-moderation is difficult because there is intense pressure to publish immediately, otherwise the contributor will get cheesed off and the conversation will flag. The reason delays happen is that a human can process one transaction at a time. Since content could be created anywhere, anytime, there could be loads of it waiting in the queue. Imagine if instead of humans, a smart automated solution, with the ability to process millions of transactions in a short period, say a second, could do it. It could solve many issues faced with moderation and UGC today.

If only…

If only machines could be taught to understand unstructured content.

The reason humans have been almost indispensable for the process is that the content that needs to be moderated is unformatted. With great effort humans have been able to make computers understand formatted content. Software code is a type of formatted content based on which a computer takes actions. But unstructured content is something else.

The content could be an image, or an audio file, or a video or just plain unstructured text, gibberish if you please. A human brain, with its fine sensibilities, can understand the content, but a machine cannot, unless taught how to.

This is where AI comes into the picture.

With AI, machines are learning to understand, interpret and act on unstructured content, be it text or audio or image or video. Through detailed training programs, computers are being taught to read/ see/ view the content and make connections with actions. For example, an AI solution for an autonomous vehicle might involve exposing the engine to a traffic light in all possible shapes and sizes and forms and connecting that ‘view’ with an action; move if green, stop if red.

The same thing is happening in content moderation. Automated solutions are making progress and, hopefully, will soon be doing the heavy lifting instead of humans.

It is to be expected that reliable social media moderation tools should:

  • Be able to handle content in any format, including text, audio, video and images
  • Be equipped to handle content of all types, including reviews, blogs, emails, comments, etc.
  • Provide a dashboard facility where traffic can be monitored
  • Allow automation of tasks where possible
  • Have NLP capability to interpret audio files and establish perspective for textual information
  • Permits delegation of specific tasks
  • Have filters for profanity, vulgarity, pornographic content, violence, etc.

A large number of oWorkers clients are technology companies. These clients keep oWorkers honest and up-to-date regarding technology. oWorkers operates from super secure facilities that are ISO (27001:2013 & 9001:2015) certified which ensures the safety of client data. It is also GDPR certified. For clients that require it, physical segregation of client workspaces is possible through the means of access control.

The Rise and Rise of Social Media Moderation Services

The Rise and Rise of Social Media Moderation Services

The Rise and Rise of Social Media Moderation Services

The development of and advancements in telecommunication technologies and the birth of the internet, in many ways freed the BPO, or Business Process Outsourcing, industry from its physical shackles.

The industry had existed ever since consumer franchises in the West had started growing on the back of a prosperous economy in the decades after the Great Depression. The need had been felt for segregating business processes that were not customer facing, and did not need to occupy expensive real estate in downtown office centers of big corporations, which had led to the creation of support centers mostly on the outskirts of the big city, so that documentation could be physically moved between the two.

With increased mobility and cheaper long-distance travel and transportation, these support centers could be moved further away from the business centers, to take advantage of cheaper real estate and more easily available workforce in the hinterland. Somewhere along the way, with increasing volumes and some companies finding it challenging to run both their business as well as support services, specialist BPO operators came into existence.

The advancements in telecommunications and the availability of the internet to the common man finally gave flight to the industry and freed it from its physical shackles. It now became possible to move white-collar jobs to pretty much any part of the world where the required skills were available, and get it done without losing anything by way of turnaround time. In fact, because of time zone differences, in some cases turnaround times actually improved as the processing center would be opening by the time the business center was ready to shut, do the processing during its business day and send the work back by the time the business center opened for work the following day. It was almost as if some elves had descended and completed your work while you were asleep.

oWorkers has been at the forefront of business process outsourcing (BPO) services that have been unleashed by the revolution in telecommunications technology. It has identified data-based services as its area of operation in which it excels and covers a host of services including, among others, moderation services. It has been identified as one of the top three data based BPO services providers in the world.

 

The rise of social media moderation services

Of course, nobody can see the future. And change is a law of nature. We know that, don’t we?

Do we know what will be the services that will comprise the bulk of outsourced processing work say 25 years from now?

Obviously, we can’t. Of course, some wise folks will make predictions betting on nobody remembering them when that future time arrives, or there being more pressing issues of that time to be handled instead of worrying about random predictions.

At the time BPO services started to be sourced from all corners of the globe, with geographies like India and the Philippines perhaps leading the way, the focus was on outsourcing bulk processing activities that would deliver substantial benefit to the bottom line, if outsourced, like Call Centers, or medical transcription, or data entry.

At this time, social media was perhaps not even a gleam in the eye of Mark Zuckerberg, the founder of Facebook, which is the largest social media platform today. It follows, therefore, that neither would social media moderation have been a gleam in the eye of a regulator, nor social media moderation services in the eye of a BPO provider.

It is now a matter of historic detail that social media has taken the world by storm, in just about two decades, rising on the back of computing devices being in the hands of more and more people all the time, connected to the world wide web.

Launched as an attempt to connect students in the university he was attending and from the dorms of which he did the coding, Facebook soon went beyond and attracted people who found it to be a useful way to connect with friends and family members and indulge in an exchange of ideas and thoughts and communication. Many other social media platforms soon followed suit and launched their own versions with their own unique offerings. Based on monthly active usage, the most popular social media platforms today are:

  • Facebook – 2.7 billion
  • YouTube – 2 billion
  • WhatsApp – 2 billion
  • Instagram – 1.16 billion
  • TikTok – 689 million
  • SnapChat – 433 million
  • Reddit – 430 million
  • Pinterest – 416 million
  • Twitter – 353 million
  • LinkedIn – 310 million

The above is based on information published by the Search Engine Journal.

Initially meant for interaction between human beings, the success of these platforms has also made them objects of interest for companies. Forever in search of the next customer and the next revenue dollar, companies will go where humans, and potential customers, are. Companies are creating communities around their brand and products and trying to promote healthy engagements which fulfil many objectives together. The adoption of social media by organizations has provided a further impetus to the usage of social media platforms, as well as social media moderation services. Now it is not only the platforms that need these services, each organization and business that manages a group or community on a platform needs to do it too.

Whatever the reason for moderation, with its well-trained team of people, oWorkers is well equipped. It operates with the huge advantage of having access to an almost endless supply of resources, one of the major considerations in the industry. This is because it is a preferred employer in all its delivery locations, thanks to its deep engagement with local communities.

The steady supply also enables oWorkers to provide just-in-time resources to clients in case of unexpected spikes in volumes. This has turned out to be a boon for clients who would, otherwise, have to hire resources to meet peak volumes, and keep them idle the rest of the time; an unnecessary cost.

 

The need for moderation

We perhaps know most of the reasons that create a need for moderation of the content on the big social media platforms that can be accessed by billions of users around the world.

Despite free speech being a laudable objective, each platform has rules that it defines for engagement. Many of these are rules of social conduct that cannot be controlled by setting up automated thresholds or conditions. While technology tools can be used to some degree, in many cases a final call can only be taken through a human review.

Going beyond platform specific rules, there are many unspoken, unwritten codes that have evolved over millennia and now operate in civil society. They also need to be abided by. As do specific laws that are enacted by governments around the world.

This is why posting an audio of a speech exhorting violence against followers of another faith is not acceptable. This is why images of child nudity, or adult nudity for that matter, are not acceptable. This is why a video of graphic violence is not acceptable.

And this is why we need social media moderation services. So that vitriol is not spread through the medium of social media. So that lunatics feeling secure sitting in a dark corner of the web do not make life miserable for others.

The partnerships with technology companies that oWorkers has forged, give them access to the most modern technologies at all times. This eventually benefits clients as the technology is used for processing their transactions. In any case, 85% of oWorkers clients are tech companies who ensure that only the best technology is used.

Apart from being GDPR compliant, oWorkers is also ISO certified, and creates physical segregation between projects, where required, with the aid of access control.

 

Social media moderation services for a company

But what about the social media presence of a business. Why does that require moderation?

As we know, companies are creating spaces on social media platforms where they promote conversations about themselves and their business. The intent is to create an environment that portrays their company and its products in a favorable light so that buying opportunities amongst customers can be maximized.

This is a space where the company has the most investment and interest in. Hence, it follows, that the success or failure of these communities is a reflection on the company that promoted it as well as a negative outlook on future sales.

Who would have an interest in moderating this space? The company would.

How would it help?

First and foremost, by eliminating offensive content, it would maintain the space which can be safely participated in by interested users, without fear of being spammed or abused or threatened.

Rules being the starting point for most such engagements, they would want to enforce them in the interest of their brand. This community is centered around their brand and people may need reminding of that. A discussion around the recently concluded Olympics may not be an appropriate subject for discussion, except if it is in the context of the brand whose space it is being discussed on.

Being open, worldwide platforms, activity can happen at any point of time during the day or night. One offensive tweet and several weeks of effort could be nullified. Such possibilities make social media moderation services relevant for companies that require proactive crisis management.

Social media now serves as an important customer service touchpoint as well. Since many people now have an internet-ready device close at hand at most times during the hours they are awake, it is easy to express on social media, on some of which they are always connected. In case of an issue with a product, instead of trying to locate contact numbers or email IDs, it is much simpler to post the grievance on a social media platform. This makes it mandatory for the company to access and respond as an unchecked grievance could snowball into a public relations catastrophe. By default, social media platforms have also become customer servicing channels.

The process of moderation provides companies insights into customer behavior. These insights are valuable as they are based on natural behavior of customers in settings they feel safe in, as opposed to the more ‘managed’ traditional mechanisms such as customer surveys.

Building communities on social media for your company and brands offer great ROIs but need to be managed.

Operating out of three distinct geographies of the world, oWorkers practises the philosophy of multi-ethnic and multicultural teams. This diversity is the bedrock of their multilingual capability with which they become partners in growth for clients as they expand to geographies where the language changes. At present, oWorkers can support work in 22 of the most commonly used languages of the world.

 

Outsource social media moderation services to oWorkers

Human resources is one of the keys that unlocks the possibilities in an outsourcing engagement. oWorkers has chosen to rely on the model of hired resources over that of freelancers and contractors that some of its competitors seem to prefer. Having insourced workers provides flexibility to oWorkers in deployment while enabling employees to demonstrate their capability and grow in the company. These workers are well cared for, as demonstrated by the high ratings that employees, both past and present, leave on platforms like Glassdoor.

Their clients value the transparency they get in pricing, with a choice between input based pricing or output based pricing. They also frequently note cost savings of almost 80% after outsourcing to oWorkers.

oWorkers is ably led by a team of professionals who have hands-on experience in the industry of over 20 years. With clients from all over the world, they are able to provide a quick turnaround, either leveraging the difference in time zones, or operating their centers 24×7, that all centers are equipped for.

The major question in outsourcing has moved from “should we?” to “how should we?” Clients of oWorkers clearly recognize that outsourcing adds value to their capability set by taking away the responsibility for non-core activities at reasonable rates.

How to Create Social Media Moderation Guidelines

How to Create Social Media Moderation Guidelines

How to Create Social Media Moderation Guidelines

“Because it’s there” were British climber George Mallory’s words when asked why he wished to climb Mount Everest. The same year, in August, Mallory, who has mythical status in the world of climbing, and climbing partner Andrew Irvine, disappeared from the mountain. Some say on the way to the peak, others say on the way back, potentially making them the first to have set foot on the highest peak in the world. Of course, the first recorded and documented ascent was by Edmund Hillary and Tenzing Norgay in the fifties and they are recognized as the first.

“Because it’s there” might also be the response of the billions of users of social media when asked why they are on social media or why they use it. And posts like “Had breakfast, feeling Good” or photoshopped pictures of a vacation in an aspirational, expensive destination, might have fuelled some of the speculation regarding the relevance of social media. Whether serving a useful purpose or not, social media has taken over the world like a force of nature and now is a fact of our lives, and no longer a choice.

oWorkers has been there, too. In over seven years of providing data based BPO services to global clients, it has earned several accolades, including being counted as one of the three best BPO services providers in the world, in its category. Its leadership team, which has over 20 years of hands-on experience in the industry, continuously looks for newer peaks to climb in their effort to lead the team to greater heights.

 

Requirement for social media moderation guidelines

Climbing Everest was one of the ultimate feats of human endeavor. Like rowing solo across the Pacific Ocean. Or sledding across Antarctica to reach the geographic South Pole. Or swimming the length of the Amazon river. But that was then, when Everest was a fabled presence, deep in the wilderness, across unpopulated wilderness of rural Nepal on one side and Tibet on the other, the tallest peak as the Himalayas majestically rose from the plains across Northern India and Nepal and then plateaued out into Tibet.

Today it is a commercial venture. If you have the money and the desire, ante up and you will be on the next hypersonic pathway to the top in a few years, since bookings apparently run that long. But that is what it is. The unfortunate outcome of the traffic has been the desecration of the mountain, with towns and communication towers being set up and discarded equipment like used oxygen cylinders defiling the mountain face. The result is that rules have needed to be set up so that climbers are aware of the dos and don’ts. Not just that, periodic cleaning attempts have to be undertaken so that such equipment neither becomes a danger for other climbers nor creates an ecological disaster in the future.

The birth of social media is generally traced back to the creation of Facebook, which was apparently coded by its founder Mark Zuckerberg in his dorm in Harvard. It had a purpose when it was created, of enabling students to connect with each other in a non-intrusive way. And that remains. It continues to be a non-intrusive way of connecting with other people, no longer limited to student communities. Its use has spread like wildfire across the world, on the wings of the rapidly spreading reach of the internet, and encouraged the birth of many other social media platforms in its wake.

No longer limited to a reasonably homogeneous community, as happened in the case of Everest, increased usage appears to have introduced malpractices in social media usage that the world has been forced to sit up and take notice of. Large numbers sometimes give the feeling of safety, of being hidden in a large crowd, which can encourage people to be obnoxious. That is perhaps what has happened with social media. Sitting in their own dark corner of the world, communicating only with machines, and not real people who can talk back, some people feel powerful and perpetrate content on an unsuspecting population that is designed to push their agenda and spread discomfort and strife in the rest. Whether it is spreading messages of hate against a community, glorifying violence and posting gory images, sharing pornographic videos, anything is possible. Civil society does not look kindly upon such content for open access.

The result has been the birth of what we know as social media moderation, that seeks to head off offensive content before it can reach the masses, accompanied by social media moderation guidelines.

oWorkers has been involved in social media moderation from the get go. Its relationships with technology companies has enabled it to access cutting-edge technology for its work. This works in favor of clients as it is for client work that these technologies are used. As a GDPR compliant and ISO (27001:2013 & 9001:2015) certified company, it creates confidence in clients about the security of their data.

 

What do social media moderation guidelines look like?

Expectation setting is one of the basic principles in society. When a person signs up on a social media platform (an account creation is always required), since the platform has been put up, and presumably is inviting participation by making it open and accessible, at the first step it is the responsibility of the platform to provide a clear understanding of its rules and regulations so that the user can know what she is signing up for. If unhappy with what she sees, she is free to walk away. The platform is not in any way forcing her to participate. But once she signs up, the expectation is that she understands the regulations and agrees to abide by them.

It might be instructive to look at the guidelines of Facebook, the largest social media platform, with over 2 billion users, almost 30% of humanity.

Facebook divides its guidelines into sections, presumably for ease of access and in order that bite-sized chunks can be accessed at one time. While there are many policies and guidelines they publish, we will focus on a few that, based on their title and classification, appear to relate to offensive content. They list them down under the following sections:

  • Violence and Incitement
  • Dangerous Individuals and Organizations
  • Coordinating Harm and Publicizing Crime
  • Regulated Goods
  • Deception
  • Suicide and Self-Injury
  • Child Sexual Exploitation
  • Abuse and Nudity
  • Sexual Exploitation of Adults
  • Bullying and Harassment
  • Exploitation
  • Privacy Violations
  • Hate Speech
  • Graphic Content
  • Nudity and Sexual Activity
  • Sexual Solicitation

To understand it a little better, here is a section explaining why they have put the Violence and Incitement policy in place:

“We aim to prevent potential offline harm that may be related to content on Facebook. While we understand that people commonly express disdain or disagreement by threatening or calling for violence in non-serious ways, we remove language that incites or facilitates serious violence. We remove content, disable accounts and work with law enforcement when we believe that there is a genuine risk of physical harm or direct threats to public safety. We also try to consider the language and context in order to distinguish casual statements from content that constitutes a credible threat to public or personal safety. In determining whether a threat is credible, we may also consider additional information such as a person’s public visibility and the risks to their physical safety.”

For those who may be interested, these guidelines are available on the websites of the respective social media platforms.

It must be clarified, however, that the practice and process of reviewing and removing, if required, content, is an internal process of the company.

Regardless of platform, oWorkers has the trained manpower to moderate social media content for you, aided by the endless supply of the best talent available in the marketplace owing to their standing as a preferred employer. Fresh resources are then adopted by the dedicated training teams in each of its locations, to polish the rough diamonds into shape for client engagements.

 

Social media adoption by companies

Adoption of social media by companies is a precursor for the creation of social media moderation guidelines.

The world moved its original target market of college students very early in its life. It has, since, also moved beyond social media platforms being used for casual person-to-person exchanges only.

Constantly on the lookout for their next revenue dollar, companies perforce have had to get on to the social media bandwagon. If social media is where people are headed to, that is where they must head to, as well. If a company could be depicted in a comic book, looking at masses of people, potentially within their target segment, it would probably be shown with $$ signs in its eyes.

Since the usage of social media platforms has experienced a steep adoption curve, companies have had to evolve strategies for it as well. Through social media, they hoped to reach their message across to a much larger set of people at a fraction of the cost of traditional media. This was true in the early days of adoption but pricing may be more competitive now. They hoped to create a community around their brand where the discussion would be about their company and brands, positive hopefully. They hoped to reach out to new customers as happy customers will find it easy to share the message across on the same platform to many more people. They hoped to also leverage social media as a channel for customer service. Most people now have an internet-connected device with them all the time. If they have a need for service on any product, rather than look for how to reach the parent company, they might just find it easier to locate them on their social media sites.

The usage of social media by companies has rapidly expanded, in tandem with the growth in its adoption by individuals. Now there are millions of spaces on social media platforms that have been created for their own use.

Dollars being important to companies, the pricing offered by oWorkers becomes a differentiator. It offers a choice between output-based and input-based pricing to prospective clients. Most clients, especially those from the US and Western Europe, have noted savings of up to 80% after their work has been outsourced to oWorkers.

 

Setting up social media moderation guidelines

If they are trying to benefit from the reach of social media, they will have need to live with the ills of the platforms as well, with the main one being that of misuse of their space for propagating thoughts and ideas that may be either out of line with the rules of the site or abhorrent from the social and civic perspective.

So, what should they do?

They need to do what the platform owners do; moderate the content on their social media properties.

How do they do it?

While implementation may vary from company to company, the first step is usually the articulation of guidelines that need to be adhered to by users. Theirs may not be the road to hell, but they have to pave it with good intentions. What they might need to bear in mind while setting up social media moderation guidelines for their space:

Define the purpose of creating the community

This could be a high-level way of setting out expectations in easy language before you get into more detailed explanations later.

Specify what is acceptable and what is not

This ensures that participants have clarity and don’t end up having an excuse of lack of clarity. You could also specify legal and regulatory reasons, if any, for keeping some types of content on the ‘not acceptable’ side of the list.

Articulate consequences

A rule without any consequence of violation is just a homily. It has no place in business. Consequences for violators should be defined, and carried out. If there is an appeal mechanism where identified violators could seek a review, that should also be defined.

Timeline expectations

If the social media channel is being used for customer service, it would make sense to define turnaround times for response and action. Some companies may also need to set up processes and timelines for handling emergencies.

Define responsibility for moderation

All the good work in setting up a framework can come to nought if the responsibility for doing it is not defined. This might also be an opportunity to evaluate outsourcing versus doing it inhouse.

Be polite

When you dip in and participate in the community, being polite and professional, regardless of provocation, is mandatory.

Regardless of the guidelines of a company, oWorkers has the skills to deliver the goods. Their workers being employees, not freelancers or contractors as preferred by some competitors, gives them the flexibility of redeployment. They have also been able to build supervisory experience through managing the career and growth of staff members. They routinely get scores of 4.65 and above, on a scale of 5, from past and present employees, on platforms like Glassdoor.

 

In Conclusion

With their adopted policy of multi-cultural and multi-ethnic teams, oWorkers is able to offer services in 22 languages, regardless of the social media moderation guidelines of a client. This becomes a growth enabler for clients when they seek to expand to new geographies.

It has delivery locations in 3 distinct geographies of the world, widely recognized as among the most suitable for the business. This creates the possibility of one center serving as the business continuity backup for another, should client needs require this arrangement. In any case, all centers are equipped to operate on a 24×7 basis.

Their access to a continuous supply of manpower makes ramping up and down for clients easy, a huge saving of cost for clients.

They have been able to create a pathway for entry into the digital workforce for many from less privileged backgrounds. Your work will enable them to do the same for a few more.