Guide On How To Outsource Content Moderation

Guide On How To Outsource Content Moderation

Guide On How To Outsource Content Moderation

Let us pick out a few headline items constituting user generated content (UGC):

  • 500 hours of video content is uploaded to YouTube
  • 347222 stories are posted by Instagram users
  • 147000 photos are uploaded on Facebook
  • 41,666,667 messages are shared by WhatsApp users
  • 69444 jobs are applied by LinkedIn users

This was published in August 2020.

There are thousands of other popular platforms like Twitter, Reddit, Viber, Tumblr, eToro, Goodreads, Snapchat, again with huge amounts of content being produced and consumed by users, by anyone who signs up.

Also, there are 1.2 billion websites (1,196,298,727 to be exact according to Netcraft’s September 2020 data), some of which also permit, and even encourage, what is referred to as user generated content (UGC), for business and brand engagement, or in the hope of making their platform the next Facebook or Youtube. Every industry and business has a digital presence. There are online applications for Credit Cards, online stores for buying clothing items and online appointments with doctors. Each time we make an input into an Internet application, we create content.


What is Content Moderation?

Civil society has rules and regulations and, beyond that, some generally accepted guidelines of behavior and social norms. We also know that it takes all sorts to make the world. With over 4 billion Internet users, more than half the global population, who can say what nature of content might be posted by one of them.

This is what creates the need for content moderation and content moderation solutions.

While a debate has been going on regarding ownership of content on these platforms, and whose responsibility it is, platforms have stepped in and created processes and systems through which content moderation is carried out. Rough estimates put the number of human content moderators around the world at over 100,000, apart from the tools and AI engines deployed for the purpose.

The human moderators review text, read comments, eyeball images, view videos in an effort to make the web a wholesome place. The increasingly diverse nature of applications available on the internet and the variety of content being created is making content moderation an increasingly challenging activity. Apart from the psychological scars it leaves on people who need to go through raw content for moderating it.

At an individual business and website level too, the intent is to create wholesome content that will attract the right audience who will create content appropriate for that space. Hence, not just the large public platforms, but individual websites also make an effort to moderate the content that gets created on their website or even on their pages on the popular social sites.  


Should you outsource content moderation?

For the moment the debate is pretty much settled, with outsourcing being the preferred option for many websites and platforms that need it though there are dissenting voices advocating for platform owners to do it themselves.

There are many reasons for it:

  1. You don’t have the expertise of the resources. Content moderation is nothing like your business; whether you sell Credit Cards, or run an eCommerce store. It requires specialised skills.
  2. While you may have guidelines for moderation of your online presence, the overarching guidelines are those of civil society and acceptable behavior which is common to all businesses. Hence, you are able to dive into a deeper pool of talent that provides content moderation solutions.
  3. It keeps your employees focussed on your business, where their expertise lies. Really no point in taking them away from a job they can do well and put them onto a job they know nothing about.
  4. As it does not require advanced educational qualifications and extensive experience of work, outsourced resources are available for a fraction of the cost you might be paying to your employees.


Choosing a Provider for Content Moderation Solutions

Let us look at how you should go about the task of identifying a suitable provider when you outsource content moderation, and the parameters you should include in the evaluation exercise.

Listed against each parameter are a set of capabilities, in bullet form, that you should enquire about and an interested vendor should be able to demonstrate to enhance their credentials for being selected for content moderation services that you seek.

Familiarity with work and demonstration of quality

  • Familiarity with type or work needed by showcasing similar work done for other clients; this also demonstrates how well they have understood your requirement.
  • If not familiar, offers compensating benefits, like lower pricing.
  • Willing to connect you to existing clients who will support vendor’s capability claims.

Having delivered content moderation services to several clients over eight years of operation, for oWorkers the sign of success is the continued expansion of client relationships and the additional projects entrusted by them. With their agreement to display client names on our website, our list of referenceable clients runs long.

Attrition and People Management practices

  • Follows contemporary People Management practices that are fair and transparent.
  • Incorporation of work from home related practices.
  • Presents data on attrition which should be lower than other comparable vendors, else there should be logical reasons.
  • Permits unsupervised interaction with a set of frontline workers.

oWorkers has chosen to work with employees on our rolls, and not freelancers and consultants on short contracts. While this impacts on our attrition numbers as we may appear to have a larger headcount, we believe it works well for us in the long run, as we are able to build long-term relationships with our staff and give them progress. Despite our strategic choice, our attrition numbers are best in class. Our employees consistently rate us 4.6 or more on Glassdoor, on a scale of 5.


Preparedness for virtual workforce

  • People Management policies and work contracts permit employees to work from home when required.
  • Hiring can be done remotely.
  • Virtual onboarding and training capability.
  • Technology deployed enables a distributed workforce to seamlessly login and operate and communicate.
  • Supervisors trained on handling a virtual workforce.
  • Quality team has tools through which monitoring and coaching can be carried out.

Ever since the world went into a lockdown because of the pandemic, oWorkers has been quick off the blocks in creating infrastructure that enable us to continue seamless operations whether we operate from the workplace or from home or from anywhere in the world. You get this advantage when you outsource content moderation to us. Not just delivery, but each of our support functions like Hiring, Training, Quality and Workforce Planning have been included in the process of establishing tools with the result that today these units continue to operate and support delivery, anywhere to anywhere.

Workforce Planning and Management

  • Has a team that keeps track of requirement of resources for each project at any point of time and actual availability.
  • This team also has the ability to normalise available headcount to reach target numbers, especially when there is a shortfall.
  • Can demonstrate examples of mobilising and removing resources quickly to meet resource requirements.
  • Distributed workforce is able to log onto a common platform for tracking and visibility as well as communication.

When you outsource content moderation to oWorkers, with our technology infrastructure, we are able to provide flexibility to staff to work either from home or office in a seamless manner, the choice being the employee’s. The communication tools deployed on the internet bring everyone together.

Internal Quality process

  • Has an Internal Quality team that is independent of the delivery unit.
  • The team can access transactions and agents in a virtual (work from home) setting as well as in a physical workspace and interact with them for monitoring and coaching.
  • Performance scorecards of this team are also linked to business performance but independent of the delivery unit.
  • Ability to demonstrate interaction between the Internal Quality team and senior management and examples of actions taken based on their recommendations

Quality Assurance (QA) and Quality Control (QC) processes are a part of the DNA at oWorkers and deployed by default when you outsource content moderation to us.. This process enables our senior management to stay informed about developments and performance of each of our engagements and also keeps the delivery team on its toes. Our Quality team will represent you internally and ensure that the work that reaches you is near perfect. 98% accuracy levels is what this team has helped us deliver consistently.  


  • Price offered is competitive.
  • All aspects of content moderation services required have been covered in the commercial proposal.
  • If the price offered is lower than benchmark, explanation of reasons as well as comfort that it is sustainable.
  • If the price offered is higher than benchmark, explanation of the additional value they offer.

oWorkers’ growing relationships are a testimony to the value we add to our clients. Being the cheapest provider is not our goal, but being the most valuable provider is. With our pricing options for clients, of choosing between dollars per unit of time and dollars per unit of output, cinets can choose what works best for them.

Support for Multilingual delivery

  • Experience with the languages that are our immediate requirement for content moderation services.
  • Experience with languages that might be our next set of requirements.
  • Overall languages covered.
  • Ability to add a language outside the list, if required.

oWorkers supports over 22 languages from its three global centers in the heart of the BPO world. Our network of partners also provides us the flexibility of offering additional languages, should there be a need beyond our current coverage.


  • Tools and technologies used for the work on offer.
  • Ability to access and learn other tools that may be required.
  • Ability to secure the data that they will be working on, including storage and transmission.
  • Physical access control and monitoring of facilities.

Though we offer the most current technologies required for content moderation solutions, we retain the nimbleness and openness to acquire and train our staff on additional tools mandated by our clients. Only if it works for you will it work for us. Of course, having some of the unicorn marketplaces of the world as clients does help in ensuring our technology is current.

oWorkers is ISO (27001 and 9001) certified and GDPR compliant. Our security protocols extend to the staff who work from home. We also offer physical segregation of work areas, if required.

Scalability and access to human resources

  • Level of scalability offered for hiring resources.
  • Access to a pool of short-term workers.
  • Ability and capacity to train at short notice.

When you outsource content moderation, as a result of seasonality and other factors, volumes can fluctuate. While providers may want to suggest hiring resources to handle the peak volumes, for a client it will mean extra cost during periods when volumes are low. Hence, our ability to scale up and down by 100 resources in 48 hours is a key differentiator.

We are able to do this because of the deep connections we have with our local communities where we are active contributors. Being a desirable place of work for many gives oWorkers the edge during quick ramps as well as lowers our cost of attracting talent throughout the year.

Financial health

  • Financials of the company for the last three years demonstrating stability and profitability.
  • Projections for the next three years, if available.

oWorkers operates as a locally registered company in each of the three current locations it works out of. It is deeply rooted in the communities it operates in, pays local and social taxes and works with hired staff and not freelancers.


Next Steps

oWorkers has chosen its playing area as data entry, annotation and moderation services and is counted as a leader in the space. It has been ranked as one of the top three data entry companies in the world. With over 20 years of hands-on experience, our senior leadership is involved in the day-to-day operations, client interactions and giving guidance to delivery teams. We operate from three global centers and provide overnight delivery to many clients, either from one of our three global centers, or with our 24×7 operational machinery. When you outsource content moderation to oWorkers you get a built-in business contingency with our three global centers acting as back-ups for each other. Our clients save upto 80% of their pre-outsourcing cost when they work with us. We hope you will, too.

What is content moderation – an overview

What is content moderation – an overview

What is content moderation – an overview

We may not always have the same name for it, but all of us understand what is meant by content. A dictionary may define it summarily and confusingly as something that is contained within something, but that does not come close to the context in which the term is used here. Of course, dictionaries are upping their game as well, as the “something that is to be expressed through some medium, as speech, writing, or any of various arts” definition by illustrates.

If you are reading these lines and words, what you are reading is known as the content of the article with a title as given at the top of this page. You would, perhaps, already know that.

And moderation? What is moderation?

Cambridge dictionary defines moderation as the “quality of doing something within defined limits.” In other words, the act of being moderate and avoiding extremes.


What is content moderation?

What does one get when one combines content with moderation?

We know that a humongous amount of content is produced and consumed, every day, every hour, every minute and every second. Each time we react on social media, or upload a picture on Facebook, send money through our internet banking account, or send an email, or forward a WhatsApp message, content is being created. And consumed. Someone is seeing our social media reaction, someone is receiving the WhatsApp message and someone is receiving the money we have sent.

How much is the data being produced and consumed?

Domo has been publishing an annual ‘Data Never Sleeps’ infographic for several years, that summarizes the content being produced (and consumed) every minute on the world wide web. They published the eighth version in Q3 of 2020. According to the latest version, these are some of the things that happen on the web IN ONE MINUTE:

  • 347,222 stories are posted by Instagram users
  • 4497420 searches are run on Google
  • 41,666,667 WhatsApp messages are shared
  • 69,444 jobs are applied for by LinkedIn users
  • 208333 participants are hosted by Zoom in various meetings
  • 147,000 photos are uploaded by Facebook users

All this in ONE MINUTE. Besides, the rate of sharing, posting, and uploading is only increasing.

Since the web is open for use, or abuse, by anyone and everyone, a mechanism is needed to ensure that the content being posted, that can be consumed by any of the seven billion people in the world, is suitable. While each platform has its own set of policies and guidelines, they are, in general, also guided by the many unwritten rules of civil society pertaining to interaction between humans.

Content moderation is the process through which this is done. It is the process of screening and monitoring user generated or open content, based on platform rules as well as rules of civil society, with a view to determining if the content is suitable for publishing. All platforms that rely on user generated content (referred to as UGC), such as dating sites, ecommerce platforms, social media, marketplaces, will have some form of content moderation as a control process.

oWorkers has over eight years of experience in moderation web content, for clients from all over the world. As traffic on the internet grows, so does the volume of content and the need to moderate. It is completely focused on data based projects such as those of moderation, and has been repeatedly recognized as one of the top three providers of data based BPO services in the world. It is a provider that knows the meaning of what is content moderation.


The value of User Generated Content (UGC)

If UGC is a nuisance, who don’t we just disallow it? After all, the keys to the website are with its owners. They can decide the level of access to be given to visitors, and whether they should be allowed to leave their footprint on their web property or not.

The simple reason is that UGC creates great value for businesses. How?

  1. Marketing gurus say that a satisfied customer is the best advertisement for a business. As a company, I am expected to talk up my products and services and say things about them that induce people to buy. In a globally connected world, consumers have access to competitive information and know that. They don’t necessarily trust what you have to say about your brand. But it matters to them when they see that there are other real people, like them, who are saying things about the product they are interested in, well, that holds value for them because that is unbiased feedback. Hence businesses want UGC that is complimentary.
  2. It is free. While companies spend loads of money in promoting their products and advertising, they don’t need to spend a penny on the feedback that organically accumulates on their web property. They just need to create it and maintain it. Of course, they need to ensure that the products and services they are selling are delivering what they promise, else the UGC could misfire.
  3. The brand gets exposure. Social media users like to talk about themselves. If they like the company’s products and are invested enough, they are likely to feature in their social media messaging, spreading the word across to their networks. Again, brand promotion for no cost. Some people who do this regularly and are seen as trustworthy and reliable, acquire the tag of influencers; who can influence purchasing decisions of other consumers. Some companies do invest in influencers who then need to balance out their own brand value of trustworthiness with their sponsoring brand;s objectives.
  4. Organically created UGC provides good SEO karma to the brand as well and improves its search engine rankings. The more the content on a website, the more opportunity search engines have to locate matches. In addition, when the content is user generated, it provides a variety greater than what a small group of people creating content may be able to come up with. As users, they may have more flexibility in language and also be better able to mirror terms that are likely to be used by other users. As a result, you get a keyword-rich website.

What is content moderation becomes a relevant question for companies to know and understand since, with such rich pickings from UGC, most self-respecting, confident brands will want to benefit from it.

With its deep roots in the community where we have located our delivery centers, oWorkers attracts the best local talent which it then trains with the help of a dedicated training team, whose job it is to make hired resources ‘fit for purpose’ for client projects, several of them being content moderation projects. The trained resources become adept at handling different types of moderation requirements for clients. Under the guidance of a leadership team that have over 20 years of hands-on experience in the business, oWorkers delivers customised solutions for clients. With its ability to attract talent, oWorkers can also handle temporary spikes in volumes with aplomb. We can ramp up by almost a hundred people in 48 hours.


Common methods of content moderation

So, we now have answers to the ‘what is content moderation’ question. But that is not enough. It needs to be done. How is it done?


As the name suggests, in this method, the content is not allowed to go live until it has passed through the process of moderation instituted by the property owner. This is the safest method of moderation and poses no risk to the brand by virtue of harmful content getting published.

The downside of this method is that it requires a large moderation capacity which could slow down the process of approval. Content creators, who normally like to see their content being visible as soon as they create it, might lose interest in interacting with the brand. It could also call into question the spontaneity and validity of the content.


Letting content go live and continuing to monitor what has been published is also a popular method of moderation. The pros and cons are the opposite of the pros and cons of pre-publishing moderation. The content gets published immediately, which creates a more vibrant and interactive community. A user posting content is perhaps most interested at the time she has created content and is willing to do more with it. Immediate publication also raised the perception of honesty about the content.

The downside is that malicious content could be posted and potentially do some damage before it is caught. Even though it is removed after a review, with modern technologies easily accessible, images and screen-shots taken by visitors cannot be erased.


This method relies on visitors to the website to report content that they deem to be harmful. It is then reviewed by site administrators and acted upon as they deem fit.

This method relies not only upon the users’ ability to identify content that is inappropriate but also upon her engagement with the brand to be of a magnitude where she will report it to the site administration and not just move to the next website. Hence, there are obvious gaps that are possible if you use this strategy. However, if you do have a set of engaged users, this can save the company a lot of money in moderation expenses.


Visitors are asked to rate the content visible on the website, including UGC, on its suitability and appropriateness. The theory is that suitable content will get better ratings and will become more visible on the top of the listings to users. Inappropriate content, being voted low, will slip to the bottom of the lists and gradually disappear out of view.

This is a slow method but inexpensive to implement and suffers from all the shortcomings of a group of users who may not have a deep commitment to the brand’s objectives even though they know the answer to what is content moderation.


There are many automated tools that can be deployed for content moderation.

There are simple ones that rely on finding matches with a set of words and phrases that are not acceptable, and can either replace them with alternates, or refuse permission to the entire piece of content to be published.

There are also more advanced ones that rely on technologies like Natural Language Processing (NLP) that have some ability to understand the context in which a certain word or phrase has been used and then mete out appropriate treatment to it.

Artificial Intelligence (AI) based solutions are also fast gaining ground and could well become the go-to solutions and owners may not even need to even know what is content moderation.

In addition to its hiring and training capability, oWorkers has forged partnerships with leading technology providers, creating for itself the ability to access the right technology when needed. These technologies also help clients as they are eventually deployed for delivery on their projects.

Add to this the advantages of operating from super secure facilities with protocols that further enhance the security of client data, and you have a winning combination.

oWorkers is GDPR compliant as well as ISO (27001:2013 & 9001:2015) certified. It is able to operate either from office or from home, given the constraints imposed on account of the pandemic.


The oWorkers advantage

If more reasons are needed to engage oWorkers in your content moderation journey, they operate out of three global delivery locations.

So what, you might ask? How does that help me?

This gives them access to a multicultural talent pool through which they are able to support client moderation requirements in 22 languages. Content creation is happening from all corners of the world and can be in any language. You can’t keep adding vendors each time someone posts in a new language from a far corner of the world.

Thanks to their management practices, costs are managed and kept in check, which is why they offer some of the best pricing amongst competitors. Clients advise savings of almost 80% after engaging them in their projects, especially clients from the US and Western Europe, without any compromise on delivery quality.

You will not even need to know the meaning of what is content moderation.

How To Outsource Image Moderation

How To Outsource Image Moderation

How To Outsource Image Moderation

“A picture speaks a thousand words.”

Whoever coined this phrase, believed to be of early 20th century vintage and American provenance, could not have foreseen the challenges created by pictures almost a hundred years later.

In a world drowning in user generated content (UGC), much of it in the form of images, that picture speaking a thousand words, which was uploaded by one of the over 4 billion internet users of the world, could be viewed by any of the other over 4 billion minus one users. These could be males and females, children and adults, Asians and Europeans, judges and policemen, Christians and Buddhists, rich and poor. In short – anyone.

That picture could be saying different things to different people in many different languages, whispering to some, shouting to others, holy to some, vile to others. Like someone said: One man’s meat is another man’s poison.”

What one person may consider good, enjoyable, or beneficial may be disliked by someone else. We have different sensibilities, tastes, likes, expectations. And there are millions of children and young adults in the mix as well.

Images are especially sensitive. Text needs to be read. Videos are richer, but need to be viewed. Images, on the other, convey their story in an instant, whether it is of a thousand words or a million. While an overwhelmingly large percentage of images uploaded will be safe and innocuous, there is always a possibility of a small percentage being vile and malicious, either inadvertently so or as a deliberate act.

So, what does one do? Surely we are too far down the ‘freedom to share content’ road to retrace our steps.

We moderate the images that are being uploaded.

As a business or owner of a website that invites participation from users, you may not be concerned about the large volume of images being uploaded every minute on popular platforms like Instagram or Facebook, but you are concerned that if offensive content were to find its way to your website, it would mar the reputation of your brand. Hence the need for content moderation on your website, if it is open for external participation.

What could go wrong? For starters, there could be porn and gore. That presumably kills the need for more examples. These are enough to kill your brand.

If you permit open sharing of images on your website, you should consider a moderation solution that will sanitize your web presence and enable you to sleep soundly at night.

How do we moderate? 

With the help of image moderation solutions, either inhouse or with the support of a partner.

It was a debate at some stage whether to outsource image moderation or do it inhouse, but with growth in traffic, evolution of tools and technologies and requirement of a specialised skill-set to monitor, it stands pretty much settled for the time being, in favor of outsourcing. As an independent business, you perhaps neither have the expertise nor wish to invest in creating it. You’d much rather be left to do your business, where you have an expertise, and leave the moderation job to the specialists in that space.

On the journey of outsourcing to an image moderation solutions provider, there will come a point, after you have selected a provider, done the negotiations, signed a contract, provided training and many other things, when your work will be handled by an outsourced partner who will, in a way, become an extended part of your team. Hence the selection of a partner becomes a critical step and should be done with care. And because it needs to be done with care, it will consume resources and money, and hence the outsourcer will hope that the exercise does not need to be done again in a hurry, and that it is the beginning of a long-term relationship.

How should you go about selecting a partner for image moderation services so that it becomes a stable, lasting engagement?

The following paragraphs offer suggestions on parameters to use for the purpose.


Quality of Work

In any engagement, the quality of work is of paramount importance, perhaps even more so when you outsource image moderation, as there are brand and legal implications. Prior experience of doing similar work for another client is of great value in this assessment, since one can see examples and judge whether the work being done is of acceptable quality or not. If work is of acceptable quality and can be supported by a confirmation from existing clients, that is even better. Of course, not having prior experience should be an automatic disqualifier if the vendor can offer benefits that offset this limitation.

oWorkers has over 8 years of experience in serving clients from different industries. Our clients are mostly referenceable and prominently displayed on our website. Our team will monitor and eliminate content that has been agreed to be harmful, significantly reducing your business’ exposure to not only poor customer experience but also legal and PR issues and other potentially harmful responses.


Speed of Response

At a generic level, speed of work is a prized and desirable ability. The faster you work, the more you can do, allowing the business to leverage its assets for greater productivity and output.

When the context shifts to image moderation services, it acquires an entirely new dimension since we are talking about moderating content that is constantly coming in with the expectation of millions of users around the world engaged in that process that it will start showing up as soon as we press the ‘Submit’ or ‘Send’ key. 

Our moderation teams will be at work so that you and your team can sleep peacefully.  With three centers located in the most desirable outsourcing geographies of the world, oWorkers provides 24×7 monitoring because someone, somewhere is always awake and logged in to the internet. 



When you outsource image moderation, as a thumb rule, it should not cost you more than your inhouse processing. Getting the benefit of volume and specialisation is one of the objectives of outsourcing. The partner, owing to much larger volumes of image moderation services work handled, is expected to be in a position to attract the right resources at reasonable cost, unlike your business, for whom it is not a core activity. 

Comparing quotes from different interested providers will provide a good idea about the general pricing trends. While low is preferable, if the offered price is much lower, or higher, than others, it should be looked into carefully to understand the reason.

oWorkers clients consistently report 80% savings in cost after outsourcing, as compared to pre-outsourcing costs. With options like shared resources, even for companies with low volume of content outsourcing is a great option.


Tools and Technology

The speed at which content is being created and shared around the world, a purely manual method of moderation, especially for the popular social media platforms, is often not a workable proposition, though it might be for an individual business. The vendor should be in a position to offer tools that automate a significant part of the process, with only disputed cases being routed for manual adjudication. Of course, in addition to the manual moderators dipping in to check the output from the automated service from time to time.  

Through our partnerships with leading providers, oWorkers offers clients access to the latest technologies when they outsource image moderation to us. In addition, we offer the flexibility for our resources to work on client moderation systems, if they come with one.


Business Contingency

When you outsource image moderation, you outsource a perennial activity, one that keeps happening, or can happen, at any point of time, from any corner of the globe. Someone, somewhere, could be accessing your website and uploading content on it. Depending on the service standards your business wishes to offer, and that the partner needs to comply with, what if the facility or location from where these services are being offered, are impacted on account of a natural calamity, or strife of some sort? Does that mean that content being uploaded lies in the queue unattended? Or gets auto uploaded after a defined cooling off period? Both these are sub-optimal for a business. Ideally, the partner should offer the facility of switching to another site that is not impacted. Of course, it is understood that the switch will not cater to the entire volume, else idle space and headcount will need to be retained.

With facilities in three of the most popular BPO delivery locations in the world, oWorkers offers you unmatched ability for business contingencies by switching partial; volumes from one site to another. A multi-site engagement can also be offered that obviates the need for switching.


Hiring and Scaling Capacity

This is a core BPO skill as it is a people dependent business and is as relevant for image moderation solutions as it is for any other work supported by BPOs. Not only does the provider need to ensure there is a constant supply of interested jobseekers to be hired, but also they have access to a deep pool where they can dip and hire a large chunk in one go, either for a short period, or hired for one project and then deployed across the others once the specific need is over. 

oWorkers has been consistently ranked highly on Glassdoor by its employees. It must also be highlighted that we have employees who do our clients’ work, and we don’t rely on freelancers or contractors. We are a preferred employer in most of the communities we work in, which gives us access to deep talent pools at low cost; they come to us instead of us having to seek them out. We offer the capacity to scale up by up to a 100 resources in 48 hours.



Cost of resources is an issue of huge consequence for BPOs, since frontline agent cost is perhaps the single biggest item on the Income Statement. Moreover, attrition being a reality for all BPOs, the hiring engine needs to keep working all the time. Joined at the hip to hiring is the Training machinery. If hiring is throwing fresh talent into the company, it becomes the job of the training machinery that needs to take them under their wing, and impart the necessary basic skills required for the workplace, followed by project-specific training to make them revenue generating resources. In any case, the BPO practice of hiring ‘raw’ resources who do not possess advanced educational degrees or any material business experience, makes training an indispensable function for image moderation solutions.  

With committed training teams attached to each of its centers, oWorkers is ready 24×7 to take upon itself the task of making new hires job-ready in the quickest time, starting with common skills like typing, language and soft-skills, and graduating to client project-specific skills at the other end. They also carry out retraining of agents who need more support even though they may have ‘gone live.’


MultiLingual Capability

On the face of it, image moderation services may not appear to be an activity that might be impacted by language. Language might be considered to be germane to activities like text moderation which have to use one or more languages. An image, on the other hand, is an image. It might say a thousand words but whoever is viewing the image will understand those thousand words in his own language. But look deeper, and language has an impact on an image too. A poster, for example, is an image. Text can be presented in the form of an image. This brings multilingual capability into the equation.

oWorkers offers support in 22 major languages of the world and has the capacity to add more based on specific client requirements. 


Internal Quality process

When you outsource image moderation, the delivery team, the most critical team in a BPO, often struggles with balancing many competing priorities. On occasion, this could result in a compromise on the high standards that they might otherwise strive to maintain. 

In order that such events do not become visible to the client and mar the relationship, an Internal Quality (IQ) team has become an indispensable part of a BPO’s setup. The IQ team takes samples to verify the health of the process. Where deviations are found corrective action is instituted, including but not limited to counselling of the team member. 

Quality Assurance and Quality Control are a part of the DNA at oWorkers, led by an independent team of Internal Quality specialists conversant with modern quality standards and techniques like ‘Six Sigma,’ ‘ISO’ and ‘Lean.’ They are also the eyes and ears that enable the senior management to stay connected with each project. 


Financial Condition and Management Support

Eventually, business is about money. Investing money, spending money, making money. The Income Statement and Balance Sheet are where all the different elements of a business meet to portray its financial health. It is the common purpose which binds all functions together. 

When you outsource image moderation, it is your expectation that the vendor will deliver value for the revenue it earns from you. It will make necessary investments and incur required costs. This is where the financial condition of the company becomes relevant. A healthy financial condition will allow the business the wriggle-room to make the investments and expenses, knowing that revenue will be earned. If it does not have wriggle-room, it will cut corners and try to avoid value-adding investments and expenses. Eventually your outsourced work will suffer.

Financial decisions are taken by senior management who also provide guidance to the organisation on strategy and management. The greater the involvement of senior management, the sharper the focus of the teams involved in client delivery.

oWorkers has been a profitable enterprise from inception. It follows financial discipline in all respects and refuses engagements that will put pressure on its financial condition, to avoid adverse impact on other engagements.

Our senior leadership has hands-on experience of over 20 years in the business and are alive to each and every project handled by oWorkers.

What is User Generated Content: A Guide

What is User Generated Content: A Guide

What is User Generated Content: A Guide

It is user generated content that most companies and brands today actively seek out in the hope of building their success on.

But what is it? What is user generated content?

With the advent of social media, starting at around the turn of the century, with the creation of Facebook in his Harvard dorm by Mark Zuckerberg, the creation of user generated content, or UGC, has been going on almost unchecked. After all, any content created by a user will be user generated content? There can be no argument against that, right?

Social media platforms, meant to promote the free and easy exchange of communication and ideas, caught the fancy of people, and their membership grew rapidly, aided by the spread of internet access around the world. Unlike emails, where you sent a message that went to an inbox, and received a response, if at all, at some uncertain point of time in future, social media provided instant gratification. The content created by one was immediately available to others, some of whom would also respond, which could be accessed by the original creator instantly as well. This instant access is what sharing and communication thrived on, leading to the rapid increase in adoption and growth of social media.

More social media platforms soon came into existence, attracted by the growing awareness and usage, and started attracting their own target populations, sometimes catering to specific niches, either by way of usage or by way of population segment. It is believed that over half the world’s population uses one or more of the social media platforms actively with many creating UGC apart from virtually everyone being consumers of what is available out there as content, mostly UGC of other users of the platform.

As a data services focused BPO, oWorkers has a team vastly experienced in handling all aspects of user generated content where support is needed. We have been selected as one of the top three data based BPO service providers in the world on multiple occasions.


What is User Generated Content: The Business View

User generated content, or UGC, is a term that is used to refer to content that has been created on spaces created by a company for showcasing or promoting their products and brands, by people who do not represent the company in any capacity, and do not have any stake in the health, good or bad, of that company. This content is usually within the confines of the space as well as the rules that have been set up by the company for participation on that platform. It could take many shapes, like a product review, a comment on a blog, a video of a personal experience, and many others. It is neither paid advertising nor created by employees who owe their livelihood to the company.

At first meant for and limited to individual participation, with a free and flowing exchange of ideas and information, social media platforms were bound to attract companies sooner or later, drawn by the sheer weight of humans joining the platforms and the free and open exchange of ideas and communication taking place.


Because businesses have to go where people are. Because they need to sell. Whether they produce trucks or software programs or hotel rooms or consulting services, they need someone to buy them. These buying decisions invariably need to be taken by an individual or a group of individuals, either for themselves or on behalf of another company or organization. Hence, companies selling products and services just have to go where they can find people.

oWorkers’ ability to attract walk-in talent, stemming from the close contacts we have with our communities as well as being recognized as preferred employment providers, enables us to retain the choicest of talent for client projects related to UGC.

The consistent flow of walk-in applicants also gives oWorkers the leverage to hire resources rapidly for short-term peaks clients may ace in volumes. This gives our clients a huge cost advantage as they do not need to hire and keep idle resources just for the time when they might get unexpected spikes. We can hire an additional 100 resources within 48 hours. Most of our competitors find it difficult to beat that.


UGC has other benefits too

UGC is authentic

If you are looking for a hotel for your planned vacation to the Maldives, what source will you trust? The hotel’s brochure and website or reviews from other vacationers (UGC) like yourselves on some independent website.

We know that more and more people now trust independent feedback for taking such decisions. The hotel’s own website has a stake in it. They want more business. Why would they tell you the bad things? Hence, UGC is trustworthy.

UGC creates community

As social animals, human beings have a need to belong to something they see as being larger than themselves. When a brand creates a community, it attracts people who are users of the brand or interested in it in some way.

Finding others with similar interests and desires creates a community around the brand which can become an engine of growth for it. Common, shared experiences, with the brand being central to them, is a powerful unifying factor, and a tool for creation of goodwill for the brand.

It creates an army of marketers

Can we imagine the number of people a company would need to hire for creating positive content around its products? What is user generated content is something they would also need to be taught.

While the exact number may vary from one brand to another, and from one piece of content to another, what can be taken as a safe assumption is that that number will be large, and will keep getting larger as the company seeks to expand the scale of its content creation.

Now imagine the same scenario but with content being created by members of the community the brand has been able to create around itself.

Now, the company does not need to hire anyone for creating content. Content creation will become a self-driven activity which will also keep growing in volume as the community size increases.

It is free

While there could be other costs that might arise, like moderation, the creation of content does not cost the company anything. It goes on, and on, and on, all by itself, with some guidance and intervention from time to time.

The company does not pay them anything. It has no statutory obligations like funding their retiral accounts. And they do it gladly and willingly, unlike many hired marketing resources constantly griping about their employers while trying to produce content that portrays the same employer in good light. Think about it.

Working with oWorkers provides business continuity cover to our clients, across our three facilities in three different geographies of the world. Even without BCP cover, our centers are equipped to operate 24×7, delivering the finest turnaround times.


What is User Generated Content: Types

While anything and everything that is created by users qualifies as content, or, more specifically, UGC, it might be useful to understand the various formats in which such content is created.


This is language in a written format, with the help of characters and symbols that human beings created for the purpose, moving on from the mostly hieroglyphic, or pictorial content, that is understood to have been used by early man. 

There are a variety of scripts used in the modern world, with the Latin alphabet consisting of 26 characters perhaps being the most widespread, at least in the Western world. This is followed by the Chinese script, consisting of thousands of characters, also called logograms, followed in many countries in Asia, with China and Japan being the largest. The Arabic alphabet is used by languages such as Urdu and Pashto and is the only widely used script read from right to left. Popular in South Asia, specifically the Indian subcontinent is the Devanagari script that consists of 47 characters and forms the basis of over a hundred languages and dialects.

Modern computing devices can now operate on multiple scripts, making it possible to create and share communication in different languages and scripts. Understanding what is user generated content, can be confusing, considering the language options people have of communicating.

Text continues to be the simplest and most common format in which content is created and shared across the world. From comments to articles to posts to reviews to descriptions. Even computers can understand text, when written in a defined format, also known as software code.

With our policy of employing multicultural and multi-ethnic talent in all our facilities, oWorkers has developed the ability to offer support in 22 languages. UGC can come from anywhere, in any language. Our wide coverage ensures clients get complete support. We have, so far, not come across a requirement that goes beyond these 22 languages.


Our aural senses rely on audio impulses; sounds and notes and words and poems and speeches that can be heard and appreciated and understood and responded to. It is one of the basic human senses.

Language, again, forms one of the main building blocks of audio-based communication, in other words, the spoken and understood word. Languages developed in different ways in different parts of the world, but with a common objective, of facilitating communication between humans.

Many people prefer to communicate through audio messages, instead of typing out the message in whatever is their script or language of choice. They consider speaking and hearing to be more natural actions, perhaps.

Our delivery centers are locally registered and we pay social and local taxes for our staff. oWorkers has consciously followed a policy of working with employed staff instead of freelancers and contractors that some other BPOs seem to do. Employed staff provides us flexibility in deployment as well as provides mid-level leadership resources as people grow in the company. We are consistently rated as 4.65 or better on a scale of 5 by our past and present staff members on platforms like Glassdoor.


“A picture is worth a thousand words” is an adage many of us are familiar with. We often struggle to communicate in reams and reams of text what we are able to communicate through an image, or a set of images, with far greater facility.

It is perhaps more direct as well, without interpretation, the way it is, as necessarily happens when trying to communicate through words. The unspoilt beauty of a snow-capped mountain peak, the fierce gaze of a wild animal, the outline of tall buildings in a busy urban center, can be expressed through images far more eloquently than words.

With modern technology making everyone a photographer, clicking a picture and posting it on social media is child’s play now for most people.

The pricing offered by oWorkers is competitive and offered in a transparent manner, with clients getting a choice between dollars per unit of input, say manhours, or dollars per unit of output. Our clients from the US and Western Europe regularly note savings of close to 80% when they work with us. We have the answers to the ‘what is user generated content’ question so that our clients don’t need to seek them.


Video remains the richest form of content, combining in itself the qualities of both audio and images, which are unique and rich content even by themselves. Combining them together enhances the experience manifold and gives it a life-like quality.

There is a reason why movies are popular entertainment, and not galleries displaying still photographs or museums where you can listen to audio recordings. It is perhaps because it gives one a life-like experience, closest to real life. Governments and administrations all over the world are now scrambling to install video equipment on street corners to keep an eye on the proceedings. It is because video gives the most accurate account.

Once again, with video recording ability being present in many smartphones, recording and sharing videos to communicate is becoming increasingly popular.

With a number of technology companies as clients, our technology capability is constantly being tested and pushed. oWorkers operates from super secure facilities & protocols for ensuring the data security of clients. We are ISO (27001 :2013 & 9001:2015) certified and GDPR compliant.


The oWorkers advantage

With a pedigree of over 8 years and led by a team with over 20 years of hands-on experience in the industry, oWorkers is the first choice for many companies looking for UGC related support.

We work with several unicorn marketplaces around the world.

The work we do for our clients enables us to employ a few more people from communities at a disadvantage and give them an entry into the global digital economy. Your choice of oWorkers as a partner will enable us to do the same for a few more people.

Why is Facebook Content Moderation important?

Why is Facebook Content Moderation important?

Why is Facebook Content Moderation important?

Many voices today seem to suggest that the supremacy of Facebook among social media platforms has been challenged by many upstarts with their formats that appeal to the younger people and first-time users.

According to the website Datareportal, as of July 2021, there are over 4.48 billion registered users of social media platforms in the world with as many as 520 million signing up in the seven months (till July) of this year. More than half the global population uses social media one way or another, and a much larger proportion, close to 70%, if we consider ‘eligible users’ which will eliminate some demographics like children from the denominator, who cannot have social media accounts.

Facebook has 2.8 billion registered monthly active users. YouTube has 2.3 billion and WhatsApp, now a part of Facebook, has 2 billion. Instagram has 1.4 billion.

In a free market, the emergence of competition is inevitable. Each new player will make an effort at creating its own niche and try to address a segment whose needs, it believes, are not met by the existing products. Different platforms have been successful, to varying degrees, in challenging the supremacy of Facebook. However, as is evident from the data, Facebook remains numero uno in the social media space, and even more dominant if data for its group products like WhatsApp and Messenger are included which, on their own, are also easily in the top 10.

And that has been the situation ever since its founder Mark Zuckerberg, the founder of Facebook, sat down in his Harvard dorm to write the code for creating it. Some of us have been exposed to the early days of Facebook thanks to The Social Network, a Hollywood movie based on the founding early days of Facebook and the controversies. Of course, it is a work of art, with creative license, hence to be taken with a pinch of salt.

Starting life at a time social media had started to fire the world’s imagination, oWorkers is steeped in the knowledge of the digital world. It has chosen to specialize in data related service offerings, including those related to social media. In its brief existence of 8 years, it is already counted as one of the top three providers of data based BPO services in the world.


Need for Facebook content moderation

As many users, as many opinions.

While from the business perspective its reach has perhaps exceeded the wildest imagination of its founders, who had, it is believed, intended to create a platform through which students, in Harvard as well as other universities, could communicate, its eventual popularity extending to a third of the global population, would no doubt have been a pleasant surprise. The popularity of Facebook has created huge revenue opportunities for the platform; for advertising, for data, for businesses looking to reach out to target segments for their products and services. The ways of monetizing the opportunity have continued to expand.

But blessings are never unmixed.

On the one hand, the openness of social media platforms, and the ability of users to access and create information, feeds into the modern-day narrative of freedom of speech and lowering of censorship barriers.

On the other hand, the same openness appears to be an invitation to some people to create content that might be considered to be vile and offensive by many others.

Like what?

Like hate speeches denigrating followers of a community or faith or group and exhorting people to violence.

Like images of graphic violence posted by an adherent of a terrorist organization.

Like pornographic videos.

And much more.

Social media platforms, eventually, are a mirror of real life. Just like in real life there is a small percentage of criminals who need to be managed, the percentage of creators of offensive content is pretty small, but since it is an open platform, efforts need to be made to ensure it does not reach the target audience and result in the harmful consequences it aims to create. This creates the need for Facebook content moderation.

Our global clients find the oWorkers pricing mechanism to be transparent and attractive, enabling them to save up to 80% of their original costs. This is especially true for clients from Western Europe and the US. Clients also appreciate the choice they get, of a price based on the input, like manhours, or a price based on the output produced.


Facebook content moderation – setting standards

In the interest of transparency, setting expectations is important. When we call someone out, it is done while weighing the called-out action or comment against an ‘expectation.’ That expectation, in day-to-day life could be borne out of commonly understood and accepted practices of human behavior that, even if not defined in letter, are generally well understood by most people. Of course, a more transparent form is to set the expectations, or guidelines, in letter, so that the room for ambiguity and interpretation can be reduced.

Organizations make an effort to articulate the guidelines that define expected behavior while interacting with the company on their property, like a website or a community page. Facebook is no different. In fact, social media being a business which makes it useful for a wide variety of people, or users, to join, setting out standards and expectations is particularly important as it provides a baseline against which participation could be evaluated. Moreover, users of the platform have to agree to abide by the rules and regulations that have been set out before they can begin participating.

‘Facebook Community Standards’ define content that is acceptable and that is unacceptable on the platform and can be viewed on the Facebook website. An extract from the page:

“The goal of our Community Standards is to create a place for expression and give people a voice. The Facebook company wants people to be able to talk openly about the issues that matter to them, even if some may disagree or find them objectionable. In some cases, we allow content—which would otherwise go against our standards—if it’s newsworthy and in the public interest. We do this only after weighing the public interest value against the risk of harm, and we look to international human rights standards to make these judgments.

Our commitment to expression is paramount, but we recognize the internet creates new and increased opportunities for abuse. For these reasons, when we limit expression, we do it in service of one or more of the following values:


We want to make sure the content people see on Facebook is authentic. We believe that authenticity creates a better environment for sharing, and that’s why we don’t want people using Facebook to misrepresent who they are or what they’re doing.


We’re committed to making Facebook a safe place. Content that threatens people has the potential to intimidate, exclude or silence others and isn’t allowed on Facebook.


We’re committed to protecting personal privacy and information. Privacy gives people the freedom to be themselves, choose how and when to share on Facebook and connect more easily.


We believe that all people are equal in dignity and rights. We expect that people will respect the dignity of others and not harass or degrade others.”

Community standards for Facebook content moderation “apply to everyone, all around the world, and to all types of content.” It is divided into sections for ease of reference. A few prominent ones are:

Violence and Criminal Behavior

This includes:

“Violence and Incitement

Dangerous Individuals and Organizations

Coordinating Harm and Publicizing Crime

Regulated Goods

Fraud and Deception”


This includes:

“Suicide and Self-Injury

Child Sexual Exploitation

Abuse and Nudity

Sexual Exploitation of Adults

Bullying and Harassment

Human Exploitation

Privacy Violations

Image Privacy Rights”

Objectionable Content

This includes:

“Hate Speech

Violent and Graphic Content

Adult Nudity and Sexual Activity

Sexual Solicitation”

oWorkers, with three strategically located global delivery centers, that can individually operate on a 24×7 basis should there be a client requirement for the same, are well equipped to provide business contingency to clients, by splitting volume across centers, with a common front to the client.


Enforcement of Facebook content moderation guidelines

A law without a mechanism for enforcement is usually considered to be toothless.

Facebook employs a combination of technology and people to enforce its guidelines in a two-step process of ‘detection’ and ‘taking action.’


Technology is playing an increasingly important role in detecting violations before they are reported or even viewed. This is in line with global trends. While human beings are better at evaluation of content and can understand the context and fine nuances much better than a machine, they have limitations in terms of capacity, and also cost money on an ongoing basis. With the huge amount of content being uploaded every second, it is always a losing battle for detecting offensive content.

Technology, on the other hand, can process and review millions of pieces concurrently. And, once developed, the running costs are fairly low. With technologies like Artificial Intelligence (AI) gaining ground, there is increasing reliance on technology for flagging off potentially harmful content.

Technology even helps reviewers prioritize content.

With the relationship that they have built with technology providers, oWorkers today has access to the latest in technology. This benefits their clients as the technologies are eventually used for work on their projects. The super secure facilities they operate from and their ISO (27001:2013 & 9001:2015) certifications, provide additional comfort to clients. They are also GDPR compliant.

Taking action

Technology works in concert with teams of reviewers to obstruct or delete offensive content. Facebook says, “Our technology proactively detects and removes the vast majority of violating content before anyone reports it. Engineers, data scientists and review teams work together to update and improve this technology over time. Meanwhile, our technology helps review teams prioritize content…Most of this happens automatically, with technology working behind the scenes to remove violating content—often before anyone sees it. Other times, our technology will detect potentially violating content but send it to review teams to check and take action on it.”

With their unique positioning as preferred employers, oWorkers attracts a steady stream of walk-in jobseekers. Not only does this reduce their hiring costs, reflecting in the attractive pricing they are able to offer, it also gives them a choice of talent for various projects, including content moderation. The steady stream also gives them the ability to hire for short-term peaks in demand, up to 100 additional resources within 48 hours.

Reviewer teams for Facebook content moderation are available across the world and provide 24×7 coverage and are capable of handling content in more than 50 languages. After all, offensive content is not the preserve of a particular language or culture or geography.

Having actively practised employing a multi-ethnic and multi-cultural team in all their offices, oWorkers has got the benefit of a multi-lingual support capability as a by-product. They are able to provide support in 22 of the most common global languages.

“As potential content violations get routed to review teams, each reviewer is assigned a queue of posts to individually evaluate. Sometimes, this review means simply looking at a post to determine whether it goes against our policies, such as an image containing adult nudity, in instances when our technology didn’t detect it first.

In other cases, context is key. For example, our technology might be unsure whether a post contains bullying, a policy area that requires extra context and nuance because it often reflects the nature of personal relationships. In this case, we’ll send the post to review teams that have the right subject matter and language expertise for further review. If necessary, they can also escalate it to subject matter experts on the Global Operations or Content Policy teams.”

Facebook has a number of tools in its arsenal as actions that can be initiated in the event of a violation. These typically include strikes, with each additional strike reducing the user’s privileges, placing restrictions on accounts, disablement of accounts, restriction on accounts of public figures during periods of civil unrest and removing pages and groups.


Supporting Facebook content moderation

oWorkers believes in working with employed staff, as opposed to some of our competitors, who prefer contractors and freelancers. Though it might mean carrying additional cost at times, it provides flexibility in deployment of resources while providing an experienced middle management layer in the organization. The team is led by a management team with over 20 years of hands-on experience in the industry.

We work with many less privileged communities. The work they do for us becomes a ticket for an entry into the global digital economy for them. The work you outsource to us will enable us to usher a few more youngsters from these communities into the global digital economy.

What you need to know about Social Media Moderation

What you need to know about Social Media Moderation

Social Media is an integral part of modern-day life. In technical terms it could be described as a technology through which people can interact with each other with the help of a computing device like a laptop or mobile phone, which is connected to the internet. It is estimated that more than half the population of the world uses social media, of course some more than others.

Interactions happen over a platform that has been created for the purpose. While each platform may have its own unique features and positioning, their collective success can be seen in them being among the most well-known and frequently-used technologies today. Of course, apart from the fact that the founders of some of these platforms are among the richest people in the world.

The design of these platforms is meant to facilitate the free flow of information and ideas among participants, enabling access to the hidden corners of the platform to all comers. Of course, in the course of development, now there are facilities that enable interactions to be limited to defined sets of users, should that choice be made. Each platform operates on the basis of a set of rules and regulations that users have to sign up for.

The interactions that take place on these platforms are now referred to as User Generated Content (UGC); content that could be published (generated/ posted) by anyone using the platform, and not limited to a defined set of publishers as might have been the case in the publishing world of yore.

As one of the top three data services BPO providers in the world, with a leadership team that has over 20 years of hands-on experience, oWorkers has been active for over 7 years in providing a variety of back-office services, including social media related, enabling clients to focus on their core business.


The need for social media moderation

So, it’s great, right? People can connect with each other. They can share ideas and thoughts. Even pictures and videos, of a vacation, an event in the family, anything. People also claim that thanks to social media they have been able to connect back with friends and family members they had lost touch with many years back.

The openness and popularity of social media themselves have become challenges.

We know one rotten apple can spoil the whole basket. The social media basket consists of over 3.5 billion users. That is a lot of apples. Even at extremely low percentages, a few of these apples being bad, or turning bad from time, is always a possibility.

What does that mean?

It can result in content being published that not only violates the policies of the platform but also is unacceptable from a social, ethical, moral standpoint. All users on the platform are not mere consumers, they are publishers as well. These publishers, while they operate within the context of the society that we live in, and ought to abide by its guiding principles, can be in a frame of mind where they stop caring about consequences. They feel safe and powerful ensconced in their own dark corner of the internet from where they can unleash vitriol on the unsuspecting world.

Expressions of anger, hate, perversion can find expression in the form of content shared on social media. Think a video of child pornography. Think an audio of a speech from a religious fanatic urging followers to perpetrate violence on non-believers. Think gory images of physical violence and battery.

In the internet age, publishing is instant. You press the Save/ Upload button and your content can become available. Unless held back for a review.

This is where social media moderation comes into the picture in an effort to weed out such content and limit the damage such offensive content could do if permitted to remain available. It could be defined as the review and management of UGC that is being uploaded every second, in order to keep the platform clean and wholesome.

With the help of its mature relationships with technology companies, oWorkers is uniquely placed to leverage the latest as well as emerging technologies for the work they do for clients. As a GDPR compliant, ISO (27001:2013 & 9001:2015) certified provider, oWorkers maintains the highest standards in data security, keeping client information secure.


Social Media for business

As with many good things, social media originated as a means of communicating between people. We have all heard of the stories about how the Facebook founder did the coding for the platform in his dorm in Harvard, initially meant as a tool for students to communicate with each other.

However, it is always difficult to foresee what the future holds.

The popularity of social media platforms is now a part of history. With more and more people joining social media platforms and using them extensively, businesses and organizations, initially on the sidelines, started salivating at the prospect of the opportunities they could visualize. Both in terms of being able to take their brand messages across as well as reaching large swathes of target consumers in a more economical and focused manner than permitted by traditional media.

Gradually social media has become an indispensable tool for companies. Social accounts are the virtual avatars of the brand. They engage fans, reach out to target prospective customers and, in general, build a community around the brand and the company that can keep doing the good work even while the employed people in the company are asleep. It is used to drive revenue generation (isn’t that the only goal of all businesses, apart from a profitable bottom line?), reach customers in a targeted, profiled way, and offer customer service. Since social media each transaction is based on clicking something, which gets recorded, it also becomes a useful tool to gather information about customers and trends.

With its usefulness now beyond doubt, in order to leverage the reach of social media, companies set up communities, groups and pages to create communities around their products and brands, through which they hope to spread the good word about themselves as well as reach out to many new customers. These communities have also become sources of information and feedback regarding the company from genuine, unbiased users.

And since it is a community they are creating, they need to take responsibility for ensuring that it stays within the defined rules and regulations of the community, as well as generally accepted social norms. This becomes the other use case for social media moderation.

If you are interested in buying a company’s product and looking for information, will you trust the company’s own information sources about its quality or feedback from users who have no personal interest in you buying that brand?

Genuine users, of course. Right?

This explains why social media platforms have become so critical to all businesses. They need to create thriving, participating communities around their brand.

Several unicorn marketplaces trust oWorkers in doing key activities for them. Many of them, especially the ones located in the US and Western Europe, have noted savings of almost 80% after outsourcing to oWorkers. They also appreciate the transparent options they get in pricing, between dollars per unit of input and dollars per unit of output.


Benefits of social media moderation for companies

How does it help?

Ensures correct information is available to customers and visitors

Today, it is now routine for customers, or prospective customers, to visit the social media avatars of companies to know more about them. They could be looking for a service center, or a number to call for some requirement. Moderation ensures that the information available to visitors is accurate. Some may consider it unethical, but instances of companies placing incorrect information on competitor websites is not unheard of. You don’t want to be the victim of such a scheme.

Influences buying decisions

This could be seen as an extension of the previous point. It is estimated that over half of all buyers seek feedback about their target product and company from their social media presence. As discussed elsewhere, a genuine, unbiased feedback is worth much more than a company’s own, motivated information packs and advertising. Revenue being the bloodstream of a commercial organization, this alone is reason enough for social media moderation.

Creates the right brand persona

A thriving, participative web property is the dream of all brands. If the community gets rocked by offensive posts and media from time to time, which is unacceptable to the main target population, it will put them off and reduce the web property to a dark, desolate piece of wasteland. The brnd need to prevent that from happening.

Prevents people from voting with their feet

Most participants have no stake in the failure or success of the company. They don’t get any money out of talking well of the product, or participating. They might do it for various reasons of their own, including earning bragging rights for using a certain product and genuinely trying to help others with their honest feedback. Besides, they generally have a device at hand through which a quick comment is always possible.

People can choose not to participate or leave at the slightest provocation. Prevent that from happening by ensuring social media moderation in your community.

Gets quick, objective analytics and feedback

Connected participants are a great free resource. They talk well of your products and even give you feedback when something goes wrong, say a spelling error in one of your latest promotion campaigns. You’d rather get the bad news from a person well disposed towards your company, than someone who is not, who will simply walk away. This valuable, free resource can be nurtured if you stay on top of what is happening in your community.

Whatever be your reason for initiating the activity, the 24/7 operation run by oWorkers ensures quick turnaround on transactions. It can also provide business continuity in case of a particular location not being accessible on account of a natural or man-made crisis.


You are not alone

Every activity and task adds up, consumes resources and takes away from the basic purpose of an enterprise.

Would you rather evaluate the risk in a new application for insurance or worry about how to moderate?

Would you rather design better semiconductor chips or worry about how to moderate?

Social media moderation is no different. In fact, it is considered as a particularly difficult task, as moderators are tasked with experiencing the darkest, most pernicious parts of what the web has to offer, and manage it so that others don’t get exposed and negatively impacted. In short, they face the bullets so that others can stay safe.

The job does not leave any physical scars that can be identified and treated. It works gradually on the psyche, leaving deep scars that might express themselves in unknown, unpredictable ways.

Companies like oWorkers who have been providing these services to clients across the world, have developed strategies to keep their employees safe while ensuring the work is not neglected.

Being preferred employers in their delivery locations gives them a choice of talent as they receive a continuous stream of walk-in candidates looking for jobs. Being engaged in multiple data-based services for different clients gives them the flexibility of staff rotation as well as the ability to hire a ‘fresh pair of eyes’ when the existing pair is reaching the limit of its capacity. In fact, the access to talent pools enables oWorkers to handle unexpected peaks in volume without breaking a sweat. This can be a huge advantage for clients as they don’t have to pay for keeping idle resources.

Companies are increasingly opting for outsourced services like those provided by oWorkers for ensuring the work is done efficiently while not exposing their own staff to jobs that they are not skilled for, apart from saving them from the exposure to the dark side of the web.

Additionally, oWorkers typically operates in less privileged communities. Client work enables them to engage underprivileged youngsters from local communities and usher them into the global digital economy. Your work will enable them to do the same for a few more.

What is the human cost of online content moderation?

What is the human cost of online content moderation?

Moderation is hardly a new concept for humanity. It has probably existed as long as thinking, sentient human beings have existed. In the English language sense of the word, it refers to the adoption of a middle path, the avoidance of extremes. Youngsters might be advised to ‘moderate’ their pitch when they are talking to a senior in their organization, for fear of being labelled as unwilling to listen or learn. Regular tipplers might be advised ‘moderation’ the next time they go visiting a bar or pub, particularly if they need to drive after that. A rabid anti-establishment protestor could be advised to tone down his rhetoric and demonstrate ‘moderation’ in his speeches for fear of being acted against by the establishment.

The subject on which moderation has been or is being advised to be exercised can be referred to as the content. Content of any kind is expressed in one of the four commonly understood formats; text, audio, image and video. Of course, the underlying subject of the content could be anything. It could be an idea or a movie or a news item or a cartoon or an article or a movie or a conversation. Any content could be the subject of moderation. For the youngster talking to a senior, it could be a proposal he has put forward which he is trying to justify. For the anti-establishment protestor, the content could be his views on the government’s policies.

While all content is relevant, the content that is conveyed or expressed through the spoken word is often ephemeral, and, in most cases, is forgotten after an interaction is over. Published content, on the other hand, becomes ‘permanent.’ It gets a life and its reach extends beyond the moment it was conceived and could continue to influence readers and viewers for a long time.

With its focus on data and related services, oWorkers appreciates the nuanced differences and is able to support clients in their content moderation requirements. oWorkers has been identified as one of the top three BPOs for provision of data services to clients, on multiple occasions.


What is online content moderation?

The advent of the internet and its gradually increasing adoption over the last quarter of a century has changed our lives in many ways.

Before the internet, publishing content was the responsibility of a few. Publishing houses would publish content in the form of books and periodicals. News and media houses would publish content in the form of news and current events. Advertisers would publish content related to their products and services that would be made visible through other business either selling space on billboards, or spots on TV or Radio, or column space in classified sections of media. This content would be consumed by other businesses as well as the people.

Today, with the internet, these lines have blurred. While everyone is still a consumer of content, now everyone is also a publisher of content. Whether it is my pet’s antics, or the great food at the new sushi joint in town, or the flooding of roads after a brief spell of rain, or views on the latest edition of the Olympics in Tokyo, I can create content about anything and post it so that it is available for the consumption of anyone who is interested or who may accidentally chance upon it. As we saw earlier, once content is published, it gets a permanence that is difficult to erase.

Hence, it becomes that much more necessary to exercise care while publishing online. This is where the human cost of online content moderation begins to make its presence felt.

With its position as an employer of choice in all the communities it operates in, oWorkers has access to the choicest of talent. It has the flexibility of deploying resources based on their preference as well as aptitude, and also the complexity and demands of the job. This access to a continuous supply of talent also enables oWorkers to support short-term spikes in client volumes, with ease. By committing the ability to hire a hundred additional resources in 48 hours, oWorkers relieves clients with such requirements of a huge cost in the form of idle resources during the remaining, non-spike period.


What does publishing have to do with moderation?


In the publishing of yore, since the publisher was, almost always, clearly identified, the responsibility fell upon the identified publisher in case inappropriate content saw the light of day. As they could be identified, and held accountable, they perhaps took their jobs seriously and ensured inappropriate content was edited out and the world only saw content that was kosher. Even the creators of content could be expected to be self-censored since they were also answerable to the editor or publisher for what they were submitting. So, it seemed to work well.

While publishing on the internet, however, such relationships and controls do not exist. There is no single channel or even a set of defined channels through which published content flows. It can come from anywhere. A goat farmer in rural Arkansas could be posting pictures of the milking process of his goats while a college student in Sweden writes about the retreating glaciers in her country. An unnamed person in an unnamed location could also be uploading a video of a woman being shot because of her refusal to comply with the orders of the ruling dispensation.

There is no external check. You are your own creator as well as editor. You feel secure under the cloak of anonymity you believe exists in the corner of the world you are in and posting from. You feel powerful.

Social media makes it even easier not only to create content but also to share it. That is the purpose of the existence of social media platforms; to encourage free and flowing communication and exchange of ideas and thoughts between people. Their rapid growth is a testimony to their success. While all platforms define rules of posting and participation, which most people abide by, they can be flouted. Hence the need to create a system of checks which leads into the issue of the human cost of online content moderation.

With its centers in three distinct geographical locations, and a hiring policy that supports a multi-cultural, multi-ethnic workplace, oWorkers now has the ability to support online content moderation in 22 languages. If content is being created from all corners of the world, it can be in any language. For moderating content one needs to understand it first.


What do we mean by the human cost of online content moderation?

As we have seen, content creators can do many things in the online world. They can post messages of hate in an effort to create rifts and push their agenda. They can upload pornographic material unsuitable for the many teenagers who also throng the online world. They can spread malware and cyber threats. They can spread fake messages as well as doctored videos in an effort to create confusion and anarchy. And much more.

Civil society cannot permit this to happen. Hence content needs to be moderated. Content that is being uploaded in unimaginably large volumes from around the world. Many methods of moderation have been attempted, but the one that works best is the one where the content can be evaluated and permitted to be accessible to viewers. Humans have been attempting to create technology that will do this for them and have placed reliance on Artificial Intelligence (AI) to deliver the goods.

Unfortunately, at this point of time, and we have to say this with mixed feelings, AI based solutions are no match for human capability. Ai engines cannot match the ability of that wonderful organ, the human brain, to understand and evaluate based on the fine nuances of each piece of content that the AI engine likely has no clue about, despite all the training it has been given. What this means is that human beings have to be deployed to view and evaluate the horrible content that has been referred to on multiple occasions in this article, so that others can be kept safe. That is the human cost of online content moderation.

oWorkers is at the forefront of technology with its enduring partnerships with technology companies through which it has access to the latest technologies. Clients also benefit from these partnerships as these technologies are used for their work. They are GDPR compliant and ISO (27001:2013 & 9001:2015) certified and were one of the first BPOs to take action in the wake of the Covid-19 epidemic and create infrastructure that enabled staff to work from home in a secure environment.


What do they have to do?

Moderators need to put themselves in the line of fire to protect others. It is not that every second person in the world is out to create gory content designed to create nightmares for the person watching it. Most content is kosher, and most of the non-kosher may be only mildly offensive, and more a non-observance of a platform guideline than content designed to create controversy or trigger civil wars. It is that tiny fraction of the content that is offensive in many of the ways defined above, and generally unfit for human consumption, because of which human moderators are deployed to identify and prevent such content from going public.

And when they go through such content, it is already too late. Some of us might recall our own experiences of watching a horror movie on the big screen and its impact on us for a long time. Despite knowing that what we were watching was unreal, a work of fiction. Online moderators know what they are reviewing is real. Someone is making those hate speeches. Someone has recorded an instance of child pornography. It leaves an indelible impression on the psyche of the person who is reviewing it.

That is the human cost of online content moderation. Many moderators put up a brave face and laugh off suggestions that it might be impacting them deeply. However, increasingly, they are recognizing and understanding the impact it has on them.

A debate has been under way on the subject for a long time. Some of the biggest names in business run the most popular social media platforms and they are the ones most in need of content moderation. While commercial considerations on part of human moderators are understandable, and the reason many of them put up their hand for doing the work, deleterious side effects need to be recognized if there is to be any hope of managing them. If not recognized, the people being impacted by them will continue to suffer in silence, along with their close family and friends, while the rest of the world looks for the next business opportunity. That is the human cost of online content moderation.

As they work with employed staff, and not contractors or freelancers, oWorkers considers staff development to be their responsibility. While online content moderation is a challenging task, with the wide variety of projects they handle, staff rotation is common. This is required for their psychological stability as well as personal development.


Outsourcing moderation

As the volume of content has kept growing, the ability of platforms to manage it has kept reducing. After all, how many people can one hire and deploy for moderating content? Besides, each additional resource you hire costs money?

Many large organizations have opted for the outsourced method for handling moderation. It costs less, there is lower recognition of psychological conditions as medical, resulting in lower responsibility for employers. Moreover, the geographies to which this work is outsourced, perhaps have other more immediate and pressing concerns to bother too much about strict implementation of labor laws and working conditions.

Additionally, export income, which accrues to them from such jobs, is important for these geographies, resulting in greater tolerance for outsourcers bending rules. And for the people doing the job, it is a welcome white-collar employment opportunity.

The outsourcers make an effort to make a noise about the great working conditions that they provide to the outsourced workers and how well they look after them. Some of that might even be true. However, the unfortunate fact appears to be that while they are being held accountable for the impact on workers in the developed world, outsourcers getting online content moderation work done by people in the less developed parts of the world seem to be getting away fairly cheaply while the impacted workers may be left struggling with the cost to their health that might manifest itself only after a few years. That is the human cost of online content moderation.

Clients of oWorkers note savings of close to 80% when they outsource their work to us. This is particularly true of clients from Western Europe and the US. They also appreciate the transparency in pricing, with the choice of a dollars per unit of input or dollars per unit of output that oWorkers typically offers to its clients.

oWorkers possesses the ability to offer business contingency based on its three-location presence, if needed by clients. For a service like online moderation, it can be very useful as content is being created all the time. Several unicorn marketplaces choose oWorkers as their outsourcing partner. We hope you will, too.

What is content moderation?

What is content moderation?

What is content moderation?

We may not realize it, but each time we interact with any of the social media platforms, we are either consuming content that someone else has created or creating content that we would like others to consume. A review I wrote of the last book I read, photographs of my niece’s wedding last week, an update on travelling to the Maldives, are different forms of content I am creating that I want to share with others, get them to read/ view it and interact with me.

Similarly, many other people I know are doing the same, with the same intent of getting others, like me, to consume and interact with them. In this case they are the producers and I am the consumer. Many platforms have also evolved as tools for disseminating information. It is common for authorities to keep their Twitter handle updated so that people can get accurate information on what is going on, especially in times of a crisis or emergency. During the Covid-19 pandemic, social platforms have widely been used by people to spread information on availability of beds in hospitals, drugs, etc.

Of course, social media platforms, while they have the potential for good, can also be abused, like anything else. They are often leveraged for spreading malicious information about communities and groups. They are used for spreading rumors and lies, to the extent of threatening the law and order situation.


Social media for business

An unimaginable amount of data is being created every moment that is being shared and consumed over social media platforms.

As consumers are spending so much time on social media platforms, it creates a natural interest for organizations who have always been seeking ways and means of reaching their target populations in the most effective manner. Hence, if consumers, of all kinds, are already present here, can businesses be far behind?

Social media platforms provide an opportunity for companies to interact with consumers in a most close-to natural setting as might be possible. They leverage these platforms to create awareness about their products and services, in other words promoting their products and services. At the same time, they can keep getting information on consumer tastes and preferences, a kind of a market research and survey in a natural setting, as opposed to the artificial setting of a survey form being filled.

On account of being digital platforms which require creation of a profile or account by users, the consumer demographic information is available to these platforms at a level of detail that was hitherto not possible. Advertising in tabloids and newspapers and billboards was like a scatter-gun approach; you spray a lot of bullets around in the hope that a few will find the target and hit them. In the case of social platforms, for a fee, the platform will make available demographic data to these companies who will then be able to drill down to the specific segment of people they wish to reach, without wasting their message on others. Thus, there is a greater ‘bang for the buck’ that is available to corporations.


The case for content moderation

Organizations are busy leveraging social media platforms to further their interests. They are busy creating communities and groups in a bid to bolster their presence and appeal and brand recall.

At the same time, the raw platform exists, for all users who may choose to sign up and create an account on it. They could choose to engage with their own groups and communities or the world at large. They could choose to become a part of the communities created and sponsored by other users, like the organizations we referred to, or create groups and communities themselves. Whichever their method of engagement, a wide choice of both consumption and production of content remains available to all users.

Taking the scattergun analogy a little further, we need to watch out for the ‘loose cannons’ amongst social media platform users. By and large every individual is a responsible, caring being. However, there could be some who are not. Without delving into the reasons that make them so, we know that crime is a reality, murders do take place, rapes and thefts happen. On the ‘civil code’ side of the divide, contracts get violated, leading to litigation and court cases.

Social media is yet another platform that is subject to all the variety and vagaries of the human mind and human psyche. For many, it is that safe corner where I can be me, away from the prying eyes of the world. I do not have a social setting where I am being watched or judged, at least immediately. This ‘safety’ of the world wide web can be toxic and heady and could lead people to create content that is not kosher for the rules that civil society has defined for itself. There could be graphic content posted that is poison for the young minds that throng to these platforms. There could be hateful content that could have the potential of inciting people against people and communities against other communities. Content could also be illegal.

In order that these platforms do not degenerate into a free-for-all or anything-goes, they need to be moderated, or watched over. This is the responsibility we have to ourselves and to civil society. This is what is commonly known as content moderation. It is a practice followed by all social media platforms that position themselves as open platforms.

oWorkers has established itself as a premier data services BPO. One of our key offerings is social media content moderation. We have supported global clients to manage increasing volume of content by deploying tools along with trained human resources to manage the activity. We have been identified as one of the top three data services BPO providers in the world.

With a presence spanning three geographies and ability to deliver services in over 22 languages, oWorkers is a one-stop shop for many of our clients.


How it works

In simple terms, it is the practice of monitoring content and moderating it, where required. Moderation can take two basic forms; the content can be modified, or it might be disapproved for display (or deleted). In other words, it will not be available to other users of the platform. Of course, each platform and community owner would have many different ways of executing these two actions.

Setting expectations is perhaps the logical place to start when one wishes to implement rules. While platform owners may have Terms and Conditions that they ask users wishing to use the platform to sign, organizations who leverage the platforms for furthering their organizational interests by developing, nurturing and supporting vibrant communities round their products and services, may also wish to lay down the Terms and Conditions, or Guidelines, for the users and visitors of their platform. That being done, it makes their task of taking actions like deletion and modification justified and simple. At least one will not get a “you never told me,” objection.

oWorkers runs a battery of tests before we hire people for this activity. As can be imagined, reviewing content that can be disturbing, has the potential of leaving emotional scars on the people reviewing it. oWorkers being a preferred employer in the regions we operate in helps in attracting talent and giving us choices. The hired resources are provided with training before deployment on client engagements.

Our access to a deep pool of resources also enables us to cater to peaks and troughs in volumes, which can be a costly exercise for clients.


Techniques of content moderation

You can choose from a variety of moderation methods. Of course, one will need to take into account the profile of users, purpose of the community, time sensitivity of content and other factors before deciding on one, or a combination of more than one, methods. The common ones are:

No moderation

This is also a choice, sometimes forced by circumstances. You may consider your community to be small, or made up of homogeneous, uniformly disposed people and hence decide upon no moderation. Or, it could be a principled stand that you take deciding upon no moderation. In any way you arrive at the decision, this is also a moderation method in a manner of speaking.


Content is screened by a moderator before it becomes visible or available on the platform. It becomes visible only if the moderator decides that it can be made visible.

The advantage of using this method is obvious. It gives the owner of the space the highest level of control. You can control exactly what you would like there and eliminate what you don’t.

On the flip side, this creates the greatest lag in making the content available. There could be peaks and troughs in volumes of course but any manual activity will consume some time, creating a possibility of backlog and delay. Users who are creating content may get put off by these delays and may look at the site as one that ‘manages’ interactions instead of letting them flow freely.

Besides, of course, it will be expensive as it uses human beings.


The difference in this method of content moderation is that it allows the content to be published without passing through a checkpoint. The content is verified after it has already been published.

The big advantage this method has is that it satisfies users who like to see their content visible immediately. It will also make the community or platform appear to be open, permitting all content to be published.

On the flip side, it could permit content that needs to be deleted, to also be visible for some time. While it will be deleted, some viewers may have seen it, reacted to it, copied it and shared it further.

Distributed moderation

In this method you leverage your users, almost entirely, for the content moderation that needs to be done. The community is set up in a manner that users’ interaction with the content leads to its promotion and demotion on the platform. Thus, unpopular content or content that is voted down, will gradually cease to be visible to new visitors; unless they make the effort to scroll to the ends of the page/ community for the sake of finding that content.

This could be an effective method where the community is homogeneous and aligned on their views on appropriate and inappropriate not only within themselves but with the owners of the community as well.

Reactive moderation

This method assumes that all content is good, unless a consumer points out to the contrary. The site makes available reporting tools allowing offended users to highlight such content.

It is extremely cost-effective as far as the requirement of people for moderation is concerned. However, there is very little control on content. You are relying on someone to be offended enough to raise an objection so that you come to know of it, review it and take a call.

There is really no assurance that all content that violates policy, or is illegal or is hateful and hurtful, can be removed through this mechanism. Consumers also need to be aware and motivated to report.

Automated moderation

Then there are automated solutions, mainly in the form of Artificial Intelligence (AI). AI and Machine Learning (ML) have gradually been gathering pace. While humans have almost perfected the art of getting machines to understand structured text, what we know as software code, doing the same with unstructured text has been a challenge. With AI, that frontier is also being crossed now.

Training data sets are being created that mimic human behavior and separate the acceptable from the non-acceptable ones, based on which AI models are implemented.

These tools have the advantage of speed and coverage. They can scan an input almost as soon as it has been created. In addition, they don’t miss anything. However, a human watch is still needed to tackle the issues beyond their ken, as well as ensure that their actions are correct.

With its wide-ranging partnerships with technology companies, oWorkers is able to access the latest technologies relevant for moderation of content. These technologies are also deployed for client projects. Added to our GDPR compliance and ISO (27001:2013 & 9001:2015) certifications, gives us the width of capabilities to handle any moderation job for our clients.


Content moderation through an outsourced partner

The need has been established.

Will you do it inhouse? Or would you like to hire a specialist partner like oWorkers to execute on your behalf, like many other organizations have done?

The advantages of outsourcing are obvious:

  • You can focus on your core tasks
  • You get the knowledge and skills of an experienced specialist partner; otherwise you will have to build that inhouse
  • With its hiring and training ability, a partner like oWorkers can hire and attract the right talent for the best price
  • It is a more cost-effective solution

Of course, the choice is yours.

Using AI for content moderation: a logical progression

Using AI for content moderation: a logical progression

Using AI for content moderation: a logical progression

There are many things that keep getting debated by humanity, with opinions ranging widely across the spectrum.

Then there are some issues which seem to be universally accepted, at least by the impacted constituencies.

The need for companies to seek User Generated Content, UGC for short, and the need to moderate that UGC, seem to be two issues that find universal acceptance, at least amongst the companies that generate it as well as the people who consume it, leading to a point where using AI for content moderation appears to be a widely accepted practice too.

oWorkers has been moderating content for clients from around the world for over 8 years. It is a growing practice with the increasing volume of content being generated. As a GDPR compliant, ISO (27001:2013 & 9001:2015) certified company, oWorkers has forged relationships with a number of technology companies, which provides us access to cutting-edge technology, whether it is for content moderation, or any other service.


Need for generating UGC

The internet has dramatically altered how information is shared and consumed.

Perhaps just a generation back it seems there was a dearth of information and people hungry for it would seek out sources like newspapers, television, radio, and other more subject-specific sources, to get more information.

Today we have a surfeit of information. We are flooded by information from all sides. At the click of a button or two on the internet we can access any piece of information, from anywhere in the world, that has been put up for open access.

The shoe is now on the other foot. We now seem to have ‘too much’ information, if that is possible, and the challenge now is to separate the wheat from the chaff, distil the truth from the lies, the real from the fake, the relevant from the irrelevant.

This applies to communication generated by businesses as well.

It is an accepted fact that for a business, its financial soundness and money-making ability, as long as it is within legal boundaries, comes at the cost of almost everything else. They have long sought to portray themselves and their products in positive hues. Understandably so. One cannot expect a business to speak poorly of the products and services it is selling, can one? This, again perhaps understandably, leads to their target customers accepting their messages with a ‘pinch’ of salt, if not more. Consumers are not fools. They see and hear what businesses have to say to them, seek out more information, and then use their own personal algorithms to determine the merit of the pitched products and take the buying decisions.

Initially restricted to platforms like print media advertisements, billboards, radio and TV spots for communicating their message, the growth of social media platforms has provided companies with a new platform that, if used judiciously, could enable them to reach much larger numbers at lower costs. And they have been quick to get on to the social media bandwagon for product promotions. While cost may have increased with greater adoption, what has definitely increased on account of the rising popularity of social media is the information clutter or overload.

Already faced with disbelief of advertiser claims, this clutter makes it more difficult for target customers to ‘get the message’ that advertisers are trying to convey.

Enter user generated content.

Social media users around the world have shown a tendency towards looking for genuine user feedback and comments on the products and services they are interested in, as one of their evaluation parameters, instead of placing reliance on what is claimed by the business itself. Recognizing this trend, companies have moved fast, as they do when they need to, in finding ways to generate UGC in support of their products and services.

This is why generating UGC has come to occupy a prominent place in the marketing budgets of many companies.

As active, contributing members in the communities where our delivery centers are located, oWorkers has access to the best talent as a preferred employer. Whatever be the reason UGC is generated, our talented staff, who are employees, and not freelancers or contractors as preferred by some of our competitors, are equipped to deliver the goods.

A related benefit of access to a continuous supply of talent is the ability to handle short-term spikes in client volumes. These could be seasonal or these could be driven by other events. Our deep supply pool enables us to meet these short-term requirements, resulting in significant savings for clients who would otherwise need to keep resources idle for significant periods of time when volumes are lower.


Need for moderation

What we are deep down could, at times, be very different form what we are when in front of others. How we behave when we believe we are safe from the prying eyes of the world could be very different from our ‘society’ face.

The internet, with its ability to reach the deepest recesses of the world, where users can interact with the content on the web from a feeling of power, of being able to do what they want, without censure, of expressing themselves in a way they are not able to with others, sometimes gives rise to content that can be frightening as well as damaging for others who happen to access it. This is one of the reasons why content needs to be moderated.

Most social media platforms provide spaces where companies, brands, even individuals, can create their own spaces, like groups and pages, where they can initiate conversations about the themes they are interested in. These are often leveraged by brands to generate conversations about their products and services. This is the unbiased content interested buyers look for when they take a decision. The platform also gives the brand an opportunity to reach newer customer segments.

Whether the open platform, or sections of the platform created and managed by companies and individuals, abusive or offensive content can be posted on both. Platform and space owners typically set out the rules for participation, thereby setting expectations that action could be taken if someone steps out of line. Moderation can be exercised even if the participation violates some written or unwritten rules of civil society. It does not need to be specifically written in the rules of a website for someone to know that pornographic or graphic violence should not be posted on any openly accessible platform.

This is where the need for content moderation as well as using AI for content moderation emanates from.

Operating from super secure facilities in each of its three delivery locations, oWorkers offers support in 22 languages as it employs a multicultural team. While images may not have a language, textual, audio and video content do.


How content is moderated

There are a few universally understood methods of content moderation that can be done manually. In brief, these are:


In this method, content is reviewed and authorized before it becomes visible on a platform. It exercises the best control over malicious content being visible to visitors but tends to delay publishing, stinting a vibrant community.


Here, content is allowed to become visible, while it keeps getting reviewed and removed, if found unsuitable. The advantage is that content is not held back, which is an encouragement to participants to contribute. However, it is possible malicious content has been viewed, and even clicked, by some visitors, before it could be removed.

Reactive moderation

Reactive moderation is based on the inputs provided by community members on the suitability of content. Buttons and tools are made available to them for their inputs. Based on the inputs received, a decision is taken on content units. This is an inexpensive method for the company owning the space being moderated.

Distributed moderation

Here again, community members are provided with rating and/ or voting mechanisms to enable them to provide their inputs. It is expected that content voted low by visitors will gradually be pushed down to a point where they stop being visible. Again, inexpensive method but may not suit sensitive brands and portals.

It is generally accepted that pre-moderation is the most preferred form of moderation. If it needs to be moderated, it needs to be moderated before it is visible on the platform.

This is where automation and using AI for content moderation come into the picture. With automation and AI, it is possible to moderate content before it becomes visible. These tools are able to overcome some of the limitations of manual moderation, such as capacity, speed and cost, because of which many platforms resort to Post, Reactive or Distributed forms of moderation.

As a BPO focused on back-office services, including content moderation, oWorkers has been identified as one of the top three data services providers in the world. On more than one occasion.


Using AI for content moderation for different types of content

All content can be divided into Text, Audio, Image and Video content and all of these need to be moderated. It is not that moderation is reserved for any particular form of content. How can using AI for content moderation up the game for each of these formats?

Textual content

This might be the easiest type of content for a program to handle, as it is made up of defined characters in sequences. Software program codes are written as sequences of defined characters. Each character is unique and can be identified by the machine. Hence, at the basic level, machines can understand each of the characters that constitute a word, phrase or sentence.

The challenge, however, arises, when textual content is unstructured or contextual meaning is required to be understood. Machines cannot match the intuitive and contextual thinking power of human beings. Thankfully!

AI is now making inroads into areas that traditional software programming could not address. Using natural language processing (NLP) algorithms, it is becoming possible to do sentiment and emotion analysis on textual content, thereby creating perspective for the text to be placed in. These algorithms are able to identify fake news and even issue scamming alerts.


While the human eye has the capacity to look at an image and assign meaning to it, for a machine, an image is merely a random collection of dots, or pixels, perhaps of different colors. It is unstructured information for a machine. The human eye and mind can work together to make sense of it, but not a machine. The question is – how does one get a machine or a program to make sense out of unstructured information?

With the rapidly evolving AI and ML (Machine Learning) industry, computer vision-based programs are being activated for seeking objectionable content. The AI engine, to become useful, has to undergo a long process of training with the help of ML. By feeding thousands and millions of images and connecting them to conclusions based on how a computer sees, or reads each image, creates the bedrock for using AI for content moderation for images. It reaches a point at which, when the next image comes in, which is not a training image any more, the AI engine is able to draw conclusions on its suitability based on what it has been taught. Combinations of text and images are also possible to interpret using a combination of techniques.


While humans can listen to an audio and understand the communication, a computer cannot. Computers can understand structured textual data to an extent, based upon which software programs are created. However, audio is beyond the ken of computers.

The first step is to convert the audio into text which computers have some hope of understanding. Here again, while advances have been made in NLP techniques for understanding spoken and contextual language, it is far from where the human mind is. However, converting and applying NLP processing techniques takes us to the same place as we are with textual content at which point techniques relevant for textual content will be applied..


Video is the richest form of content, being a combination of audio and images or image frames processed so rapidly in sequence that they appear like a continuous playback to the human eye. How does a computer come close to the level of understanding that a human can draw from it?

There is a reason we are discussing video content in the end; after we have discussed text, audio and images. The reason is that all the techniques used in those formats of content will be applicable to videos as it is a composite of all. Image frames will be analysed with computer vision and AI. Textual and audio data will be analysed with NLP-based AI techniques. The one additional level of complexity that exists in video content is the information that is generated based on the changes that take place as the images progress. It is no longer a single image that has to be read in isolation. It is single images, along with the changes taking place from one to the next and then the next, along with audio in a manner that keeps time with the images.

oWorkers has been amongst the first to create an environment for its staff to work from home in times of the epidemic, as and when required. All staff can now operate fully either from home, or office, depending on the situation and preference. This ensures that clients are not left exposed on account of non-availability of resources.


A logical progression to AI

The basic issue being faced by companies is that there is just too much content being generated for any meaningful manual moderation exercise to be feasible. Hence, using AI for content moderation is no longer a choice. It is a necessity. It is now a question of how fast and how accurately we can build and train our AI engines to do it for us.

Of course, human intelligence and oversight will always be required. But the balance has to change with most of the heavy lifting being done by machines, with humans stepping in either to adjudicate on doubtful cases which machines are unable to decide on, or to periodically monitor and ensure that the machines are in line with expectations.

Our efforts have resulted in many youngsters being able to make a transition from their challenging circumstances to becoming a part of the global digital workforce. Your work will enable us to support a few more people to make the transition.

Use of moderation for business explained through social media moderation examples

Use of moderation for business explained through social media moderation examples

Use of moderation for business explained through social media moderation examples

It is not something that gets printed in the Annual Report of public companies or promoted through paid advertising by private companies. This is one reason social media moderation examples, especially in terms of names of companies and brands, are difficult to come by. Many times, it is deduced through the process of logical extension and extrapolation; if Company A is doing social media moderation, Company B must be doing it as well. Or, if Brand P has a presence on social media, and they are generating a lot of positive comments, it is only possible if they do social media moderation. Attribution of social media moderation being done by a company, based either on evidence or acknowledgment, may not be easy to do.

That being said, it is not that the companies doing moderation themselves or through an outsourced arrangement, are indulging in illegal activities that they need to keep under wraps. Under the commonly understood practice of giving fair and reasonable notice, all platforms and all communities created on a particular social media platform start by publishing the rules of engagement. The generally accepted practice requires participants to sign up to become a part of the platform or community, one of the steps being to agree to abide by the rules put in place by the owner of that space. If they do not like the rules, they are free to stay away and not participate. Having accepted the rules and signed up for participation, it is an expectation that they will abide by what they have signed up for. And, if they step out of line, the site owners would be within their rights to moderate the content in a manner that is conducive to achieving the objectives with which they set up the space and related rules.

Social media moderation examples are, perhaps, best understood in the context of the different ways in which moderation can be done.

Some claim that not moderating social media is also a decision and hence should be counted as an example of moderation. There is perhaps some merit in the contention, and it may be counted as one is some write-ups, but for our purpose, after having noted it as a possible decision that could be taken by many companies, we will discuss the various types of social media moderation that might be classified as ‘active.’

Counted as one of the top three BPO service providers in the world in its chosen area of data and back-office, oWorkers is an expert in moderation of social media communities on behalf of clients across industries from all over the world. Clients, especially from the US and Western Europe, note the saving of almost 80% of the cost prior to outsourcing the work to oWorkers.



As perhaps conveyed by the name of this method, in this method, the content is checked before it is released for the public. This could be viewed as the classic method of authorization before any action is taken. Accuracy and control take precedence over engagement and user experience. User generated content (UGC) is placed in a queue that is fed to the moderating resources who pick off one item at a time and take a view on its suitability for the website. If found suitable, it is released and becomes available to users.


This allows website owners to exercise control over the content. They can ensure that unsuitable content is not available to users.


It degrades user experience. When someone participates on a social media platform, the objective is to facilitate the free flow of communication and ideas. Holding back content created by users stilts the experience and could make for a dull and dead platform if user expectations are different.

Social media moderation examples that rely on this method also incur a significant cost as human resources are required in adequate numbers to review each piece of content, in good enough time.


This could be the appropriate method for platforms that are sensitive to abusive content, such as ones directed at young adults, or where undesirable or libelous content could cause damage to the brand or the subject being promoted, such as celebrities. It could also be used where the turnaround time expected is not fast, allowing for review and authorization without upsetting the community.

It is understood that many startups offering educational services and content rely on pre-moderation to ensure that offensive or motivated content does not get through. The content on their platform is usually meant for access with or without subscription, and generally there is no immediacy required by the content submitter for it to be published.

The oWorkers advantage

As a preferred employer in all the locations it works in, oWorkers receives a steady stream of walk-in talent. It gets to choose resources based on the requirement of different client projects, including for pre-moderation which often requires resources in large numbers.



In this method, content is not held back for review prior to being published. The participating user gets the satisfaction of seeing her UGC visible as soon as she has pressed the SEND or UPLOAD button. What she might not know is that while the content has become visible, a copy of the content is also placed in a queue for review where a moderator takes a call, as in the pre-moderation process, on whether to permit the content to continue being visible.

Thus, in a way, this works in a reverse way as pre-moderation. All content is assumed to be kosher unless found to be otherwise, while it is the other way round in pre-moderation.


This method promotes user engagement by permitting a free flow of communication and exchanges without the need for an authorization in each case.


The human cost continues to be an issue. While content does become visible, the need for a quick review does not go away because it is now visible. Unacceptable content needs to be removed as soon as possible to limit the damage it can do.

Of course, the other downside is that with technology in everyone’s hands, inappropriate content, even if it becomes accessible for a short period of time, can be copied and spread very quickly. It may become difficult to limit the damage once that has happened.


It is understood that YouTube relies upon a form of post-moderation to maintain the quality of content available on its platform.

The oWorkers advantage

oWorkers operates with hired staff, and not contractors and freelancers as some of their competitors do, which delivers the best social media moderation examples. They regularly receive scores of 4.65 and above from past and present employees on platforms like Glassdoor. They pay social taxes for their staff in the locations they operate from.


Reactive moderation

Where the website or community owner does not proactively seek to identify and eliminate offensive content, and instead relies upon feedback from the participating community itself, is known as reactive moderation. Content that is identified and flagged by one or more members of the community would then be reviewed and adjudicated upon. The member/s flagging the content may also need to be kept informed of the fate of the content, and the reasons for whatever action is taken, especially if the content is not acted upon as suggested by the member. Typically, the platform will make available a button that can be clicked if someone wishes to report a particular piece of content.


Among social media moderation examples, this method gives priority to the engagement that happens on the platform, over the need for control. It also accords respect to its members and assumes that it has members who are engaged and mature and will take the initiative to flag content that may be inimical to the interests of the group, without taking undue umbrage because of its presence.

Since each piece of content need not be adjudicated upon, the requirement of resources needed for the activity reduces substantially.


Like in any case where pre-moderation is not being done, there is a risk of malicious content making it to the website and being rapidly copied and spread, leading to bad press for the community.

Also, the responsibility of publishing standards for action for an item that is flagged, still remains with the owners of the website.


Facebook is understood to use reactive moderation for its platform where it reviews and acts upon content that may be reported by users. This may not be the only method of moderation used by Facebook. It is also known that Facebook might be using different methods of moderations for different types of content on its platform.

The oWorkers advantage

Its ability to attract talent also gives oWorkers the flexibility to provide short-term resources to handle peaks and troughs in client volumes, which can often happen in any form of moderation. oWorkers can hire an additional 100 resources within 48 hours. This enables clients to save big on account of not having to hire and retain resources only to meet peak or unexpectedly high volumes.


Distributed moderation

This could be considered as a variation of Reactive moderation. The website owners don’t engage in primary moderation, or moderation of their own volition. Instead, the community is set up in a way that members get to provide feedback on each piece of content published, in the form of a rating. Content that gets consistently high ratings is most visible to users, with lower rated content being pushed further and further down as a result of poor ratings. In time, the lowest ranked content could almost become invisible. From the perspective of identifying and removing offensive content on sensitive websites, among social media moderation examples, this is not much used.


This method also gives priority to the engagement that happens on the platform, over the need for control. It also accords respect to its members and assumes that it has members who are engaged and mature and will take the initiative to rate content so that it may find its correct place in the database.

As there is no role for a moderator, with content being rated up or down by users, this does not exercise the financials of the community owner.


Like in any case where pre-moderation is not being done, there is a risk of malicious content making it to the website and being rapidly copied and spread, leading to bad press for the community.

Without an oversight being maintained by the owner, there is a possibility that content could slide out of control if users do not take active part in managing and rating it.


An example could be Slashdot. News stories submitted by users and editors are evaluated by users and editors. Comments can also be added to each story.

The oWorkers advantage

Operating out of three distinct geographies, with its avowed policy of hiring a multicultural workforce, oWorkers has developed, and now offers to clients, the ability to process in 22 languages. After all, social media interaction can take place from any part of the world in any language.


Automated moderation

The story is not complete without inclusion of automated moderation. As volumes have risen and the world becomes more divided and dangerous, automated moderation holds out hope for many platforms and companies. It is hoped it will be able to pre-moderate sensitive content at a cost and in volumes that human beings are not able to.

One of the challenges has been that much of the content that is being uploaded is unstructured. It could be photos or audio or video, or even unformatted text. Traditional technologies have been unable to handle that, but the growth of Artificial Intelligence (AI) holds promise that it may be able to do much of the heavy lifting at some point in time which, currently, is limited to applying filters in one form or another.


As volumes rise, automation often becomes a savior for business with the advantages it provides like handling much larger volumes at much lower running costs. In addition, it is expected to apply standards strictly and consistently that humans sometimes fail to.


It has not been able to develop the intuition and fine sensibilities of the human brain. While it is expected to do the majority of the work, when it is unable to take a decision, it is expected that it will defer to its human masters.


Automation based examples may be few at this point, but are expected to become commonplace once technologies become reliable. It could be used across the industry spectrum, with some customisation for each situation in which it is applied.

The oWorkers advantage

Operating from secure facilities, oWorkers is ISO (27001:2013 & 9001:2015) certified and GDPR compliant. Its time-tested partnerships with technology companies provides it access to the latest processing technologies for client work.



With all three sites being capable of running 24×7 operations, not only can they provide quick turnaround, but also offer business continuity by splitting volume across two or more locations.

85% of their clients are technology companies, including some unicorn marketplaces. They are led by a team that has over 20 years of hands-on experience in the industry. They will have several social media moderation examples to share with prospective clients such as yourselves.