The Challenges of Commercial Content Moderation

The Challenges of Commercial Content Moderation

The internet has taken over our lives in many ways. All in a matter of a quarter of a century, a blink-and-you-miss long duration in the context of humanity and our planet. It has kept making inroads into our day to day lives, gradually accounting for more and more of what we do and how we do anything.

  • We no longer need to go to a ticket counter or a travel agent to book a flight ticket. We can do it on the internet.
  • We no longer need to get into a conference room with ten other people to have a meeting or a discussion. We can do it over the internet.
  • We no longer need to get four people across a table in a room to play bridge. We can do it on the internet.
  • We no longer need to go to bookstores to buy a book. We can do it through the internet.

There is no end to the list of things that can be done online. Of course, the experience is probably going to be different, and some may complain about the loss of a lifestyle they were used to, but there is no denying that the internet has fundamentally changed our way of life.

Add social media in the mix and the potion becomes even more potent.

Social media allows and facilitates an almost endless flow of sharing of ideas and content in the form of text, audio, images and video, in groups and communities and circles that span the whole wide world. It covers not only individuals, but entities and organizations of all types including business and not for profit organizations, societies and trusts, schools and colleges, private and public, and everything in between. Since such a large part of humanity uses social media, commercial entities anyway get drawn to it like flies to honey, in the hope of communicating their sales messages to them and taking their agenda forward.

Commercial content moderation, or any other, oWorkers is clear about its focus areas, and specializes in data based activities such as moderation. In its journey of eight years, it has supported a wide set of global clients with content moderation work enabling them to focus on their primary business. Its success can be measured by the growth in relationships over this period. oWorkers is recognized as one of the three best providers of back-office BPO services in the world on multiple occasions. oWorkers being GDPR compliant and ISO (27001:2013 & 9001:2015) certified keeps your business secure when you partner with them.

 

What is commercial content moderation?

In the pre-internet days, content for mass consumption could be produced and distributed by a limited set of entities who could be identified and controlled reasonably easily. As a result, all the content producers also exercised reasonable restraint while publishing content.

Today, every person using the internet, especially social media, is a content producer, apart from being a consumer. The graduate in Atlanta can share her ordeals while looking for a job, even as the Dubai-based executive tries to impress people with pictures of his recent vacation to Switzerland even as the homemaker in Melbourne uploads videos of her cooking skills. With more than half the world’s seven billion plus people using the internet in one way or another, the volume of content being generated and consumed is astonishingly huge.

The name given to this type of content creation is User Generated Content, or UGC.

Not everyone sharing content is concerned about the rules of the game, or the responsibility of existing in a civil society. While most participants and content creators will be conscious of their responsibilities, a few may not be, and end up poisoning the entire ecosystem.

What can go wrong? What can these content creators do? Here are some samples:

  • They can deliver hate speeches and messaging and create schisms in society
  • Divisive messaging, either political or religious, can threaten the law and order situation
  • Graphic violence and pornographic videos can be uploaded
  • Spam and malware threats can be spread
  • Fake news and misleading articles can create confusion and divisions
  • Cyberbullying can be used to target individuals and groups

These are just a few examples. The bottomline is that such content, often in the name of ‘free speech,’ can be used for furthering the narrow and misguided aims of a few misguided souls, creating divisions, hatred, violence and prejudice in the society at large. There are also especially vulnerable groups like children who also access these platforms, and stand to be corrupted by such content.

The process of identifying and removing such content is referred to as content moderation. UGC is the main reason for the need for content moderation since anyone and everyone can create it, not just a few identified individuals organizations.

But what is commercial content moderation?

Taking the discussion a step forward, when the moderation exercise is done on web properties that have a commercial angle to them, then that is what content moderation becomes; commercial.

That could be everything. All social media platforms are commercial ventures, be it Facebook, or Youtube, or Twitter, or Instagram. Some of their founders/ owners like Mark Zuckerberg and Jack Dorsey are billionaires many times over.

Then come the commercial entities, the for-profit enterprises, who park themselves on a corner of these platforms and seek to create a space where they can reach out to the users of that platform, as well as provide a space where they can carry out promotional activities for their own business. They need to keep their space squeaky clean and ensure that users are not turned away by what they see there. They need to do content moderation in their own spaces too. That is also commercial content moderation.

The management team of oWorkers, with over 20 years of hands-on experience in the industry, understand moderation and its many methods. Under their guidance, oWorkers has been going from strength to strength. With several unicorn marketplaces as long-time clients, oWorkers understand the challenges of this work as well as client concerns. With centers in three of the most sought-after delivery locations in the world and employing a multi-cultural team, enable it to offer multilingual services to clients.

 

How is it done?

It is a messy, ungainly process, with major platforms spending in the billions to keep their platforms free of malicious content. Some use inhouse resources while some outsource the activity to specialist BPO companies who do the work of screening out unwelcome content. A combination of the two is also adopted by many.

The desired approach for most companies who need moderation on their spaces is to moderate the content before it becomes visible to visitors. However, the volume of content being uploaded often makes this an impossible task, since this delays the content being published, upsetting users generating the content and reducing the vibrancy of the platform and impinging on free-flowing conversations. As a result, there are many other post-publish moderation techniques used by owners. This could be a post-moderation exercise where moderators keep working on the content and removing objectionable ones. However, the objectionable content would probably have been seen by some users. It could also be through the involvement of users and visitors who either flag off objectionable content that gets reviewed by administrators and acted upon, or a voting system in which objectionable content gets voted down and gradually vanishes from sight. Depending on the sensitivity of the content, one or more methods of moderation could be used, each with its own pros and cons.

Artificial Intelligence is the big hope of platforms that need commercial content moderation. It is hoped that AI will become smart enough to moderate a large proportion of content being generated, before it is published. It will be able to overcome many of the issues because of which manual pre-moderation is not possible, such as volume, capacity and cost.

Until the time it becomes viable and feasible, manual moderation is the way to go.

Of course, no moderation is also a method and ‘free speech’ is often used in justification. It may be possible in groups which are access controlled in some manner, but not on open platforms with access to everyone.

Thanks to their position as a preferred employer in the communities it works with, oWorkers has access to talented staff who are provided training to make them ready for the job of moderation, whether it is pre or post. Our walk-in traffic also enables them to keep their hiring costs low, as they do not need to advertise for candidates, which eventually gets passed back to clients as better pricing.

The walk-in talent pool gives them, and consequently their clients, other benefits too, like flexibility of handling ramps. The deep talent pool oWorkers has access to, enables them to meet these short-term requirements, which most clients have from time to time, without breaking a sweat. They can hire almost a hundred additional people within 48 hours. This is a substantial saving for clients who may otherwise have to hire resources for the full year and keep them idle when work volumes are normal.

 

What about moderators?

So far, we have been talking of moderation as an abstract activity. But let us not forget that as long as it is a manual activity it needs to be performed by human beings. These are the foot soldiers in the war against malicious content and the perpetrators of that content. The people who spend hours every day, or night, peering at computer screens, perhaps in an isolated, secretive space since they are the ones to authorise what should become visible to others. They are the ones who need to be familiar with the rules of the platform they are moderating content for, and ensure they do it accurately.

So what? Many of us do that.

What is different about the work being done by people who do commercial content moderation is that they need to go through, review, and take a stand on hurtful, malicious, graphically violent, pornographic, divisive, hateful, fake, bullying content in their line of duty. The result is that in the process of protecting others, they expose themselves to that body of content and all the psychological issues it could have on them. They put themselves in the line of fire to protect others. No doubt there is a commercial consideration that impels them to do it, but the risks of the job are now reasonably well known.

There is a lively debate in progress on the subject, since some of the largest names in business are the ones who need content moderation the most. With the scale that content generation has acquired, the money that needs to be spent in doing it is not peanuts any more. Most large companies use the strategy of hiving off the activity to a supplier, possibly located in a geography with access to cheap resources, who is better at hiring people at lower costs for doing the work, resulting in a lower bill to the platform or corporation for whom it is being done.

There is a lot of effort made by the platforms in conveying to the world at large how well the people who do this work are taken care of. From comfortable office spaces, to fair wages and working conditions, to benefits. While all that may be true, the unfortunate fact seems to be that the psychological scars of the job are, at the end of the day, to be borne by the individual and perhaps her near and dear ones. This aspect of the job is now receiving increased attention globally. Some of the largest corporations in the US have reached multimillion dollar settlements with groups of moderators for the enduring scars left on their psyche as a result of this work over a period of time. Workers in outsourced countries, however, may not even have access to adequate psychological care and support as there is social stigma associated with them.

oWorkers is sensitive to the peculiar requirements of this work and the toll it can extract from people doing it. An advantage we have over competitors is that we work with employed staff, and not contractors or freelancers as some of them seem to prefer. This gives us a stake in the well-being and development of our staff members. We regularly rotate our staff members across different engagements. This helps them in staying fresh, getting experience, and avoiding the ill-effects deep exposure to work like moderation might have.

oWorkers has been one of the earliest BPOs to equip staff to work from home, ensuring their clients’ businesses remained unaffected during the peak of the Covid-19 driven lockdowns around the world. Today, oWorkers is fully equipped to work from the office as well as home, depending on the situation on any given day.

 

The oWorkers advantage

With several unicorn marketplaces as long-time clients, oWorkers understands the challenges of this work as well as client concerns. Many clients, particularly from Western Europe and the US, mention savings of almost 80% after outsourcing work to oWorkers, and getting the benefit of their transparent pricing models.

We have supported many deserving youngsters from challenged backgrounds to get a job in the global digital marketplace, in each of the three locations we have centers in. Your commercial content moderation work will help us extend the opportunity to a few more.

Recommended Posts