What is the human cost of online content moderation?
Moderation is hardly a new concept for humanity. It has probably existed as long as thinking, sentient human beings have existed. In the English language sense of the word, it refers to the adoption of a middle path, the avoidance of extremes. Youngsters might be advised to ‘moderate’ their pitch when they are talking to a senior in their organization, for fear of being labelled as unwilling to listen or learn. Regular tipplers might be advised ‘moderation’ the next time they go visiting a bar or pub, particularly if they need to drive after that. A rabid anti-establishment protestor could be advised to tone down his rhetoric and demonstrate ‘moderation’ in his speeches for fear of being acted against by the establishment.
The subject on which moderation has been or is being advised to be exercised can be referred to as the content. Content of any kind is expressed in one of the four commonly understood formats; text, audio, image and video. Of course, the underlying subject of the content could be anything. It could be an idea or a movie or a news item or a cartoon or an article or a movie or a conversation. Any content could be the subject of moderation. For the youngster talking to a senior, it could be a proposal he has put forward which he is trying to justify. For the anti-establishment protestor, the content could be his views on the government’s policies.
While all content is relevant, the content that is conveyed or expressed through the spoken word is often ephemeral, and, in most cases, is forgotten after an interaction is over. Published content, on the other hand, becomes ‘permanent.’ It gets a life and its reach extends beyond the moment it was conceived and could continue to influence readers and viewers for a long time.
With its focus on data and related services, oWorkers appreciates the nuanced differences and is able to support clients in their content moderation requirements. oWorkers has been identified as one of the top three BPOs for provision of data services to clients, on multiple occasions.
What is online content moderation?
The advent of the internet and its gradually increasing adoption over the last quarter of a century has changed our lives in many ways.
Before the internet, publishing content was the responsibility of a few. Publishing houses would publish content in the form of books and periodicals. News and media houses would publish content in the form of news and current events. Advertisers would publish content related to their products and services that would be made visible through other business either selling space on billboards, or spots on TV or Radio, or column space in classified sections of media. This content would be consumed by other businesses as well as the people.
Today, with the internet, these lines have blurred. While everyone is still a consumer of content, now everyone is also a publisher of content. Whether it is my pet’s antics, or the great food at the new sushi joint in town, or the flooding of roads after a brief spell of rain, or views on the latest edition of the Olympics in Tokyo, I can create content about anything and post it so that it is available for the consumption of anyone who is interested or who may accidentally chance upon it. As we saw earlier, once content is published, it gets a permanence that is difficult to erase.
Hence, it becomes that much more necessary to exercise care while publishing online. This is where the human cost of online content moderation begins to make its presence felt.
With its position as an employer of choice in all the communities it operates in, oWorkers has access to the choicest of talent. It has the flexibility of deploying resources based on their preference as well as aptitude, and also the complexity and demands of the job. This access to a continuous supply of talent also enables oWorkers to support short-term spikes in client volumes, with ease. By committing the ability to hire a hundred additional resources in 48 hours, oWorkers relieves clients with such requirements of a huge cost in the form of idle resources during the remaining, non-spike period.
What does publishing have to do with moderation?
Everything.
In the publishing of yore, since the publisher was, almost always, clearly identified, the responsibility fell upon the identified publisher in case inappropriate content saw the light of day. As they could be identified, and held accountable, they perhaps took their jobs seriously and ensured inappropriate content was edited out and the world only saw content that was kosher. Even the creators of content could be expected to be self-censored since they were also answerable to the editor or publisher for what they were submitting. So, it seemed to work well.
While publishing on the internet, however, such relationships and controls do not exist. There is no single channel or even a set of defined channels through which published content flows. It can come from anywhere. A goat farmer in rural Arkansas could be posting pictures of the milking process of his goats while a college student in Sweden writes about the retreating glaciers in her country. An unnamed person in an unnamed location could also be uploading a video of a woman being shot because of her refusal to comply with the orders of the ruling dispensation.
There is no external check. You are your own creator as well as editor. You feel secure under the cloak of anonymity you believe exists in the corner of the world you are in and posting from. You feel powerful.
Social media makes it even easier not only to create content but also to share it. That is the purpose of the existence of social media platforms; to encourage free and flowing communication and exchange of ideas and thoughts between people. Their rapid growth is a testimony to their success. While all platforms define rules of posting and participation, which most people abide by, they can be flouted. Hence the need to create a system of checks which leads into the issue of the human cost of online content moderation.
With its centers in three distinct geographical locations, and a hiring policy that supports a multi-cultural, multi-ethnic workplace, oWorkers now has the ability to support online content moderation in 22 languages. If content is being created from all corners of the world, it can be in any language. For moderating content one needs to understand it first.
What do we mean by the human cost of online content moderation?
As we have seen, content creators can do many things in the online world. They can post messages of hate in an effort to create rifts and push their agenda. They can upload pornographic material unsuitable for the many teenagers who also throng the online world. They can spread malware and cyber threats. They can spread fake messages as well as doctored videos in an effort to create confusion and anarchy. And much more.
Civil society cannot permit this to happen. Hence content needs to be moderated. Content that is being uploaded in unimaginably large volumes from around the world. Many methods of moderation have been attempted, but the one that works best is the one where the content can be evaluated and permitted to be accessible to viewers. Humans have been attempting to create technology that will do this for them and have placed reliance on Artificial Intelligence (AI) to deliver the goods.
Unfortunately, at this point of time, and we have to say this with mixed feelings, AI based solutions are no match for human capability. Ai engines cannot match the ability of that wonderful organ, the human brain, to understand and evaluate based on the fine nuances of each piece of content that the AI engine likely has no clue about, despite all the training it has been given. What this means is that human beings have to be deployed to view and evaluate the horrible content that has been referred to on multiple occasions in this article, so that others can be kept safe. That is the human cost of online content moderation.
oWorkers is at the forefront of technology with its enduring partnerships with technology companies through which it has access to the latest technologies. Clients also benefit from these partnerships as these technologies are used for their work. They are GDPR compliant and ISO (27001:2013 & 9001:2015) certified and were one of the first BPOs to take action in the wake of the Covid-19 epidemic and create infrastructure that enabled staff to work from home in a secure environment.
What do they have to do?
Moderators need to put themselves in the line of fire to protect others. It is not that every second person in the world is out to create gory content designed to create nightmares for the person watching it. Most content is kosher, and most of the non-kosher may be only mildly offensive, and more a non-observance of a platform guideline than content designed to create controversy or trigger civil wars. It is that tiny fraction of the content that is offensive in many of the ways defined above, and generally unfit for human consumption, because of which human moderators are deployed to identify and prevent such content from going public.
And when they go through such content, it is already too late. Some of us might recall our own experiences of watching a horror movie on the big screen and its impact on us for a long time. Despite knowing that what we were watching was unreal, a work of fiction. Online moderators know what they are reviewing is real. Someone is making those hate speeches. Someone has recorded an instance of child pornography. It leaves an indelible impression on the psyche of the person who is reviewing it.
That is the human cost of online content moderation. Many moderators put up a brave face and laugh off suggestions that it might be impacting them deeply. However, increasingly, they are recognizing and understanding the impact it has on them.
A debate has been under way on the subject for a long time. Some of the biggest names in business run the most popular social media platforms and they are the ones most in need of content moderation. While commercial considerations on part of human moderators are understandable, and the reason many of them put up their hand for doing the work, deleterious side effects need to be recognized if there is to be any hope of managing them. If not recognized, the people being impacted by them will continue to suffer in silence, along with their close family and friends, while the rest of the world looks for the next business opportunity. That is the human cost of online content moderation.
As they work with employed staff, and not contractors or freelancers, oWorkers considers staff development to be their responsibility. While online content moderation is a challenging task, with the wide variety of projects they handle, staff rotation is common. This is required for their psychological stability as well as personal development.
Outsourcing moderation
As the volume of content has kept growing, the ability of platforms to manage it has kept reducing. After all, how many people can one hire and deploy for moderating content? Besides, each additional resource you hire costs money?
Many large organizations have opted for the outsourced method for handling moderation. It costs less, there is lower recognition of psychological conditions as medical, resulting in lower responsibility for employers. Moreover, the geographies to which this work is outsourced, perhaps have other more immediate and pressing concerns to bother too much about strict implementation of labor laws and working conditions.
Additionally, export income, which accrues to them from such jobs, is important for these geographies, resulting in greater tolerance for outsourcers bending rules. And for the people doing the job, it is a welcome white-collar employment opportunity.
The outsourcers make an effort to make a noise about the great working conditions that they provide to the outsourced workers and how well they look after them. Some of that might even be true. However, the unfortunate fact appears to be that while they are being held accountable for the impact on workers in the developed world, outsourcers getting online content moderation work done by people in the less developed parts of the world seem to be getting away fairly cheaply while the impacted workers may be left struggling with the cost to their health that might manifest itself only after a few years. That is the human cost of online content moderation.
Clients of oWorkers note savings of close to 80% when they outsource their work to us. This is particularly true of clients from Western Europe and the US. They also appreciate the transparency in pricing, with the choice of a dollars per unit of input or dollars per unit of output that oWorkers typically offers to its clients.
oWorkers possesses the ability to offer business contingency based on its three-location presence, if needed by clients. For a service like online moderation, it can be very useful as content is being created all the time. Several unicorn marketplaces choose oWorkers as their outsourcing partner. We hope you will, too.