Multilingual content moderation
Your User Generated Content moderated on 20+ Languages
SIMPLY, QUICKLY, EFFECTIVELY
with AI TECHNOLOGY + DEDICATED HUMAN MODERATORS SOLUTION
Multilingual content moderation services
For all media & online communities today, content moderation is mandatory : Internet Troll, fake news, racism, nudity … keep you users happy and your contents legal with multilingual content moderation services 24/7.
For online media, online publishers and huge online communities who need a multilingual content moderation company you knocked at the right door.
From images conformity to comments moderation, from profiles check to videos upload moderation, we have a solution both with technological AI tools + dedicated Human resources of highly efficient moderators working 24/7 in 20+ languages : English, German, French, Italian, Spanish, Portuguese, Dutch, Polish, Finnish, Danish, Irish, Swedish, Ukrainian, Russian, Hungarian, Norwegian, Slovak, Croatian, Romanian, Turkish, Greek, Bulgarian, Hindi and Arabic. (others on demands).
Some big names who trust us
for their content moderation outsourcing
- Italian UGC
CONTENT MODERATION SOLUTIONS
- Online press
- Social networks
- Online communities
- Online MARKETPLACE
- KIDS platforms
OUR DELIVERY MODEL
We fully managed your operation from A to Z
Use our solution with AI technology + human moderators with multilingual capability within a few days to moderate :
- Obscene content or abusive or intimidation comments
- Libelous material
- Breach of copyright
- Child abuse
- Safety issues
- Off-topic comments
- Illegal items
- Nudity / porn
- Drugs/ weapons / Alcohol
- Celebrities related problem
For a variety of moderation types including :
- Social media moderation
- Community moderation
- Online media moderation
- UGC moderation
On any type of contents :
- Audio files
- Profiles (user names)
On any moderation PROCESSES :
Per hour with SLA & KPI agreements, per Unit
from 5 €/H to 13 €/H equivalent
From 5.5 $/H to 15 $ / H equivalent
Depending of the size of the team, types of contents, volume and languages to be moderated
“Content is King” is what Bill Gates prophetically titled an essay he wrote in 1996 that was published on the Microsoft website. The phrase has been quoted countless times ever since. Here are a couple of quotes from that essay:
“Content is where I expect much of the real money will be made on the Internet, just as it was in broadcasting.”
“One of the exciting things about the Internet is that anyone with a PC and a modem can publish whatever content they can create. In a sense, the Internet is the multimedia equivalent of the photocopier. It allows material to be duplicated at low cost, no matter the size of the audience.”
At that time, producing content used to be the preserve of a few, like publishing houses, writers and movie makers. And the rest of us were consumers of that content. It was reasonably easy to trace content back to the producer.
A generation later, the world has changed. While content might still be king, all of us, apart from being consumers, are now also producers of content. We write blogs, create posts on various media, comment on content of others, share pictures, record and publish video calls. We do it in text, audio, image and video formats. Someone or the other is doing it all the time. There is a bombardment of content from all over the world.
A lot of the content all of us are producing is being published on ‘open platforms’ where it can be consumed by the ‘public,’ and not just a defined and controlled set of users known to us.
Each piece of content affects each individual and each organisation in a different way. Some content could come across to some users as profane, some as vulgar, some as predatory, some as fake news-spreading, some as insensitive, some as fraud-perpetration, some as hate-mongering and some as plain and simple illegal. Such content could not only lead to social and political issues, but also the platform and brand falling foul of the law as well as of followers/ customers.
Hence it needs to be managed, through a process of content moderation.
Content moderation, or just moderation, is the process employed by a platform or host or webmaster, to identify and regulate ‘offensive’ content. The process could vary from platform to platform and owner to owner and would involve creation of a set of rules that would guide the process of identification and regulation of the ‘offensive’ content.
The one thing that would be common amongst the platforms and site owners is that in most cases, the service is outsourced to a specialist agency, also known as a content moderation company.
Outsourcing being the choice of most, we can focus on selecting the right outsourcing company to partner with.
We will divide the process of evaluating companies offering a content moderation service into two sets of criteria:
- Criteria based on which capability on content moderation services can be evaluated
- Criteria based on which their BPO credentials can be evaluated
Capability on content moderation services
As the heading of this section suggests, we will focus on criteria that will bolster, or undermine, an assessee company’s credentials regarding their ability to handle different aspects of content moderation services with ease and confidence.
Ability to handle different types of moderation
One type of a service or another is likely to be employed by platforms and businesses as it would be important for them to ensure they are not adversely impacted because of inappropriate content they did not detect. They could choose one of the various types of services that are available, including, at times, a combination, which is why it is important for a content moderation services company to be conversant with the various types. The most common types are:
Pre-moderation – Reviewing content before it is published and becomes visible. Provides greatest control but can be expensive and users are sometimes left guessing as to when their content will show up.
Post-moderation – Content becomes immediately visible, but moderators review content in a duplicate queue based on which they can decide on action. While content can be removed, screen-shots, if taken, could defeat the purpose.
Reactive moderation – Relies on visitors flagging inappropriate content which, once flagged, gets reviewed by moderators and acted upon as deemed fit.
Distributed moderation – Employs a voting system from members to decide on whether members feel that the content is as per expectations of the community and permitted to stay, or would need to be addressed.
Automated moderation – Humans also have available some software tools to assist them in the moderation process, like word filters, which either mark or replace words and phrases identified as objectionable.
oWorkers offers comprehensive moderation solutions with experience in setting up teams that facilitate brand and image management by clients. A testament to our acknowledged expertise is that clients often update their own guidelines on content moderation once they initiate discussions with us during their search for a moderation services company to outsource their work to.
While one outsourcer will belong to only one industry, the idea behind a vendor possessing multi-industry capability is to establish their credentials and familiarity with the variety of different needs that different industries might throw up. Experience with multiple industries will mean exposure to, and consequent ability to handle, a wider variety of situations and challenges.
oWorkers offers a team experienced in moderating content on a variety of platforms like Online Media, Gaming, Social Networks, Dating and Alliances, Online Communities, Online Marketplaces and Childrens’ Platforms. The experience gained over many years of providing these services is available to the next client.
Ability to recognise and address offensiveness
This is where the ‘rubber meets the road.’ The ‘moment of truth’ when a moderator identifies offensive content and deals with it. Does the moderation services company have the wherewithal to identify and address offensiveness? On most occasions, a large business will have a well defined policy that will be able to address most of the situations. But real life likes to throw curve balls. There will be situations not covered in any Manual or SOP that, because of their nature, will need to be handled immediately. This is where experience and prior exposure to a wide variety of situations will come in handy.
Having moderated content for many years across different client engagements, oWorkers can proudly claim familiarity with, and ability to handle, the following types of offences:
- Obscene content
- abusive or intimidation comments
- Defamatory content
- Libelous material
- Breach of copyright
- Child abuse
- Safety issues
- Off-topic comments
- Illegal items
- Nudity/ porn
- Drugs/ weapons/ Alcohol
- Celebrities related issues like trolling
Like many other services, in a global world where businesses strive to cater to the requirements of customers around the world, multilingual capability in BPOs is almost becoming a requirement. After all, no business would like to go shopping for a new vendor each time the business expands to cover a new geography with a different language. Getting a partner onboard for a service is an onerous task, done with due care. Repeating the exercise for every new language might even negate the benefits of expansion.
From inception, oWorkers has been known for its multilingual capability, made possible in some part through its presence in three of the most popular geographies for content moderation services. It currently offers its services in over twenty of the most widely used languages in the world that include: English, German, French, Italian, Spanish, Portuguese, Dutch, Polish, Finnish, Danish, Irish, Swedish, Ukrainian, Russian, Hungarian, Norwegian, Slovak, Croatian, Romanian, Turkish, Greek, Bulgarian, Hindi and Arabic. More languages can be provided on demand.
Familiarity with content types to be moderated
Different types of websites and platforms will have differing moderation needs. Dating sites may differ from educational sites and news platforms from gaming portals. Each will have its own unique need for a content moderation service and needs to be handled in accordance with that need. An outsourcer may have a need for moderation across multiple platforms as an omni-channel service for consumers is a common practice, giving customers the choice of how they wish to interact. Hence the need for moderation over multiple platforms.
Comments – Blogs and sites that invite open participation in the form of comments from visitors.
Reviews – Many sites are designed to invite feedback and comments, perhaps the most common being Ecommerce sites which invite user feedback and reviews for a variety of products. Apart from the risk of offensive content, they also run the risk of abuse by competitors.
Chats – Apart from person-to-person chats there are many collaboration platforms where communication flows fast and thick through in-product chats. Gaming perhaps is a good example where players chat while playing.
Forums (Fora) – Created with specific objectives, these can be hijacked by vested interests leading to a derailment of an otherwise healthy discussion.
Images – Do uploaded images adhere to guidelines? Are they likely to cause offense to some section of viewers?
Audio – An audio is just a file today. Once recorded, it can be shared, played back, forwarded, etc. Audio uploaded on open sites needs to comply with standards of acceptability in general and rules defined by the site owner in particular.
Video – The richer the content format, the greater the possibility of causing offense. Video, hence, is perhaps the most sensitive of media to possible abuse and thus needs to be handled with that much more care and attention.
Whatever be the type of content, oWorkers content moderation specialists are at hand to make the internet a pleasant experience for visitors and safer for sections at risk, like children. We will use human expertise along with modern technologies to make each visit a positive experience. This includes live monitoring. We will ensure that no part of the content, whatever the media, is unsupervised.
With a center in the Eurozone, with moderators whose life experiences are part of the shared experiences of the Western World, the cultural gap is eliminated leading to moderation of a high standard.
Every BPO has hundreds of moving parts that need to operate in consonance in order to deliver superior outcomes. The primary ones, that will combine with capability in moderation services to provide superior outcomes, are discussed in the ensuing paragraphs.
This is an essential component of any commercial engagement. One party, a content moderation services company in this case, provides a product or a service, and the other, the recipient of the product or service, pays for it through a common currency which becomes Revenue for the recipient. This is the reason why a business exists.
In a B2B engagement, pricing is often a negotiated number since the basket of products and/ or services purchased is likely to be unique. The challenge that often arises is that pricing overshadows many other equally or more important elements of the contract since it is a tangible, quantifiable data point versus many other parts which are often subjective. It is recommended that pricing be considered after evaluation of the other elements of the contract has been completed.
oWorkers offers a transparent pricing mechanism, one that is even displayed on the website. Our content and social media moderation service is priced between Euro 5 and 13 per hour (between US Dollar 5.5 and 15 per hour). The variables that will determine the exact price point being the size of the team, types of content, volume and languages to be moderated.
Business cannot wait. Each process forms an input into the next one. If one link in the chain moves slowly, it will cause the entire chain to move at the pace of the slowest link. The faster you process, the quicker the realisation of Revenue and more can be pushed through the processing machinery. In short, speed of processing is an essential requirement.
With centers in three global locations, oWorkers is able to offer the advantage of 24×7 delivery. For its global clients, it also provides speedy delivery taking advantage of the time difference with the client site. In many cases we are able to commit overnight delivery.
This is a wide term, and encompasses many elements; like Hiring, Training, Attrition Management, and the general welfare of employees. BPO services are driven by human resources. Let’s face it, if a process could have been automated, it would have been. BPO services are needed because many processes require human skill. From the BPO perspective, and perhaps counter-intuitively, such requirements are increasing even while efforts are on to automate processes that are currently manual. A social media moderation service provided by BPOs is a good example of such a requirement. A few years back, it did not exist.
Suffice it to say that the ability to manage all aspects that relate to the workforce is an essential ability for a provider to possess and display.
oWorkers can offer the flexibility to ramp up and down by 100 resources in 48 hours. This has been made possible because of our deep engagement and active participation in the communities where we work. oWorkers is an aspirational job for many enabling is to attract a continuous supply of talent. Our Training team takes new hires under their wing, trains them to become billable resources, enabling them to make a positive contribution to their families. Our attrition numbers are regularly better than most others in the industry, giving us an edge in retention of knowledge and limiting our cost of hiring and training.
Technology and Data Security
It is in the interest of every BPO to ensure they deploy current technologies that ease their work and deliver better results, hand in hand with the highest levels of data security to protect the huge amounts of client data that they handle, made more complex by the need to permit employees to work from home due to the pandemic.
oWorkers’ partnerships with the best AI moderation tools help us get off to a quick start on most engagements, after which the human element comes in. The fact that 85% of our clients are technology companies keeps us on our toes. We operate from super secure facilities & protocols for your data security, We are GDPR compliant and ISO (27001 :2013 & 9001:2015) certified. Our data security extends to work from home.
Management, Financial and Regulatory Stability
A content moderation services company will be as effective as it is allowed to be by its management. What that means is that the activities of the company are governed by the leadership team. The company may be capable of delivering great value in a particular area, but unless the management shows inclination to be providing that service, it will not be able to function smoothly as the support needed from the rest of the organisation may not be forthcoming. Hence, assessing the commitment of the management team is important.
In addition, the company should ensure that it is on the right side of all legal requirements, as well as financially sound to provide an environment for the business units to operate with freedom.
oWorkers is a locally registered company in all its global centers. We pay local taxes as well as social taxes for our people. It has been a profitable company and takes on engagements after careful evaluation and discussion. Our management team comes with hands-on experience of over twenty years in the business.
A BPO services provider with a focus on data entry and related services, oWorkers delivers content moderation services to clients, with several unicorn marketplaces among them, across the globe. Many of our clients have reported savings of up to 80% after outsourcing to us. Like it has for many others, your search for great quality and quick turnaround at competitive pricing from a content moderation services company, should end with oWorkers.