On The Road To Distinguish Content Moderators From Community Managers

On The Road To Distinguish Content Moderators From Community Managers

Created with a view on freedom of speech, the internet paved the way for growing digital communities to communicate on a global scale with small and large groups of individuals and organizations. Simultaneously, digital platforms made room for hate speech, child pornography, cyber bullying, and violence to run freely with almost no control. Social networks where this type of speech is still unfolding took too long to act, but they are now coming to grips with it.

As more people and organizations use online platforms, a whole new world of digital and social media careers flourished. The wish and need to develop and nurture communities led to the creation of the community manager role. Furthermore, social platforms started hiring content moderators to dismantle abusive content, immobilize trolls, and scan user-generated content to check if it complies with the organization’s acceptable use policy.

Different Roles Mixed Up

However, how often do human resources and candidates searching for content moderation positions come across job ads that include the functions of a community manager? What about community manager positions actually involving content moderation responsibilities? The mix-up between the two roles (and other digital occupations) is still deep-seated.

To add up to the confusion, or simply to portray a more appealing designation, social networks created Trust and Safety Abuse divisions that enlist Trust and Safety professionals who perform content moderation and community management duties.

Whether aiming to invest in content moderation, community management, or both, the differences between the two roles should be clear. Moreover, business analysts and sales professionals endorsing content moderation and community management should be knowledgeable about the distinction as well. Otherwise, when managing departments that cover content moderation or community management and hiring professionals to perform these duties, businesses and third-parties might get results that are different from the expected. The clarification is even more important within large companies, including business process outsourcing firms, where silos, which tend to obstruct communication, business processes and productivity, might exist.

Growing Need for Both Content Moderators and Community Managers

Social platforms and messaging apps have been facing numerous controversial issues, such as videos promoting terrorist attacks on a video platform, fake accounts disrupting the US presidential elections, and the viral misinformation rampant on a messaging app during the presidential elections in Brazil, which mass media then disseminated without verifying it.

The mentioned polemics and an array of other issues showed the public and legislators a digital world where fraud, violence, and insecurity are rampant. Social platforms have since been prompted to restructure their moderation features and processes.

How are content moderators supposed to keep a digital community healthy?

Content moderators look for inappropriate content, delete or modify inappropriate posts, and possibly delete accounts of offenders.

As digital users are expected to become more tribal, they will favor simpler and niche digital platforms, where members decide who joins the group. In this case, the community managers role will become all the more important.

The community manager sets up community guidelines, develops membership outreach programs, facilitates the integration of the community they promote and represent, and manages disputes. They also analyze and produce reports, monitor community content about the brand, and suggest opportunities based on the monitoring.

The BPOs’ Role in the Midst of Digital Services Change

Currently, there is growing criticism from the mass media against major social platforms for failing to curate content published on their networks. Additionally, international leaders and legislators have been passing laws to criminalize tech giants for not taking prompt action to remove hate and violent content.

Australia is the most recent hallmark when it comes to international legislation changes affecting social media content. Australia’s parliament passed a bill in April 2019 to criminalize tech platforms for failing to remove violent videos and audio promptly. This bill came after an Australian gunman live-streamed himself shooting worshippers in two mosques in Christchurch, New Zealand. Canada, the US, and New Zealand are considering taking similar action.

The present situation makes it necessary for social platforms and businesses that use content moderation to access updated artificial intelligence tools to peruse content, along with a larger workforce of moderators.

Faced with these needs, a few BPOs are rendering major social platforms and brands moderation services. Aside from giant social platforms, businesses have also been contracting third-party services to look for and take action against inappropriate content on the pages they run. The BPOs’ role in explaining to its clients the relevance of both community management and content moderation is critical to properly measure return on investment and match client and staff expectations, especially within the changing scenario.