Moderators, also known as “web cleaners”

The development of the web and the increasing number of internet users has inevitably led to the publication of billions of contents such as reviews, comments, photos, and videos. Among them: fake news, trolls, sexual assaults, insults, threats, physical violence…

So many contents that moderators analyze and remove every day, for our own good. But who cares about theirs?

| Why do we need moderators?

The internet is undoubtedly a woven web that allows us to communicate all over the world. The development of the web still goes in this direction, with social media pushing us daily to share content – whether visual or written.

This limitless web allows everyone to express themselves as they please. Of course, freedom of expression, we say yes! However, this no-limit communication space does not only encourage the most benevolent to share their thoughts, and we are confronted with sensitive and malicious contents. As they say, “One person’s freedom ends where another’s begins.” This well-known expression generally means that one must know how to restrict their liberties in a community so as not to encroach on others. Perhaps it is not SO well-known after all?

Everyone is impacted by unregulated content: a user can be shocked by a violent video, a person can be openly harassed because of their appearance, a company can go bankrupt due to unjustified negative comments on Google Reviews (it is well-known that people take more time to complain than to leave a positive review). This is why moderators have naturally found their role on the web. Originally, they were volunteer internet users who tried to sort through content. Today, being a moderator has necessarily become a full-time job that is particularly grueling.

| Post-traumatic syndrome, burnout, depression…

Moderators filter the content of a platform by applying its rules to protect users and their mental health. But who protects these web cleaners?

Imagine spending your day dealing with insulting, racist, or even pedopornographic content. Could you handle it and not take to heart what others are capable of sharing publicly? Protecting moderators has become an important issue for their employers and for platforms that need them. It is easy to find testimonials from moderators who have recurrent nightmares or who experience an overload of stress and moral exhaustion. Unfortunately, there is still no 100% optimal solution to spare them from being confronted with contents that no one else should see.

Companies are making efforts and adopting new standards to try to support moderators as best they can with: monitoring their well-being, psychological assistance, and sports facilities.

– What can we do for them?

At Webhelp, more than 80 new initiatives were implemented in 2021 with 24/7 support and the WebHEALTH program focused on physical well-being. Positive results have been observed, including an increase in loyalty and productivity. For example, since the launch of a program aimed at encouraging employees to share their experiences, there has been a 50% reduction in absenteeism related to mental health.

Protecting internet users still requires too much human effort but remains the best way to moderate content on the internet. Discover our first video on the subject with Martin Le Bars – Solution Director, DCS/Webhelp.

N'hésitez pas à partager cet article !
"Moderators, also known as “web cleaners”"