Moderation under Pressure
The rules are tightening for social media giants. Meta, Snapchat, TikTok, Twitch, YouTube, and X have been summoned by the French government at the beginning of June for a strategic meeting. Led by Aurore Bergé – Minister Delegate for Gender Equality and the Fight against Discrimination – and in the presence of ARCOM, the DGPN (General Directorate of the National Police), and Clara Chappaz, Minister Delegate for Artificial Intelligence and Digital Affairs, the platforms are expected to explain the rise of hateful content online and account for their moderation efforts. In 2024, online hate surged by 16%, prompting the government to act. These new measures mark a turning point and introduce fresh challenges for moderation stakeholders.

| Legal Context
Since the adoption of the Digital Services Act (DSA) in 2022 at the European level, major digital platforms operating in Europe are subject to strengthened obligations in the fight against illegal content (hate speech, harassment, disinformation). The key solution to limit such content? Moderation. Within this framework, the GDPR governs the processing of personal data. As a result, social media platforms are required to implement effective systems to detect and remove illegal content – while also upholding freedom of expression.
| Breakdown of the Moderation Obligations
The DSA does impose penalties if a platform fails to remove clearly illegal content within a reasonable timeframe (typically 24 hours after being reported). These penalties can reach up to 6% of the platform’s global revenue.
But today, the stakes go beyond legal compliance: ministers are demanding full transparency on the platforms’ internal practices:
- How many French-speaking moderators are employed?
- How many violations are tolerated before an account is suspended?
- What proportion of moderation is done by humans?
- How do recommendation algorithms work?
Platforms have until July 14 to provide clear written answers to these questions. Another hearing is already scheduled to assess progress and guide further action.
Beyond financial penalties, the real risk now is reputational: a lack of transparency or responsiveness can seriously damage the credibility of a brand, whether it’s a platform or an advertiser.
| Implications for Moderation Professionals
This regulatory tightening is reshaping expectations and practices around moderation:
- Increased content regulation demands robust and compliant processes.
- For brands, moderation is now a matter of both reputation and legal conformity.
- The sector requires hybrid skill sets: legal, linguistic, and technical.
- Moderation systems must operate 24/7 and in multiple languages.
- A dual approach is key: AI technology + human oversight.
As a result, platforms can no longer handle it all internally :specialized outsourcing is becoming essential.
| And that’s exactly where Netino by Concentrix steps in…
As government pressure ramps up, Netino by Concentrix remains your trusted partner for deploying large-scale human and hybrid moderation solutions.
In a digital world under intense scrutiny, moderation is no longer a “nice-to-have”, it’s a prerequisite for online existence.
N'hésitez pas à partager cet article !
"Moderation under Pressure"