4 Décembre 2025 - écrit par sylvina neri - Lu 77 fois

After Russia, Mehdi Bensaid sets his sights on Roblox


As Roblox and Free Fire surge in popularity among young players, policymakers are rethinking how to balance creativity, community, and child safety online. Morocco’s Minister Mehdi Bensaid wants stricter oversight, while Russia just blocked Roblox over “extremism” and alleged LGBT “propaganda.” The debate underscores a messy intersection of digital rights, youth protection, and geopolitical narratives.

What’s driving the push for tighter gaming rules

Roblox and Free Fire are no longer just games, they’re sprawling digital ecosystems where millions of young players create, socialize, and compete. That scale has sparked a fresh wave of scrutiny. In Morocco, Minister of Youth and Culture Mehdi Bensaid has called for regulation to better control minors’ access to platforms like Roblox and Free Fire, citing exposure to inappropriate content and the need for protective guardrails. His stance reflects a broader unease: as user-generated worlds expand, so do the risks, from unmoderated chat rooms to unofficial experiences that slip past community guidelines.

Meanwhile, Russia escalated the conversation dramatically by announcing a block on Roblox, accusing it of hosting extremist content and spreading LGBT “propaganda.” Its media regulator, Roskomnadzor, argued the platform negatively impacts children’s “spiritual and moral development.” Supporters framed the move as essential to protect minors; critics saw it as yet another ratchet in the country’s tightening digital controls. Whether you agree or not, it’s a potent reminder that gaming moderation is now entangled with culture wars and state-level information policies.

So where does this leave players, parents, and developers? First, age-gating and real content moderation are no longer optional — they’re central to platform legitimacy. Roblox’s layered moderation (filters, human review, safety teams) and Garena’s in-game reporting tools matter, but the bar is rising. Clearer age verification, context-aware parental controls, transparent enforcement logs, and faster takedowns of harmful content are becoming table stakes. For regions with divergent norms, localized compliance teams can help adapt safety frameworks without flattening global communities.

Second, there’s a cost to blunt instruments. Overbroad restrictions risk punishing the majority of creators and players who use these platforms responsibly. They also push communities into harder-to-moderate spaces, private servers, encrypted chats, or unofficial mods, where harm can be more difficult to detect. The smart path is proportional, evidence-based regulation: publish risk audits, co-design youth safety standards with civil society, and mandate transparency reports detailing abuse patterns and mitigation outcomes.

Third, players deserve a voice. Youth councils, bug bounty-style safety programs, and creator charters can help platforms align incentives: reward safe design, penalize exploitative mechanics, and center well-being over engagement hacks. The best safety roadmap blends policy and product, nudges that surface age-appropriate experiences, friction against dodgy links or chats, and robust guardian dashboards.

For now, the debate around Roblox and Free Fire is bigger than any single title. It’s about how we govern digital playgrounds where childhood, creativity, and commerce collide. The challenge is to protect kids without erasing the communities that make these platforms worth playing.