What is Proactive Moderation 2

PROACTIVE VS. REACTIVE MODERATION

Many games studios and developers are aware of the toxicity and harassment happening in their games and have put what’s known as “reactive moderation” measures in place as a response. Usually, here’s how this works:

  1. A studio sets up methods for players to report bad behavior.
  2. A player experiences toxic behavior and has to remember to file a report at a convenient time.
  3. The community or moderation teams at that studio then examine each report individually.
  4. If the report was accurate and indeed flagged toxic behavior, the moderator then decides the action to take against the offending player.

This is slow, expensive, and inefficient.

BUT WHAT IF THERE WAS A BETTER WAY?

Proactive moderation is all about looking for signs of toxicity as it happens.

Rather than relying on players to send incident reports, proactive voice moderation notices flags bad behavior and automatically captures the key data to escalate to moderators, enabling them to respond faster and more comprehensively to any unfolding toxicity.

Proactive moderation takes the onus off of the player, enables moderator efficiencies, and is cost-effective.

IT'S THE WAY OF THE FUTURE.


TOXMOD IS GAMING'S ONLY VOICE-NATIVE MODERATION SOLUTION.

Built on advanced machine learning technology and designed with player safety and privacy in mind, ToxMod triages voice chat to flag bad behavior, analyzes the nuances of each conversation to determine toxicity, and enables moderators to quickly respond to each incident by supplying relevant and accurate context.

Book Your Demo


SIGN UP FOR OUR EMAIL UPDATES