Modulate x TrustCon 2023

We had a great time at TrustCon 2023! CEO and co-founder Mike Pappas led a presentation on building effective (and enforceable) codes of conduct. Plus, we were joined by TouchCX for a panel discussion about foundational practices in game design and development that encourage pro social behaviors in gaming communities. 


TrustCon 2023

It’s Time to Get Real in Gaming: Solving the Challenges of Moderation for Real-Time Player Support

 Tuesday, July 11, 3 pm PDT 
 Ballroom B 

On day one of TrustCon, CEO and co-founder Mike Pappas spoke in a panel discussion highlighting the vital importance of trust, safety, and inclusivity in game development and design as the boundaries between gaming and social platforms blur. Speakers Suzannah Fischer, Nyetta Jackson, Mike Pappas, Joi Podgorny, and David Vinson explored key themes:

  1. The Growing Symbiosis: Gaming and social worlds are increasingly intertwined, impacting and causing constant shifts to demographics and player profiles across age groups and platforms. We are seeing younger and younger players in games, especially in the VR spaces.
  2. In-Game Safety: Real-time gaming experiences pose challenges for ensuring player safety, demanding innovative technological solutions to combat real-time harm effectively. Often times, it is too slow or inaccurate to rely on player-generate reports of abuse or harm in games; instead we need real-time solutions that are powered by emerging tech.
  3. Regulation and Transparency: Increased visibility and accountability through regulation and transparency reporting are essential for creating safe gaming spaces that prioritize user safety. This means announcing regular updates to your stakeholders on the rates of problematic behavior on your platform and sharing progress updates on implementing solutions.
  4. Safety-by-Design: Incorporating safety measures into game development is crucial for building positive user experiences and fostering community loyalty as the industry expands into new Web3 and spatial experiences. Why wait until you've launched a game before implementing moderation practices and solutions? Building a game with safety mechanisms already in place helps to set expectations within internal moderation and trust & safety teams that users' safety and wellbeing is a priority, rather than an afterthought.

Moderator

David Vinson

David is the Director of Research, Product and Innovation at IntouchCX. With a PhD in Cognitive and Information Sciences, he has extensive experience in developing intelligent technology solutions that balance productivity and risk to ensure employee and customer success and well-being. He has published numerous articles and patents in the areas of behavior, data, and the mind. Currently, he focuses on implementing technology solutions that enable and protect communities and employees.

Panelists

Mike Pappas

Mike Pappas is the CEO and co-founder of Modulate, which builds ToxMod to detect hate speech, bullying, harassment and other forms of toxicity in game voice chats using machine learning. Mike’s work at Modulate ranges from developing new partnerships platforms looking to protect their players, monitoring industry and regulatory trends, and reinforcing an internal and external culture of passion, respect, and personal growth.

Paula Kennedy Garcia

Paula has a robust 25-year industry career in CX and, in her current role as EVP Innovation and Product Strategy, she is leading the charge for next-generation solutions and transformation for IntouchCX, where the R&D at IntouchNXT focuses on leading from the future, to revolutionize experiences across all CX and Trust and Safety pathways.

Nyetta Jackson

As EVP of Client Solutions and Head of Player Experience at InTouchCX, Nyetta focuses on working with clients to help them achieve their goals of providing their players and customers with the best experience through innovation and strategic guidance.

Joi Podgorny

Joi s a Trust & Safety expert and Metaverse veteran, who has spent the better part of the past two decades working on the bleeding edge of the technology & entertainment industries, from product management to content/ brand development to leading international data privacy compliance, community management & social media teams. Joi is currently working with OpenWeb as the GM of Trust & Safety, helping publisher partners reach their goals with our products, while helping define, create and promote an internet with healthier conversations.

 


TrustCon 2023

Defining the Space: Making the most of your Code of Conduct

 Thursday, July 13, 11 am PDT 
 Boardroom C 

Crafting an effective Code of Conduct is a complex endeavor. It involves defining the boundaries between harmless fun and genuine harm, taking into account diverse cultural norms and individual perspectives. Moreover, Codes of Conduct carry legal weight and must adhere to various regulations such as the EU's DSA, the UK Online Safety Bill, COPPA, and GDPR/CCPA.

In this workshop, Modulate CEO and co-founder Mike Pappas shared valuable insights and tactics for designing an impactful Code of Conduct. The conversation emphasized the need for collaboration, education, and careful consideration in designing effective Codes of Conduct to foster safe and inclusive gaming environments. Key insights and discussion points included:

  1. Audience Understanding: Codes of Conduct should cater to a broad audience, including regulators, legal departments, users, and internal teams, ensuring that all stakeholders comprehend the expectations.
  2. Balancing Transparency and Safety: While transparency is crucial, revealing certain information, such as user reports, must be carefully managed to avoid potential harm, as seen in online dating apps.
  3. Cross-Industry Perspectives: Learning from diverse industries like regulation, technology, travel, and media enriches the understanding of regulation and user conduct. Notably, gaming Codes of Conduct prioritize user-to-user interactions, while other industries focus on safeguarding customer support teams from community abuse.


ToxMod is gaming's only voice-native moderation solution.

Built on advanced machine learning technology and designed with player safety and privacy in mind, ToxMod triages voice chat to flag bad behavior, analyzes the nuances of each conversation to determine toxicity, and enables moderators to quickly respond to each incident by supplying relevant and accurate context.

Book Your Demo


TRUST & SAFETY Lately _ light

Sign up for Trust & Safety Lately

Stay up to date with the latest news, regulations, and industry trends that will impact Trust & Safety in games, from the experts at Modulate. Sign up today!