Ahead of the Live Service Gaming North America Summit 2025, we had the opportunity to speak with Carolyn Gudmundson, Director of Product at GGWP.

Carolyn shares her expert take on how studios can create strong engagement ecosystems that nurture player-led communities across multiple titles. She dives into the role of AI in proactive moderation, the importance of thoughtful onboarding, and strategies to build safe, inclusive, and enduring live service experiences.


1) What are the key elements of a successful engagement ecosystem that keeps players invested across multiple games in a portfolio, and how do community interactions play a role in that?

There’s a reason why live service games are multiplayer. The endless ways that we can interact with each other are what keep a game fresh and fun over long periods of time. While we can’t and shouldn’t try to overly control how players interact (after all, giving players space to be creative is what keeps things interesting), a key element in long-term success for any live service game is making sure that players are set up to have the best experience possible when interacting with each other.

Player interaction and dynamics between players are often overlooked in terms of business impact because it’s an inherently messier topic and can be more difficult to measure compared to something clear cut like A/B testing a new button color, but it’s worth digging into because the effects are significant. Especially when looking at players coming into a game for the first time, we see significantly lower retention rates among players who are exposed to toxicity in those early days.

2) What strategies have you seen work best for creating a player-driven ecosystem that supports both long-term engagement and a positive in-game culture?

When thinking about promoting positive in-game culture that drives long-term engagement, it’s important to think about the behaviors and interactions that enhance the gameplay experience rather than just focusing on the behaviors that detract from it.

Think about your game at its best, like that one match that becomes a cherished memory for everyone involved. From there, you can consider all of the different levers you have to maximize the chances of recreating that magic. How can the game design itself encourage positive play? What about the game lore? How do you onboard players so that they learn how to play in a way that makes the game more fun for everyone? New players may inadvertently act in ways that detract from other players’ experiences, and having some understanding of that learning curve and nudging them in the right direction with empathy. Ideally a player should never need to read a lengthy set of rules, because they’ve been onboarded in a way that viscerally demonstrates the benefits of positive behavior.

3) How can AI tools and automation support the workflows of Live Ops and community teams in their efforts—whether it’s in content moderation, player support, or delivering more personalized engagement experiences?

One big shift that AI and automation has enabled is moving from a passive approach to moderation to a more proactive approach. Back in the day, games relied solely on players to submit reports when other players were behaving inappropriately, and player support teams could only wait to receive those reports, and often couldn’t take action on them because they didn’t have the information to verify the report or didn’t have the time. It’s not just a bad experience for your moderation team, but a bad experience for players too. First, their game experience was negatively impacted by another player. Second, they had to take the time to file a report, likely only reporting after the situation had escalated. Third, they feel revictimized when their report is ignored.

AI tools can help support a more effective approach by proactively looking at user behavior and flagging issues. When you can proactively detect early signs of toxicity, it’s often possible to nudge players back on the right path by giving them feedback about their behavior rather than waiting until it gets so bad that you need to penalize them with mutes or game suspensions/bans. If you’re waiting until it gets bad enough that players are reporting the issue, it’s likely the player has already significantly negatively impacted others’ experiences or even caused other players to churn.

4) What are the key challenges in accurately identifying problematic behavior in real-time, and how can customized automation help moderators scale their impact?

Proactively detecting unwanted behavior is the first step. From there, player support teams can decide what to automate and what should be flagged for human review. The combination of AI and automation is so powerful because it can clear away all the noise that bogs down moderation teams and allow them to focus their time and energy on the cases that really need their attention. There are nuanced cases where not everyone will agree on what’s toxic or not, and we tend to focus on these examples because they’re interesting to talk about and debate. In reality, the bulk of toxic incidents aren’t so nuanced and interesting, and it’s helpful for mod teams to have an automated system they can trust to appropriately take action on clearly disruptive behavior.

When issues need human review, one of the biggest challenges is making sure a moderator has the context they need to make an informed decision. AI tools can organize and present relevant information that allows moderators to review issues more efficiently and effectively, like surfacing a player’s history of abusive behavior to give context to a recent incident, or flagging when a player might be abusing the reporting system to file fraudulent reports on other players.

5) As many game studios develop products aimed at younger audiences—like Riot’s 2XKO—how can we ensure these are non-toxic and safe environments for players?

Younger audiences need extra care when it comes to ensuring a safe environment, and this is where the proactive approach to moderation becomes even more important. Especially for cases like child grooming, you can’t rely on someone to report that behavior, especially when the target may not even know anything inappropriate is happening. By proactively monitoring for this type of behavior, you can stop it at the early signs before it escalates into a real-world threat. For example, if one player asks another player for their age it may be a totally normal offhand question, but if that same player has asked ten other players for their age in the last week, it suddenly doesn’t feel normal anymore. Only a proactive system can catch something like that and flag it before it escalates.

Overall, regardless of a player’s age, it’s important to make sure that toxicity doesn’t get in the way of having fun and connecting with others through playing games together.


Join Carolyn at the Live Service Gaming North America Summit! Download the agenda here and see what is lined up for this event.

Return to Home