Star Stable Online is an online horse adventure game played by hundreds of thousands of young kids and teenagers every month. With a global community of players aged eight to 18, bullying is a persistent threat. We delivered SEL-led moderation strategies and in-game messaging for an AI-led pilot that measurably helped reduce toxicity and nurtured inclusivity.
Our recommendations supported an impressive 5% drop in toxicity. As a world first, the pilot headlined the Fair Play Summit at one of the industry's biggest annual events, the Game Developers Conference.
Read more about the outcome here.
Star Stable Online's moderators responded decisively to toxicity. But they weren't nurturing positive behaviors that build inclusive communities. We gave them communications tools anchored in SEL to do both.
These tools empower moderators in two ways. First, they help identify the emotions behind harmful behaviors. Second, they suggest responses moderators can use to nudge players toward positive behaviors.
We applied this principle in an experimental pilot project where an AI moderated the game's open chat. Players who were flagged by the AI for writing potentially harmful messages were shown our texts. These were non judgmental prompts that encouraged reflection and positive ways to share their thoughts.
By promoting empathy, we reduced toxicity and nurtured skills players need to be good members of the Star Stable Online community.