Gaming addiction or moral panic? Why the real threat to players is poor moderation
June 5, 2025 | UGCIn April 2025, a video claiming that gaming is as addictive as drugs went viral across social media. Published by a group called Game Quitters, a community for “people who want to quit playing video games and get their life back,” the video featured popular influencers like Mr. Beast and claimed to expose the gaming industry’s darkest secrets. It ticked all the moral panic boxes. And just like clockwork, media outlets followed suit: warning parents to unplug the Xbox, policymakers to take note, and developers to brace for another cycle of public scrutiny.
Dr. Rachel Kowert has seen this play out too many times before.
“Games are just the latest in a long line of scapegoats for broader social anxieties,” says Dr. Kowert, a research psychologist, author, and founder of Psychgeist, a multimedia studio that explores the science of games and pop culture. Dr Kowert is a global thought leader in trust and safety, mental health, and the psychology of play, and a visiting researcher at the University of Cambridge. “If something bad is happening, humans want a simple explanation. And gaming makes an easy target.”
But Dr. Kowert isn’t having it. In a conversation with Ailís Daly, Head of Trust & Safety for EMEA at WebPurify, an IntouchCX company, she unpacks why the addiction narrative persists, why it’s harmful, and what trust and safety teams should actually be focusing on instead.
The myth of gaming addiction
First, let’s get this out of the way: “Gaming addiction” is not a recognized clinical condition.
“It’s a really catchy phrase,” says Dr. Kowert, “but from the scientific perspective, it’s not something that’s been consistently found or reported in the research.”
That doesn’t mean there aren’t people who use games in a way that becomes problematic, just as someone might binge food, work, or social media as a coping mechanism. But that’s a far cry from the kind of compulsive, chemical dependence that defines addiction.
The harm in framing gaming as inherently addictive, Dr. Kowert argues, is that it misdiagnoses the issue. “If I have a child and I think he’s addicted to games, my first instinct might be to take the games away. But what if those games are actually his only bridge to connection, or a much-needed stress reliever?”
The games, she explains, are often not the root problem but a symptom. “Maybe it’s depression, or anxiety. Games become a coping tool. Taking that away might do more harm than good.”
Scapegoating games distracts from the real issues
Part of the reason the addiction narrative sticks is its simplicity. Media thrives on digestible villains. But that simplicity comes at a cost: it diverts attention from the actual trust and safety challenges inside gaming communities.
“When it comes to real harms in games, we should be talking about social harms,” says Dr. Kowert. Harassment, toxicity, hate speech. These are the things doing measurable damage, she argues.
And unlike addiction, which remains a contested category in psychology, the evidence for these social harms is overwhelming.
Good moderation is a business advantage
So what works? According to Dr. Kowert, the best strategies are multi-layered. Effective moderation doesn’t rely on a single tool or policy; it requires alignment between systems, people, and platform design.
“Historically, policy teams didn’t talk to product teams, and moderators weren’t trained based on platform-specific risks. But that’s changing,” she says. “We’re moving out of those silos.”
She points to a success story from developers Activision and Modulate. By integrating real-time voice moderation into Call of Duty, they significantly reduced disruptive behavior. Not only that, they grew their player base and improved retention.
“It’s a perfect case study in ROI,” she notes. “Too often, trust and safety initiatives are seen as cost centers. But good moderation is a smart business decision.”
AI can help, but it’s not a silver bullet
There’s growing excitement about AI-powered tools for content moderation, particularly in gaming, where speed and scale matter. But Dr. Kowert urges caution. If AI tools are used without human oversight, they may reinforce biases or make opaque decisions without recourse.
“AI shouldn’t be correcting its own homework,” she says. “We need humans in the loop.”
Dr. Kowert stresses the importance of context. While automated systems can flag slurs or detect keyword patterns, they still struggle to understand nuance, sarcasm, or the intent behind a phrase. Without that layer of human judgment, platforms risk enforcing moderation decisions that feel arbitrary or unfair.
Kowert also points out that fairness and safety are especially complex in gaming because player cultures vary so dramatically from one title to the next. “In some games, trash talking or saying things like ‘I’m going to kill you’ is not only accepted but expected,” she says. “In others, that same behavior is completely unwelcome.”
This variation, she argues, makes it nearly impossible to set a universal moderation standard, and reinforces the need for human input.
“Social media might have more consistent community norms,” she explains. “But with games you need a culturally specific approach. What is acceptable in one gaming community may not be acceptable in another and you need humans to understand the context behind what’s being said.”
Proactive enforcement is cultural, not just procedural
“You can’t apply a one-size-fits-all approach,” says Kowert. “Fairness and safety have to be balanced based on the game’s culture.”
That’s where proactive enforcement — moderation that prevents harm before it happens — comes in. One underrated tool? Community guidelines.
“We could do a lot more with guidelines,” she says. “They should set the tone for what kind of culture you’re trying to create. And they should be communicated in a way that players actually absorb.”
Gaming companies, she suggests, could even use game mechanics to teach community norms. “Why not integrate it into the gameplay itself?”
Players need feedback and transparency
One of the most overlooked aspects of content moderation in gaming is feedback. Players want to know their actions matter. If they report someone for harassment, they want to know what happened.
Some platforms are starting to get this right. Dr. Kowert highlights Marvel Rivals, which sends users a message not just confirming their report, but explaining what action was taken.
“That’s a game-changer. It builds trust, and it encourages players to keep using the tools.”
Transparency, she says, is fundamental. Without it, people assume that nothing is happening behind the scenes, which erodes confidence in the system.
Why this narrative needs to change
Every time a public tragedy strikes, the media combs through the perpetrator’s past looking for a digital scapegoat. Inevitably, someone discovers they once played a violent video game. Cue the headlines.
“It’s gotten farcical,” says Dr. Kowert. “When the CEO of United Healthcare was shot, the headline was that the shooter played Among Us. It’s a cartoonish game for 10-year-olds. But that’s where the media wanted to go.”
Her frustration is palpable, not just as a scientist, but as someone who sees how these narratives harm players, parents, and developers alike.
If she could rewrite the narrative around games and mental health, Dr. Kowert says it would be this: Games are overwhelmingly positive, or at worst, neutral. Let’s stop pretending they’re society’s boogeymen.
What gives her hope
Despite the misinformation, despite the clickbait, Dr. Kowert remains optimistic. “The people working in trust and safety are some of the best people I’ve ever met. They genuinely want to make things better.”
That collective energy is what makes her hopeful for the future of gaming communities.
“We’re finally having the right conversations. We just need to make sure the public is listening.”
Let’s change the conversation
At WebPurify, we believe that protecting players means moving beyond headlines and panic. It means building moderation systems that reflect real community norms, combining smart AI with human insight, and giving players the transparency they deserve.
Want to explore how your platform can take a proactive, culturally aware approach to trust and safety? Get in touch with us or the team at IntouchCX to learn more.