Do’s and Don’ts of crisis communication on UGC platforms
October 30, 2023 | UGCA single moment of crisis can spiral into a reputational quagmire for UGC platforms if not properly managed, causing long-lasting damage to your brand and community trust. As anyone who’s been through such a baptism of fire knows, at the heart of any crisis management strategy is effective, clear communication — a principle thoroughly examined in our new eBook, The Challenge of Moderating Harmful Content in Real-Time.
In addition to spelling out WebPurify’s crisis response model (aspects of which can be emulated in your own company, should you choose to build a Trust and Safety solution internally) for when traumatic news events go viral on UGC platforms, our new eBook stresses the critical role of communication in these high-stakes scenarios, emphasizing that while technology is an invaluable tool, it’s the human element, enhanced by effective communication, that really counts.
Clear communication is just one ingredient of an effective response, and in this article we dive deeper into this subject, presenting a comprehensive list of Dos and Don’ts for crisis communication. These guidelines aren’t just about damage control; they are about establishing a dialogue that engenders trust, demonstrates empathy, and commits to actionable solutions for the safety of your users and the integrity of your platform.
How you communicate during a crisis determines whether you emerge with your reputation intact or severely tarnished.
Do… Have Clear Community Guidelines
Make sure you have a set of platform policies that establish clear boundaries on what is allowed and what isn’t. Clear guidelines serve as the backbone of any effective content moderation process, enabling your users to understand what is expected.
Do… Try to Anticipate Crises
Use automated alerts or dedicated teams to monitor breaking news or trends that could lead to your platform users sharing harmful content en masse. Being prepared allows you to act swiftly, minimizing the negative impact on your community. In situations like these, even 15 minutes’ notice of an event can make all the difference.
Do… Establish a Chain of Command
Make sure your team’s roles and responsibilities are clear and established in advance to allow for quick decision-making during a crisis. A well-defined hierarchy facilitates streamlined communication and actions.
Do… Train Your Moderation Team Quickly
As you’ll discover in the eBook, in WebPurify’s model our training guides are sent to the QC team an average of 15 minutes after crises appear on our radar. This helps content moderators prepare for types of content they should soon expect to see and that will likely bear hallmarks or identifiers of specific viral imagery. Quick training that enables quick identification of the violative content ensures your team is well-equipped to tackle the crisis as it unfolds.
Do… Post-Crisis Debriefing
After the crisis, hold a debrief to discuss what worked, what didn’t, and how to improve future responses. Learning from each incident enhances your future preparedness.
Don’t… Rely Solely on AI
While automated systems are crucial for moderating at scale, they can’t replace human judgment, especially in complex cases. The truth is that events in the news are a real gray area, even more so on news platforms themselves. AI can’t understand or communicate nuanced cases. Therefore, a hybrid content moderation model is the most effective in these moments.
Don’t… Delay Communicating With Your Users
Avoid long delays in public communication that could erode trust and create confusion. By communicating your platform’s stance quickly you can eliminate confusion and potentially stop some users from innocently sharing the harmful content. When traumatic events unfold in the news, the truth is that many people often share the unsettling scenes on social media thinking that they’re spreading the word, unaware of how the images or video may affect other users.
Keep your users and advertisers (where relevant) informed of what measures are being taken to protect them during a crisis, possibly through public statements or in-app notifications. Open communication helps to build trust and alleviate anxiety.
Don’t… Skip Mental Health Resources
Consider the mental health impact on your users and moderators, alike, and implement support mechanisms to help mitigate psychological harm. Wellbeing is a critical factor in maintaining user engagement over the longer term. People don’t want to visit someplace that feels unsafe. Not providing supportive spaces or resources for affected individuals can have long-term negative effects. Emotional support complements content moderation. For more on how to protect your front-line staff from harm, check out our eBook on Best Practices for Mental Wellness in Content Moderation.
Offer resources or spaces where users can discuss their feelings about what’s transpired and seek help, such as links to mental health resources, helplines, or emergency response services. Online platforms can be incredible disseminators of life-saving information during crises, and provide a way to share support at scale.
Don’t… Neglect Looking Ahead
A crisis can occur more than once. Let’s be honest: it will occur more than once. Avoiding planning for future instances is a missed opportunity for improvement. Always strive to refine your processes for future challenges. Additionally, consider that during anniversaries of tragic events, graphic imagery or other traumatic content might resurface on your platform – it’s prudent to keep an eye out for that.
WebPurify’s crisis response model serves as an invaluable resource for UGC platforms looking to navigate the increasingly complicated world of online content moderation. For a more detailed look into this approach, download our new eBook today.