Request a Demo Blog

Best Practices for Trust and Safety Officers: An Interview with the TSPA’s Co-Founders

November 4, 2020 | Image Moderation, Profanity Filter, Video Moderation, UGC

Trust & Safety

The Trust and Safety Officer

The trend towards doing business online and the need to address the safety and security issues this brings has led to the introduction of a role that until recently was behind the scenes: the trust & safety officer. This individual is typically assigned with helping to design and implement policies to give consumers confidence that their experience online will be a safe one – free from fraudulent transactions and unpleasant and potentially dangerous interactions.With the growth of the position of trust and safety officer accelerating in recent years, guidelines for the role are not always easy to find. Those filling this role may find it a challenge to determine which policies to implement and approaches to take in conjunction with any guidelines established by the enterprise they work for.

This article is intended to help address this gap by reviewing some of the roles and responsibilities of modern trust & safety officers and featuring an interview with Adelin Cai and Clara Tsao, co-founders of the Trust & Safety Professional Association (TSPA). Adelin and Clara use their extensive knowledge in the field to discuss a number of issues commonly experienced by T&S officers along with their thoughts on how to handle them.

T&S Initiatives and T&S Roles

The tremendous popularity of doing business online in recent years has been driven by technological developments such as increased computing power and changing consumer behavior. Given the challenges involved in monitoring and policing consumer activity online, the need to perform this task has increasingly led to the development of trust & safety initiatives at companies that do significant business online.

The logic behind attending to these issues is compelling; as this article contends, by helping their customers avoid negative impacts ranging from fraud to viewing objectionable material and more, companies can protect themselves from the hit to growth and, in the worst case, viability that can result from incidents which compromise a company’s trust & safety defenses.

Here at WebPurify, the officer we most commonly work with to address these issues is the “Head of Trust and Safety.” This article offers examples of some of the different roles, outside of general trust and safety officer, that can be considered to fall under the trust and safety heading.

Some of the roles featured include:

  • Escalation agent: Escalates incidents which present a threat to the safe functioning of a company’s online operations.
  • Investigator/threat analyst: Looks into threats related to a company’s operations, for instance, threats of attacks on the facilities of a company running public events such as concerts or sporting events.
  • Law enforcement response: Interfaces with law enforcement to resolve incidents and to handle legal requests.
  • Trust and safety engineer: This position typically refers to software engineers tasked with designing anti-fraud provisions or policies to protect a company’s customers.
  • Fraud analyst: The position can also include data analysts who search through company data for signs of fraudulent activity. Fraud analysts will typically identify problematic data which is then slated for further investigation.
  • Product risk specialist: This position focuses on the risks presented by a company’s products in terms of fraud, or customer exposure to unpleasant material. For instance, if a company offers a live streaming service, or other product that allows user generated content (UGC), this product presents trust and safety issues.

While the range of trust & safety positions can encompass a wide spectrum of roles, a main driver for the creation of the position has been the dramatic increase in UGC in recent years. While such content helps drive consumer engagement with a brand and its products, it also raises the risk of exposing a company’s customers to objectionable content, whether in the form of pornographic, excessively violent, or hate driven material.

Responsibilities of Trust and Safety Officers

Devising policies and procedures to prevent such material from being uploaded or created on company social platforms is typically a major responsibility of trust & safety professionals today. To accomplish this, a variety of methods can be used. Software tools utilizing advanced profanity filters and artificial intelligence can be implemented to prevent hate speech, violence, pornography and many other concerning types of content. However, while software of this type can be effective as a first layer of defense, human monitoring is often required to provide comprehensive protection from such content, given the ingenuity that hackers and other bad actors can bring to bear in terms of finding ways to circumvent automated content controls.

Human user generated content moderators, for instance, are often used by companies that allow user submitted images, videos, or comments on their platforms, to remove offensive material that has been submitted, block repeat offenders, and approve comments in cases where such approval is required prior to comment acceptance.

Today companies such as Twitter, Upwork, eBay, Airbnb and many more have “Trust and Safety” initiatives.

An Interview with the Leaders of the Trust & Safety Professional Association

As the industry has grown, a professional group representing trust and safety officers has emerged. The Trust & Safety Professional Association was formed to support the global community of professionals charged with developing and enforcing principles and policies defining acceptable behavior and content online.

TSPA serves as a forum enabling professionals in this field to connect with their peers, locate resources for career development, and exchange best practices that can be used to navigate the unique challenges facing the profession.

To learn more about TSPA and its leaders’ thoughts on current trends in the industry, we interviewed co-founders Adelin Cai, Chair, and Clara Tsao, Interim Co-Executive Director.

How did you become involved with the Trust and Safety industry?

Clara: I first touched trust and safety and content moderation when I worked at Microsoft and led a partnership focused on digital literacy and human rights in Myanmar. During this experience, I worked with community-based organizations and experienced content moderation language barriers leading to rampant online disinformation, offline chaos, violence, and eventually, ethnic genocide. Years later, I joined the US Government, where I worked with several trust and safety teams focused on coordinating a range of emerging threats including terrorist use of the internet, foreign influence operations, and election security.

Adelin: I ended up working in tech by accident, as a result of graduating with an advanced degree in the middle of the last recession. I went from contracting with Google to working full time on the Sales team, then transitioned into an advertising and merchant policy role.

How would you define the responsibilities/duties of a modern Trust & Safety officer? Have these responsibilities evolved since you began in the field? If so, how?

Clara: We define trust and safety as professionals who determine acceptable behavior or content online. Many people have a misconception that trust and safety is only focused on the content, what stays on and what gets taken down. However, as technology has become more sophisticated, the “behavior” of malicious actors has become even more important, alongside “who” these malicious actors are. In the case of Russian influence operations, more and more companies have had to evaluate the source of where content is coming from, and also indicators like “country of origin” which is especially important in proving out foreign influence operations and justifying takedowns. At Facebook, for example, the increase in behavioral based activity has led teams to counter not just content but also “coordinated inauthentic behavior”. Additionally, while trust and safety have often spun out of traditional customer support operational or legal/policy functions, over the recent years, there has now been an increase in engineers and product managers working under “trust and safety” in an earlier and more proactive way to prevent bad behavior from happening in the first place through safer product design.

Adelin: At a high level, the responsibility of a trust and safety professional is to think about the ways in which products can be misused or abused, and then figure out ways to prevent or enforce against that abuse. I don’t think this core responsibility has changed since I’ve worked in the field. What has changed is the way teams think about how they can prevent or fight abuse. Tooling and proportional enforcement have become more sophisticated over the years, and there is more thought to combating abuse beyond a binary “leave up”/ “take down” framework. I think there is slowly more of a willingness on the part of product teams to integrate trust and safety enforcement as part of the overall product experience, and not a post-abuse punishment.

What do you see as some of the most important factors or issues impacting the T&S sector currently?

Clara: For trust and safety professionals working in content moderation, the COVID pandemic and “shelter-in-place” has had a huge impact on the ability to safely moderate certain content at home. Many companies have strict privacy restrictions to protect the identity of users, thus making it more difficult to do remote moderation. Additionally, many professionals may not want to expose their loved ones or family members to toxic or potentially graphic content, especially when they are at home. Mental health and resilience are a lot harder to monitor for remote teams, which may spend hours working nonstop given the increased number of users online. In one recent Wall Street Journal article, average Americans are now spending 16:06 hours a day with digital media, 4 more hours than the average time-spent pre-Covid.

What are the main challenges you see affecting T&S professionals in their day-to-day work?

Clara: There are constantly emerging new challenges in the professional field, from dealing with pressures from new global regulation, to new threats from new hate groups or behavioral tactics, that make the day-to-day difficult. Additionally, many teams struggle with resourcing gaps, from third party tools, internal tools, and/or support from product and engineering teams to resolve.

Adelin: There’s so much scrutiny into the culpability of technology in creating real world harm and there’s enormous pressure on trust and safety professionals to somehow, in a vacuum, solve these problems that are deeply rooted in these broader societal issues. Hate speech didn’t originate online. It happens online but that online abuse is a mirror of what the real world is. Trust and safety professionals can only do so much with the tools they’re given, and they will continue to fight abuse to the best of their ability. A big challenge is that society keeps creating the content we expect trust and safety professionals to clean up.

What is the most important thing T&S professionals should focus on to help protect and preserve the quality of the customer experience?

Clara: Transparency in the decision-making process is important to develop and maintain the trust of customers (but often difficult to put into practice as there are often varied interpretations of what is considered acceptable policy).

Adelin: To be honest, their own health and psychological safety. You can’t be expected to do your best work when you’re stressed, tired, or burned out. It’s easy to experience all those things when you work on trust and safety.

While some UGC, such as sexual material, is clearly not acceptable on most platforms, how challenging is it for T&S professionals to determine what is and isn’t acceptable for customers to be exposed to in more ambiguous cases involving different political views and other such issues?

Clara: This is a very difficult challenge most trust and safety teams struggle with, and there is never a “right” solution that keeps everyone happy. One of our goals in building this organization is to help professionals navigate best practices from each other (and not have to recreate the wheel), and mapping those tactics with how it may fit within their own company’s community values, policies, culture, and environment.

Adelin: Extremely challenging. This is a huge part of the trust and safety policy work. The way in which content policies are defined will vary from product to product, depending on the mission and values that each team brings. Before being able to determine what is or is not acceptable, it’s imperative that teams align on core principles to help inform their decision making on edge cases. Once core principles are articulated and clearly documented, then specific enforcement criteria can be created to support at-scale, consistent enforcement. When edge cases arise, returning to the previously established core principles can be a good way of moving more quickly towards an enforcement decision.

Given that serious public health issues such as the novel coronavirus, vaccines, etc. can attract misinformation and false claims, how can T&S professionals effectively address such issues?

Adelin: Find credible experts and sources, and rely on their expertise to help inform your team’s policies and processes. You don’t have to be an expert on vaccines or global pandemics to create policies or processes that undermine the effectiveness of misinformation or false claims proliferated through your products. The solutions to mitigating this abuse will vary from product to product, and don’t have to be based on content removals alone.

What are the objectives of the TSPA?

Clara: TSPA is a new, non-profit, membership-based organization that will support the global community of professionals who develop and enforce principles and policies that define acceptable behavior online. TSF (Trust and Safety Foundation) will focus on improving society’s understanding of trust and safety, including the operational practices used in content moderation, through educational programs and multidisciplinary research.

Given the newness of the trust and safety officer position, how difficult has it been to explain the important role T&S professionals play to the press and other interested parties?

Clara: There has been increased interest in understanding the day-to-day work of trust and safety from policymakers, NGOs, researchers and other outside groups. Oftentimes there are many misconceptions or assumptions made about the profession that do not reflect reality. The role of our sibling non-profit organization, the Trust & Safety Foundation, is to improve society’s understanding of trust and safety through educational programs such as our weekly case-study series and investing in multidisciplinary research.

Adelin: While the function itself isn’t new, press coverage and growing sophistication of user understanding is shining a light on this previously less visible work. The great news is that people like reporters and other interested parties have a keen interest in learning more, so it hasn’t been challenging to explain the importance of the role. The challenge comes in describing how trust and safety is more than content moderation, especially because content moderation is very much in the public discourse at this moment.

What type of traction has the TSPA gained among T&S professionals?

Clara: We are still a fairly new organization (that did not launch till late June), so we will keep you posted here. Our current founding corporate supporters include professionals from Airbnb, Automattic (including WordPress.com and Tumblr), Cloudflare, Facebook Inc. (including Instagram and WhatsApp), Google (including YouTube), Match Group (including Tinder, Hinge, Match, and OkCupid), Omidyar Network, Pinterest, Postmates, Slack, Twitter, and the Wikimedia Foundation.

The Future of Trust & Safety

With companies and consumers increasingly moving to online interaction for everything from purchasing food to chatting with friends to consuming entertainment content, the growth of the trust and safety field seems only likely to continue gaining momentum. Given the wide variety of activities that fall under the umbrella of trust & safety, the responsibilities of trust & safety officers appear poised to continue to expand as the role takes on added importance due to increasing online activity.

In light of this, the founding of TSPA is well-timed to help provide those working in this growing field with resources that can help them keep up with the ever-increasing demands of the job.