Request a Demo Blog

Internet safety for kids: why every parent should care about content moderation

September 11, 2023 | UGC

For most of us, the internet has been part of our lives less than 30 years, but in that time, it became fundamental to how society functions. In the age of smartphones, content streaming, and online retail, a life lived “offline” is hard to imagine. The internet brings with it many benefits, but also a number of well-known risks – everything from scams and privacy concerns to troubling or illegal content. Even for adults, the internet can be a daunting and overwhelming place, but children are particularly vulnerable to the dark side of online life.

For years, companies ranging from major social media platforms to online retailers have grappled with the problem of illegal or violative user-generated content (UGC), employing ever more sophisticated solutions in an attempt to keep their users safe. Most of the time, the average internet user might be unaware of UGC moderation, but it’s a vital – if largely unseen – aspect of online life, especially when it comes to internet safety for kids.

Content moderation: the invisible shield

Simply put, UGC moderation is the process of analyzing and, if necessary, deleting or censoring user uploaded content, in an effort to ensure that it meets a certain expected standard of behavior – typically outlined in a website’s community guidelines. Content moderation is the “invisible shield” that protects internet users – particularly children and other vulnerable people – from a host of problematic text, images, video and audio – or combination thereof. This includes everything from scams and bigoted content to explicit imagery or sales of illegal substances. Ideally, UGC content moderation removes this violative content quickly, often before it’s posted.

The secret to successful content moderation is a combination of factors: clear and transparent community guidelines, consistent enforcement of those guidelines using a combination of AI and human moderation, and appropriate user controls to establish privacy preferences and report inappropriate content. Leading content moderation services like WebPurify offer powerful “hybrid” moderation solutions, including both AI and human intervention to parse large amounts of content quickly and take action on posts that are likely to be problematic.

Internet safety for kids: why every parent should care about content moderation

Why internet safety for kids must be taken seriously by online platforms

Dr. Adam Pletter is a licensed clinical psychologist who specializes in the treatment of children, adolescents, and young adults from his office in Bethesda, Maryland. He also founded iParent101, a program aimed at helping parents and children safely navigate the digital world. Dr. Pletter spoke with us, explaining why internet safety for kids needs to be taken seriously.

What does internet safety for children look like?

It’s a clear compromise. The internet is essential to our children’s lives and enhances them in many ways, and it comes with a lot of potential dangers. The simplest, most direct analogy I can think of is to cars, and learning to drive. At a certain age, our children have an opportunity to learn to operate very useful and potentially very dangerous machines called cars. Parents should approach online safety for their children in the same way as teaching them to drive. It’s all about providing an environment where the child can learn and practice under some level of supervision until they can demonstrate appropriate, trustworthy, and safe behavior.

Why are children most at risk from harm online?

When it comes to keeping children safe in the digital world, there are three aspects to consider. The first is that children have an underdeveloped prefrontal cortex. You can think of that as the “braking system” of the brain, and it isn’t fully developed until around the age of 25. The second aspect is that children have an overactive emotional brain, and they’re less inhibited. They take risks, which aids in their trial-and-error learning about their world. So you’ve got weak braking and overactive emotions, and then the third thing is an unending, virtually unlimited amount of digital content on demand that is tailored to their interests.

Imagine the wackiest thing that a seven-year-old could think of. They can probably see a video about that thing with just a couple of clicks. Then imagine an older child looking up videos on nutrition, dieting, or health, maybe for legitimate educational and health reasons. They might very easily be led down a rabbit hole of disordered eating content, and all kinds of other concerning things presented as ‘helpful information’. A 12-year-old, with an underdeveloped regulatory system, is much more vulnerable to those kinds of influences than an adult might be.

Why parents should care about their child’s internet use

According to a study by UNICEF, “Globally, children and young people tend to become early users and prime innovators on the internet, and are often far ahead of their parents and other adults in terms of use, skills, and understanding.” Despite their technical skills, however, children are at high risk online from threats such as cyberbullying, predatory behavior, and inappropriate violent or sexual content. Often, they’re targeted in supposedly innocent social and “play” environments, including video gaming sites and social media. In cases where a child is more tech-savvy than the adults in their household, it can be hard for parents to identify when something has gone wrong.

Is your child at risk? Dr. Pletter suggests questions that parents should be asking:

  1. Is your child’s level of secrecy increasing? Do they seem like they’re hiding more from you, or lying to you?
  2. Are they more withdrawn?
  3. Are they more protective over their phone or tablet computer?
  4. Have they expressed to you that an online activity used to be fun, but now it’s not anymore?
  5. Do they appear to be sleep-deprived?

When it comes to proactive management of internet safety for one’s kids, Dr. Pletter’s advice to parents is to involve themselves in their child’s online activities, and turn access to online activities into opportunities for parent/child dialogue:

I always say that “parent” isn’t just a noun, it’s a verb too. Despite what they might say, our kids need monitoring, mentoring, and guidance. Just like any other complicated parenting situation, I recommend parents create an evolving, ‘level system’ where the child runs into certain boundaries or restrictions that slow or stop them, and help them regulate. And then, when the child wants to do something online, there can be a dialogue.

By creating a dialogue, where the child has to come to you in order to do what they want, you’re not stopping the child, you’re helping them slow down and think about what they’re going to do. And that helps the child practice regulating their thoughts and emotions.

How to check if the website your child visits is being moderated

If content moderation is the invisible shield that helps keep your child safe online, how does a parent ensure that a website has a robust content moderation policy or trust and safety plan? Below are some tips parents can use to check if a website their child visits meets these criteria:

Check the Website’s “About” or “FAQ” Sections
Often a trustworthy site will have information about its safety protocols and content moderation guidelines in its “About Us” or “FAQ” (Frequently Asked Questions) section. Look for details that explain how they handle inappropriate or harmful content and how they protect younger users.

Look for Age Guidelines or Ratings
Many websites and online services that target younger audiences provide age recommendations. These guidelines can often indicate that the website has some level of moderation in place to make it appropriate for a certain age group.

Read User Reviews
Third-party reviews or articles that discuss the website’s safety features can provide insights into how effective their content moderation is. Websites that are known for poor moderation will often have negative reviews from parents or watchdog organizations.

Check for Parental Controls
A website that offers parental controls likely takes safety and content moderation seriously. Parental controls allow parents to set restrictions on what their children can see or do on the website.

Search for a Community Guidelines or Terms of Service Page
Websites committed to safe online spaces often have Community Guidelines or Terms of Service that outline what is and isn’t acceptable behavior on the platform. Read through this to check if the platform has guidelines in place and if they are stringent enough to ensure safety.

Contact Customer Support
If you can’t find the information you’re looking for, consider directly reaching out to the website’s customer support team to ask about their content moderation policies. Their willingness and ability to answer these questions can be an indicator of how seriously they take this issue.

Observe the Content and Interactions
If the website involves community interaction (like forums or comments), take some time to observe the kind of content being posted and how quickly inappropriate content is removed. This can give you an idea of how effective their content moderation is. It’s worth creating an account yourself to do some exploring and vetting. Consider, too, nonchalantly asking your child about their thoughts on the site or app and why they like it so much. What they admire or criticize in their reply can be telling.

Look for Badges or Certifications
Some websites display badges or certifications from online safety organizations, such as National Online Safety. These indicate that the website has been vetted for safety features.

Check Privacy Settings and Data Handling Policies
Websites with strong content moderation usually also have robust privacy policies to protect their users’ data. Check how the website handles and protects user information to gauge their overall commitment to safety.

Stay Updated
Policies can change, so it’s a good idea to periodically review the safety features on websites your child frequents. Signing up for any safety alerts or newsletters from the website can keep you informed about any changes.

Taking the time to assess a website’s content moderation and trust and safety operations can go a long way in ensuring your child’s online experiences are both safe and enriching.

Meet WebPurify's moderation team, responsible for the detection of Child Sexual Abuse Material

 

WebPurify: a content moderation service built for the challenges of today

WebPurify works with thousands of companies, from startups to Fortune 1000 brands, gaming platforms to social media. We are driven by our mission to keep communities and vulnerable groups, including children, safe online. In addition to our text moderation AI models and profanity filter, WebPurify offers a powerful image moderation solution that combines AI and human moderators to deliver real-time detection of potentially violative images. Meanwhile, WebPurify’s live video moderation can review thousands of videos every day, based on either turnkey or custom criteria designed for a client company’s specific needs.

WebPurify equips brands with cutting-edge tools to mitigate multiple risks, from illegal to harmful or disruptive content, and conduct across all media formats – text, image, video, audio, livestreams, and even the metaverse.

In conclusion

Content moderation is an essential tool in ensuring internet safety for kids. In a world where children can access virtually any content they choose with just a few clicks, it is more important than ever that their safety is prioritized by parents and online platforms alike. Our content moderation services are invaluable in keeping children safe from harmful content online, but parents can and must also play a crucial part in reducing risk by monitoring their children’s behavior and encouraging a dialog about online activities.