Request a Demo Blog

Why global platform regulations are everyone’s business

March 17, 2025 | UGC

For brands trying to navigate the growing number of online safety regulations, it can feel like trying to solve a Rubik’s Cube while blindfolded. With the European Union’s Digital Services Act (DSA) and the UK’s Online Safety Act (OSA), not to mention an influx of new laws being discussed across Europe, Canada, and beyond, staying compliant is no small feat. The regulatory landscape is shifting fast, and every platform, from social networks to online marketplaces, needs to keep up or risk significant consequences.

That’s why we sat down with Ailís Daly, WebPurify’s Head of Trust & Safety for EMEA, and Alexandra Popken, VP of Trust & Safety, to break down what these regulations mean, where they’re heading, and what companies need to know before enforcement gets real.

Why global platform regulations are everyone’s business

The regulatory floodgates are open (but are they working?)

The past few years have seen a surge of legislative action aimed at curbing online harm and ensuring digital accountability. “There’s been a tsunami of regulations across Europe, the UK, Canada, Singapore, Australia, Sri Lanka – to name just a few,” Ailís says. “And let’s not forget potential legislation in the US around child safety with the Senate KOSPA bill.”

For European platforms, acronyms such as DSA, DMA, and the AI Act have joined GDPR in creating a complex puzzle of compliance. “It’s no longer just about protecting data,” Ailis explains. “The new wave is about online safety, fair marketplaces, and ethical innovation.”

But not all regulations are created equal. The DSA, for example, has been broadly welcomed as a necessary framework for platform accountability, Ailís points out, while Sri Lanka’s Online Safety Act has sparked concerns over the suppression of free speech. There’s a balancing act at play here, with regulators aiming to protect users while ensuring platforms remain functional and innovative.

The question now is whether these laws can be effectively enforced. Ailís points out that regulators have been rapidly evolving, bringing in industry experts to help shape enforcement strategies. “The proof will be in the pudding,” she says. “How regulators respond to risk assessments, audits, and rule violations will show whether these laws truly have teeth.”

Compliance isn’t just for big tech – why every platform should care

It’s easy to assume that these regulations are aimed at tech behemoths such as Meta, Google, and TikTok, but that’s a dangerous misconception, Ailís says. The rules are broad and apply to a wide range of platforms, including smaller social networks, forums, and e-commerce sites.

“While very large online platforms (VLOPs) have stricter obligations, smaller platforms aren’t off the hook,” Ailís explains. “Platforms with fewer than 45 million users still face numerous compliance requirements, such as user complaint mechanisms and transparency for advertising.”

Speaking of advertising, the DSA is particularly strict: no targeted ads based on minors’ profiles or sensitive personal data. Under the DSA, all platforms except the smallest – those employing fewer than 50 people and with an annual turnover under €10 million – must implement redress mechanisms, cooperate with trusted flaggers, and provide advertising transparency.

And the UK’s OSA brings a unique challenge: all in-scope providers must conduct risk assessments for illegal content and children’s access, regardless of size. For smaller companies, the deadlines are tight.

“Very importantly, in the UK, in-scope providers will have only until March 16, 2025, to complete their risk assessments,” Ailis warns. “That’s a tight deadline, and companies need to prepare now. The clock is ticking.”

What happens if companies fail? “Big fines,” Ailís says. “Under the DSA, 6% of global turnover and 10% under the OSA.” Ouch.

The reality of compliance: risk assessments, AI moderation, and appeals

These regulations don’t just ask platforms to do better – they require concrete workflows and processes. One example is the illegal content reporting requirement under the DSA.

Ailís walks through how this plays out in practice. “If a user in Europe spots illegal content on a platform, they must be able to easily report it. Whether it’s hate speech, illegal goods, or harmful imagery, platforms must have reporting mechanisms in place,” she explains. “Moderators must review the content, decide on enforcement, and communicate the outcome back to the reporter. And if enforcement isn’t applied, the user must have the right to appeal.

“It’s not optional – platforms must implement this or face severe consequences.”

This presents significant challenges for those platforms that lack the resources to build such infrastructure. All of a sudden, many platforms are faced with the task of implementing automated detection systems, establishing human review processes, and ensuring compliance with transparency requirements.

“Smaller platforms may not have the teams to handle risk assessments or build complex moderation workflows,” Ailis explains. “But these are now non-negotiables.”

Are regulators prepared for what comes next?

Regulators have long been accused of being out of touch with how digital platforms actually function. But Ailís believes that’s changing.

“I’ve sat across from UK and EU regulators, and they’re sharp. Many have industry backgrounds and understand the enforcement challenges,” she says. “Flexibility is the key and regulators must adapt as enforcement progresses.”

This is especially relevant given the growing trend of holding executives personally accountable. Under the OSA, CEOs could face criminal liability if their platforms fail to comply with the law. “We’re moving towards a world where executives will be held more accountable than ever before,” Ailís says.

Where WebPurify fits in: a compliance partner for the future

For companies struggling to navigate the new regulatory landscape, WebPurify offers more than just moderation tools – we can provide a strategic approach to compliance.

“Many organizations are encountering, perhaps for the first time, the real complexities of implementing moderation workflows at scale,” Ailís says. “We help companies not just meet their DSA and OSA obligations, but build scalable, future-proof Trust & Safety operations.”

From risk assessments and appeals processes to AI detection and human review workflows, WebPurify partners with platforms of all types to ensure they stay ahead of these compliance requirements without compromising user experience.

It’s important to know that these regulations are the beginning of a fundamental shift in how online platforms must now operate. Companies that take compliance seriously today will have a major advantage over those that wait for enforcement to catch up with them.

“There’s no substitute for getting this right the first time,” Ailís adds.

Want to understand how these laws impact your business? Reach out to us today for expert guidance in building a future-proof Trust & Safety framework.

Stay ahead in Trust & Safety! Subscribe for expert insights, top moderation strategies, and the latest best practices.