Are you prepared for the new online safety laws? Understanding the DSA & OSA
December 16, 2024 | UGCImagine you’re a small but growing e-commerce platform, juggling product listings, customer reviews, and advertising campaigns. Suddenly, you’re hit with a fine because your platform doesn’t meet new transparency requirements. Or you’re one of the global social media networks we work with, scrambling to keep up with regulations that demand a complete overhaul of your content moderation workflows.
These are the real-world challenges WebPurify’s clients face as they learn to navigate the increasingly complex web of global digital regulations that are being rolled out by governments across the globe. From the Digital Services Act (DSA) in the EU to the Online Safety Act (OSA) in the UK, online platforms must now adhere to strict rules that prioritize user safety, transparency, and accountability – and, increasingly, each country in which you operate will have its own rules and regulations.
So, how can platforms, both large and small, prepare for this regulatory shift? Our Head of Trust and Safety for EMEA, Ailís Daly, and VP of Trust & Safety, Alexandra Popken, helped to break down what you need to know about the DSA and OSA in this video presentation, as well as talking through the challenges they bring.
What is the Digital Services Act?
The DSA is a landmark EU regulation that aims to create a safer and more transparent digital space. It applies to a wide variety of online services, from social networks to marketplaces, with stricter requirements for platforms classified as very large online platforms (VLOPs) or very large search engines (VLSEs) – those with more than 45 million EU users. Key objectives of the DSA include:
- Protecting users’ fundamental rights online.
- Regulating content moderation and platform transparency.
- Banning targeted advertising for minors and sensitive categories such as religion or sexual orientation.
“The DSA’s ultimate objective is to protect users by safeguarding fundamental rights online, establishing a powerful transparency and accountability framework for online platforms, and providing a single uniform framework across the EU,” Ailís says. “So organizations to which the DSA applies are required to provide greater transparency on their services and adopt procedures for handling takedown notices, informing users in certain circumstances, and addressing complaints. They must also refrain from certain practices, such as profiling. For example, you cannot target minors or vulnerable individuals with advertising. And there’s also a requirement to improve controls for service users.”
What is the Online Safety Act?
The UK’s OSA shares similar goals to the DSA, but has its distinctions. For instance, the OSA imposes risk assessment obligations on all in-scope platforms, regardless of size. “This framework introduces a common horizontal, harmonized rulebook that’s applicable to all,” Ailís explains. “To be fair, I think that’s to be welcomed.
“These regulations all aim to provide greater online safety, ensuring a fair digital marketplace, and putting guardrails in place for online technology innovation.”
Key compliance requirements
Both the DSA and OSA set out robust frameworks that require platforms to implement various systems and processes:
1. User reporting and transparency
Under Article 16 of the DSA, platforms must offer mechanisms for users to report illegal content and appeal decisions. Ailís explains, “Perhaps they see a user-generated review that includes illegal hate speech…or they observe an account profile that has a link to illegal goods and services. The user must be able to report it and the platform must take appropriate action, whether that’s removing, geo-blocking, or permitting the content.”
2. Advertising restrictions
The DSA bans targeted advertising to minors and prohibits the use of sensitive personal data for profiling-based ads. These rules complement the EU’s General Data Protection Regulation (GDPR) but go further in setting stricter guidelines for platforms.
For more on advertising, see our ebooks on The Unseen Side of Advertising and Influencer Marketing, Online Shopping & Our Children.
3. Risk assessments
Both regulations require comprehensive risk assessments, though the scope varies. The DSA focuses on systemic risks for VLOPs, while the OSA mandates risk assessments for all platforms. As Ailís notes, “The biggest challenge for risk assessments is the time, planning, and resources required to do them properly. It is not a one-person role.”
Challenges for smaller platforms in maintaining compliance
While larger platforms often have dedicated compliance teams, smaller platforms face unique challenges in meeting these requirements. Under the DSA, even platforms employing fewer than 50 people must implement measures such as trusted flaggers, third-party vetting, and user-facing transparency for advertising. But compliance within limited budgets and smaller teams can be daunting.
To complicate matters further, platforms in the UK have only three months to comply with the OSA after Ofcom, the UK government-approved regulator, publishes its guidance.
How WebPurify can help
“Not every company required to comply has the deep expertise in scaled moderation that WebPurify does,” says Ailís. “Many organizations are encountering, perhaps for the first time, the real complexities of implementing automated detection systems, establishing robust human review processes, and ensuring transparency.”
The DSA and OSA mark a significant shift in the regulatory environment for many online platforms. Compliance is no longer optional, but a business-critical necessity – and many brands are having to figure this out on the fly.
Read more about our Trust & Safety consultancy services.