Digital Services Act (DSA)

  • , par Paul Waite
  • 12 min temps de lecture

The Digital Services Act represents one of the most significant regulatory shifts for tech companies operating in the European Union since the 2000 e-Commerce Directive. If you run an online platform, marketplace, or any digital service reaching EU users, understanding this regulation is essential for staying compliant and competitive.

This comprehensive guide breaks down what the DSA means for providers, users, and the broader digital ecosystem. You’ll learn which services are covered, what new obligations apply, and how enforcement is already reshaping how major platforms operate.

Overview of the Digital Services Act

The Digital Services Act (Regulation (EU) 2022/2065) is an EU regulation that entered into force on 16 November 2022 and became fully applicable from 17 February 2024. Its core mission is straightforward: create a safer, more transparent online environment across the European Union while establishing clear rules for how intermediary service providers must handle illegal content and protect users.

The DSA establishes a comprehensive framework that complements the Digital Markets Act, though the two regulations serve different purposes. While the Digital Services Act focuses on content moderation, user safety, and transparency obligations for digital services, the DMA targets the market power of so-called gatekeepers to promote fair competition. Think of the DSA as addressing what happens on platforms, while the DMA addresses how platforms interact with competitors and business users in the single market.

The regulation covers a broad spectrum of online intermediary services. This includes basic internet infrastructure like ISPs, hosting services such as cloud providers and web hosts, online platforms including social media platforms and online marketplaces, app stores, and search engines. The DSA applies a tiered approach to obligations, recognizing that larger services with greater reach pose different risks than smaller platforms. Services reaching more than 45 million monthly active users in the EU—approximately 10% of the EU population—face the strictest requirements as very large online platforms (VLOPs) or large online search engines (VLOSEs).

What the DSA changes for users and citizens

The DSA strengthens fundamental rights online in ways that directly affect how you experience digital services every day. For children and other vulnerable users, the regulation introduces specific protections that platforms must implement, marking a departure from the era of minimal platform accountability.

One of the most significant changes involves transparency around content moderation decisions. When a platform decides to remove content or suspend your account, you now have the right to receive clear information explaining that decision. This isn’t just a vague notification—providers must give you specific reasons tied to their terms of service or national or EU law. If you disagree with a platform’s decision, you can appeal directly through an internal complaint handling system that the platform must maintain.

The DSA also restricts how platforms can use your data for targeted advertising. Social media platforms and other online platforms are now prohibited from profiling minors for advertising purposes. Additionally, platforms cannot use sensitive personal data—including information about your political opinions, religious beliefs, or sexual orientation—to target ads at you. These provisions aim to reduce manipulation and protect users from harmful content that algorithmic systems might otherwise amplify.

For users concerned about dangerous products, the regulation requires online marketplaces to implement stronger safeguards. Platforms must trace sellers to help address illegal content and counterfeit goods, making it harder for bad actors to exploit EU consumers. If you encounter hate speech, gender based violence content, or other harmful content, platforms must provide mechanisms for you to flag illegal content effectively.

When disputes arise that a platform’s internal processes don’t resolve, users can turn to out-of-court dispute resolution. Certified bodies in EU member state jurisdictions can review platform decisions, providing an alternative to costly litigation. Several member states have already established such systems covering major social media services.

What the DSA means for businesses and platforms

The DSA creates a harmonized rulebook across the EU single market, replacing the patchwork of national laws that previously governed how platforms must address illegal content. For companies operating across multiple member states, this means dealing with one set of rules rather than 27 different regulatory regimes—a significant simplification despite the new obligations.

Obligations under the DSA scale with a provider’s size and role in the digital ecosystem. Basic transparency and notice-and-action obligations apply to all intermediary service providers. Online platforms face extra duties around advertising transparency and seller traceability. Large online platforms and VLOSEs bear the heaviest compliance burden, including mandatory systemic risk assessments and independent audits. The European Commission designated the first 19 VLOPs and VLOSEs in April 2023, including platforms operated by Meta, Alphabet, ByteDance, and others.

Key operational impacts for tech companies include:

Requirement

What It Means

Updated Terms of Service

Clear content moderation policies explaining how platforms moderate content

Internal Compliance

Designated compliance officers and internal procedures

Risk Assessments

Annual analysis of systemic risks (VLOPs/VLOSEs only)

Ad Transparency

Disclosures about who paid for ads and targeting parameters

Dark Patterns Ban

No deceptive design that manipulates user choices

Recommender Systems Transparency

Explanation of how algorithms rank content

Non-EU businesses offering services to users in the EU must comply with the DSA and designate an EU legal representative. This requirement has prompted structural and governance changes at major tech companies headquartered outside Europe.

The compliance timeline created urgency for platforms of different sizes. VLOPs and VLOSEs faced additional obligations from August 2023, while all other in-scope services had until 17 February 2024 to achieve full compliance. Though compliance costs have increased—particularly for mid-sized providers—the regulation also creates a more level playing field where smaller platforms can compete on transparency and trust rather than just scale.

Scope: which providers and services are covered?

The DSA applies to all intermediary service providers offering services to users in the EU, regardless of where they’re established. This extra-territorial reach means a company based in California, Singapore, or anywhere else must comply if EU users can access its services.

The regulation distinguishes between several categories of providers, each subject to different obligations:

Mere conduit services handle the transmission of information without modifying it. Internet service providers fall into this category, as do providers of basic internet infrastructure. They enjoy the broadest liability protections but still must cooperate with national authorities.

Caching services temporarily store information to make transmission more efficient. Content delivery networks often fall into this category. Like mere conduit services, they benefit from liability exemptions when they don’t modify content and act promptly on illegal material.

Hosting services store information at the request of users. This broad category includes cloud storage providers, web hosting companies, and any service that holds user content. The DSA requires these providers to implement notice-and-action mechanisms for illegal content.

Online platforms represent a subset of hosting services that not only store but also disseminate information to the public. This category encompasses social media platforms, online marketplaces where consumers can conclude distance contracts with traders, app stores, sharing economy platforms, and travel booking sites. Online platforms face additional obligations around advertising transparency and user protections.

Online search engines that allow users to search websites based on queries form another distinct category, with major search engines subject to VLOSE requirements.

The threshold for very large online platforms and large online search engines is more than 45 million average monthly active recipients in the EU. Providers must publish their user numbers at least every six months, and the European Commission uses this data to designate VLOPs and VLOSEs.

Micro and small enterprises—those with fewer than 50 employees and less than €10 million in annual turnover—benefit from exemptions and lighter obligations. They’re still subject to basic rules around transparency and cooperation with authorities, but escape the more demanding requirements around risk assessment and compliance infrastructure.

Key obligations and compliance requirements

The DSA creates a tiered, cumulative system of obligations. All intermediaries face baseline duties, with hosting services, online platforms, and VLOPs/VLOSEs each bearing additional layers of responsibility.

Baseline obligations for all intermediaries establish the foundation. Every provider must designate contact points for users and national authorities, enabling effective communication. Transparency reporting becomes mandatory, with providers publishing regular reports on content moderation activities. Terms of service must clearly explain content moderation policies, including any use of automated tools. Providers must also cooperate with digital services coordinators when requested.

Hosting services and online platforms must implement notice-and-action mechanisms that allow anyone to notify them of allegedly illegal content. When hosting services remove content or restrict access, they must inform the affected user and explain the decision. Online platforms face requirements to prioritize notices from trusted flaggers—entities with demonstrated expertise in identifying specific types of illegal content.

Platform-specific rules address several areas of concern. Dark patterns—interface designs that manipulate users into unintended choices—are prohibited. Online advertising must be transparent, with platforms required to show who paid for each ad, why it was targeted at a particular user, and key parameters used in targeting. Online marketplaces bear additional obligations around trader traceability, requiring them to collect and verify information about sellers before allowing them to offer products to consumers. This traceability requirement helps address illegal content and dangerous goods reaching EU consumers.

VLOPs and VLOSEs face the heaviest regime. These providers must conduct annual systemic risk assessments examining:

  • Dissemination of illegal content through their services

  • Negative effects on fundamental rights including free speech and privacy

  • Risks to public health and impacts on minors

  • Effects on democratic processes, including election interference

  • Risks related to gender based violence, disinformation, and public security

Based on these assessments, very large online platforms must implement reasonable, proportionate mitigation measures. Independent audits verify compliance annually, with audit reports submitted to the European Commission. VLOPs must also provide data access for vetted researchers studying systemic risks, and maintain enhanced crisis response capabilities.

Enforcement and sanctions operate at multiple levels. Each EU member state must appoint a digital services coordinator—an independent authority primarily responsible for DSA enforcement nationally. The European Board for Digital Services facilitates cooperation among these coordinators and the Commission but lacks binding powers. For VLOPs and VLOSEs, the European Commission exercises direct supervisory authority.

Sanctions can be severe. Providers face fines up to 6% of annual worldwide turnover for violations. Repeated or serious infringements can result in interim measures or even temporary service suspension. The Commission has already initiated proceedings against platforms like X for alleged compliance failures, signaling active enforcement.

Legislative background, timeline, and future outlook

The DSA emerged from the EU’s recognition that the 2000 e-Commerce Directive could no longer address modern challenges like algorithmic amplification of harmful content, disinformation spread, and the outsized influence of major tech companies. The European Commission proposed the regulation on 15 December 2020 as part of a Digital Services Package that also included the Digital Markets Act.

The legislative process moved relatively quickly by EU standards. The Council of the European Union reached a general approach in late 2021, followed by European Parliament amendments in early 2022. Political agreement between the institutions came in April 2022, with formal adoption by the European Parliament and Council in October 2022. The regulation entered into force on 16 November 2022, twenty days after publication in the Official Journal.

Implementation followed a phased approach designed to give the largest platforms early deadlines while allowing more time for smaller providers. The Commission designated the first 19 VLOPs and VLOSEs in April 2023, based on user numbers submitted by platforms. These included major services from Meta (Facebook, Instagram), Alphabet (YouTube, Google Search, Google Maps, Google Play, Google Shopping), ByteDance (TikTok), Microsoft (Bing, LinkedIn), Apple (App Store), Amazon, Booking.com, Pinterest, Snapchat, Wikipedia, X, and Zalando. These designated platforms faced new rules starting in August 2023—four months after designation. All other in-scope services became subject to the full DSA from 17 February 2024.

Early enforcement actions demonstrate the regulation’s practical impact. The European Commission initiated proceedings against X in late 2024 over concerns about illegal content dissemination and transparency failures following the platform’s rebranding and policy changes. TikTok faced scrutiny over child safety measures and features potentially promoting addictive behavior. Amazon has been probed regarding illegal product listings. National authorities have begun similar investigations within their jurisdictions.

Looking ahead, the DSA is likely to exert the “Brussels effect”—influencing digital regulation globally as platforms standardize practices to meet EU requirements. The regulation interacts with related frameworks including the Digital Markets Act, the forthcoming AI Act, and national initiatives like the UK Online Safety Act. As member states build enforcement capacity and the transparency database accumulates moderation data, expect more granular oversight of how platforms moderate content and protect users.

Debates continue about balancing online safety with free speech, the technical challenges of algorithmic transparency, and whether enforcement resources match regulatory ambitions. What’s clear is that the era of largely self-regulated tech companies setting their own rules in Europe has ended. The DSA creates enforceable standards that platforms must meet, with significant consequences for non-compliance.

Understanding these requirements now positions your organization to navigate the new obligations effectively—whether you’re running a small hosting service or managing compliance for a platform reaching millions of EU users.


Connexion

Vous avez oublié votre mot de passe ?

Vous n'avez pas encore de compte ?
Créer un compte