On 16 December 2020, the European Commission (EC) delivered on the plans proposed in the European Digital Strategy, by publishing two law proposals related to the governance of the provision of digital services in the EU: the Digital Services Act (DSA) and the Digital Markets Act (DMA). While the DMA is intended to create a framework to regulate the behavior of so-called “gatekeepers”, the DSA aims to “define the responsibilities and obligations of providers of digital services, and online platforms in particular”, through regulating how providers of intermediary services interact with their customers and users, as well as their obligations regarding harmful or illegal content.

1. Contextual Grounds for Revision

In addition to the proliferation of cheap computing, accessible broadband, and ubiquitous mobile technology, three combined regulatory choices explain and marked the internet development from the 90s onwards: low regulatory intervention; free e-commerce, in the EU attached to the country-of-origin principle; and the liability exemption regime upon the fulfillment of certain conditions. The core of the third assumption rests on the actual neutrality of intermediaries to secure a free internet, which should be provided a high degree of protection so that intermediaries are only liable when actively moderating, posting or made aware of specific illegal content which they refuse to remove.

This model has been put to a test by all sorts of technological and socio-political developments underpinning the digital revolution that followed the proliferation of internet use. With increased importance of digital services intermediating the trade of goods and services and information sharing, illegal activities and other societal risks have also emerged. Direct “exchanges” between users & incentives to reach and engage wide audiences, underpinned by the “attention economy” and service design, as well as the systemic role in optimizing and selectively amplifying and shaping information flows online have led to the emergence of new business models, as well as to the proliferation of potential systemic societal risks, such as the amplification of illegal content, negative effects on the exercise of fundamental rights, and the intentional manipulation of the systems, leading to disinformation to health crisis, elections or political participation, instigation to violence and self-harm, etc.

2. Specific Policy Objectives and General Outline

Dynamic grounds for a regulatory revision such as resolving the outdated nature of the overall internet regulation, harmonizing intermediaries´ obligations, increasing accountability, inducing more and better content moderation, or increasing transparency in B2C and B2B, are all subsidiary to three decisive policy objectives: first, to move from a non-interventionist model to a more proactive regulatory model, proportionate to the intermediaries’ “role, size and impact in the online ecosystem”, while offering a technology neutral modernized version of intermediary liability, and tackling current problems (fake news, IP violations, disinformation, etc.) while upholding fundamental rights (freedom of expression, right to information, to privacy, etc.); secondly, to institutionalize a structured dialogue between policymakers and specific categories of intermediary services such as online platforms, rebalancing the informational level-playing field, and carving a ladder of tools giving national authorities at EU level; and thirdly, to bring a greater level playing field for European companies that face unfair competition that do not respect European standards, modelling the extraterritorial effects from the GDPR, in that the DSA applies to any entity establishing a relationship with European users, regardless of the place of establishment.

More concretely, the DSA updates some parts of the E-Commerce Directive (ECD), while maintaining key principles that contributed to the development of the digital economy in the single market, such as the country-of-origin principle1, the ban on general monitoring and the limited liability regime, with slight adaptations. However, the DSA goes beyond harmonizing the liability exemption rules by introducing a new framework of due diligence obligations for a specific category of intermediaries – online platforms, while also relying extensively on PR damage for incentivizing compliance. Ultimately, its success lies on striking the right balance between competing fundamental rights, as the drive to make internet more responsible, safer and more accountable easily leads to clashes and accusations of censorship.

The internal structure of the DSA is divided into three major parts: 1) updating and harmonizing the conditions for liability exemption of intermediary services on Chapter II – answering the goal of protecting internet freedom and freedom of expression, 2) due diligence obligations for specific categories of intermediaries on Chapter III – answering the goal of increasing operators’ transparency and responsibility, and 3) a comprehensive enforcement setup and rules on Chapter IV – to ensure an enforcement mechanism allowing for a structured dialogue with national and European authorities.

3. DSA key elements & innovations:

  • The DSA proposes a gradual regime, inserting a scale of obligations depending on the type and size of intermediary services – the due diligence obligations based on the public role of specific intermediary services as public spaces.
  • To accommodate the structure underlying the due diligence obligations, a new definition of “Online Platforms” was introduced within the list of intermediary services, as those services that not only host, but disseminate content/information to the public at the request of the recipient of the service. Online platforms do not lose protection from liability, but are subject to extra obligations, the violation of which does not result in a civil lawsuit, but in administrative measures, fines, and potential suspension of services. This means the creation of a double-edged regime of liability: a general protection (unchanged since ECD) and a set of new obligations concentrating on increased responsibility. However, complying with these obligations is not a condition for benefiting from the liability exemption, as they are intentionally separate entities of the DSA.
    • Within “online platforms”, the concept of Very Large Online Platforms (VLOP) is introduced, stipulated quantitatively – 10% of the European population. These are subject to a series of obligations that go beyond moderating illegal content and tackle in fact harmful content. Examples:
      • Audits to application of due diligence obligations, at their own expense
      • Assessment of systemic risks and application of risk prevention and mitigation measures, with national and European authorities
      • Transparency rules on recommender and advertising systems.
  • Obligation for all hosting service providers to put in place an EU-wide harmonized notice & takedown procedure (N&A), although not harmonizing the grounds for acceptable notification and not providing with the stay-down or specific time frames for content removal. N&A are considered as a minimum requirement necessary to ensure users can effectively flag allegedly illegal content they encounter online.
  • Illegal vs Harmful Content – the DSA is neutral as to the type of content per se. However, it sets tools on Chapter II and III for targeting both illegal and harmful content. While there is some harmonization at EU level, larger parts are in national hands. In fact, the DSA is designed in a way that leaves space for the Union or national legislator to provide for specific rules on specific areas that require different or complementary regulatory approaches.
  • Terms & Conditions of “Online Platforms” – must be coherent with the Charter of Fundamental Rights, according to which they can make reasoned removal or blocking decisions about content or users, with internal complaint mechanisms. They will also need to be transparent as to the “policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review”.
  • The traceability of online vendors on marketplaces, or the Know Your Business Customer (KYBC), to fight piracy and counterfeiting online.
  • One of the new features included in the DSA has to do with the privileged role given to trusted flaggers for detecting illegal content. This provision only attributes the status of trusted flagger to some entities, leaving out of scope others with extensive experience, especially on counterfeit products and piracy. Online platforms must ensure notices submitted by trusted flaggers are processed with priority, while they would not benefit from a “priority channel” from micro and smaller service providers.
  • On online targeted ads: nowadays the business model of most online platforms is based on ads. The DSA seeks to introduce greater transparency on how the online ads work, especially the targeted ads, including commercial and political ads.
  • Finally, the enforcement governance – The proposal provides for the establishment of Digital Services Coordinators (DSC), an EU Body for Digital Services, as well as additional powers for the EC for supervising VLOPs.

4. General Intermediary Liability and its Scope

The DSA targets online intermediary services. Mirroring Articles 12-15 of the ECD, the DSA applies to the following sublets of intermediary services:

  1. Mere conduit”: Internet exchange points, Wi-Fi access points; Virtual Private networks, Voice over IP, and other interpersonal communication services.
  2. Caching”: content delivery networks, reverse proxies, content adaptation proxies2.
  3. Hosting”: cloud computing, web hosting, services enabling sharing content and information online, file storage and sharing.
  4. Online platforms” such as online marketplaces, app stores, collaborative economy platforms, and social networks are explicitly included among providers of hosting services. However, these operators are defined as a special category of intermediaries for the first time. The distinctive factor is that online platforms not only host, but also disseminate content. Within the online platforms, a concept of VLOP is introduced, based on the number of European users. For benefiting from the liability regime, these platforms should act as any other hosting service.

On the liability regime, there are only two major novelties, as compared to the ECD:

  • The “Good Samaritan clause” – essentially, voluntary own-initiative (not mandated) measures aimed at detecting, identifying, and removing, or disabling access to, illegal content or measures to ensure compliance with EU law do not lead to unavailability of liability exemption of providers of intermediary services. Key to note that technological development has made the traditional distinction between active and passive obsolete, as pretty much all online platforms optimize content nowadays.
  • The introduction and partial harmonization of minimal procedural information obligations of providers upon receiving an order by national or judicial authorities. None of these information obligations are empowering provisions, in the sense that the legal obligation to comply with the orders as such should be based on national laws. This may lead to legal uncertainty in terms of the application of the Country-of-Origin Principle, in the sense that these orders can be issued by all Member States, and not only in the Member State of establishment of the provider3. However, in line with what was said, the enforcement of the information obligations is different that the enforcement of the content of the order as such.

5. Due Diligence Obligations

Obligations under the Digital Services Act

Image Credit: https://www.connectontech.com/wp-content/uploads/sites/38/2021/01/Baker-McKenzie-Digital-Services-Act.pdf

Chapter III contains the bulk of what is truly innovative in the DSA. The idea that operators need to act more responsibly is translated into extra, cumulative obligations, moving to an altogether more proportionate set of different regimes according to role, size and impact in the online ecosystem. This indicates the regulator´s goal of inducing proactive actions through a spectrum of administrative and procedural obligations, while keeping the insulation from liability partially in place.

The first level of obligations applies to all intermediary services, imposing transparency obligations, the duty to cooperate with authorities, and the alignment of their terms & services with the Charter of Fundamental Rights.

The second level adds the N&A mechanism to the first level of obligations.

The third level focuses on online platforms and adds to the first and second level of obligations a more detailed set. Worth stressing that micro and SMEs are exempted from the third-level obligations, which give proper recourse against takedown notices in the previous section, but also introduce the legal basis for the suspension of users, as well as some consumer protection obligations, as the KYBC.

The fourth level sets obligations for VLOPs. Differently than the DMA approach, which seeks to prevent platforms´ anticompetitive behavior towards businesses and end users, the DSA takes a conceptually different approach. The cluster on VLOPs essentially addresses systemic risk assessments & mitigation specifically on content moderation systems, recommender systems and systems for selecting and displaying advertisement. In addition, further transparency obligations are imposed in terms of the use of recommender systems and online advertising, and VLOPs will need to provide the data necessary to monitor and assess compliance upon request by the DSC. To finalize, Codes of Conduct and crisis protocols are also introduced as encouraged modes of self-regulation, and the EC is also supporting voluntary industry standards in a number of specific areas.

6. Implementation, Cooperation, Sanctions and Enforcement

The DSA introduces a set of detailed and complex enforcement measures and mechanisms, establishing a common European enforcement framework based on a mechanism with three levels:

  • National – creation of a Digital Service Coordinator (DSC) per Member State, the primary supervisors, and enforcers, with powers of investigation and enforcement at a national level. The DSC of the Country of Establishment is the entity responsible for the purpose of enforcing Chapter II, III and IV. Each DSC has the duty to enforce the DSA where appropriate vis à vis providers established in their territory regardless of the place where the infringement or its effects take place. To support this overarching objective of the enforcement system, and to ensure that the DSC of origin is aware of all elements at stake when enforcing the rules, the DSA provides different tools to ensure cooperation between the DSCs in the country of destination of the service and those in the country of establishment, including the cooperation mechanism (Article 45), the rules on handling of complaints (Article 43), the possibility to set up join investigations (Article 46), as well as additional enforcement and cooperation tools in case of VLOPs (Article 50 and following). On the other hand, the DSA provides specific mechanisms to trigger the action of the DSC of establishment where deemed necessary. First, both the Board and the DSC of destination can respectively recommend to or request the DSC of establishment to take measures to ensure compliance with this Regulation, and report within predefined timelines. In case of disagreement, moreover, the DSC of establishment and the Board have the possibility to refer the matter to the Commission, which can request that specific investigative or enforcement measures are taken by the DSC of establishment. Finally, where the DSC of establishment has not taken any investigatory or enforcement measures concerning VLOPs pursuant to the request of the Commission, the Commission may take over the enforcement.
  • European Board of Digital Services – an ad-hoc independent advisory group to the EC and the DSC, tasked to recommend actions, gather views, etc.
  • European Commission – responsible for cross-border cooperation, with complementary enforcement and investigation powers vis-à-vis VLOPs, and for structuring the EU framework of the regulatory community. The involvement of the Commission in the enforcement of DSA obligations is envisaged in view of the particular challenges that may emerge in relation to assessing and ensuring the VLOPs compliance, for instance relating to the scale or complexity of a suspected infringement or the need for particular expertise or capabilities at Union level. This justifies the possibility for a given Digital Services Coordinator of establishment, to request, on a voluntary basis, the Commission to intervene and exercise its investigatory and enforcement powers. Similarly, the Commission should also have the possibility to intervene in cross-border situations concerning very large platforms where the Digital Services Coordinator of establishment did not take any measures to address any infringement of the DSA despite the Commission’s request.

The Member State in which the main establishment of the provider of intermediary services is located have jurisdiction for enforcement purposes. For non-EU providers offering services in the EU, the place of the legal representative is the place of enforcement. If no such representative is designated all States have the competence but repetitive proceedings in other States are prevented through a system of notification.

DSCs are given significant powers. Any actor with relevant information (not only intermediaries) about infringements may be required to cooperate. Even more significant is the power to conduct on-site inspections of any premises on premises used by intermediaries, including the right to “examine, seize, take or obtain copies of information relating to a suspected infringement in any form, irrespective of the storage medium”.

The penalties system is thoroughly reformed. Member States are tasked with laying down rules on effective, proportionate and dissuasive penalties. Although penalty amounts are not harmonized, the DSA stipulates that these may not be greater than 6% of the annual income or turnover, or maximum total under 5% for periodic penalties. Penalties for inaccurate, misleading, etc. information are set at 1% or less of the annual income or turnover.

VLOPs are subject to a separate and detailed regime of enforcement. If a DSC finds such a platform to be in violation of any obligation specific to VLOPs, it will be first made subject to an enhanced supervision regime. The infringement finding may come out of the DSC own motion or upon request by the EC or the Board. The result is a request to form an action plan (and possibly a code of conduct). If the action plan is unsatisfactory, further independent audits might be ordered. The final step is a finding by the relevant DSC as to whether the problems have been rectified. If the answer is negative, an action by the EC remains a possibility.

While national DSCs are the main enforcement actors for general issues, very significant powers have been given to the EC in relation to VLOPs. The EC may initiate enforcement proceedings of its own motion or following various investigation points also regulated. If so, the EC has independent investigatory powers like those DSCs have in respective cases. The VLOP under investigation may offer commitments to address the flagged problems. If the EC accepts, the case is closed but can be reopened in cases where new facts arise, the commitments are breached or if it turns out that false information had been used.

A finding of non-compliance may be made, leading to penalties imposed on VLOPs to the similar extent as regular penalties described above. In cases of persistent failure, the EC may ask DSCs to act according to ask national judicial authorities for temporary suspension of services or access.

7. Overview of MS Positions

  • FR, AT, DE and PL have expressed doubts about the relationship between the DSA and the existing national provisions, fearing that the level of demand of the DSA is below their national standards. This can become a point of friction as the goal of the DSA is to harmonize the digital single market, hence the choice of a regulation and not a directive.
  • DE and FR have expressed their preference for broadening the regulator’s supervisory powers to all online content moderation policies, that is, to extend the scope of application of the rules to legal but harmful content (such as disinformation). This position may prove to be a controversial point, as most MS prefer to limit the scope of the DSA to illegal content.
  • FR has pushed for the inclusion of “online search engines” alongside “caching services”.
  • DE and FR believe the proposal can go further in terms of obligations and in strengthening consumer rights and protection vis-à-vis platforms.
  • FR, ES and AT have argued for greater involvement of the authorities of the country of destination, advocating the need to rethink the country-of-origin principle so that national authorities can act quickly and effectively. This will be a controversial point since the country-of-origin principle has broad and solid support from most MS. On the other hand, keeping with the country-of-origin principle brings complications in terms of the jurisdiction applicable in executing cross-border orders, especially considering the multiplicity of situations arising from the diversity of legal systems.
  • DK, ES and IT have been pushing for the introduction of stay-down mechanisms, in already meeting with what is foreseen from the European Parliament
  • PL and HU mentioned the possibility of having national judicial or administrative authorities restoring legal content taken down from an online platform.
  • IE, LU, CZ can be regarded as the most liberal group. They have raised greater doubts and asked for more evidence to support the EC´s choices.

Calendar for negotiations in the Council: According to Euractiv, a majority of EU countries have succeeded in rejecting the commitment for an agreement on both DSA and DMA by spring 2022. Instead, the European Council Conclusions of 22 of October point towards reaching an agreement “as soon as possible”.

8. State of Play in the European Parliament

The Internal Market and Consumer Protection (IMCO) Committee, the responsible committee, has appointed MEP Christel Schaldemose (S&D Denmark) as rapporteur for the Digital Services Act in January 2021.

Christel Schaldemose presented her draft report on 28 May 2021. The rapporteur wants to introduce stricter rules on online marketplaces to better protect consumers, further transparency measures and requirements to ensure user protection and strengthen the implementation and enforcement provisions to ensure that no Member State becomes a safe haven for online platforms. Importantly, the rapporteur suggested introducing a ban on targeted advertising, to increase transparency in this field and to give more control to the user in the context of recommender systems. The next session for the consideration of amendments or compromise amendments will take place on 27-28 October. The Committee vote is scheduled for 8 November, and the plenary vote is expected for December, in a date to be confirmed. Recent information indicates that the Committee vote was officially postponed, in which case so is the plenary vote.

The Committee on Civil Liberties, Justice and Home Affairs (LIBE), the Committee for Industry, Research and Energy (ITRE) and the Legal Affairs Committee (JURI) are associated committees. The LIBE Committee and the Culture Committee (CULT) adopted their opinions in July 2021 and in September 2021, respectively. The ITRE Committee published its draft opinion on 27 May 2021, which has focused on the amount of administrative burden and requirements setting on smaller players, amounting their compliance costs, for instance on introducing collective representation for the purpose of providers of intermediary services established in a third country to comply with the obligation of designating a legal representative in the EU, in case they have been unsuccessful in obtaining such services after reasonable effort.

  1. In the logic of combating piracy and illegal products online, it is also clear that it is necessary to give the authorities of the country of destination greater leeway in order to avoid the slowness of judicial remedies, and as it could be undermined by some enforcement provisions. This could be a point of friction during the negotiations.
  2. On the chapter on due diligence obligations, both “mere conduit” and “caching” services are burdened with the lowest level of obligations. In this context, it is important to stress that there is an important distinction to be made between these two services and the hosting services, within which fall the sublets of services which are most heavily burdened by the due diligence obligations chapter.
  3. Example: If a comment on a social media platform is illegal in the Member State where the commenting user is living, but not illegal in the Member State where the online platform is established, the Member State of residence of the user could address a direct order to the online platform to block the comment in its territory. But only if there is a national law enabling the Member State to send such order to an online platform, offering its services to the territory of this Member State.

It takes engagement to create a community.

You can suggest a topic, receive PTL’s updates or just contact us if you need anything:

Suggest a new Topic

If there is a specific matter you’d like Portugal Tech League to dwell on, this is the form you want to fill.

Subscribe to our Newsletter

Portugal Tech League’s newsletters are meant to update the community about the topics being approached, the developments on digital policies and any other interesting themes and events. This is where you can sign up to receive it.

Send us you enquiries

If you have any questions or just wish to reach out, this is the best way to contact Portugal Tech League.