Download the country report

List of Relevant Laws Impacting Free Speech (EU) (2015-2022)

Legislation

In 2018, the Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) came into force. Article 17 enshrines the “right to erasure” which gives the data subject the right to obtain from the controller the erasure of personal data concerning him or her without undue delay when the personal data are no longer necessary in relation to the purposes for which they were collected or otherwise processed, among other cases. The controller, in such cases shall take reasonable steps, including technical measures, to inform controllers which are processing the personal data that the data subject has requested the erasure by such controllers of any links to, or copy or replication of, those personal data. Exceptions would apply when processing is necessary for exercising the right of freedom of expression and information, for compliance with a legal obligation, for reasons of public interest in the area of public health, for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes, and for the establishment, exercise or defence of legal claims. This right (including its jurisprudential version as the “right to be forgotten”) might negatively affect access to information and particularly to information and personal data related to public figures and matters of public interest.

Directive (EU) 2018/1808 of the European Parliament and of the Council of 14 November 2018 amending Directive 2010/13/EU on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) delegates on platform important legal adjudication powers as well as creates a regime of responsibility that might lead to over-removals.

Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC lays down additional provisions harmonizing EU copyright law, particularly with regards to digital and cross-border uses of protected subject matter. This Directive has been criticized in terms of impact on freedom of expression in the sense that it forces platforms to use automated filters which might not be able to properly detect protected content.

Regulation (EU) 2021/784 of the European Parliament and of the Council of 29 April 2021 on addressing the dissemination of terrorist content online aims to ensure the smooth functioning of the digital single market by addressing the misuse of hosting services for terrorist purposes. With this legislation, Europe seems to move towards a progressive delegation of true law enforcement powers to private companies, depriving Internet users (and hosting service providers themselves) of the legal and procedural safeguards applicable to this kind of decision until now. Moreover, intermediary platforms may be progressively put in a position where cautiously overbroad decisions may be taken, as the only way to avoid the high and somewhat vaguely defined responsibilities penalties that may be imposed on them.

The Council Regulation (EU) 2022/350 of 1 March 2022 concerning “restrictive measures in view of Russia’s actions destabilizing the situation in Ukraine” prohibits broadcasting or facilitating any content by the State-owned and controlled Russian media outlets, “including through transmission or distribution by any means such as cable, satellite, IP-TV, internet service providers, internet video-sharing platforms or applications”. This is a very problematic ad-hoc legislation for a variety of reasons ranging from the competence of national independent audiovisual regulators in this field, the use of a very broad and general assessment of the information provided by the mentioned outlets rather than specific and properly analyzed pieces of content, and the emergency and lack of proper consultation and participation in the adoption of the regulation.

The Digital Services Act or Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC represents and overhaul of EU law governing intermediaries’ handling of user content. It builds on the pre-existing eCommerce Directive from 2000, and preserves key ideas and legal structures from that law. The DSA applies to numerous Internet intermediary services. It provides both immunities and obligations. Many of its specific rules apply only to services in specific categories (access, caching, hosting, and marketplace providers, for example). The DSA asserts significant jurisdiction over companies based outside the EU. It reaches services “directed” to EU Member States. It allows enforcers to assess extremely steep fines, in principle reaching up to 6% of annual revenue. It also sets up major new regulatory powers within the European Commission. The DSA contains problematic provisions regarding freedom of expression, including a broad definition of “illegal content” (article 3.h), notice-and-action mechanisms without sufficient safeguards for free speech rights of third parties (article 16), general obligations for platforms to act upon suspicion of criminal activities (article 18), obligation to detect broadly formulated “systemic risks” as well as to adopt mitigation measures (which do not only cover illegal but also harmful content) (article 34 and 35), and a so-called “crisis mechanism” that would put in the hand of the Commission significant powers to control online speech (article 36).

 

Co-regulatory instruments

The EU Code of Conduct on Countering Illegal Hate Speech Online was originally agreed on May 2016 between the European Commission and Facebook, Microsoft, Twitter and YouTube. Other tech companies have joined afterwards. The Code follows the definition of illegal hate speech established by the Framework Decision 2008/913/JHA of 28 November 2008. The Code aims at providing IT Companies with criteria and instruments to support the European Commission and EU Member States in the effort to respond to the challenge of ensuring that online platforms do not offer opportunities for illegal online hate speech to spread virally. The implementation of the Code of Conduct is evaluated through a regular monitoring exercise set up in collaboration with a network of organisations located in the different EU countries. The enforcement of this Code might be problematic in terms of freedom of expression since advocates for the use of content moderation tools (generally used to tackle legal but harmful content) to tackle illegal content, which can create confusion, uncertainties and fair process issues for users. Users may not have clear means to dispute legal interpretations or raise defenses based on European free expression guarantees. The Code may also ignore nuances regarding the definition of illegal content online across different member States. In addition to this, the regular evaluations of the implementation of the Code put the focus on quantitative aspects (number of reports and removals) instead of providing a more granular and substantive analysis regarding the corresponding decisions, which can clearly stimulate over-removals.

The 2022 Code of Practice on Disinformation is the result of efforts from major online platforms, emerging and specialised platforms, players in the advertising industry, fact-checkers, research and civil society organisations to deliver a strengthened and improved version of the 2018 Code. Signatories committed to take action in several domains, such as; demonetising the dissemination of disinformation; ensuring the transparency of political advertising; empowering users; enhancing the cooperation with fact-checkers; and providing researchers with better access to data. It is important to note that the new Code will become part of a broader regulatory framework, in combination with the legislation on Transparency and Targeting of Political Advertising and the Digital Services Act. For signatories that are Very Large Online Platforms, the Code aims to become a mitigation measure and a Code of Conduct recognised under the co-regulatory framework of the DSA. The use of the Code of Conduct can have direct or direct impact on the right to freedom of expression since disinformation does not constitute an illegal activity as such in most member States as well as according to international human rights standards. This can be particularly the case when by virtue of the mentioned legislation the Code becomes a co-regulatory instrument which in practice may create a framework to legitimize and impose speech restrictions beyond legality.

 

Case law

Republic of Poland v. Parliament and Council. Case number C-401/19. Judgement of 26 April 2022. The Court concluded that the obligation, on online content-sharing service providers, to review, prior to its dissemination to the public, the content that users wish to upload to their platforms according to article 17 of the Copyright Directive, is accompanied by the necessary safeguards to ensure that that obligation is compatible with freedom of expression and information. However, article 17 had been widely criticized for establishing the obligation for platforms to use automated filters to monitor user’s speech, which would lead to unnecessary removals that would be difficult to challenge ex post.

Google LLC v. National Commission on Informatics and Liberty (CNIL). Case number C‑507/17. Judgement of 24 September 2019. The Court establishes that, currently, there is no obligation under EU law, for a search engine operator who grants a request for de-referencing made by a data subject following an injunction from a supervisory or judicial authority of a Member State, to carry out such a de-referencing on all the versions of its search engine. However, EU law requires a search engine operator to carry out such a de-referencing on the versions of its search engine corresponding to all the Member States and to take sufficiently effective measures to ensure the effective protection of the data subject’s fundamental rights. Furthermore, the Court validates the adoption of measures which effectively prevent or seriously discourage an internet user conducting a search from one of the Member States on the basis of a data subject’s name from gaining access, via the list of results displayed following that search, through a version of that search engine outside the EU, to the links which are the subject of the request for de-referencing. This decision presumes a non-existing uniformity when it comes to the balance between freedom of information and privacy protection across different member States. It also uses very ambiguous criteria to refer to the possibility of applying de-referencing requests beyond the limits of the EU thus creating a potential risk of misuse or abuse of the so-called “right to be forgotten” in other jurisdictions.

Glawischnig-Piesczek v. Facebook Ireland Limited. Case number C-18/18. Judgement of 3 October 2018. In this decision, the Court establishes that the E-Commerce Directive does not preclude a court of a Member State from ordering a host provider to remove information which it stores, the content of which is identical, equivalent to the content of information which was previously declared to be unlawful, or to block access to that information, irrespective of who requested the storage of that information, provided that the monitoring of and search for the information concerned are limited to information conveying a message the content of which remains essentially unchanged compared with the content which gave rise to the finding of illegality; and to remove information covered by the injunction or to block access to that information worldwide within the framework of the relevant international law, and it is up to Member States to take that law into account. This decision may trigger very important challenges in terms of practical implementation due to the use of ambiguous terms such as information that is “equivalent” to illegal content. It also endorses the creation a possible general monitoring obligation and the use of automated filters in certain cases, as well as the possible extraterritorial application of European limits to freedom of expression.

Back to map