By Joan Barata
As part of their research on the topic, CELE has characterized a threat of regulation as “any kind of public or private utterance or action by public officials who hold regulatory power over others in which they express, suggest, or imply, clearly or veiledly, their desire to see their subject’s conduct move in a particular direction”.
In the United States, this conduct is usually called jawboning. As described by Ramiro Álvarez Ugarte in a recent piece, this is an extremely complex area of law, in part because assessing when public officials cross such a legal line requires a complex understanding of both the scope and instruments of the existing legal framework, as well as the actual intention of obtaining a specific result which is not necessarily pre-determined or explicitly pursued by the law.
In the European Union (EU), the DSA establishes a series of fundamental rules and principles regarding how intermediaries participate in the publication and distribution of online content. It especially focuses on content hosting and sharing platforms, such as Facebook, TikTok, Twitter, and YouTube. It also incorporates new important rights for users and obligations for service providers (particularly the so-called very large online platforms: VLOPs) in areas such as terms and conditions, transparency requirements, statements of reasons, advertising, protection of minors, complaint-handling systems, and out-of-court dispute settlements among many others.
The DSA may and, in some aspects, has already become a regulatory area where the possible exertion of informal pressures deserves to be properly studied and analysed. Firstly, because the DSA explores unchartered regulatory territories on the basis, as it will be shown, of general legal principles and provisions that need to be primarily enforced by private actors (intermediaries). Secondly, because the complexity of the enforcement and monitoring system of the DSA requires the combined action of national regulators for each member state and the European Commission when it comes to the legal obligations applicable to big tech platforms that provide services across all European states and globally. This choice by the legislator (justified by the possible lack of capacity of national regulators to deal with such actors properly) has opened the door to concerns about the political profile of the Commission as well as the adoption of monitoring techniques closer to political statements than administrative procedures with safeguards and guarantees. Recent public acts by the so-far commissioner in charge of the DSA implementation have in fact confirmed such concerns.
This post will now examine some of the provisions included in the DSA which may be used by relevant authorities to impose on intermediaries the adoption of decisions affecting users’ content in non-properly accountable ways.
___
Read the full piece at CELE – Centro de Estudios en Libertad de Expresión y Acceso a la Información | Center for Studies on Freedom of Expression and Access to Information
Read More
Joan Barata is a Senior Legal Fellow for The Future of Free Speech. He works on freedom of expression, media regulation, and intermediary liability issues.