The Digital Services Act (DSA), which aims at the creation of a safer online environment in Europe, addresses the lack of transparency in content moderation by online platforms. Therefore, the DSA imposes several new due diligence obligations. This article explores the implications of these transparency obligations on the spread of disinformation, in particular on the Very Large Online Platforms (VLOPs) that will be subject to additional scrutiny. The article highlights the potential benefits of the new regulatory framework that enables the access of vetted researchers to platforms’ data, empowers users by reducing information asymmetry and mitigates certain risks. However, questions remain regarding the information overload for the regulators and the effectiveness of the future DSA enforcement. In view of the possible enforcement issues, the article proposes to go further, for example by adding a general principle of transparency (beyond the list of due diligences obligations) and by strengthening the co-regulatory and multistakeholder model of regulation (beyond what the DSA helpfully provides).
This working paper discusses the legal framework governing algorithmic transparency in the European Union. By reviewing the shortcomings and gaps of legislation currently in force across the regulatory spectrum, it identifies the key challenges which forthcoming regulation should consider and build on. In particular, this working paper analyses the proposed Digital Services Act (“DSA”), unveiled by the European Commission in December 2020. It suggests some amendments to the current draft of the DSA to ensure a more robust framework for enhancing the transparency of very large online platforms. This working paper focuses on improving the DSA’s rules concerning auditing, data access, regulatory scrutiny in connection with very large online platforms’ systemic risks and user explanation rights, while requiring the necessary strengthening of platform accountability. Moreover, this working paper outlines a transparency initiative beyond the DSA and pursuing a more holistic and systemic approach to transparency through a scaled duty of transparency for online content platforms and a broader oversight framework.
Price personalisation raises four policy concerns: building trust, fostering competitiveness, increasing access, and avoiding exploitation. The Modernisation Directive introduces an information requirement about personalised prices. The research explains how this information requirement can and shall be used to make price personalisation pro-competitive and pro-consumers. The analysis can be divided into two main parts. First, disclosing the impersonal price is the simple and effective way to reap the benefits of price personalisation while counteracting its negative effects. Second, the legal grounds for the right to know also the impersonal price in EU law are identified. After having explained that consumers have a right to be offered a personalised price, it is shown that the principles of transparency and effectiveness in EU consumer law, together with the right granted by Article 22(3) GDPR, imply that consumers have the right to know also the impersonal price. The right to know also the impersonal price is a critical tile for solving the puzzle represented by the best governance of digital markets in the European Union.
The information paradigm is a core theoretical framework underlying both consumer and data protection legislation. It postulates that providing some key information about a transaction or a data processing protects the rights and interests of the weak party.However, very few users pay attention to the information that is given to them online. In the paper, we argue that there are alternative ways to raise consumers’ awareness. An interdisciplinary effort that combines behavioural insights and best practices from information design could address the alleged failure of the information paradigm. To this end, the contribution presents the “Be.aware” app, a legal design solution developed within a research project on the sharing economy in Brussels.
La contribution examine comment réguler la diffusion de la « désinformation » et des « avis factices » sur les plateformes digitales. Ces deux formes de désordres de l’information qui affectent respectivement les plateformes des réseaux sociaux et de l’économie collaborative résultent des nouvelles formes d’expression et de manifestation en ligne. Toute tentative de régulation nécessite de bien délimiter les pratiques les plus abusives et trompeuses qui seules méritent une attention et une régulation juridique. Cette contribution examine les initiatives ou législations prises par l’Union européenne et les États membres (Allemagne, Belgique, France, Royaume-Uni) pour limiter ces phénomènes. La transparence, notamment sur la source des financements publicitaires ou des avis, peut contribuer à limiter les désordres informationnels. La contribution propose plutôt des modifications de l’architecture des plateformes pour augmenter les coûts de propagation d’informations de mauvaise qualité.