Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
55 views26 pages

Gatekeepers

Uploaded by

muasa.wathome
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
55 views26 pages

Gatekeepers

Uploaded by

muasa.wathome
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 26

Abstract

This article examines the relationship between free speech and content restrictions in the
context of the EU Digital Services Act (DSA) and the Digital Markets Act (DMA). It examines
the duties of 'gatekeepers' in the DMA, the challenges of content moderation, and the fine
line between encouraging open communication and limiting harmful material. The article
seeks to add to the current discussion on regulating digital platforms, specifically on the
gatekeeper designation under the DMA. This implies that this label, influenced by the
political objectives and legal views of the European Commission, can assist in finding a
middle ground between improving freedom of speech and avoiding excessive compliance
and censorship.
Keywords: Digital Services Act, Digital Markets Act, gatekeepers, content moderation,
freedom of speech, European Union.
I. Introduction
The growth of internet platforms in the digital era has greatly transformed how we
communicate, access information, and engage with society. Currently, social media
platforms and websites that share content serve as modern equivalents of public areas,
allowing for different conversations and interactions. However, the newly acquired
authority also means taking on the duty of overseeing content moderation, which is a
crucial and intricate responsibility1.
Content moderation consists of supervising and handling user-created content to
guarantee adherence to the platform's regulations and legal responsibilities. While
essential for upholding a secure and courteous online atmosphere, it also sparks notable
worries regarding the freedom of speech.2 The difficulty lies in finding a balance between
encouraging open discussions and stopping harmful or illegal content when it comes to
free speech and regulating content. Too much moderation can limit free speech and lead
to censorship, while too little moderation can lead to more harmful content 3. In order to
tackle these challenges, the European Union has put into effect two important regulations:
the Digital Services Act (DSA) and the Digital Markets Act (DMA) 4.The goal of the DSA is to
establish a more secure online environment, with defined duties for internet platforms,
whereas the DMA aims to guarantee equitable and transparent digital markets. One
important aspect of the DMA is the identification of specific major platforms as
'gatekeepers', leading to increased regulation because of their substantial influence on the
market5.
This article suggests that the DMA's gatekeeper designation, shaped by the
European Commission's political goals and legal interpretations by the CJEU, strikes a
balance between boosting free speech through preventing arbitrary content moderation
and the danger of platforms over-complying and censoring lawful speech. The goal is to
examine this equilibrium, investigating how the Gatekeeper Rules, freedom of speech, and
the larger aims of the DSA and DMA interact with each other.

1
Caitlin Chin-Rothmann, Taylar Rajic, and Evan Brown, “A New Chapter in Content Moderation: Unpacking the UK Online
Safety Bill” CSIS (2023) available at https://www.csis.org/analysis/new-chapter-content-moderation-unpacking-uk-online-
safety-bill
2
UK Parliament, “Digital Technology and the Resurrection of Trust Contents: Accountability and the technology
platforms” Parliament Publications (2020) available at
https://publications.parliament.uk/pa/ld5801/ldselect/lddemdigi/77/7707.htm
3
Clifford Chance, “Content Moderation and Online Platforms: An impossible problem? Regulators and legislators look to
new laws” Clifford Chance (2020) available at https://www.cliffordchance.com/insights/resources/blogs/talking-tech/
en/articles/2020/07/content-moderation-and-online-platforms--an-impossible-problem--.html
4
European Commission, “The Digital Services Act package” Digital Strategy available at https://digital-
strategy.ec.europa.eu/en/policies/digital-services-act-package#:~:text=The%20Digital%20Services%20Act%20and,level
%20playing%20field%20for%20businesses.&text=Follow%20the%20latest%20progress%20and%20learn%20more
%20about%20getting%20involved.
5
Ibid
After the opening, the following section will explore the emergence of digital
platforms and the significance of content moderation. The next part will cover the conflict
between freedom of speech and content moderation, then examine the DSA and DMA in
the following section. The fifth part of the paper will outline the research hypothesis and
the goal of the article. The last part will provide a summary, covering the results and
significance of the study.
In conclusion, it is a complex issue that requires careful consideration to determine
a balance between freedom of speech and content moderation. This article aims to add to
the current conversation regarding the significant role played by the DMA in acting as a
gatekeeper. With the evolution of digital platforms, it is necessary to enhance our
comprehension and oversight of these environments. This analysis is expected to offer
valuable perspectives on the equilibrium between freedom of speech and content
moderation within the EU's DSA and DMA.
II. The EU's Approach to Online Content Moderation
The Evolving Landscape of Free Speech and Content Moderation online
The landscape for freedom of speech has been significantly changed by the digital
realm. Due to social media platforms and online forums, individuals now have
unprecedented reach to a global audience. This has increased accessibility to sharing
information and ideas, enabling previously unheard or suppressed voices to be heard 6.
However, the digital world has introduced new challenges to the fundamental principles of
freedom of speech7. The convergence of internet anonymousness and the extensive scope
of digital platforms has led to the rising propagation of hate speech, misinformation, and
other damaging content8Given these challenges, the European Union (EU) has put in place
actions to regulate online content, with a particular focus on overseeing the material
posted on digital platforms. Two important legal cases from the Court of Justice of the
European Union (CJEU) have impacted these regulations: Germany v. Facebook in 2018
and GS Media v. YouTube in 2020.
The Germany v. Facebook (2018) case marked a major development in overseeing
digital services, especially in safeguarding user data. The German FCO looked into how
Facebook utilized user data in its terms of service, and discovered that the company had
exploited its dominant position in the social networking industry 9. The FCO prohibited the
6
Taha Yasseri, “From Print to Pixels: The Changing Landscape of the Public Sphere in the Digital Age” School of Sociology,
University College Dublin (December 2023) DOI:10.13140/RG.2.2.20652.44166
7
Huebler, John, "Free Speech and the Internet" (2020). Student Research Submissions. 338.
https://scholar.umw.edu/student_research/338
8
Carr Center For Human Rights Policy Harvard Kennedy School, “Freedom of Speech and Media:Reimagining Rights and
Responsibilities in the US” (February 2021) available at
https://carrcenter.hks.harvard.edu/files/cchr/files/free_speech.pdf
9
Kyriakos Fountoukakos, Marcel Nuys, Juliana Penz and Peter Rowland, “The German FCO’s decision against Facebook: a
first step towards the creation of digital house rules?” The Competition Law Journal (18)(2)55-65 available at
https://awards.concurrences.com/IMG/pdf/14._the_german_fco_s_decision_against_facebook_k_fountoukakos_m_nuy
utilization and implementation of data processing detailed in Facebook's terms of service.
This decision highlighted the importance of protecting user data and raised awareness
about potential unethical behaviour by major online platforms 10. The FCO found that
Facebook's terms of service violated GDPR, causing users to lose control over their rights
to determine their own data11. The FCO also stated that it had a negative effect on both
competitors in the same market and on third markets. This case set the precedent that EU
member states have the authority to require platforms to delete illegal content, signifying
a major advancement in the oversight of digital services.
The GS Media v. YouTube (2020) ruling provided clarification on the responsibilities
of online platforms in relation to copyrighted material12. The lawsuit centered on users of
YouTube uploading multiple phonograms, which music producer Frank Peterson said he
owned, without his permission13. The CJEU stated that before determining if linking to a
work online without the rights holder's consent constitutes a public communication, it
must be established if the links were shared without seeking financial gain by an individual
who was unaware or couldn't have known that the linked publication was unauthorized 14.
This decision will have a big impact on content moderation, as it obligates platforms to
stop the sharing of copyrighted material without permission. It indicates that platforms
could face legal responsibility for copyright violation if they are mindful of the violation
and fail to act to stop it. This choice emphasizes the significance of efficient content
moderation systems in safeguarding the rights of copyright owners, while also pointing out
the difficulties that platforms encounter in monitoring user-created content 15.
The Germany v. Facebook (2018) and GS Media v. YouTube (2020) cases are both
important milestones in the oversight of digital platforms. They emphasize the significance
of safeguarding user data and efficiently moderating content, while bringing attention to
the difficulties platforms encounter in managing these factors alongside the desire to
uphold open and competitive digital markets. Due to the ever-changing nature of the

s_j_penz_p_rowland_comp_law_journal_july_2019_.pdf?
55716/90b472e3bdb19a8366d83f872d6ca3c466422e732e175fa9f995cf5e1c97d6b5
10
Kerber, W., Zolna, K.K. The German Facebook case: the law and economics of the relationship between competition
and data protection law. Eur J Law Econ 54, 217–250 (2022). https://doi.org/10.1007/s10657-022-09727-8
11
Thomas Thiede, Laura Herzog (Spieker & Jaeger), “The German Facebook Antitrust Case – A Legal Opera” Kluwer
Competition Law Blog (February 2021) available at
https://competitionlawblog.kluwercompetitionlaw.com/2021/02/11/the-german-facebook-antitrust-case-a-legal-opera/
12
Opinion Of Advocate General Saugmandsgaard ØE delivered on 16 July 2020 (1) Joined Cases C-682/18 and C-683/18
Frank Peterson v Google LLC, YouTube LLC, YouTube Inc., Google Germany GmbH (C-682/18) and
Elsevier Inc. v Cyando AG (C-683/18) (Requests for a preliminary ruling from the Bundesgerichtshof (Federal Court of
Justice, Germany)) available at CURIA - Documents (europa.eu)
13
Zoi Krokida (Stirling University), “AG’s opinion on Peterson/ YouTube: Clarifying the liability of online intermediaries for
the violation of copyright-protected works?” Kluwer Copyright Blog (January 2021) available at AG’s opinion on Peterson/
YouTube: Clarifying the liability of online intermediaries for the violation of copyright-protected works? - Kluwer
Copyright Blog (kluweriplaw.com)
14
Ibid
15
Graham Smith,” CJEU's GS Media copyright linking decision draws a line: ordinary internet user or commercial
website?” Bird&Bird (2016) available at CJEU's GS Media copyright linking decision draws a line: ordinary internet user or
commercial website? - Bird & Bird (twobirds.com)
digital environment, it is crucial to regularly assess and revise regulations to ensure they
stay relevant and efficient16.
While the internet allows for unlimited self-expression and global connections, it
has also become a central hub for the dissemination of hate speech and misinformation.
The increasing concern lies in the use of harmful language that belittles others based on
factors such as race, ethnicity, gender, sexuality, nationality, and religion. It frequently
showcases marginalized groups in society. This form of interaction can lead to isolation,
social tensions, and even violence in extreme cases17.
Misinformation, especially fake news and conspiracy theories, poses a significant
challenge as well. It has the ability to quickly disseminate through online platforms,
connecting with a large number of people in a brief amount of time 18. The impacts of false
information have extensive reach. It has the potential to erode trust in institutions, worsen
social divides, and impact election results. In a global health emergency, misinformation
about health can result in serious consequences, such as possible loss of life. Content
moderation tools, like algorithms that identify and delete harmful content, are essential in
addressing these issues19.
Nevertheless, these tools frequently have difficulty in accurately identifying and
eliminating such content. There are numerous challenges. Algorithms might struggle to
comprehend the subtleties of human language, leading to challenges in discerning
between valid criticism and hate speech. It can also be challenging for them to confirm the
accuracy of information, making it hard to distinguish and delete false information 20.
Furthermore, due to the speedy nature of digital communication, harmful content
can easily circulate before being identified and taken down. Even if the content is
eliminated, it can be easily reposted or shared, turning content moderation into a never-
ending task similar to playing whack-a-mole21. These obstacles emphasize the importance
of strong and detailed content moderation policies. These policies must be clear,
consistently implemented, and uphold users' rights to express themselves freely. They
must also be flexible, able to adjust to the changing digital environment. This involves

16
Adam Satariano, “Facebook Loses Antitrust Decision in Germany Over Data Collection” New York Times (June 2020)
https://www.nytimes.com/2020/06/23/technology/facebook-antitrust-germany.html
17
Théophile Lenoir, “Challenges of Content Moderation” Instituit Montaigne (2019) available at
https://www.institutmontaigne.org/en/expressions/challenges-content-moderation
18
European Union Agency for Fundamental Rights, “Online Content Moderation – Current challenges in detecting hate
speech” FRA (2013)available at https://fra.europa.eu/sites/default/files/fra_uploads/fra-2023-online-content-
moderation_en.pdf
19
Zachary Laub, “Hate Speech on Social Media: Global Comparisons” Council on Foreign Relations (June 2019) available
at https://www.cfr.org/backgrounder/hate-speech-social-media-global-comparisons
20
Wilson, Richard A. and Land, Molly, "Hate Speech on Social Media: Content Moderation in Context" (2021). Faculty
Articles and Papers. 535. https://opencommons.uconn.edu/law_papers/535
21
NoToHate Fact Sheets, “ HATE SPEECH, MIS- AND DISINFORMATION” UN available at
https://www.un.org/sites/un2.un.org/files/notohate_fact_sheets_en.pdf
creating advanced content moderation tools, setting clearer user guidelines, and putting in
place systems for user appeals and complaints22.

Finally, despite the significant difficulties posed by the increase in hate speech and
misinformation, overcoming these challenges is possible. By implementing comprehensive
and intricate measures, the online environment can remain a platform for unrestricted
speech and worldwide networking, while also safeguarding individuals from detrimental
material. Progress can be achieved with a combined effort and cooperation from all
involved, despite the complexity of the task at hand.

The Digital Services Act (DSA): Focus on Content Regulation

The European Commission introduced the Digital Services Act (DSA) to establish a
comprehensive legal framework for regulating digital services within the European Union.
It addresses various topics in the digital age, including the spread of illicit content and
damaging behaviours on the internet23. The DSA's approach to addressing inappropriate
content and harmful behaviours is founded on several essential principles. Above all, it
creates specific rules for removing unauthorized content. According to Article 14 of the
DSA, digital service providers must establish procedures to quickly remove or block illegal
content24. Providers must promptly remove or limit access to illegal content once they are
informed about it. They must also disclose their methods for monitoring content, such as
the quantity and categories of notifications received, and the steps taken in reply. This
obligation to report is aimed at guaranteeing transparency and accountability in the way
providers manage illegal content25.
The DSA includes measures to improve openness and support for users in managing
content. As stated in Article 15, digital service providers must offer transparent and readily
available details regarding their content moderation policies and procedures 26. This
includes guidance on how to report notices and complaints, the actions providers can take
on their services, and the legal remedies available for users. People who do not agree with
a content moderation decision can use an internal complaints system and seek resolution
through an alternative dispute resolution process. This is created to offer users an easy
way to question content moderation choices and seek a resolution if they think their rights
have been infringed27.
22
UNESCO, “Addressing hate speech on social media: contemporary challenges” UNESCO (2021) available at
https://unesdoc.unesco.org/ark:/48223/pf0000379177
23
European Commission, “Digital Services Act Overview” European Commission available at
https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age/digital-services-
act_en#:~:text=The%20DSA%20regulates%20online%20intermediaries,and%20the%20spread%20of%20disinformation.
24
The Digital Services Act, Article 14
25
European Commission, “Questions and answers on the Digital Services Act” available at
https://ec.europa.eu/commission/presscorner/detail/en/QANDA_20_2348
26
The Digital Services Act, Article 15
27
Aina Turillazzi, Mariarosaria Taddeo, Luciano Floridi & Federico Casolari (2023) The digital services act: an analysis of its
ethical, legal, and social implications, Law, Innovation and Technology, 15:1, 83-
Nevertheless, the DSA may have restrictions when it comes to certain types of
content like hate speech. Although the DSA's Article 20 mandates that digital service
providers must address illegal hate speech, it fails to offer a precise definition of hate
speech28. This may result in uneven enforcement of the regulations in various platforms
and areas. Moreover, the DSA does not cover the problem of legal yet harmful content on
platforms, like misinformation and cyberbullying. This gap is important because content
like this can have serious effects on society, even if it is not considered illegal29

To sum up, although the DSA offers a strong structure for addressing illicit content
and damaging behaviour on the internet, there are still areas that can be enhanced. It will
be crucial to regularly assess and adjust the DSA to keep it impactful and up-to-date as the
digital environment changes. This includes addressing challenges from specific content
such as hate speech and legal yet harmful content, and ensuring protection of users'
rights. The DSA marks a major milestone in overseeing digital services, yet it is just the
beginning of an ongoing effort to guarantee a secure and equitable digital space for every
user.

The Relevance of Delfi AS v. Estonia (2015) ECtHR 64669/09

The 2015 ECtHR 64669/09 case of Delfi AS against Estonia sets a benchmark for
intermediary liability and content supervision in today's technological age. Delfi, a popular
Estonian internet news site, was held accountable for offensive comments made by its
users in this scenario. The court ruled that Estonia did not breach Article 10 of the ECHR by
finding Delfi accountable for defamatory remarks in the comments section of its articles30.

The EU's strategy for content moderation takes into account the decision made by
the ECtHR in the Delfi case, offering valuable perspectives on establishing intermediary
responsibility. These factors include the platform's ability to identify illegal content and
their efforts to remove it31. The court conducted a three-step assessment to ascertain if
Delfi's rights were violated. At first, it was determined that Estonia had infringed upon
Delfi's freedom of speech by issuing civil fines for the defamatory comments 32.
Furthermore, the court found that the granted remuneration was in line with legal

106, DOI: 10.1080/17579961.2023.2184136


28
Joan Barata, Oliver Budzinski, Mark Cole, Alexandre de Streel, Michèle Ledger, Tarlach McGonagle, Katie Pentney,
Eleonora Rosati, “ Unravelling the Digital Services Act package” European Audiovisual Advisory https://su.diva-
portal.org/smash/get/diva2:1605131/FULLTEXT01.pdf
29
TheLegal500, “User Content Moderation Under the Digital Services Act – 10 Key Takeaways” available at
https://www.legal500.com/developments/thought-leadership/user-content-moderation-under-the-digital-services-act-
10-key-takeaways/
30
See CASE OF DELFI AS v. ESTONIA (Application no. 64569/09) JUDGMENT STRASBOURG 16 June 2015available at
https://www.echr.coe.int/documents/d/echr/Press_Q_A_Delfi_AS_ENG
31
Global Freedom of Expression, “ Delfi v Estonia” https://globalfreedomofexpression.columbia.edu/cases/delfi-as-v-
estonia/
32
Ibid n(29)
standards, after concluding that Delfi had violated the Civil Code Act and Obligations Act of
Estonia33. Furthermore, the court recognized that the reason for imposing civil fines on
Delfi was to protect the reputation and rights of individuals.

The case could have major consequences for how much liability platforms could
face under the Digital Services Act (DSA). The goal of the DSA is to balance the duties of
online platforms with their role as crucial intermediaries and important providers and
influencers of information.34 It acknowledges the important role that online platforms have
in enabling access to information and communication in the current digital age.
Nevertheless, it also recognizes that these platforms must make sure that their services
are not being utilized for sharing illegal content or participating in harmful activities35.

In order to weigh these factors, the DSA suggests introducing new, unequal
responsibilities for online platforms. These responsibilities are created to align with the
scale and influence of the platform, ensuring that bigger platforms have to meet stricter
criteria. For instance, bigger platforms must perform risk evaluations, put into place
prevention measures, and designate compliance officers36. At the same time, the DSA
maintains the responsibility protection for hosting providers as described in the
eCommerce Directive. This shows that online platforms are generally not held accountable
for illicit content posted by users, as long as they are unaware of the illegal activity and
quickly take it down or restrict access when informed37.

Nonetheless, the DSA provides a clearer distinction between the responsibility of


online platforms and their accountability according to consumer law. It is specified that
online platforms, like marketplaces, will still be held accountable under consumer law if
they give the impression to an ‘average consumer’ that the information, product, or
service being transacted is offered by them or by someone they are supervising. This
clause aims to stop online platforms from avoiding responsibility by saying they are just
middlemen. The Delfi case and the DSA mark important advancements in the areas of
intermediary liability and content moderation. They emphasize the importance of finding a
fair strategy that safeguards both individual rights and the common good, while also
acknowledging the duties and influence of online platforms in today's digital era.

As the digital environment keeps changing, it is crucial to regularly reassess and


revise the DSA to ensure its efficacy and relevance are maintained. This entails dealing
33
Ibid
34
FuturefreeSpeech, “DELFI v. ESTONIA” available at https://futurefreespeech.org/delfi-v-estonia/
35
Oskar Josef Gstrein, “Case analysis of the ECtHR judgment in Delfi AS v. Estonia (app. No. 64569/09), The difficulties of
information management for intermediaries” available at https://jean-monnet-saar.eu/?p=881
36
Human Rights Law Center, “European Court of Human Rights examines the entitlement to freedom of speech Delfi AS
v Estonia [2013] ECHR, Application No. 64569/09 (10 October 2013) https://www.hrlc.org.au/human-rights-case-
summaries/european-court-of-human-rights-examines-the-entitlement-to-freedom-of-speech
37
Neville Cox, “Delfi v. Estonia: Privacy Protection and Chilling Effect” https://verfassungsblog.de/delfi-v-estonia-privacy-
protection-and-chilling-effect/
with the difficulties presented by particular types of content like hate speech and legal but
harmful content, while also guaranteeing that users' rights are properly safeguarded. The
DSA marks a substantial advancement in overseeing digital services, yet it is just the start
of a continuous effort to guarantee a secure and equitable digital space for every user38.

Considering the Delfi case, the DSA may have to find a way to maintain a balance
between intermediary liability and protecting freedom of expression. Although the DSA
retains the defences of "hosting," "caching," and "mere conduit" from the eCommerce
Directive, it also introduces substantial new responsibilities for digital service providers 39.
These responsibilities consist of having a public "contact point" for communication with
other authorities and users, and following the current "notice and takedown" procedure 40.
These middlemen service providers will additionally gain from a new "Good Samaritan"
clause, ensuring they do not forfeit these protections even if they conduct independent
investigations to detect and delete unlawful material. Therefore, the Delfi case and the
DSA are important advancements in intermediary liability and content moderation. They
emphasize the importance of maintaining a fair strategy that safeguards the rights of
people and the common good, while acknowledging the duties and functions of online
platforms in today's digital era41.

III. The DMA's Gatekeeper Rules and their Impact on Free Speech

38
Faculty of Law, University of Oslo, “ntermediary Liability for Copyright Infringement in the EU’s Digital Single Market
Looking at the Proposed Copyright Directive’s Compliance with the current EU framework on Intermediary Liability”
https://www.duo.uio.no/bitstream/handle/10852/60876/1/Thesis-Intermediary-Liability-for-Copyright-Infringement-in-
the-EU-s-DSM.pdf
39
E&G Economides LLC, “NAVIGATING THE DIGITAL SERVICES ACT: INNOVATION AND ACCOUNTABILITY” available at
https://www.economideslegal.com/media/documents/Navigating_the_Digital_Services_Act.pdf
40
Ceyhun Pehlivan , Peter Church “EU - The DSA: A new era for online harms and intermediary liability” Linklaters (2022)
available at https://www.linklaters.com/en/insights/blogs/digilinks/2022/july/eu-the-dsa-a-new-era-for-online-harms-
and-intermediary-liability
41
Neville Cox, “Delfi v. Estonia: Privacy Protection and Chilling Effect” https://verfassungsblog.de/delfi-v-estonia-privacy-
protection-and-chilling-effect/
Definition and Designation of Gatekeepers under the DMA.

The DMA brings in the idea of "gatekeepers" to the digital market scene.
Gatekeepers are characterized by various criteria, each of which demonstrates the
platform's power and authority in the digital market across the European Union (EU) 42.
First and foremost, a platform should possess a substantial market share in the EU. This
can be identified through its user base or its function as an access point to other online
services. A platform that has a significant number of users can influence the digital market,
and a platform acting as an entry point can regulate access to various online services,
ultimately impacting market dynamics43. In addition, the platform needs to firmly establish
its position, creating obstacles for potential new rivals to enter the market. The extent of
entrenchment can be evaluated based on factors like how long the platform has been
around, how much its users depend on it, and the absence of other feasible options 44. A
well-established platform can establish market norms and procedures, frequently at the
expense of competition and innovation.

Furthermore, the platform should be capable of regulating the entry of businesses


and users to its services and possibly the broader digital market. This regulation can
appear in different forms, like via the platform’s terms of use, its algorithms for ranking
and suggesting content, and its guidelines for data access and transferability 45. Through
managing entry, the platform has the ability to impact which companies thrive or collapse,
which forms of content are visible or concealed, and how personal data is utilized and
distributed46. It is the European Commission's responsibility to designate gatekeepers.
Several factors can affect this decision-making process. Competition authorities in the EU
will share market share and platform dominance information with the Commission to help
evaluate the platform's market presence and strength 47. The Commission is expected to
conduct public consultations to gather feedback from platforms, user groups, and civil
society organizations. These discussions can provide important perspectives on how the
platform affects external parties and the broader online marketplace 48.
Nevertheless, even though the selection process should remain impartial, there is a risk of
political interference. Political agendas can influence the Commission's focus and
understanding of the DMA, specifically in relation to how content moderation is carried
out. For example, worries about fake news and offensive language may result in
gatekeepers having to adhere to stricter rules for filtering content, whereas concerns
42
Digital Markets Act, Regulation 2
43
European Commission, “Digital Markets Act: Commission designates six gatekeepers” Press Release 6 September 2023
available at https://ec.europa.eu/commission/presscorner/detail/en/ip_23_4328
44
Ibid
45
Christophe Carugati, “The difficulty of designating gatekeepers under the EU Digital Markets Act” Bruegel 2023
available at https://www.bruegel.org/blog-post/difficulty-designating-gatekeepers-under-eu-digital-markets-act
46
European Commission, “About the Digital Markets Act” https://digital-markets-act.ec.europa.eu/about-dma_en
47
Ibid
48
Van Doorne, “The Digital Markets Act (DMA): rules for digital gatekeepers to ensure open markets”
https://www.lexology.com/library/detail.aspx?g=dff5cdb3-7e2f-4ec2-9b7b-095bd156e7cd
about censorship and free expression may result in more relaxed rules. Therefore,
appointing gatekeepers is not only a technical procedure but also a political one that
mirrors the EU's wider goals for the digital market.

The rules for identifying gatekeepers under the DMA are well backed by EU law in
terms of market share and entrenched position. Article 101 of the Treaty on the
Functioning of the European Union (TFEU) makes it illegal to create agreements that
inhibit, limit, or manipulate competition in the internal market 49. This clause has played a
key role in influencing the European Commission's strategy towards dominant market
positions.

United Brands Co. v Commission ([1976] ECR 1127) set a significant precedent by
establishing a framework for evaluating dominance in the European Union 50. The Court
ruled that dominance is determined by a company's share of the market, the level of
competition, and the options available to consumers. United Brands Company was
discovered to have taken advantage of their strong position in the banana market,
breaking Article 101 TFEU51. This situation highlights the significance of market share in
establishing dominance and backs up the DMA's focus on market share as a factor for
recognizing gatekeepers52.

In a similar manner, the Commission in Microsoft Corp. v Commission ([2004] ECR


I-759) was able to contest Microsoft's combining of its internet browser and operating
system53. The Commission claimed that this behaviour solidified Microsoft's control in the
market and limited competition54. The CJEU confirmed the Commission's ruling,
showcasing its power to tackle behaviours that solidify control. This instance bolsters the
DMA's emphasis on entrenchment as a key factor in identifying gatekeepers.

The legal question becomes more intricate when considering the impact on third
parties. Although Article 56 TFEU ensures the freedom of service movement, the DMA
focuses on instances where platforms serve as "gatekeepers" and unfairly limit other
businesses' access to the internal market. In cases such as Commission v Post Danmark A/S
([2011] ECR I-10801), the CJEU ruled that a national postal operator's dominant position
was unjustifiable and impeded competition, providing backing for the Commission. This

49
The Treaty on the Functioning of the European Union, Article 101
50
Judgment of the Court of 14 February 1978. United Brands Company and United Brands Continentaal BV v Commission
of the European Communities. Chiquita Bananas. Case 27/76. Available at
https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A61976CJ0027
51
Margaret H. FitzPatrick, United Brands Company v. Commission of the European Communities: Window to Price
Discrimination Law in the European Economic Community, 1 Nw. J. Int'l L. & Bus. 338 (1979)
52
Joseph J. Norton, The European Court of Justice Judgement in United Brands: Extraterritorial Jurisdiction and Abuse of
Dominant Position, 8 Denv. J. Int'l L. & Pol'y 379 (1979).
53
JUDGMENT OF THE COURT OF FIRST INSTANCE (Grand Chamber) 17 September 2007 * In Case T-201/04,
54
Ibid at para759
example shows the Commission's readiness to step in when a dominant player's actions
unfairly affect others.

Nevertheless, it is important to distinguish between proper platform curation (such


as eliminating violent content) and excessive influence on third-party access. The latest
lawsuit between Apple Inc. and Epic Games, Inc. ([2021] US Dist. case. Court 55. Although
the judgment was made in the US legal system, it brings attention to the continuing
discussion on platform practices such as required in-app purchases, which are believed by
some to hinder competition and innovation. In this situation, Epic Games disputed Apple's
rule that mandates all in-app transactions on iOS devices to be processed through Apple's
payment system, resulting in Apple receiving a cut56. The court sided with Apple on the
majority of the points, but also mandated that Apple permit developers to guide users to
different payment methods. This situation highlights the importance of finding a delicate
equilibrium between enabling platforms to curate their services and stopping them from
excessively controlling market access.

Transparency and accountability principles outlined in Article 15 of the TFEU are


essential for appointing gatekeepers in the DMA 57. Public discussions that take into
account different viewpoints from various stakeholders are in line with these principles.
They make sure that the decisions made by the European Commission are not solely based
on internal factors or political interests, but are influenced by a wide variety of
perspectives.

The Commission has found that conducting public consultations in competition


cases, like the Google's Android investigation, has proven to be beneficial based on
previous experiences. In this instance, the Commission asked for input from different
parties such as rivals, customers, and scholars to help with its inquiry into Google's
supposed anti-competitive behaviours58. This comprehensive approach not only boosted
the credibility of the Commission’s decision but also guaranteed that it was based on a
thorough grasp of the market dynamics.

Nonetheless, the worry about political interference in the designation process


persists. Although Article 17(3) TFEU highlights the Commission's autonomy, there is
potential for political influence, especially in relation to content moderation strategies. The
Commission needs to balance these challenges while upholding its dedication to making

55
Epic Games, Inc. v. Apple Inc. 20-cv-05640-YGR
56
Jonathan Stempel, “Epic Games says Apple violated App Store injunction, seeks contempt order” Reuters (2023)
available at https://www.reuters.com/legal/epic-games-accuses-apple-violating-app-store-injunction-2024-03-13/

57
Article 15 of the TFEU
58
DIETER PAEMEN, The Google Android European Court Judgment and its Wider Implications, Clifford Chance available
at https://www.cliffordchance.com/insights/resources/blogs/talking-tech/en/articles/2022/09/the-google-android-
european-court-judgment-and-its-wider-implica.html
decisions based on facts and evidence.Precautions are in place to prevent bias and
maintain the Commission's autonomy. The CJEU has consistently supported the
Commission's autonomy in competition matters, as seen in the case of France v
Commission ([1991] ECR I-659)59. In this instance, the CJEU dismissed France's argument
that the Commission had gone beyond its authority, thus upholding the Commission's
independence and its ability to implement competition regulations.

Gatekeeper Rules and their Potential to Enhance Free Speech

The objective of these guidelines is to prevent one platform from dominating and
encourage a variety of content by ensuring interoperability, fair ranking, and transparent
content moderation. Gatekeepers under the DMA must have interoperability as a crucial
requirement. This implies that gatekeepers might need to permit users to move data and
engage with different platforms. Through promoting interoperability, the DMA has the
ability to avoid platform lock-in, which traps users in one specific ecosystem 60. This can
promote a broader variety of voices and perspectives by allowing users to access content
and viewpoints beyond just one platform. Interoperability can promote competition and
innovation by allowing new and smaller platforms to enter the market and engage with
established platforms more seamlessly61.

The gatekeepers also need to ensure fair ranking as a crucial requirement.


Gatekeepers could be banned from unfairly favoring their own services or content over
others in search results and platform features under the DMA 62. This can create a fairer
competition for smaller content providers, making sure that a variety of voices can be
heard by users. The DMA can encourage a more varied and lively digital landscape by
stopping gatekeepers from controlling rankings in order to boost their own content 63.
Having transparency in content moderation is also a crucial component of the Gatekeeper
Rules in the DMA.

Gatekeepers must offer more transparent content moderation policies and


procedures. This enables users to comprehend the reasons for content removal and
possibly dispute unjust decisions. Transparency in content moderation can also foster trust

59
APPLICATION for annulment of Commission Decision C(2006) 659 final of 1 March 2006 seeking payment of penalty
payments due in compliance with the judgment of the Court of Justice in Case C-304/02 Commission v France [2005] ECR
I-6263,
60
European Commission - Press release, “Designated gatekeepers must now comply with all obligations under the Digital
Markets Act Brussels, 7 March 2024” https://europa.eu/newsroom/ecpc-failover/pdf/ip-24-1342_en.pdf
61
Global Partners Digital, “The EU Digital Markets Act: is interoperability the way forward?” available at https://www.gp-
digital.org/the-eu-digital-markets-act-is-interoperability-the-way-forward/
62
Hacker P, Cordes J, Rochon J. Regulating Gatekeeper Artificial Intelligence and Data: Transparency, Access and Fairness
under the Digital Markets Act, the General Data Protection Regulation and Beyond. European Journal of Risk Regulation.
2024;15(1):49-86. doi:10.1017/err.2023.81
63
Ibid
between platforms and users by providing users with a better grasp of the rules governing
their online interactions64.

These regulations may also motivate platforms to implement more impartial


content moderation procedures. Platforms are currently dealing with the difficulty of
maintaining a balance between allowing free speech and addressing concerns regarding
harmful content. The DMA could discourage platforms from excessively moderating
content or catering to specific user groups by fostering a competitive environment 65. This
may result in a more fair and nuanced strategy for managing content, ensuring that legal
and rightful speech is preserved while harmful content is dealt with efficiently.

In summary, the DMA's Gatekeeper Regulations have the potential to greatly


improve freedom of expression by curbing platform dominance, championing a variety of
content, and encouraging transparency and fairness in content moderation. Smaller
content providers and new market entrants can also be advantageous with these
regulations, leading to a more dynamic and competitive digital market. Nevertheless,
careful oversight will be necessary to ensure that these rules effectively encourage free
speech and competition, while also avoiding stifling innovation or placing excessive
burdens on platforms.

Risks of Over-Compliance and Potential for Suppressing Free Speech

Although the Gatekeeper Rules in the Digital Markets Act (DMA) intend to boost
free speech and encourage a variety of content, there is a potential danger of excessive
compliance leading to the stifling of free speech66. This danger occurs when platforms
become too careful in their content moderation practices to avoid penalties for not
following the rules67. Excessive content moderation caution may result in platforms
removing legal content due to potential sanctions. This could potentially discourage users
from expressing themselves freely, hindering genuine discussions on touchy subjects.
Users might choose to censor themselves by not sharing content that is legal but could
spark controversy, for fear of it getting deleted. This may result in online discussions
becoming more uniform, as only the least contentious opinions are shared68.

64
Aurelien Portuese, “The Digital Markets Act: European Precautionary Antitrust” Information technology and
Innovation Foundation (2021) https://itif.org/publications/2021/05/24/digital-markets-act-european-precautionary-
antitrust/
65
Ibid
66
European Commission, “Digital Markets Act: Commission designates six gatekeepers” Press Release 6 September 2023
available at https://ec.europa.eu/commission/presscorner/detail/en/ip_23_4328
67
Gregory Day, Monopolizing Free Speech, 88 Fordham L. Rev. 1315 (2020). Available at:
https://ir.lawnet.fordham.edu/flr/vol88/iss4/3
68
Gregory Day, Monopolizing Free Speech, 88 Fordham L. Rev. 1315 (2020). Available at:
https://ir.lawnet.fordham.edu/flr/vol88/iss4/3
Platforms may prioritize technical efficiency over nuanced content moderation due
to their emphasis on compliance. To avoid being penalized, platforms may choose to use
basic content moderation algorithms that can quickly review and moderate a large amount
of content, but lack the subtlety and context understanding of human decision-making 69.
This may lead to the deletion of contentious yet important material, like political
disagreements or cultural critique, which would continue to limit freedom of expression.
Furthermore, the potential for legal responsibility may deter innovation. Platforms may
start to hesitate to implement new features or functionalities that could potentially
increase their liability70. This might impede the creation of novel methods for linking and
exchanging information, restricting the digital realm's ability to nurture innovation,
teamwork, and societal transformation.Ultimately, although the DMA’s Gatekeeper Rules
could boost free speech and encourage varied content, there is a chance of excessive
adherence leading to a constraint on free speech and hinderance of innovation. It is
essential that these regulations are put into effect and upheld to maintain a equilibrium
between encouraging competition and safeguarding freedom of speech. This will need
continuous monitoring and fine-tuning, along with a dedication to maintaining
transparency, accountability, and proportionality principles

IV. Balancing Open Markets and Free Speech

Balancing the promotion of open digital markets with the protection of


fundamental rights such as free speech is a complicated issue central to the Digital
Services Act (DSA). The DSA aims to create a fairer digital market with more competition,
by placing extra duties on big online platforms referred to as "gatekeepers". Despite this, it
also aims to safeguard the basic rights of users, such as the right to freedom of expression.
Achieving equilibrium between these two goals demands thorough reflection and
precision.

One of the primary challenges in maintaining this fragile equilibrium is the conflict
between clarity and flexibility. Having clearer instructions for platforms to differentiate
between legal and illegal content can assist in maintaining consistent and effective content
moderation processes. For example, the DSA mandates that platforms have systems in
place to quickly remove or block access to unlawful content. Nevertheless, strict
adherence to these rules may inhibit creativity and ability to respond to changing online
risks. The online environment is ever-evolving, with continuous emergence of new harmful
content and behaviours71. Hence, the DSA must grant sufficient leeway for platforms to
69
Niam Yaraghi , “Regulating free speech on social media is dangerous and futile” (2018)
https://www.brookings.edu/articles/regulating-free-speech-on-social-media-is-dangerous-and-futile/
70
Freedom and the Media 2019, “Media Freedom: A Downward Spiral” Freedom House available at
https://freedomhouse.org/report/freedom-and-media/2019/media-freedom-downward-spiral

71
Francesco Vogelezang, “Illegal vs Harmful Online Content:Some reflections on the upcoming Digital Services Act
package.” Institute for Internet and Just Society (2020) https://www.internetjustsociety.org/illegal-vs-harmful-online-
adjust their methods of content moderation in response to these changing risks, all while
maintaining specific guidelines to uphold fairness and consistency.

Having independent supervision is also vital in maintaining a equilibrium between


open markets and freedom of speech. Independent oversight bodies have an important
role in making sure the DSA's Gatekeeper Rules are enforced properly and do not result in
excessive limitations on freedom of speech72. These entities can offer a fair assessment of
platforms' content moderation techniques, making sure they adhere to the DSA and do
not misuse their authority to regulate content or stifle freedom of speech. They can also
offer a platform for users to challenge content moderation rulings, giving an extra level of
responsibility.

Moreover, it is crucial to promote communication among platforms, regulators, and


civil society groups in order to maintain a balance between open markets and freedom of
speech. This conversation can aid in recognizing possible issues in the DSA’s rules and
establishing effective methods for moderating content that maintains a balance between
safety and freedom of speech. For example, platforms can discuss their experiences and
obstacles in applying the regulations of the DSA, regulators can offer advice and
explanation on these regulations, and civil society organizations can give feedback on how
these regulations affect users' rights and freedoms 73. This conversation can assist in
guaranteeing that the DSA is executed in a manner that supports competitive markets and
protects freedom of expression.

It is a difficult challenge to balance promoting open digital markets and protecting


basic rights such as freedom of speech. Nevertheless, there are possible remedies and
optimal methods that can assist in attaining this equilibrium. To begin with, oversight
bodies that are independent have the potential to be extremely important in this
process74. These organizations can conduct an unbiased evaluation of how platforms
moderate content to make sure they follow the DSA and do not exploit their authority to
censor or limit free speech. They can also offer a platform for users to challenge content
moderation choices, adding another level of responsibility. Autonomous monitoring
organizations are able to carry out audits and inquiries on platforms' actions, ensuring
transparency and responsibility. They can also offer advice and suggestions to platforms on
how to enhance their protocols in order to safeguard users’ rights more effectively75.
content
72
European Parliament, “Online Platforms' Moderation of Illegal Content Online”
https://www.europarl.europa.eu/RegData/etudes/STUD/2020/652718/IPOL_STU(2020)652718_EN.pdf
73
Franklin De Vrieze, “Independent oversight institutions and regulatory agencies, and their relationship to parliament
Outline of assessment framework” Westminster Foundation for Democracy (2019)
https://www.wfd.org/sites/default/files/2021-12/WEB_INDEPENDENT-OVERSIGHT-INS.pdf
74
Ibid
75
STAROŇOVÁ, Katarína. “Comparing the Roles of Regulatory Oversight Bodies in Central and Eastern European
Countries.” European Journal of Risk Regulation, vol. 8, no. 4, 2017, pp. 723–42. JSTOR,
https://www.jstor.org/stable/26363845. Accessed 23 Mar. 2024.
Secondly, providing more explicit instructions for platforms to differentiate
between legal and illegal content can be advantageous. These guidelines offer platforms a
solid structure for content moderation, minimizing the chance of making random or unfair
decisions. Nevertheless, these recommendations must be able to adjust to the changing
digital environment with flexibility. Regularly reviewing and updating them is necessary to
incorporate changes in technology, societal norms, and legal standards. Clear guidelines
can assist platforms in improving their content moderation systems by offering a precise
standard for training algorithms and human moderators, resulting in more effective and
efficient systems.

In summary, finding a middle ground between encouraging open digital markets


and protecting freedom of speech within the DSA is a challenging endeavour that
demands meticulous examination of multiple factors. Independent monitoring
organizations, more defined regulations for platforms, and communication among all
parties are essential components of this procedure. Through a thorough examination of
these elements, the DSA can establish an impartial and competitive online market that
safeguards fundamental rights of users. Nevertheless, it is crucial to keep in mind that this
is an ongoing process that demands regular assessment and adjustment to the constantly
changing digital environment. The Digital Services Act is just the start, not the final
solution, in maintaining a equilibrium between open markets and free speech in the digital
era.

Conclusion

Ultimately, the DSA and the DMA mark important progress in regulating digital
services within the European Union. They strive to establish a digital marketplace that is
both just and competitive, while also safeguarding users' basic rights such as freedom of
speech.. The discovery of how Gatekeeper Rules can improve freedom of speech is a major
revelation in this article. The DSA seeks to prevent large online platforms known as
"gatekeepers" from abusing their authority to censor content or stifle freedom of speech
with new requirements. Nevertheless, there is a possibility of excessive adherence, as
platforms might lean towards being overly careful and eliminate more content than
required in order to steer clear of consequences. This may result in unjust limitations on
freedom of expression and suppress the free flow of ideas necessary for a functioning
democracy.
It is crucial to find a balance between open markets and freedom of speech. Establishing
more precise instructions for platforms, implementing unbiased monitoring, and fostering
communication between all parties are essential factors in reaching this equilibrium.
Nevertheless, it is important for these measures to be able to adjust to the changing digital
environment and to uphold the rights and freedoms of users. Another important topic of
this article is the changing rules of content moderation in the EU. In light of ongoing
changes in the digital environment, it will be crucial to regularly assess and revise these
regulations to guarantee their effectiveness and relevance. The DSA and DMA mark the
start of this process, with expected future developments as new challenges and
opportunities emerge.

There are numerous possible areas for additional research on this subject. One
aspect involves evaluating individual instances in which the Commission has identified
gatekeepers and the rationale for those choices. This could give important information on
understanding the "impact on third parties" and other crucial concepts in the DMA. The
role of national courts in reviewing the Commission's gatekeeper designations is another
important area. Analyzing the DMA could lead to a more detailed legal understanding and
guarantee its consistent and fair application throughout the EU.

Another crucial area for further research is the interaction of other EU regulations,
like the DSA. This may assist in guaranteeing a consistent and successful regulatory
structure for the digital market. Another crucial factor to consider is the potential harm to
the Commission's reputation if its designation process is viewed as unfair or not
transparent. This might result in heightened examination and possibly impact upcoming
policy choices.

In conclusion, examining collective redress lawsuits involving gatekeepers could


offer useful perspectives on the Commission's comprehension of the impact on third
parties in the process of designating gatekeepers. This could help ensure that the DMA is
conducted in a just and effective manner while also respecting the rights and freedoms of
users. In conclusion, the DSA and the DMA play a crucial role in overseeing digital services
in the EU. However, they present complex issues that require careful consideration and
ongoing examination. By consistently addressing these issues and engaging in
conversations with various stakeholders, we aim to create a digital marketplace that is just,
competitive, and respects the fundamental rights of every user.

References
Adam Satariano, “Facebook Loses Antitrust Decision in Germany Over Data Collection”
New York Times (June 2020) https://www.nytimes.com/2020/06/23/technology/facebook-
antitrust-germany.html

Aina Turillazzi, Mariarosaria Taddeo, Luciano Floridi & Federico Casolari (2023) The digital
services act: an analysis of its ethical, legal, and social implications, Law, Innovation and
Technology, 15:1, 83-106, DOI: 10.1080/17579961.2023.2184136

APPLICATION for annulment of Commission Decision C(2006) 659 final of 1 March 2006
seeking payment of penalty payments due in compliance with the judgment of the Court
of Justice in Case C-304/02 Commission v France [2005] ECR I-6263,

Aurelien Portuese, “The Digital Markets Act: European Precautionary Antitrust”


Information technology and Innovation Foundation (2021)
https://itif.org/publications/2021/05/24/digital-markets-act-european-precautionary-
antitrust/

Caitlin Chin-Rothmann, Taylar Rajic, and Evan Brown, “A New Chapter in Content
Moderation: Unpacking the UK Online Safety Bill” CSIS (2023) available at
https://www.csis.org/analysis/new-chapter-content-moderation-unpacking-uk-online-
safety-bill

Carr Center For Human Rights Policy Harvard Kennedy School, “Freedom of Speech and
Media:Reimagining Rights and Responsibilities in the US” (February 2021) available at
https://carrcenter.hks.harvard.edu/files/cchr/files/free_speech.pdf

Ceyhun Pehlivan , Peter Church “EU - The DSA: A new era for online harms and
intermediary liability” Linklaters (2022) available at
https://www.linklaters.com/en/insights/blogs/digilinks/2022/july/eu-the-dsa-a-new-era-
for-online-harms-and-intermediary-liability
Christophe Carugati, “The difficulty of designating gatekeepers under the EU Digital
Markets Act” Bruegel 2023 available at https://www.bruegel.org/blog-post/difficulty-
designating-gatekeepers-under-eu-digital-markets-act

Clifford Chance, “Content Moderation and Online Platforms: An impossible problem?


Regulators and legislators look to new laws” Clifford Chance (2020) available at
https://www.cliffordchance.com/insights/resources/blogs/talking-tech/en/articles/
2020/07/content-moderation-and-online-platforms--an-impossible-problem--.html

DIETER PAEMEN, The Google Android European Court Judgment and its Wider
Implications, Clifford Chance available at
https://www.cliffordchance.com/insights/resources/blogs/talking-tech/en/articles/
2022/09/the-google-android-european-court-judgment-and-its-wider-implica.html

Digital Markets Act, Regulation 2

E&G Economides LLC, “NAVIGATING THE DIGITAL SERVICES ACT: INNOVATION AND
ACCOUNTABILITY” available at
https://www.economideslegal.com/media/documents/Navigating_the_Digital_Services_A
ct.pdf

Elsevier Inc. v Cyando AG (C-683/18) (Requests for a preliminary ruling from the
Bundesgerichtshof (Federal Court of Justice, Germany)) available at CURIA - Documents
(europa.eu)

Epic Games, Inc. v. Apple Inc. 20-cv-05640-YGR

European Commission - Press release, “Designated gatekeepers must now comply with all
obligations under the Digital Markets Act Brussels, 7 March 2024”
https://europa.eu/newsroom/ecpc-failover/pdf/ip-24-1342_en.pdf

European Commission, “About the Digital Markets Act” https://digital-markets-


act.ec.europa.eu/about-dma_en
European Commission, “Digital Markets Act: Commission designates six gatekeepers”
Press Release 6 September 2023 available at
https://ec.europa.eu/commission/presscorner/detail/en/ip_23_4328

European Commission, “Digital Markets Act: Commission designates six gatekeepers”


Press Release 6 September 2023 available at
https://ec.europa.eu/commission/presscorner/detail/en/ip_23_4328

European Commission, “Digital Services Act Overview” European Commission available at


https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-
age/digital-services-act_en#:~:text=The%20DSA%20regulates%20online
%20intermediaries,and%20the%20spread%20of%20disinformation.

European Commission, “Questions and answers on the Digital Services Act” available at
https://ec.europa.eu/commission/presscorner/detail/en/QANDA_20_2348

European Commission, “The Digital Services Act package” Digital Strategy available at
https://digital-strategy.ec.europa.eu/en/policies/digital-services-act-package#:~:text=The
%20Digital%20Services%20Act%20and,level%20playing%20field%20for
%20businesses.&text=Follow%20the%20latest%20progress%20and%20learn%20more
%20about%20getting%20involved.

European Parliament, “Online Platforms' Moderation of Illegal Content Online”


https://www.europarl.europa.eu/RegData/etudes/STUD/2020/652718/IPOL_STU(2020)65
2718_EN.pdf

European Union Agency for Fundamental Rights, “Online Content Moderation – Current
challenges in detecting hate speech” FRA (2013)available at
https://fra.europa.eu/sites/default/files/fra_uploads/fra-2023-online-content-
moderation_en.pdf

Faculty of Law, University of Oslo, “ntermediary Liability for Copyright Infringement in the
EU’s Digital Single Market Looking at the Proposed Copyright Directive’s Compliance with
the current EU framework on Intermediary Liability”
https://www.duo.uio.no/bitstream/handle/10852/60876/1/Thesis-Intermediary-Liability-
for-Copyright-Infringement-in-the-EU-s-DSM.pdf

Francesco Vogelezang, “Illegal vs Harmful Online Content:Some reflections on the


upcoming Digital Services Act package.” Institute for Internet and Just Society (2020)
https://www.internetjustsociety.org/illegal-vs-harmful-online-content

Franklin De Vrieze, “Independent oversight institutions and regulatory agencies, and their
relationship to parliament Outline of assessment framework” Westminster Foundation for
Democracy (2019) https://www.wfd.org/sites/default/files/2021-12/WEB_INDEPENDENT-
OVERSIGHT-INS.pdf

Freedom and the Media 2019, “Media Freedom: A Downward Spiral” Freedom House
available at https://freedomhouse.org/report/freedom-and-media/2019/media-freedom-
downward-spiral

Global Freedom of Expression, “ Delfi v Estonia”


https://globalfreedomofexpression.columbia.edu/cases/delfi-as-v-estonia/

Global Partners Digital, “The EU Digital Markets Act: is interoperability the way forward?”
available at https://www.gp-digital.org/the-eu-digital-markets-act-is-interoperability-the-
way-forward/

Graham Smith,” CJEU's GS Media copyright linking decision draws a line: ordinary internet
user or commercial website?” Bird&Bird (2016) available at CJEU's GS Media copyright
linking decision draws a line: ordinary internet user or commercial website? - Bird & Bird
(twobirds.com)

Gregory Day, Monopolizing Free Speech, 88 Fordham L. Rev. 1315 (2020). Available at:
https://ir.lawnet.fordham.edu/flr/vol88/iss4/3
Hacker P, Cordes J, Rochon J. Regulating Gatekeeper Artificial Intelligence and Data:
Transparency, Access and Fairness under the Digital Markets Act, the General Data
Protection Regulation and Beyond. European Journal of Risk Regulation. 2024;15(1):49-86.
doi:10.1017/err.2023.81

Huebler, John, "Free Speech and the Internet" (2020). Student Research Submissions. 338.
https://scholar.umw.edu/student_research/338

Human Rights Law Center, “European Court of Human Rights examines the entitlement to
freedom of speech Delfi AS v Estonia [2013] ECHR, Application No. 64569/09 (10 October
2013) https://www.hrlc.org.au/human-rights-case-summaries/european-court-of-human-
rights-examines-the-entitlement-to-freedom-of-speech

Joan Barata, Oliver Budzinski, Mark Cole, Alexandre de Streel, Michèle Ledger, Tarlach
McGonagle, Katie Pentney, Eleonora Rosati, “ Unravelling the Digital Services Act package”
European Audiovisual Advisory
https://su.diva-portal.org/smash/get/diva2:1605131/FULLTEXT01.pdf

Jonathan Stempel, “Epic Games says Apple violated App Store injunction, seeks contempt
order” Reuters (2023) available at https://www.reuters.com/legal/epic-games-accuses-
apple-violating-app-store-injunction-2024-03-13/

Joseph J. Norton, The European Court of Justice Judgement in United Brands:


Extraterritorial Jurisdiction and Abuse of Dominant Position, 8 Denv. J. Int'l L. & Pol'y 379
(1979).

Judgment of the Court of 14 February 1978. United Brands Company and United Brands
Continentaal BV v Commission of the European Communities. Chiquita Bananas. Case
27/76. Available at https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX
%3A61976CJ0027

JUDGMENT OF THE COURT OF FIRST INSTANCE (Grand Chamber) 17 September 2007 * In


Case T-201/04,
Kerber, W., Zolna, K.K. The German Facebook case: the law and economics of the
relationship between competition and data protection law. Eur J Law Econ 54, 217–250
(2022). https://doi.org/10.1007/s10657-022-09727-8

Kyriakos Fountoukakos, Marcel Nuys, Juliana Penz and Peter Rowland, “The German FCO’s
decision against Facebook: a first step towards the creation of digital house rules?” The
Competition Law Journal (18)(2)55-65 available at
https://awards.concurrences.com/IMG/pdf/14._the_german_fco_s_decision_against_face
book_k_fountoukakos_m_nuys_j_penz_p_rowland_comp_law_journal_july_2019_.pdf?
55716/90b472e3bdb19a8366d83f872d6ca3c466422e732e175fa9f995cf5e1c97d6b5

Margaret H. FitzPatrick, United Brands Company v. Commission of the European


Communities: Window to Price Discrimination Law in the European Economic Community,
1 Nw. J. Int'l L. & Bus. 338 (1979)

Neville Cox, “Delfi v. Estonia: Privacy Protection and Chilling Effect”


https://verfassungsblog.de/delfi-v-estonia-privacy-protection-and-chilling-effect/

Neville Cox, “Delfi v. Estonia: Privacy Protection and Chilling Effect”


https://verfassungsblog.de/delfi-v-estonia-privacy-protection-and-chilling-effect/

Niam Yaraghi , “Regulating free speech on social media is dangerous and futile” (2018)
https://www.brookings.edu/articles/regulating-free-speech-on-social-media-is-dangerous-
and-futile/

NoToHate Fact Sheets, “ HATE SPEECH, MIS- AND DISINFORMATION” UN available at


https://www.un.org/sites/un2.un.org/files/notohate_fact_sheets_en.pdf

Opinion Of Advocate General Saugmandsgaard ØE delivered on 16 July 2020 (1) Joined


Cases C-682/18 and C-683/18 Frank Peterson v Google LLC, YouTube LLC, YouTube Inc.,
Google Germany GmbH (C-682/18) and
Oskar Josef Gstrein, “Case analysis of the ECtHR judgment in Delfi AS v. Estonia (app. No.
64569/09), The difficulties of information management for intermediaries” available at
https://jean-monnet-saar.eu/?p=881

See CASE OF DELFI AS v. ESTONIA (Application no. 64569/09) JUDGMENT STRASBOURG 16


June 2015available at
https://www.echr.coe.int/documents/d/echr/Press_Q_A_Delfi_AS_ENG

STAROŇOVÁ, Katarína. “Comparing the Roles of Regulatory Oversight Bodies in Central and
Eastern European Countries.” European Journal of Risk Regulation, vol. 8, no. 4, 2017, pp.
723–42. JSTOR, https://www.jstor.org/stable/26363845. Accessed 23 Mar. 2024.

Taha Yasseri, “From Print to Pixels: The Changing Landscape of the Public Sphere in the
Digital Age” School of Sociology, University College Dublin (December 2023)
DOI:10.13140/RG.2.2.20652.44166

The Digital Services Act, Article 14

The Digital Services Act, Article 15

The Treaty on the Functioning of the European Union, Article 101

TheLegal500, “User Content Moderation Under the Digital Services Act – 10 Key
Takeaways” available at
https://www.legal500.com/developments/thought-leadership/user-content-moderation-
under-the-digital-services-act-10-key-takeaways/

Théophile Lenoir, “Challenges of Content Moderation” Instituit Montaigne (2019) available


at https://www.institutmontaigne.org/en/expressions/challenges-content-moderation

Thomas Thiede, Laura Herzog (Spieker & Jaeger), “The German Facebook Antitrust Case –
A Legal Opera” Kluwer Competition Law Blog (February 2021) available at
https://competitionlawblog.kluwercompetitionlaw.com/2021/02/11/the-german-
facebook-antitrust-case-a-legal-opera/
UK Parliament, “Digital Technology and the Resurrection of Trust Contents: Accountability
and the technology platforms” Parliament Publications (2020) available at
https://publications.parliament.uk/pa/ld5801/ldselect/lddemdigi/77/7707.htm

UNESCO, “Addressing hate speech on social media: contemporary challenges” UNESCO


(2021) available at https://unesdoc.unesco.org/ark:/48223/pf0000379177

Van Doorne, “The Digital Markets Act (DMA): rules for digital gatekeepers to ensure open
markets” https://www.lexology.com/library/detail.aspx?g=dff5cdb3-7e2f-4ec2-9b7b-
095bd156e7cd

Wilson, Richard A. and Land, Molly, "Hate Speech on Social Media: Content Moderation in
Context" (2021). Faculty Articles and Papers. 535.
https://opencommons.uconn.edu/law_papers/535

Zachary Laub, “Hate Speech on Social Media: Global Comparisons” Council on Foreign
Relations (June 2019) available at https://www.cfr.org/backgrounder/hate-speech-social-
media-global-comparisons

Zoi Krokida (Stirling University), “AG’s opinion on Peterson/ YouTube: Clarifying the liability
of online intermediaries for the violation of copyright-protected works?” Kluwer Copyright
Blog (January 2021) available at AG’s opinion on Peterson/ YouTube: Clarifying the liability
of online intermediaries for the violation of copyright-protected works? - Kluwer Copyright
Blog (kluweriplaw.com)

You might also like