The EU is amidst the process of creating entirely new rules for content regulation on digital platforms in the form of the Digital Services Act (DSA). The DSA proposal, an update on the current e-Commerce Directive (ECD), started a process that will produce legislation with potentially far-reaching consequences not only for the EU’s digital environment but also likely far beyond its borders. The outcome is, of course, not yet known. However, judging from the proposal, DSA will make platforms’ content distribution and moderation practices more transparent, give users more power and protection, and create a regulatory framework to oversee it all.
However, earth-shaking as it sounds, at least in the context of seemingly never-ending platform regulation debates, it is not entirely new. For one segment of content – audiovisual – something like this has been already legislated for. At this very moment, we are witnessing its introduction into practice via the implementation of the provisions on video-sharing platforms (VSPs) in the Audiovisual Media Services Directive (AVMSD).
The AVMSD is arguably the first legislation that recognises the essential role of content platforms in distributing and organising the content they carry but also appreciates their distinct position from other forms of media regulated through its provisions.
This presentation will examine the new systemic approach to content regulation introduced by the AVMSD and taken up by the DSA proposal. And how the new nexus of rights it creates moves us from media-regulator dichotomy to a far more user oriented user-platfrom-regulator triangle.
According to the UK government’s response to the Online Harms White Paper consultation[1] ‘online harms’ encapsulates a broad variety of content published online that can be harmful to certain groups within society, and/or society and the public sphere at large. The content that falls within the scope of the Draft Online Safety Bill includes ‘illegal’ content, such as hate speech,[2] and legal, yet ‘harmful’ content, such as misinformation and disinformation.[3] The increasing role that is played by online platforms in our daily lives, and their responsibility for preventing the spread of illegal and harmful speech on their sites has been a source of debate for some time; a situation that has been intensified by the imminent publication of the Online Safety Bill in the UK, and similar legislative developments in other jurisdictions.[4]
When we think about this developing online harms regime (both in the UK, and elsewhere) we can be forgiven for thinking only in terms of how laws placing responsibility on social media platforms to prevent the spread of false news will benefit society and the public sphere. In this paper, I will argue that despite this benefit, the imposition of these laws, and the responsibilities they place on social media platforms, can have insidious implications for free speech. Specifically, my paper will make the claim that the increased prospect of liability, and the heavy fines that may result from breaching the duty of care in, what will be once it is enacted, the UK’s Online Safety Act, could result in social media platforms censoring more speech, but not necessarily illegal or harmful speech, or using the imposed ‘responsibility’ as an excuse to censor speech that does not conform to their own ideological telos, or align with their commercial objectives. Thus, in passing laws to protect the public sphere from certain content we may have inadvertently given social media platforms a statutory justification to take even more ‘control of the message’.
The German Network Enforcement Law (2017) is the first of a number of similar European initiatives, that require major platforms to moderate content in response to user takedown notices. This paper argues that its approach presents a compromise between American and European speech traditions. As all compromises, NetzDG has left both sides – US platforms and First Amendment advocates, on one side, and European constitutional stakeholders, on the other side – feeling ‘compromised’ and not illegitimately so. Nevertheless, the mechanism NetzDG adopts, namely a public framing of private censorship, is uniquely adept at simultaneously assuaging the primary European fear about the absence of effective speech controls and the primary American fear about the presence of governmental censorship. Both fears are grounded in their respective conceptions of empowered citizenry, and are as legitimate in their own right, as they are opposed to each other. Still, NetzDG’s central process of a public framework for private censorship creates a bridge between these speech traditions by addressing the structural differences that go to the heart of the legitimacy concerns. Whilst this bridge is fragile and imperfect, it is good enough to meet the core objections from either side.
János Kornai, the best-known Hungarian economist published his theory on the different coordination mechanisms at the beginning of the 80s. (The Socialist System, Princeton, 1993) According to the theory, the allocation, and the control of resources and people is performed by some major types of coordination mechanisms, from which the bureaucracy and the market are the two most important one. Although these two mechanisms are often contrasted as each other’s opposites, (partly by Kornai, but also in Mises’ theory) they are in fact complementary in modern (industrial) capitalism, because the capitalist enterprise use the bureaucratic control to organize its internal functioning, and to process the inputs (like consumer demand) from the outside world, as Beniger argues very convincingly in his famous book. (The Control Revolution, HUP, 1989)
The presentation offers a simple idea of the platform as a new type of coordination mechanism by partly using and further developing Lessig’s theory on algorithms as means of behavioral control, but building also on Julie Cohen’s theory of platform as the major infrastructure of informational capitalism. Platform in this framework is a new, and very effective coordination mechanism, having three features, (following partly van Dijck’s theory): datafication, membership, and algorithmic control. Platform coordination in many respects is so effective, that it acts as an “invasive species” – wherever it appears, it is very rapidly displacing, or disrupting the other two mechanisms. Though the disappearance of bureaucracy does not sound too threatening, this is not true for the mechanisms of the free market. The presentations will give some examples of this disruption.
I will argue in my presentation, that this disruptive nature is the reason why law sometimes seems to be confused about the regulation of platforms – because law is designed to regulate the acts of legal entities like companies, and/or people, and does not think in terms of mechanisms. To put it very simply It is not the particular company which is the cause of the problem, but the mechanism itself.
This contribution discusses how innovations in information technology are deployed in and affect work. Technology’s impact on work has a lengthy history commencing with mechanisation in the first industrial revolution, leading up to the present Industry 4.0 (or Industry 5.0 as the European Commission views it). Information technology continues efforts to maximise commercial efficiencies (through increasing production whilst minimising costs). Information technology has additionally enlarged and intensified a neoliberal approach to labour. The prevalence and entrenchment of information technology in work suggest a trajectory that has not been properly considered. The present is a chance to pause and consider next steps. Innovations in information technology have driven many of the workplace changes undertaken in advancing a neoliberal reconceptualization of labour. This reconceptualization of work evidences the triumph of ambitions over realities where decisions are made in advance of there being an infrastructure through which to implement them. Furthermore, we see a diminishing of the humanity from the workplace. There is a denigration of routine work to something that should be automated in order to render activity more efficient, without considering the place of routine work within the context of a job (as well as contemplating what it would mean to remove the routine). Moreover, there is tremendous potential for further entrenching existing inequalities (for example disparities between gender and race, as well as rural and urban differences). This is a threshold moment and, unlike at any point previously, information technology has made answering the following question unavoidable and imperative: what is the role of work in our lives?
It took a long time, a very long time. For two decades legislators have been observing in a more passive role the continuing increase of the success of online platforms, not last as facilitators of online content communication and distribution. Even though there have been discussions about inadequacy of pre-existing legal frameworks applicable to the online sphere in light of this economic success leading to a dominance and unavoidability of a number of key platform providers, the core regulation remained unchanged at least on the level of the EU. After preparatory steps and a slowly, but progressively unpacking of the E-Commerce Directive the arrival of the proposals for a Digital Services Act and a Digital Markets Act mark a (potential) significant change. This restart for an adequate regulation of the platform economy can also be seen in the work of the Council of Europe’s Information Society Department leading (potentially) to a new standard-setting Recommendation on Principles for Media and Communication Governance in 2022. The presentation will give an overview of key changes that would result from these approaches for content distribution online while providing an evaluation of the suitability and future-orientation of the proposals.
The proposed intervention will offer a look at the complex normative landscape for online governance, discussing challenges posed by multistakeholder governance and applying international law in seemingly borderless cyberspace. It combines a detailed technical look at network infrastructure with a foresight into geopolitical future beyond the Paris call, UN GGE or the OEWG’s first report. The author discusses cyber-sovereignty and possible “splinternet” scenarios with specific references to national efforts aiming at securing “national cyberspaces”. She argues that they are not technically valid political perspectives. Both these approaches, whether the norm making efforts of the UN or the technical efforts of the NATO community disregard the decentralized, multistakeholder-based design of the network. Only taking it into account as one of strategic factors can offer effective measures to “control” or “govern” internet infrastructure and tackle the challenges identified as “absence of great power cooperation” and “lack of incentives for internalizing norms” (Ruhl et all, 2020). The challenge in identifying and enforcing “norms” online, including those focused on protecting national critical infrastructures, lies in the way norms and standards for the day-to-day operations of the internet are currently created and enforced. These processes necessitate the collaboration of state and non-state actors, particularly the technical sector. Ensuring their effective cooperation based on a comprehensive set of norms is the greatest cybersecurity challenge of the 21st century.
The mix of public and private exercise (and abuse) of powers show how the rule of law is under pressure from multiple sides. Within this framework, the primary question is how constitutional law can limit the exercise of discretionary powers in the digital age. In this context, we argue that European constitutional law provides a path to mitigate the challenges for the principle of the rule of law. The rise of European digital constitutionalism has been a primary example of how the Union has reacted to the challenges raised by the consolidation of privat digital e powers.The primary goal of this work is to underline the evolution of the principle of the rule of law in the digital age while defining the potential remedies to safeguard this principle. We argue that European constitutional law already provides instruments to address this situation. More precisely, we focus on the horizontal effect of fundamental rights as a solution to abuse of powers by private actors and, more broadly, the positive obligation for public actors to protect human and fundamental rights which is another primary instrument to protect the principle of the rule of law against the consolidation of unaccountable powers.
The first part of this work examines the evolving landscape of the rule of law in the digital environment. Precisely, this part focuses on two examples looking at the challenges for the rule of law in the digital public sphere and in relation to the processing of personal data. The second part outlines the rise of European digital constitutionalism. The third part analyses the potential constitutional remedies and approaches which can ensure the protection of the principle of the rule of law in the algorithmic society.
[1] Department for Digital, Culture, Media and Sport, Online Harms White Paper: Full government response to the consultation, 15th December 2020: https://www.gov.uk/government/consultations/online-harms-white-paper/outcome/online-harms-white-paper-full-government-response.
[2] Ibid. According to the response a ‘limited number of priority categories of harmful content, posing the greatest risk to users, will be set out in secondary legislation’ which will include ‘hate crime.’ Furthermore, ‘hate content’ is one of the ‘priority categories’ that will be set out by the government in secondary legislation. See [2.3] and [2.29].
[3] Ibid. Where disinformation or misinformation is unlikely to cause ‘significant harm to an individual’ it will not fall in scope of regulation. Department for Digital, Culture, Media and Sport, Online Harms White Paper: Full government response to the consultation, 15th December 2020, 23, [2.76]-[2.78], [2.80]-[2.88]. Pursuant to clause 98 of the Bill Ofcom is required to establish and maintain a committee to advise it on the prevention and handling of disinformation and misinformation online.
[4] For example, the European Commission’s ongoing development of the Digital Services Act, Ireland’s Online Safety Media Regulation Bill, Germany’s ‘network enforcement law’ known as Netzwerkdurchsetzungsgesetz law, or ‘NetzDG’.
Telephone: + 36 30 588 5491
Postal address: H-1441 Budapest, P.O. Box 60
Address: H-1083 Budapest,
2 Ludovika tér
Email: sorban.kinga@uni-nke.hu
Domain name owner:
University of Public Service
Central phone number:
+36 (1) 432-9000
Editors-in-chief:
UPS IT Directorate
UPS Communications