Skip to main content

When Platforms Judge: Delegated Jurisdiction and the Redistribution of Public Authority

By March 12, 2026Developments

Fernanda Florentino Fernandez Jankov, PhD, legal scholar affiliated with the Faculty of Law of the University of São Paulo (USP), research focuses on jurisdictional transformation, delegated authority, and the constitutional reconfiguration of sovereignty

When Meta suspends an account pursuant to statutory risk-assessment duties, or when a platform removes content under legally mandated harm-prevention frameworks, are they merely moderating speech—or do they exercise a form of delegated jurisdiction?

The dominant narrative describes platform governance as regulation. This description is conceptually inadequate, however. Contemporary digital regulation does not simply impose external constraints on private actors (see, e.g., Jack Balkin, Free Speech in the Algorithmic Society). It increasingly embeds norm-application and enforcement functions within platform infrastructures themselves. https://lawreview.law.ucdavis.edu/archives/51/3/free-speech-algorithmic-society-big-data-private-governance-and-new-school-speech

This shift is not semantic. It is structural.

In legal theory, jurisdiction denotes the authority to apply general norms to particular cases with binding effect. It entails interpretation, factual assessment, sanction, and—at least in principle—the availability of review. Historically, jurisdiction has been a core attribute of public authority, embedded in constitutional structures and territorially bounded institutional orders. Courts and administrative bodies exercise jurisdiction because they are institutionally authorized to operationalize law in concrete situations.

Digital governance unsettles this architecture.

Across jurisdictions, platforms are no longer required merely to react to unlawful content after notification. They are increasingly mandated to anticipate risks, evaluate systemic harms, implement mitigation strategies, and structure internal decision-making procedures. Canada’s proposed Online Harms Act (Bill C-63) (https://www.parl.ca/DocumentViewer/en/44-1/bill/C-63/first-reading), for example, establishes a statutory “duty to act responsibly” (Part 1, Clause 12), requiring regulated services to implement measures aimed at reducing the risk of exposure to harmful content on their platforms. In practice, corporate compliance teams and content-moderation bodies must interpret statutory standards, determine thresholds for removal or restriction, apply enforcement measures, and operate internal complaint procedures. The application of public norms thus occurs within private infrastructures before any court or regulator intervenes.

What is at stake is not platform power as such, but the structural relocation of jurisdictional functions beyond constitutionally recognized institutions.

The vocabulary of “regulation” or “compliance” fails to capture this development. Regulation presupposes an external public authority supervising a distinct private subject. What we observe instead is a redistribution of adjudicative functions from constitutionally recognized state institutions — courts and administrative bodies — to privately operated platform governance systems. Norm interpretation and enforcement increasingly occur within platform infrastructures before any public authority reviews the matter.

Canada’s proposed Online Harms Act (Bill C-63) illustrates this transformation with particular clarity. By imposing a statutory “duty to act responsibly” (Part 1, Clause 12), the Act requires regulated services to identify risks associated with harmful content—including hate speech, incitement to violence, non-consensual intimate image distribution, and large-scale disinformation—and to implement governance frameworks designed to mitigate such harms. These obligations place platforms in the position of operationalizing public norms that directly implicate fundamental constitutional values, including freedom of expression, equality, human dignity, and the structural conditions necessary for the preservation of a democratic rule-of-law order.

These are not merely reporting or transparency duties. They require platforms to make first-instance determinations about whether specific content falls within legally defined categories of harm and to impose binding consequences such as removal, restriction, or demotion. In doing so, platforms perform a function structurally analogous to adjudication: they interpret statutory standards, apply them to concrete cases, and enforce outcomes within their systems. The shift is therefore institutional, not semantic—it relocates the operational core of norm application from courts and public authorities to privately structured governance mechanisms.

While the Canadian model articulates an explicit statutory duty, the European Union framework structures comparable obligations through layered due diligence and risk mitigation mechanisms. The European Union’s Digital Services Act (Regulation (EU) 2022/2065) (https://eur-lex.europa.eu/eli/reg/2022/2065/oj) similarly mandates systemic risk assessments and due diligence obligations for very large online platforms, requiring them to evaluate risks related to the dissemination of illegal content, disinformation, and impacts on fundamental rights, and to adopt mitigation measures subject to oversight by national Digital Services Coordinators and the European Commission. The United States, by contrast, continues to rely primarily on constitutional speech protections and Section 230 of the Communications Decency Act (https://www.law.cornell.edu/uscode/text/47/230), which shields platforms from liability for third-party content while allowing them to engage in voluntary moderation. This model maintains a clearer institutional separation between adjudication by state courts and private content moderation decisions.

Yet across these models, platforms increasingly perform the functional core associated with the exercise of jurisdiction: they interpret norms, assess facts, impose consequences, and provide internal appeal pathways. Their decisions shape the boundaries of permissible expression and the distribution of digital rights in real time. The practical determination of legality and harm often occurs within platform systems before courts or regulators ever review a case.

This institutional transformation cannot be fully understood through the frameworks of private governance or corporate responsibility. What is emerging is a model of delegated jurisdiction embedded within global digital infrastructures. This delegation is not informal or merely practical; it is statutory. Legislatures increasingly require regulated platforms to establish internal systems for identifying harmful content, interpreting legal standards, applying removal or restriction measures, and operating complaint-handling mechanisms. The authority for this exercise does not derive from corporate policy alone, but from legislative mandates that condition market access and legal compliance on the implementation of these decision-making structures. Public institutions retain oversight and enforcement powers, but the initial application of legal norms to concrete cases occurs within platform systems themselves.

Delegation is not unfamiliar in constitutional systems. At the national level, administrative agencies routinely exercise delegated authority within constitutionally defined accountability structures, subject to judicial review and parliamentary oversight. Platform governance, however, introduces a distinct configuration. Under statutory mandates such as Canada’s Online Harms Act (Bill C-63, Part 1, Clause 12) (https://www.parl.ca/DocumentViewer/en/44-1/bill/C-63/first-reading), private digital services are required to establish internal systems that identify harmful content, interpret statutory standards, decide on content removal or restriction, and operate complaint-handling mechanisms. When a platform determines whether a piece of content constitutes hate speech or harmful disinformation under legally defined categories—and removes or restricts it accordingly—it is performing a function structurally analogous to adjudication. Yet these decisions are taken by private entities operating transnationally, outside traditional constitutional accountability architectures, with public authorities positioned primarily in supervisory or ex post review roles

The resulting configuration produces a constitutional asymmetry. Platforms operationalize public norms, but they are not constitutionally constituted as public authorities. They exercise structured decision-making power, yet their legitimacy and review mechanisms remain primarily internal or regulatory rather than judicial in nature.

If jurisdiction can be redistributed through statutory design and infrastructural dependence, then the modalities through which sovereignty is exercised are undergoing transformation. Infrastructural dependence refers to the structural reliance of public discourse and democratic participation on privately operated digital platforms that function as the primary channels of communication and information exchange. Legislative mandates increasingly operate through these platforms, requiring them to embed norm-application mechanisms within their technical and organizational architecture. Authority is no longer monopolized by territorially bounded institutions. It is functionally dispersed across hybrid public-private architectures in which judging becomes infrastructural: decisions about the legality, permissibility, or harmfulness of content are made within platform systems through structured moderation protocols, risk assessments, and algorithmic prioritization before courts or regulators intervene.

The constitutional question of our time may not be how to regulate platforms more effectively. It may be how to conceptualize and constitutionalize delegated jurisdiction once the practice of applying and enforcing public norms is no longer institutionally confined to the State.

Understanding this shift requires moving beyond the language of compliance. It demands a reconceptualization of jurisdiction as a function that can be redistributed, embedded, and operationalized across institutional and infrastructural actors in the digital age. This task must be undertaken at the constitutional level, as part of a reassessment of the foundations of state sovereignty itself. Revisiting the classical distinctions between prescriptive, adjudicative, and enforcement jurisdiction is essential, particularly where statutory design embeds first-instance norm application within private digital systems operating transnationally. The question is no longer whether delegation occurs, but how sovereign authority is structured, limited, and rendered accountable when its operational core is exercised through hybrid public-private infrastructures. Addressing this transformation requires developing new constitutional frameworks capable of clarifying the conditions under which delegated jurisdiction remains compatible with democratic rule-of-law principles.

Suggested citation: Fernanda Florentino Fernandez Jankov, When Platforms Judge: Delegated Jurisdiction and the Redistribution of Public Authority, Int’l J. Const. L. Blog, Mar. 12, 2026, at: http://www.iconnectblog.com/when-platforms-judge-delegated-jurisdiction-and-the-redistribution-of-public-authority/

Leave a Reply