August 2025
Algorithms of Sovereignty: How Technology Became the New Architect of Power
Digital platforms now extract rent, set rules, and shape politics like feudal estates — exploring how technology became the new architecture of sovereignty.
Angelina Zaitseva
In 2022, Sri Lanka’s government collapsed within just a few months. Fiscal irresponsibility and the lingering effects
Welcome to Garden Research. We bring together epistemic methods, cognitive inquiry, and experimental approaches to carry research beyond academic boundaries. We build for those who want to develop research literacy, learn how to work with knowledge, navigate complexity, and build durable systems of thinking.
August, 11
In 2023, the U.S. Federal Trade Commission launched a major antitrust investigation against Amazon. In the course of the proceedings, it was revealed that between 2015 and 2019, the company had secretly deployed an algorithm codenamed "Project Nessie" — one that allowed it to test how high it could raise prices while remaining confident that competitors would automatically follow suit.

The algorithm analyzed behavioral patterns of other e-commerce platforms and predicted their reactions to Amazon's pricing changes, effectively transforming market competition into a coordinated pricing system.

As a result, Amazon generated over $1 billion in excess profits, forcing American consumers to overpay — and competitors to unconditionally accept the imposed price dynamics.
In 2024, the European Commission fined Apple €1.84 billion for violating the Digital Markets Act (DMA) over its App Store policies. It was found that the company had systematically prohibited app developers — Spotify among them — from informing users that subscriptions could be purchased more cheaply outside the App Store.

The so-called anti-steering policy meant that within an app, developers could not even include a link or a textual mention of an alternative payment method. This effectively compelled both sellers and buyers to route all transactions through Apple's payment system, which charged commissions of up to 30%.
None of this is incidental. These are symptoms of a new economic reality.

Economist and former Greek Minister of Finance Yanis Varoufakis has termed it technofeudalism.

In his view, classical capitalism is already behind us. The primary source of power today is no longer markets and competition, but digital platforms that operate as private feudal estates. Rather than generating wealth through production or trade, they extract rent for access to their infrastructure.

Amazon, Google, or Apple are not simply companies operating within a market — they are owners of "digital land," where everyone else is forced to play by their rules.
And these rules cannot be negotiated — one can only refuse to use the platform altogether. Which, given the current market conditions, would amount to commercial failure and, consequently, product death.

The concept of technofeudalism made considerable waves in public discourse. And, naturally, it was met with a torrent of criticism.

For instance, writer and technology researcher Evgeny Morozov argues that while the idea of a "new feudalism" is compelling, most of what it describes still fits comfortably within capitalist logic — platform monopolies, data exploitation, and financial rents remain part of the same regime.

Meanwhile, researcher Arif Novianto insists that technofeudalism is more of a metaphorical image than a coherent socioeconomic reality. Many of the phenomena it points to — the concentration of power, control over infrastructure, dependence on cloud services — serve to augment capitalism rather than replace it.

Yet the resonance this idea produced is profoundly significant in its own right. Varoufakis clearly struck a nerve.
That nerve is a deep-seated anxiety about how modern technologies are reshaping our relationship with sovereignty.
Factory-States of the Future
It is no coincidence that in every legal case mentioned above, it was the state that initiated proceedings.

Advanced technologies and political sovereignty are entities that have been inextricably linked throughout history.
For state power, technology has always been a critical instrument — one it sought, at a minimum, to control, and ideally, to monopolize entirely.

This is necessary, first, in order to achieve specific military, economic, and social objectives. In the sixteenth century, England and France controlled gunpowder production, turning it into an instrument of power centralization. In the nineteenth century, Russia and the United States nationalized railroads and the telegraph as the backbone of military command and commerce. And in the twentieth century, the U.S. and the USSR were the first to monopolize nuclear technology, making it the principal resource of global power balance and diplomatic leverage.

But most importantly, technologies become instruments for shaping collective imagination.

In her book Dreamscapes of Modernity, scholar Sheila Jasanoff (1) calls these "sociotechnical imaginaries" — shared visions of what the future should look like and what role technology is assigned within it. In her view, those who control technologies effectively shape such visions for entire societies and, as a consequence, possess the ability to impose specific understandings of how the world works and why we are in it.

Historically, this was more often the domain of governments, carried out through full-scale state programs. "Atoms for Peace" presented nuclear energy to American citizens as a path to abundance and peace. Soviet five-year plans produced industrial utopias in which new factories, cities, and infrastructure became symbols of progress and the technological dominance of Soviet society. And the French nuclear program of the 1960s–70s promoted visions of a future in which the French would be an energy-independent and technologically advanced nation.

(1) Sheila Jasanoff, Sang-Hyun Kim, Dreamscapes of Modernity (2015)
Cloud Sovereignty
But in recent years, something has shifted.

States no longer produce convincing — or even seriously debated — scenarios of a shared future tied to technology. The primary "futurological center" has gradually relocated to Silicon Valley and Texas.

It is there that the defining narratives of our collective technological future are now being produced — narratives that form the core of public discourse on the trajectories of human development at large.

The pace of technological progress is so rapid that states no longer have time even to develop functioning regulatory mechanisms, let alone attempt to establish monopoly control over these technologies.

This is precisely why many scholars agree that AI, digital platforms, and algorithms have inaugurated a new chapter in humanity's relationship with technology.

In this chapter, technologies cease to be mere instruments in the hands of sovereignty — they become full-fledged actors that themselves possess sovereignty.

For example, they have the capacity to shape the social, economic, and political structures of society: through recommendation algorithms that determine information bubbles and political preferences; through messaging interfaces that have given rise to new forms of social behavior (from ghosting to Gatsby-ing); through platform services and freelance marketplaces that have normalized the precarity of labor.

In his book The Stack, philosopher of technology Benjamin Bratton (2) describes the contemporary technosphere as a "new architecture of sovereignty." In his account, all layers of modern infrastructure converge into something akin to a global "stack" that itself begins to assume the functions of a state: establishing rules, regulating behavior, distributing resources.

(2) Benjamin Bratton, The Stack: On Software and Sovereignty (2015)

Here are just a few examples: the Regulatory Intelligence Office in the UAE, where artificial intelligence analyzes legislation and proposes amendments to the government. Diella, the AI minister to whom the Albanian government recently delegated oversight of public procurement tenders to curb corruption. Nepalese protesters who actively used ChatGPT to vet candidates for the transitional government. Increasingly, technologies are being embedded in the very core of political processes, creating a radically new architecture of sovereignty.
Everything Is Connected to Everything
"So, are we going to be ruled by ChatGPT soon?"
Not quite.

It is critically important to understand that there is no external techno-entity that can actually "arrive" and "enslave" us (or, at the very least, take our jobs).

The distinctive feature of modern technologies lies not in the fact that they govern social reality from some external vantage point, but in the fact that they imitate, reproduce, and optimize it from within — compelling us to look at a reflection of our own biases, unjust political mechanisms, and entrenched forms of social and labor inequality.

In The Eye of the Master, philosopher Matteo Pasquinelli (3) demonstrates that artificial intelligence is shaped not through the imitation of biological intelligence, but through the assimilation of "the intelligence of labor and social relations." AI imitates "the form of social relations and the organization of labor," learning not individual thought, but collective patterns of interaction.

Amazon's Project Nessie did not create market dominance from scratch — the algorithm merely imitated and amplified the company's already existing patterns of oligopolistic behavior in the market.

The App Store's terms of use do not impose rules alien to humanity — they simply codify and automate the relations of dependency and control that are already familiar to us.
When MidJourney, in response to the prompt "Black African doctors providing care for white suffering children," persistently generated images of a white doctor and African children, it did not invent new prejudices. It reproduced the "white savior" social stereotype that was already entrenched in society.

It is not by chance that Bratton calls AI an "existential technology" — one capable of revealing certain aspects of ourselves with such precision that it can easily trigger an existential crisis.

(3) Matteo Pasquinelli, The Eye of the Master: A Social History of Artificial Intelligence (2023)
The Uncanny Valley. but in Reverse
When a neural network does something that resembles us, what frightens us is not how it imitates us, but how it reflects us. Because we may be encountering, for the first time, a real embodiment of our own social relations, our ways of working, thinking, speaking, and doing politics.

And it turns out that much of it is not particularly pleasant to confront.
WHAT's next
Made on
Tilda