What Davidson & Rees-Mogg Missed Entirely: Surveillance Capitalism
In 2019, Shoshana Zuboff published *The Age of Surveillance Capitalism*, and in doing so she named the threat that Davidson and Rees-Mogg never saw coming. Their 1997 book predicted, with remarkable accuracy, that digital technology would undermine state power. What they did not predict — what their
In 2019, Shoshana Zuboff published The Age of Surveillance Capitalism, and in doing so she named the threat that Davidson and Rees-Mogg never saw coming. Their 1997 book predicted, with remarkable accuracy, that digital technology would undermine state power. What they did not predict — what their framework was structurally incapable of predicting — was that the same technology would create a new form of private power more intimate, more pervasive, and in many ways more consequential than anything the state had managed.
This is not a minor oversight. It is the central failure of The Sovereign Individual as a guide to the world we actually inhabit. The book assumed that the digital economy would be liberating. It is, in many respects, extractive. And the extraction operates at a level that the authors could not have imagined: not your money, not your labor, but your behavior itself.
The Original Argument
Davidson and Rees-Mogg’s framework rested on a specific assumption about the digital economy: that information technology would shift power from institutions to individuals because it would make individuals harder to monitor, tax, and control. Encrypted communications would make surveillance difficult. Digital money would make taxation impractical. Remote work would make geographic captivity obsolete. The net effect would be a massive transfer of power from states to the skilled individuals who could navigate the new landscape.
This assumption was not unreasonable in 1997. The internet was decentralized, largely unmonitored, and governed by a culture of openness and pseudonymity. The cypherpunks — the community of programmers and activists who would later produce Bitcoin — were building tools for encrypted communication and anonymous transactions. The trajectory seemed clear: technology was making privacy easier and surveillance harder.
What actually happened was almost exactly the opposite. The dominant business model of the internet turned out to be not the sale of encrypted tools to privacy-conscious individuals, but the harvesting of behavioral data from billions of users who had no meaningful understanding of what was being collected or how it was being used. Google, Facebook, Amazon, and their successors did not liberate individuals from institutional surveillance. They built surveillance systems more comprehensive than any government had achieved, and they did it with the enthusiastic, if uninformed, cooperation of the people being surveilled.
Zuboff documented this process with the rigor of a scholar and the urgency of a whistle-blower. Her central argument is that surveillance capitalism represents a new economic logic — not just an extension of existing capitalism but a mutation of it. Traditional capitalism profits from the production and sale of goods and services. Surveillance capitalism profits from the prediction and modification of human behavior. The raw material is not labor or natural resources but human experience itself, rendered as data, processed into predictions, and sold to business customers who want to know — and influence — what you will do next.
Why It Matters Now
The concept Zuboff introduced that matters most for the sovereignty question is “behavioral surplus.” When you use a search engine, a social media platform, or a mapping application, some of the data you generate is used to improve the service you are using. This is the fair exchange — you get a useful product, the company gets data to make the product better. But the platforms discovered that the data generated by user behavior vastly exceeded what was needed to improve the product. The excess — the behavioral surplus — could be processed into predictions about future behavior and sold to advertisers, political campaigns, insurance companies, and anyone else willing to pay for the ability to anticipate and shape what people do.
This surplus extraction operates at a scale and granularity that would have been inconceivable in 1997. Your search history reveals your fears, your desires, your medical conditions, and your political inclinations. Your location data reveals where you sleep, where you work, who you visit, and how often. Your purchase history reveals your income level, your dietary choices, your addictions, and your aspirations. Your social media activity reveals your emotional state, your relationship stability, your susceptibility to various forms of persuasion, and the precise moments when you are most vulnerable to commercial or political messaging.
The asymmetry is the critical point for anyone concerned with sovereignty. You are transparent to the platform; the platform is opaque to you. You do not know what data is being collected, how it is being processed, what predictions are being generated, or who is purchasing those predictions. The platform knows more about your behavior than you know yourself — it can predict what you will click, what you will buy, what you will believe, before you have consciously made the decision. This is not a metaphor. It is the documented operational reality of the major technology platforms in 2026.
For most people, in most of their daily lives, this form of private surveillance is a greater threat to personal sovereignty than anything the state does. The state can tax you, regulate you, and in extreme cases imprison you. But the state, in most democratic countries, operates under legal constraints — due process, warrants, judicial oversight — that limit its capacity for arbitrary intrusion. The platforms operate under no comparable constraints. Their terms of service, which no one reads and everyone accepts, grant them authority over your behavioral data that would be constitutionally impermissible if claimed by a government.
Davidson and Rees-Mogg’s framework has no language for this threat because their framework assumes that the relevant axis of power runs between individuals and states. It does not account for a world in which the most consequential power over individual behavior is exercised by private corporations whose business model depends on knowing you better than you know yourself. This is not a refinement of their thesis. It is a refutation of their most fundamental assumption about the direction of technological change.
The Practical Extension
The convergence risk makes the situation more urgent than either state surveillance or corporate surveillance considered in isolation. In 2026, the boundary between government surveillance and corporate surveillance is increasingly blurred. Governments purchase behavioral data from data brokers rather than collecting it through traditional intelligence channels, thereby circumventing the legal constraints that apply to government surveillance. Platforms cooperate with government requests for user data, sometimes under legal compulsion and sometimes voluntarily. The surveillance infrastructure built by private companies becomes, in practice, available to state actors — precisely the outcome that the cypherpunks feared and that Davidson and Rees-Mogg assumed technology would prevent.
The practical countermeasures are not difficult to understand, but they require discipline and trade-offs that most people are unwilling to make. This is itself a sovereignty test: the willingness to accept inconvenience in exchange for autonomy.
Data minimalism is the foundation. Every service you use, every account you create, every app you install generates data that is harvested, aggregated, and sold. The sovereign response is to minimize the surface area. Use fewer services. Create fewer accounts. Install fewer apps. When you must use a service, provide the minimum information required. Do not link accounts. Do not use social login. Do not accept default privacy settings; change every setting to the most restrictive option available. This is tedious, but tedium is the price of sovereignty in an extractive economy.
Platform independence is the next layer. If your business depends on a platform — if your customers find you through Google, your audience reaches you through social media, your revenue flows through a payment processor controlled by a technology company — then you are not sovereign with respect to that platform. You are a tenant, and your tenancy can be revoked without notice, without explanation, and without recourse. The sovereign response is to build owned infrastructure: your own website on your own domain, your own email list on your own server, your own payment processing through systems you control. This is more expensive and more complicated than using platforms, and that is the point. The cost is the price of not being a tenant.
Owned infrastructure extends to communication. Email is a federated protocol; you can run your own server. Messaging can be conducted through end-to-end encrypted services that do not monetize your metadata. File storage can be local or self-hosted rather than cloud-based. Each of these choices involves trade-offs in convenience, and each of them represents a genuine increase in sovereignty over your information environment.
The deeper practice is attention sovereignty — the recognition that the platforms are not neutral conduits of information but active manipulators of your attention, your emotions, and your behavior. The feed is not showing you what you need to see; it is showing you what will keep you engaged, because engagement generates behavioral surplus. The sovereign response is to control your information inputs as deliberately as you control your financial inputs. Subscribe to specific sources rather than consuming algorithmically curated feeds. Read books and long-form writing rather than scrolling. Set time limits on platform usage and enforce them. Treat your attention as the scarce resource it is, because the platforms are treating it as raw material.
The Lineage
Zuboff’s work stands in a tradition of scholarship that examines how economic systems shape human experience in ways that are not immediately visible to the people living within them. Marx analyzed how industrial capitalism alienated workers from the products of their labor. Zuboff analyzes how surveillance capitalism alienates individuals from their own behavioral data — their own experience — by rendering it as raw material for someone else’s profit.
The parallel is instructive but not exact. Marx’s workers knew they were working in factories; they could see the alienation, even if they lacked the theoretical framework to name it. Zuboff’s subjects often do not know they are being surveilled, or if they know, they do not understand the scope or consequences. The extraction is invisible, which makes resistance harder to organize and sustain.
Davidson and Rees-Mogg’s intellectual ancestors — the Austrian economists, the public choice theorists, the megapolitical historians — were focused on the state as the primary threat to individual freedom. This was reasonable in the twentieth century, when states were the dominant institutions and their capacity for surveillance and coercion was unmatched. But the twenty-first century has produced institutions of a different kind, and the intellectual frameworks that were adequate for analyzing state power are not adequate for analyzing platform power.
Zuboff provides the corrective, and it is a corrective that any serious sovereignty practice must incorporate. The sovereign individual who protects their wealth from state seizure but surrenders their behavioral data to private platforms has achieved a partial and unstable form of sovereignty. They have locked the front door while leaving the windows open.
The Stoic tradition, as always, provides the deeper ground. Epictetus taught that freedom begins with understanding what is within your control and what is not. In the context of surveillance capitalism, what is within your control is your own behavior: what you share, what platforms you use, what defaults you accept, what attention you grant. What is not within your control is the behavior of the platforms themselves. The sovereign response is not to rage against the platforms — they will do what their business model requires — but to arrange your own affairs so that their extraction is minimized and their influence on your decisions is reduced.
This is harder than buying Bitcoin. It is less dramatic than relocating to a tax haven. It does not produce the adrenaline rush of joining a movement. But it is, in 2026, the most important sovereignty practice available to an ordinary person, because the surveillance economy touches every aspect of daily life in ways that state power, for most people in most countries, does not.
Davidson and Rees-Mogg got the direction of technological change wrong. The digital economy did not liberate individuals from institutional power. It created new institutions with new forms of power. Zuboff named those institutions and described their operations. The task for the sovereign individual in 2026 is to take both analyses seriously — to protect against state overreach and corporate extraction — and to build a life that is genuinely self-directed rather than merely relocated from one form of dependency to another.
This article is part of the Sovereign Individual Thesis series at SovereignCML. Related reading: The Sovereign Individual: What the Book Actually Argues, The Dark Side: When Sovereign Individualism Becomes Antisocial, The 2026 Sovereign Individual Thesis: A Synthesis