The Business Model Is the Problem (Not the Technology)

The conversation about digital privacy tends to fixate on specific technologies — cookies, tracking pixels, facial recognition cameras, fingerprinting scripts — as though the problem were a collection of discrete tools that could be blocked, banned, or legislated away one at a time. This is understa

The conversation about digital privacy tends to fixate on specific technologies — cookies, tracking pixels, facial recognition cameras, fingerprinting scripts — as though the problem were a collection of discrete tools that could be blocked, banned, or legislated away one at a time. This is understandable. Specific technologies are tangible; you can point at a cookie banner and feel like you are doing something when you click “reject.” But the fixation on individual technologies obscures the structural reality: the economic incentive to extract behavioral surplus exists independent of any single technology. When you block one vector, the business model finds another. The problem is not the instrument. The problem is the incentive.

Shoshana Zuboff documented this dynamic with precision in The Age of Surveillance Capitalism, particularly in her account of Google’s evolution from a company that used behavioral data to improve search results to a company that used behavioral data to build and sell prediction products. The technology changed — from cookies to device fingerprinting to cross-app tracking to on-device inference — but the economic logic remained constant. Understanding that logic is the prerequisite for any response that lasts longer than the next browser update.

The Advertising-Funded Internet and Its Structural Incentive

The original sin, if we are going to use that language, was the decision to fund the consumer internet primarily through advertising. This was not inevitable. Early internet services experimented with subscription models, micropayments, and direct sales. But advertising won — partly because consumers demonstrated a clear preference for free services, and partly because advertising revenue scaled in ways that subscriptions did not.

The consequence was structural: once a company depends on advertising revenue, its primary economic relationship is with advertisers, not users. Users become the means of production. Their attention is the inventory being sold; their behavioral data is the intelligence that makes the inventory valuable. Every design decision, every feature, every interface choice is shaped by the need to generate more behavioral data and more attention — because that is what the actual customers, the advertisers, are paying for.

Google did not set out to build a surveillance apparatus. Zuboff documents, in chapter three of her book, that Google’s founders initially resisted advertising as a business model; a 1998 paper by Sergey Brin and Larry Page explicitly warned that advertising-funded search engines would be biased toward advertisers rather than users. The pivot to advertising happened under financial pressure, and once it happened, the extraction logic followed with the inevitability of water running downhill. The behavioral surplus was there. The market for prediction products was there. The technology to connect them was there. The business model assembled itself.

We should be clear: blocking cookies, using a VPN, running a privacy-focused browser — these are not pointless. They reduce the surface area available for extraction. A tracker blocker that prevents Facebook’s pixel from loading on third-party websites genuinely reduces the behavioral data Meta can collect about your browsing habits. A VPN genuinely prevents your internet service provider from logging the specific sites you visit. These tools have real value, and you should use them.

But they address symptoms. When Apple introduced its App Tracking Transparency framework in 2021, requiring iOS apps to request explicit permission before tracking users across other apps and websites, Meta lost an estimated $10 billion in advertising revenue in a single year. That is a significant number. It is also a number that Meta responded to not by abandoning behavioral extraction but by pivoting to on-platform behavioral prediction — building models that infer your interests and intentions from your behavior within Meta’s own apps, where Apple’s framework does not apply. The extraction did not stop. It relocated.

This pattern repeats with every countermeasure. Browser-based cookie blocking led to device fingerprinting. Third-party cookie deprecation led to first-party data strategies, where companies build their own walled gardens of behavioral data. Ad blockers led to native advertising and sponsored content that is structurally indistinguishable from editorial. Each countermeasure is a patch on a specific wound; the underlying condition — the economic incentive to extract behavioral surplus — generates new wounds faster than patches can be applied.

This does not mean the patches are worthless. It means that a strategy composed entirely of patches is a strategy that never gains ground. You run to stay in place.

The Difference Between Selling Products and Selling Predictions

The clearest way to understand the structural problem is to distinguish between two kinds of companies. The first kind sells you a product or service. You pay money; the company delivers the thing. The economic relationship is direct, and the incentive structure is aligned: the company profits when you are satisfied with the product. Apple sells phones, laptops, and services. Basecamp sells project management software. Fastmail sells email hosting. When these companies collect data, the primary use is to improve the product you are paying for.

The second kind of company sells predictions about you to third parties. You do not pay, or you pay a nominal amount; the company’s actual revenue comes from advertisers, data brokers, or other entities willing to pay for behavioral prediction products. Google, Meta, and the vast ecosystem of ad-tech companies operate this way. When these companies collect data, the primary use is not to improve your experience — it is to improve the prediction products sold to their actual customers.

This distinction is not absolute. Apple collects behavioral data for its own advertising business, which has grown significantly since 2021. Amazon sells products directly but also operates one of the largest advertising platforms in the world, using your purchase and browsing history to sell prediction products to third-party sellers. The lines blur. But the fundamental question remains useful: when this company collects data about me, who is the primary beneficiary — me or someone buying predictions about me?

The Oversimplification of “You Are the Product”

The phrase “if you’re not paying, you’re the product” has become a kind of folk wisdom about the internet economy. It captures something real — the economic dynamic described above — but it also misleads in an important way. Paying customers get surveilled too.

Amazon knows your purchase history, your browsing history, your Alexa voice recordings, your Kindle reading habits, and your Ring doorbell footage. You are paying for all of these services. Samsung smart televisions — devices you purchased — have been documented collecting viewing data and transmitting it to third parties for advertising purposes. Your phone carrier, to whom you pay a monthly bill, collects and in many cases sells your location data. The relationship between payment and privacy is weaker than the folk wisdom suggests.

The more accurate framing is this: any company whose revenue depends significantly on behavioral prediction products has a structural incentive to extract behavioral surplus from you, regardless of whether you are also paying them. The question is not “am I paying?” The question is “what percentage of this company’s revenue comes from selling predictions about people like me?”

Cory Doctorow and the Enshittification Lens

Cory Doctorow’s framework of platform enshittification complements Zuboff’s analysis by describing how the extraction escalates over time. Doctorow observes that platforms follow a predictable lifecycle: first, the platform is good to users in order to attract them; then, the platform degrades the user experience to extract value for business customers (advertisers, sellers); finally, the platform extracts maximum value from everyone — users and business customers alike — to enrich shareholders.

This lifecycle is driven by the same structural incentive Zuboff identifies. Once a platform’s revenue depends on behavioral surplus extraction, every subsequent decision is shaped by the need to extract more. The degradation is not accidental. It is not the result of bad management or losing touch with users. It is the business model operating as designed. Facebook’s organic reach — the percentage of your followers who see your posts without paid promotion — declined from roughly sixteen percent in 2012 to less than two percent by 2024. That decline was not a bug. It was the platform optimizing for advertising revenue at the expense of user experience, exactly as the business model incentivizes.

The Proportional Response: Understand the Incentive, Then Choose

If the problem is the business model rather than any specific technology, then the proportional response is structural rather than tactical. This does not mean abandoning every tool and technique for blocking surveillance. It means building your strategy on a foundation that addresses the incentive structure, not just its latest manifestation.

In practice, this means three things. First, you understand which companies in your digital life derive their revenue primarily from behavioral prediction products and which derive their revenue from selling you things directly. This is not a binary; it is a spectrum, and knowing where each service falls helps you allocate your attention and your data proportionally.

Second, you make deliberate choices about where to withdraw your behavioral surplus. You do not need to withdraw from everything; that is the paranoia trap, and it costs more in capability than it saves in privacy. But you identify the two or three highest-value behavioral data streams you currently generate — your primary email, your phone’s location data, your DNS traffic — and you move those to services whose business model does not depend on selling predictions about you.

Third, you build on infrastructure you control wherever the cost-benefit ratio supports it. Your content lives on a platform you own. Your audience relationship runs through an email list you operate. Your most important communications happen on services built for communication, not for behavioral extraction. This is not comprehensive privacy. It is proportional sovereignty — reducing your dependence on the behavioral surplus economy in the areas where that dependence costs you the most.

The technologies will keep changing. The business model will keep adapting. Cookie blockers will become obsolete and be replaced by something else. The structural response — understanding the incentive, choosing where to withdraw, building on infrastructure you control — does not become obsolete, because it addresses the economic logic rather than its temporary instruments.

That is the difference between chasing symptoms and treating the condition.


This article is part of the Surveillance Capitalism & The Proportional Response series at SovereignCML.

Related reading: What Shoshana Zuboff Actually Said (And What She Didn’t), What’s Documented vs. What’s Assumed, The Enforcement Gap: Laws That Exist but Don’t Protect You

Read more