The Enforcement Gap: Laws That Exist but Don't Protect You

Privacy law, in theory, is a wall between your behavioral data and the companies that want to extract it. In practice, the wall has the structural integrity of a suggestion. The European Union's General Data Protection Regulation, California's Consumer Privacy Act and its successor the California Pr

Privacy law, in theory, is a wall between your behavioral data and the companies that want to extract it. In practice, the wall has the structural integrity of a suggestion. The European Union’s General Data Protection Regulation, California’s Consumer Privacy Act and its successor the California Privacy Rights Act, and a patchwork of state and national regulations around the world represent genuine legislative effort — years of drafting, negotiating, and political compromise. The laws exist. The rights they describe are real. And the gap between those rights and your actual protection is wide enough to drive a behavioral futures market through.

Understanding this gap is essential for the sovereign individual, not because the laws are worthless — they are not — but because relying on them as your primary defense is a strategy that assumes enforcement agencies will protect your interests with a consistency and vigor that the evidence does not support. Legal rights are a floor, not a ceiling. Sovereignty means building your own protections above the legal baseline, because the baseline is lower than the statute text would have you believe.

GDPR: The Headline Fines and the Underlying Math

The General Data Protection Regulation, effective since May 2018, is the most ambitious privacy law in force anywhere in the world. It grants European residents the right to access their data, the right to have it deleted, the right to data portability, and the right to object to processing. It imposes obligations on data controllers and processors, requires consent for many forms of data collection, and authorizes fines of up to four percent of global annual revenue for violations. On paper, GDPR has teeth.

In practice, the teeth are duller than the statute suggests. Consider Meta’s record GDPR fine: in May 2023, the Irish Data Protection Commission imposed a fine of 1.2 billion euros on Meta for transferring European users’ data to the United States in violation of the GDPR’s data transfer provisions. The number sounds enormous. Against Meta’s quarterly revenue at the time — approximately $32 billion in Q2 2023 — the fine represents roughly 3.75 percent of a single quarter’s income. Meta appealed. The data transfers that triggered the fine continued under modified legal frameworks while the appeal proceeded. The fine was a cost of doing business, absorbed into the quarterly financials like any other regulatory expense.

This pattern recurs. Amazon’s 746 million euro GDPR fine in 2021, the largest at that time, amounted to less than one percent of the company’s annual revenue. Google’s various GDPR fines across multiple jurisdictions collectively represent a rounding error on Alphabet’s balance sheet. The fines are designed to be proportional to revenue, and by that measure they achieve their arithmetic. But the behavioral change they are designed to incentivize — fundamental alteration of data extraction practices — has not materialized. The companies pay the fines, adjust their compliance language, and continue operating extraction-based business models that generate revenue orders of magnitude larger than the penalties.

The enforcement bottleneck is structural. GDPR enforcement is delegated to national Data Protection Authorities, many of which are understaffed and underfunded relative to the scale of the companies they regulate. Ireland’s Data Protection Commission, which has jurisdiction over Meta, Google, Apple, and most major US tech companies due to their European headquarters being located in Ireland, has faced persistent criticism for slow investigations and outcomes that favor the companies under review. A 2021 analysis by the Irish Council for Civil Liberties found that the Irish DPC had resolved only a fraction of its open cases and that major cross-border complaints routinely took years to process.

CCPA/CPRA: The Right to Opt Out and the Friction to Exercise It

California’s Consumer Privacy Act, effective January 2020, and its successor the California Privacy Rights Act, effective January 2023, grant California residents the right to know what personal information companies collect about them, the right to delete it, and the right to opt out of the sale of their personal information. These are meaningful rights. The problem is the mechanism of exercising them.

To opt out of the sale of your personal information under CCPA/CPRA, you must submit a separate request to each company that collects your data. There is no centralized opt-out mechanism. Each company can design its own opt-out process, and many have designed processes that are technically compliant while being practically burdensome — requiring multiple steps, account verification, and sometimes repeated requests as companies interpret “sale” of data narrowly. The Global Privacy Control, a browser-based signal intended to automate opt-out requests, is recognized under CPRA but compliance is inconsistent and enforcement of GPC-based opt-outs is still developing .

The burden falls entirely on the individual. You must know which companies have your data, navigate each company’s opt-out process, and monitor whether the opt-out was actually honored. In a landscape with thousands of data brokers and countless apps and services collecting behavioral data, this is not a manageable task for most people. The right exists. The practical ability to exercise it at scale does not.

We should talk about cookie banners, because they are the most visible artifact of the enforcement gap in daily digital life. GDPR requires informed consent for most forms of tracking. The implementation of this requirement, across millions of websites, has produced what privacy researchers have accurately described as consent theater.

The typical cookie banner presents two options: “Accept All,” rendered as a prominent, easy-to-click button; and “Manage Preferences,” rendered as a smaller, less visible link that leads to a multi-step configuration process. Rejecting all cookies often requires navigating through subcategories of tracking — functional, analytical, advertising, social media — and toggling each category individually. Some implementations require you to reject cookies from individual vendors, of which there may be dozens or hundreds listed.

The design is not accidental. Companies have a financial incentive to maximize consent rates, and the cookie banner design reflects that incentive. Studies by researchers at Ruhr University Bochum and others have found that the vast majority of users click “Accept All” — not because they genuinely consent to comprehensive tracking, but because the alternative is sufficiently burdensome that acceptance becomes the path of least resistance. The legal requirement for consent is technically satisfied. The spirit of informed, freely given consent is systematically undermined.

European regulators have begun issuing guidance against the most egregious “dark pattern” cookie implementations, and some enforcement actions have targeted specific practices — the French data protection authority CNIL fined Google and Facebook in early 2022 for making cookie rejection harder than acceptance. But the fundamental dynamic persists: the companies designing the consent interfaces are the same companies whose revenue depends on maximizing the data collected through those interfaces.

The Data Broker Regulation Gap

The data broker industry represents perhaps the starkest illustration of the enforcement gap. Over four thousand data brokers operate in the United States . State-level data broker registration laws exist in a handful of states — California, Vermont, Oregon, Texas among them — but registration requirements are minimal and do not meaningfully constrain data collection or sale practices. There is no comprehensive federal data broker regulation in the United States as of early 2026.

The practical consequence is this: a data broker can aggregate your location data, purchase history, browsing behavior, public records, and inferred characteristics into a detailed profile, sell that profile to advertisers, employers, landlords, law enforcement agencies, or private investigators, and face essentially no regulatory consequence for doing so — provided the broker complies with the narrow requirements of whatever state registration laws apply to its jurisdiction of incorporation.

Senator Ron Wyden’s investigations have documented cases in which data brokers sold location data to federal agencies, effectively providing warrantless surveillance capability through a commercial transaction. The legal theory is that data voluntarily shared with a third party (the app on your phone) loses Fourth Amendment protection — a doctrine derived from the Supreme Court’s pre-digital-era precedent, partially narrowed by the 2018 Carpenter v. United States ruling but far from overturned. The enforcement gap here is not just administrative; it is constitutional.

The FTC: Limited Budget, Limited Capacity

The Federal Trade Commission is the primary federal agency responsible for enforcing consumer privacy in the United States. The FTC’s enforcement authority derives largely from Section 5 of the FTC Act, which prohibits “unfair or deceptive acts or practices.” The FTC does not have dedicated privacy legislation to enforce; it must shoehorn privacy violations into this general framework.

The FTC’s budget, as of fiscal year 2025, is approximately $400 million — for all enforcement activities, not just privacy . Compare this to the combined annual revenue of the companies whose privacy practices the FTC is supposed to police: Alphabet, Meta, Amazon, Apple, and Microsoft alone generate over $1.5 trillion in annual revenue. The asymmetry between the regulator’s resources and the regulated entities’ resources is not subtle.

FTC enforcement actions in the privacy space have produced settlements that are, by the standards of the companies involved, modest. The FTC’s landmark $5 billion settlement with Facebook in 2019 — the largest privacy-related fine in US history — amounted to roughly one month of Facebook’s revenue at the time. The settlement required Facebook to establish a privacy committee on its board and submit to third-party audits, but the subsequent Haugen disclosures in 2021 — internal documents leaked by former Facebook employee Frances Haugen — suggested that these structural requirements had not fundamentally altered the company’s approach to behavioral data extraction.

The FTC’s consent decree model — under which companies agree to “do better” in exchange for avoiding litigation — has been the primary enforcement mechanism for decades. The decrees impose obligations that are difficult to monitor, slow to enforce, and modest in their demands relative to the scale of the violations. The FTC commissioners themselves have acknowledged these limitations publicly.

International Gaps: Data Without Borders

Your data does not respect jurisdictions. A photograph uploaded to a US-based platform may be processed by servers in Ireland, analyzed by algorithms trained on data from Singapore, and used to generate prediction products sold to advertisers in Brazil. The privacy law that applies depends on where the data is stored, where it is processed, where you are located, and where the company is incorporated — and these four jurisdictions may have four entirely different legal frameworks, or no framework at all.

GDPR’s data transfer provisions attempt to address this problem by restricting the transfer of European residents’ data to countries without “adequate” data protection. The practical effect has been a series of legal frameworks — Safe Harbor, then Privacy Shield, then the EU-US Data Privacy Framework — each of which was challenged, struck down, or renegotiated in turn. The current framework, the EU-US Data Privacy Framework adopted in 2023, faces ongoing legal challenges from European privacy advocates who argue it does not provide adequate protection against US government surveillance.

The result is a regulatory environment in which your legal protections are only as strong as the weakest link in the chain of jurisdictions through which your data passes. A profile protected under GDPR in Frankfurt can be replicated and processed in a jurisdiction with no equivalent protection, and the enforcement mechanisms for preventing this are slow, fragmented, and reactive.

What This Means for the Individual

We could spend the rest of this article cataloging additional enforcement gaps — the inadequacy of COPPA for protecting children, the inconsistency of state-level privacy laws, the near-total absence of privacy regulation in most of the world. But the pattern is clear enough. The laws exist. The enforcement lags. The penalties, when imposed, are absorbed as operating costs. The burden of protecting your behavioral data falls, in practice, on you.

This is not a reason for nihilism. The laws provide a baseline — imperfect, inconsistent, but real. GDPR data access requests give you genuine visibility into what companies hold on you. CCPA opt-out rights, burdensome as they are, do result in data deletion when exercised. FTC enforcement actions, modest as they are, do impose constraints. The legal floor exists, and it is higher than no floor at all.

But sovereignty means building above the floor. It means recognizing that the enforcement gap is structural — a product of the asymmetry between the resources of regulators and the resources of the companies they regulate — and that this asymmetry is unlikely to be resolved in your favor on any timeline relevant to your life. The proportional response is not to abandon the legal system. It is to use the legal tools available while simultaneously reducing your dependence on institutional protection.

You migrate your email to a provider whose business model does not depend on reading it. You choose a DNS resolver that does not log your queries. You build your content on infrastructure you control. You reduce the behavioral surplus you generate in the areas where the extraction is most valuable to the extractors and most consequential to you. You do these things not because the law has failed — though in many cases it has — but because the sovereign individual does not outsource the protection of essential interests to institutions that have demonstrated, repeatedly, that their capacity to protect those interests is limited.

The law is a floor. You build the walls yourself.


This article is part of the Surveillance Capitalism & The Proportional Response series at SovereignCML.

Related reading: What Shoshana Zuboff Actually Said (And What She Didn’t), The Business Model Is the Problem (Not the Technology), What’s Documented vs. What’s Assumed

Read more