Surveillance Capitalism and Children: What Parents Actually Need to Know

Children generate behavioral surplus before they can spell the word. A six-year-old watching YouTube Kids is producing watch history, engagement data, content preference signals, and session duration metrics — all of it fed into prediction models that will follow that child into adulthood. This is n

Children generate behavioral surplus before they can spell the word. A six-year-old watching YouTube Kids is producing watch history, engagement data, content preference signals, and session duration metrics — all of it fed into prediction models that will follow that child into adulthood. This is not speculation. It is the documented business model that Shoshana Zuboff described in The Age of Surveillance Capitalism, applied to the demographic least equipped to understand it and least protected by the laws that nominally exist to shield them.

The parental instinct here runs hot, and understandably so. But this is a space where proportional response matters more than usual, because the alternative — raising digitally avoidant children in a digitally saturated world — creates its own vulnerabilities. The goal is not to build a Faraday cage around your household. The goal is to raise humans who understand that their attention and their data are resources, and who learn to steward those resources deliberately. That is sovereignty made generational.

What COPPA Actually Covers

The Children’s Online Privacy Protection Act is the primary federal law governing data collection on minors in the United States, and it is narrower than most parents assume. COPPA applies to children under thirteen. It requires websites and online services directed at children — or services that knowingly collect data from children under thirteen — to obtain verifiable parental consent before collecting personal information. On paper, this sounds like meaningful protection. In practice, the enforcement is sparse and the workarounds are trivial.

The Federal Trade Commission enforces COPPA, and the FTC has limited budget, limited technical capacity, and a complaint-driven enforcement model that places the burden on parents to identify violations and file reports. The most prominent COPPA enforcement action in recent memory was the 2019 settlement with Google and YouTube, which resulted in a $170 million fine for collecting children’s data without parental consent . Google paid the fine and continued operating YouTube Kids with what critics described as minimal structural changes. The fine represented a fraction of a single quarter’s advertising revenue. As a deterrent, it was theater.

More importantly, COPPA’s age threshold creates a cliff. A twelve-year-old is protected. A thirteen-year-old is, for legal purposes, exposed to the full apparatus of adult surveillance capitalism. There is no graduated protection, no acknowledgment that a thirteen-year-old’s capacity to understand behavioral surplus extraction is meaningfully different from a twenty-five-year-old’s. The law treats the thirteenth birthday as a switch that flips from “protected minor” to “consenting participant.” The behavioral futures market does not recognize that distinction.

What Schools Share

The data exposure that parents can see — social media, YouTube, games — is only part of the picture. The exposure they often cannot see runs through the school system. Educational technology platforms have become standard infrastructure in American schools, and those platforms collect behavioral data on students as a condition of participation.

Google Classroom, Canvas, Seesaw, and similar platforms log student activity: what they access, when, for how long, how they interact with assignments, what they search for within the platform. The data governance of this information varies wildly by school district. Some districts have robust data-sharing agreements that restrict how ed-tech companies can use student data. Others sign vendor agreements with minimal scrutiny, granting broad permissions that most parents never see and would not understand if they did.

The practical reality is that if your child uses a school-issued Chromebook — and in many districts, they must — their behavioral data flows through Google’s infrastructure. Google’s student data policies are more restrictive than their consumer policies, but the data still exists, the infrastructure still processes it, and the long-term retention and use of that data is governed by policies that change at Google’s discretion, not yours. Opting your child out of school-issued technology, where that option even exists, often means opting them out of the educational experience itself. This is the bind.

The Teenager Problem

Between thirteen and seventeen, young people occupy a legal gray zone that surveillance capitalism exploits with precision. They are old enough to create accounts without parental consent on most platforms. They are young enough that their behavioral data is especially valuable — a teenager’s content preferences, social connections, purchasing inclinations, and attention patterns, tracked over years, produce prediction models of extraordinary commercial value. The behavioral surplus generated by a teenager today will inform advertising targeting models that follow them for decades.

Social media age verification is effectively non-existent as of early 2026. Platforms require users to enter a birthdate, and no mainstream platform employs verification technology that could meaningfully prevent a twelve-year-old from claiming to be sixteen. This is not an oversight. Platforms benefit from younger users’ long-term data accumulation. A user who joins at thirteen and stays through thirty has generated seventeen years of behavioral surplus — a dataset of immense predictive value. The economic incentive to verify age rigorously does not exist.

Several states have passed or proposed laws requiring age verification for social media access . The implementation challenges are significant: effective age verification requires either government ID submission (a privacy problem of its own) or biometric estimation technology (unproven at scale and raising separate surveillance concerns). The solutions, in other words, may create new extraction vectors while attempting to close existing ones.

What Parents Can Actually Do

The proportional parental response operates on two tracks: technical controls and cultural education. Neither alone is sufficient. Technical controls without education produce children who learn to circumvent restrictions but not to think about why those restrictions exist. Education without any technical support asks children to exercise judgment in an environment specifically engineered to overwhelm judgment.

On the technical track, the highest-impact intervention is device-level DNS filtering. Configuring your home network to use a privacy-respecting, ad-filtering DNS resolver — NextDNS and similar services allow for customizable filtering — reduces tracking and inappropriate content exposure across every device in the household without requiring per-device configuration. This is layer-one infrastructure, not surveillance of your children. You are filtering the network, not monitoring their activity.

Delayed social media access is the second high-impact intervention. The specific age at which you allow social media access is a family decision, but the principle is straightforward: every year of delay is a year during which your child is not generating behavioral surplus for prediction markets and not training their attention patterns on algorithmically curated feeds. This is not about moral panic. It is about the documented mechanics of the attention economy — the same mechanics we covered earlier in this series.

Beyond technical controls, the most durable intervention is conversation. Children who understand that their attention is a resource — that companies compete for it, that the competition is designed to be invisible, that the products they enjoy are often funded by selling predictions about their behavior — are better equipped to navigate the digital landscape than children who have been shielded from it without explanation. You are not trying to frighten them. You are trying to make the invisible visible.

What Parents Cannot Do

Honesty requires acknowledging the limits. If your child uses any school-issued device, some data collection is unavoidable. If your child has friends with smartphones — and by middle school, most do — they will encounter social media and messaging platforms on other people’s devices, beyond your technical controls. If your child participates in any organized activity that uses digital communication or scheduling tools, some behavioral data will be generated. Total prevention is not a realistic goal. It is not even a desirable one, because the skills your child needs are navigation skills, not avoidance skills.

The parent who tries to prevent all data collection faces the same trap we described in the previous article on paranoia: the security apparatus becomes the project, consuming the energy and attention that should be directed toward actually raising the child. Your child needs to learn to swim in this water, not to live in a house without plumbing.

The Generational Sovereignty Position

Sovereignty is generational. The deliberate practices you model — the way you manage your own attention, the choices you make about platforms, the conversations you have about why you use Signal instead of the default messaging app, why the family DNS is configured a certain way, why you read books instead of scrolling before bed — these are the most powerful transmission mechanism available to you. Children learn sovereignty the way they learn everything else: by watching someone practice it.

The proportional stance with children is the same proportional stance we advocate everywhere in this series, adjusted for the fact that you are responsible for someone who cannot yet calibrate their own response. Protect where protection is practical. Educate where education is possible. Model the deliberate digital life you want them to eventually build for themselves. And accept that you are raising a sovereign person, not a perfectly shielded one — which means they will eventually make their own choices about what to share, with whom, and why.

That is not a failure of your parenting. That is the point of it.


This article is part of the Surveillance Capitalism & The Proportional Response series at SovereignCML.

Related reading: The Attention Economy: How Your Focus Became a Commodity, The Paranoia Trap: When Privacy Becomes Paralysis, Your Surveillance Capitalism Response Plan

Read more