It's a Config Change

Part 4 of the Age Verification series · ← Previous · Next →


California’s Digital Age Assurance Act requires operating system providers to collect the user’s age at account setup before the device is usable. Today that means a self-reported birthdate. The App Store Accountability Act, which passed the House Energy and Commerce Committee 26-23 on March 5, requires app stores to verify age through “commercially reasonable methods.” The direction is the same in both bills: gate the infrastructure, not the platform.

We proposed a privacy-preserving verification architecture earlier in this series: a federal API that answers “is this person over 18?” with a signed boolean and zero personal data in transit. That solves the verification problem. This post asks a different question: do you need to verify age at all if the operating system can enforce access restrictions on child accounts directly?

The operating system already has a simpler tool. It does not require identity documents, biometric databases, or a federal API. It has existed since RFC 882 was published in November 1983.

It is called DNS.


How a platform reaches a child#

Every app on a child’s phone resolves a domain name before it does anything. Instagram calls graph.instagram.com. TikTok calls api.tiktok.com. Before a single byte of content is served, before a single ad is rendered, before a single data point is collected, the device asks a resolver: what is the IP address for this domain?

If the resolver does not answer, the app does not work: no content, no ads, no data collection, no revenue. The platform is not blocked by a firewall or filtered by a proxy. It simply does not exist. The device asked for directions to the bar, and the road did not answer.

This is not theoretical. DNS filtering is a solved problem everywhere else. Enterprise networks, schools, and libraries have filtered at the resolver for decades. Hundreds of thousands of homes run Pi-hole on a $35 Raspberry Pi, a UI on dnsmasq that has shipped on Linux since 2001.

On mobile, Apple’s Screen Time and Google’s Family Link already block specific domains at the OS level. Google manages millions of school Chromebooks through Workspace for Education with full admin controls including DNS configuration. The mechanism exists on every device children actually touch: phones, tablets, iPads, Chromebooks. The solutions are readily available. They are just not the default on child accounts.


Platform-specific, not road-specific#

This is the critical distinction the DAAA misses.

The DAAA gates the device. At account setup, the operating system collects the user’s age and transmits it to every app. The ASAA gates the store: no download without parental consent. Between them, a 14-year-old’s access to every app on the phone (school apps, messaging, maps, the browser) is mediated by infrastructure the platform did not build and does not manage. The gate is on the road.

DNS blocking gates the platform. The child uses the device for everything it was designed to do: homework, communication, navigation, creativity. The domains that the parent (or the law) has designated as age-restricted simply do not resolve. The road is open. The bar is closed.

One approach requires building a national identity verification infrastructure. The other requires a list of domain names.


What Meta is actually lobbying for#

We documented Meta’s record $26.29 million federal lobbying spend. The TBOTE Project has since documented the broader operation: $2 billion in dark money grants distributed through nonprofit shells across 45 states, all to shift age verification to app stores and device manufacturers. The strategic payoff is a single, unfederated gate that Meta does not manage and that reframes the question from “should my child use Instagram” to “should my child go online.”

DNS blocking breaks that strategy entirely. It is federated (each child account has its own access policy), platform-specific (it blocks Instagram without blocking the internet), free (no vendor, no SDK, no government API), and it already exists on every device children actually use.

Meta is not lobbying for device-level age verification because the technology does not exist. The technology exists. Meta is lobbying to make device-level age verification a legal obligation on Apple and Google, so that the regulatory liability lands on the device manufacturer instead of the platform. If the law required Meta to verify age on its own platforms, Meta would have to build a door. If the law requires Apple to verify age on the device, Meta gets the unfederated gate we described in Part 3.

The parental controls that already ship with every phone in the world are not the solution Meta wants. They are the solution Meta’s $2 billion lobbying operation is designed to make sure legislators never notice.


This already exists#

Family-safe DNS resolvers are not a proposal. They are products you can configure today: OpenDNS FamilyShield has been running since 2010, CleanBrowsing filters age-restricted content at the resolver, and Cloudflare’s 1.1.1.3 does the same. Change one setting on a child’s device and age-restricted platforms are filtered before the first byte loads, with no app, no SDK, and no identity verification required.

The only thing missing is a default. Device manufacturers already ship the parental control frameworks (Screen Time, Family Link) and the DNS enforcement mechanisms. Connecting them so that child accounts use a family-safe resolver by default would require, at most, a minor update to the parental control framework. No new infrastructure. No biometric verification system. A config change.

Every modern phone already has biometric authentication, child account management, and configurable DNS. These capabilities exist on the same device. The engineering work to connect them is real but bounded. It is a product decision, not a research problem. And the decision has a price tag.

Apple and Google take up to 30% commission on app store revenue. They profit from every app a child downloads, every in-app purchase, every subscription. The same companies that ship the parental control frameworks, that control the DNS configuration, that manage the child accounts, also collect a cut every time a child opens Instagram, subscribes to TikTok Live, or buys Robux. They have every technical capability to enforce child safety at the OS level. Enabling it by default would reduce the number of apps children use and the revenue those apps generate. The incentives are misaligned by design.

Meta profits from children on its platforms. Apple and Google profit from children using those platforms through their stores. Nobody in this system is incentivized to build the default that would protect children, because every party benefits from the status quo.

More than twenty-five states have passed age verification laws, and Meta has spent $2 billion lobbying to redirect them. The App Store Accountability Act is heading to the House floor while Utah’s version takes effect May 6. The solution is a DNS configuration that device manufacturers could ship as the default on child accounts by tweaking settings in the free software they already use.


DNS is the first layer, not the last#

DNS blocking is not a complete solution on its own. An app could hardcode IP addresses, route through CDN redirects, or serve content from domains not on the blocklist. A child could use a VPN, change their resolver, or open a browser. These are real gaps. Every one of them is addressable at the OS level with tools that already exist.

App-level controls. The OS already knows what apps are installed. Screen Time and Family Link already restrict which apps a child account can run. If Instagram is not in the approved list, it does not launch. No DNS required.

Network filtering. Meta, TikTok, and Snapchat operate on well-documented IP ranges (registered ASNs). The OS can block outbound connections to those ranges at the network layer, catching any app that attempts to bypass DNS with hardcoded addresses.

TLS/SNI enforcement. Modern HTTPS requires Server Name Indication (SNI): the client sends the hostname during the TLS handshake so the server knows which certificate to present. Hit a raw IP without a valid hostname and the connection fails with a certificate mismatch. Instagram, Facebook, and WhatsApp all rely on SNI. A child typing an IP address into a browser gets an error page, not a feed. On a child account, the OS can require SNI on all outbound HTTPS connections, refuse handshakes to restricted hostnames regardless of how the IP was resolved, and prevent apps from disabling certificate validation. The bypass surface shrinks to unencrypted HTTP, which the OS can block entirely.

Browser restrictions. Safari and Chrome both support managed configurations that restrict which domains the browser can reach. Screen Time and Family Link already offer this. A child navigating to instagram.com in a browser hits the same filter as the app.

VPN lockdown. Both iOS and Android support managed profiles that prevent the user from installing or configuring a VPN. The child cannot tunnel around the controls if the OS does not allow VPN configuration on the account.

Encrypted Client Hello (ECH) is worth addressing since it will hide SNI during the TLS handshake, which would defeat SNI-based filtering. But ECH requires a DNS lookup to fetch its configuration before the handshake begins. If DNS is filtered, ECH never bootstraps. The layers reinforce each other.

None of these are new technologies. They are OS-level enforcement mechanisms built on free software that device manufacturers already ship. The software does not need to change. The defaults do. No single mechanism is unbreakable, but the architecture works because it is orchestrated: each bypass requires defeating the next layer, and every layer is controlled by the same operating system.

The enforcement is self-reinforcing. The features that make these devices appealing to children (touch screens, app ecosystems, biometric login) are the same features that enable the controls.

This is where regulation matters. The entities that put the device and the content in a child’s lap are the ones who should be required to enable these protections by default. Legislation should target the manufacturers and platforms that profit from the status quo, not the underlying free software that already provides the flexibility to solve the problem. Regulate the deployment, not the code.


The question no one is asking#

The entire age verification debate accepts a premise no one has challenged: that the platform’s content is inherently unsafe, and the only solution is to keep children away from it.

A grocery store does not check your age at the door. It checks your age when you buy beer. The store is safe by default. The age-restricted products are gated at the point of sale, not at the entrance.

Platforms could work the same way. Filter explicit content, algorithmic amplification of harmful material, and targeted advertising for minors by default. Not as an option. Not as a parental control. As the default state of the product. If the feed is safe for children by default, age verification shrinks from a national infrastructure problem to a narrow policy at the edges.

The age verification debate has produced credential wallets, biometric pipelines, identity provider networks, and attestation APIs. We proposed our own: a federal hash-and-boolean architecture. DNS blocking is simpler than all of them. But the simplest solution of all is to require the platform to stop serving content that harms children. Meta would rather spend $2 billion reshaping legislation than change what it serves.

It is a config change. It ships with every phone. The question is whether legislators will notice before Meta’s lobbyists write another bill that makes sure they never have to.


Part 4 of the Age Verification series · ← Previous · Next →


For the investigative record behind Meta’s lobbying operation, see the TBOTE Project. For the technical architecture of a privacy-preserving federal age verification API, see Age Verification Is a Boolean. For why Meta’s products are platforms, not apps, see You Don’t Ban Kids from the Road.