The Monopoly Report – Episode 65: The Problem with Age Verification
Host: Alan Chapell
Guest: Professor Jess Meyers (University of Akron School of Law)
Date: February 11, 2026
Episode Overview
This episode delves into the growing trend of mandatory age verification on digital platforms, especially as governments seek to "protect kids" from perceived harms of social media. Host Alan Chapell and guest Professor Jess Meyers, an expert on online intermediary liability and technology law, dissect policies surfacing in Australia, Europe, the UK, and the U.S. They discuss why age verification is fast becoming the default global response, but also explore why that solution raises complex legal, privacy, technical, and societal challenges—often failing to address root harms while creating new issues for adults, children, platforms, and advertisers alike.
Key Discussion Points & Insights
Jess Meyers’ Background and Perspective
- From policy roles at Google and the Chamber of Progress to academia.
- Experience advocating in legislative settings and now operating with greater freedom as a law professor.
- Highlights the complexities (and pitfalls) of speaking candidly in public policy vs. corporate roles.
- Quote:
“I’m, you know, kind of free to say what I want now. ...You’re going to see a lot more, my personality sort of shine through, I guess.” (03:00)
- Quote:
Why Age Verification Is Trending
- Governments globally are seeking quick fixes to multifaceted “kids and social media” concerns.
- The “protect the kids” narrative often carries enormous political appeal but oversimplifies complex problems.
- Chapell and Meyers agree:
- Children’s brains and needs are distinct from adults.
- Tech platforms have real influence on minors (sometimes unsupervised by parents).
- Parental supervision is hard to sustain, despite best intentions.
- Quote (Chapell):
“It’s really, really hard to be standing over them all the time.” (08:17)
- Quote (Chapell):
The Nuance of “Harm”
- Evidence linking social media use directly to youth harm (e.g., suicide, depression, addiction) is complex and not always causal.
- “There isn’t one singular cause. ...I want to be careful about drawing a direct causal link because it’s a more complicated question.” (09:20 – Meyers)
- Many factors influence youth mental health: individual struggles, community resources, broader environment.
Global Age Verification Policies
Australia (12:19–15:05)
- Australia became first major economy to outright ban social platforms for kids under 16.
- Prompted mass account removals and noticeable backlash from minors who lost online community spaces.
- Early evidence suggests:
- Kids circumvent bans using VPNs.
- Some adults locked out due to false positives from biometric age checks.
- Quote:
“You're seeing sort of this mass removal of people, minors and I would argue also adults perhaps on accident from services, from communication services that they once had access to...” (14:19 – Meyers)
Europe and UK (15:26–17:42)
- Focus on “age assurance” rather than outright bans:
- Restricting features (e.g., DMs, content feeds) for suspected minors.
- Efforts to segregate minors from adults in online spaces.
- Guidelines unclear, leading many platforms to err on the side of de facto bans.
- Companies like Spotify and Microsoft introducing age gates broadly, even where not strictly required.
- Quote:
“It’s more of a guideline structure... that is more translating into, in my opinion, more like a ban...” (16:51 – Meyers)
- Quote:
United States (17:50–23:35)
- Stymied by First Amendment/free speech concerns.
- Two waves of state laws:
- Early attempts required platforms to verify age and bar minors from certain features—largely struck down as unconstitutional.
- Newer, indirect regulations mirroring Europe/UK: prohibit features "targeted at minors," but still prompt platforms to verify and err on exclusion.
- No national uniformity—laws differ state by state, making compliance challenging.
- The cumulative effect: platforms move toward broad exclusion of minors to minimize litigation risk.
- “It’s probably just easier to ban [miners] all.” (18:08 – Meyers)
The Problems With Age Verification (23:35–29:32)
1. Chilling Effect on Anonymous Speech
- Age verification undermines privacy and anonymity—affecting not just minors but all users.
- Only reliable verification means: government IDs or biometric scans.
- Quote:
“Anonymous speech is incredibly important, especially online... What age verification effectively requires is that for you to be able to continue to engage online... your identity is going to be inherently tied to that use or that expression.” (24:34 – Meyers)
- Quote:
2. Expanded Honeypots for Hackers
- Companies must collect—and retain—massive troves of sensitive data (IDs, biometrics)—prime targets for data breaches.
- Historic breaches (e.g., the T dating app) have resulted in real-world harms.
- Quote:
“You know, we're sort of left to ask ourselves, like, how much do we trust these companies that we have for so long said are terrible about our information, especially in a country where we don't have a federal privacy law in place...” (26:46 – Meyers)
- Quote:
- No guarantee user data is purged upon turning 18; corporate data retention practices are inconsistent.
3. Parental Complexity & Unintended Collection
- Parents must turn over their own IDs/biometrics to verify children, compounding privacy intrusions across entire households.
4. Technical and Societal Edge Cases
- Biometric/behavioral “soft” verification solutions misclassify adults as minors, and vice versa.
- Potential to disenfranchise adults participating in “youth-coded” spaces (e.g., fandoms) and to out LGBTQ+ minors or those with sensitive online identities.
Alternative Approaches & Core Dilemmas
“Lighter” Forms of Age Assurance (29:32–33:27)
- Behavioral/AI approaches (like TikTok’s “safety net”) create false positives/negatives and can't stand up to the US litigation climate.
- Risk-averse platforms will oversuppress, locking out many legitimate users.
Value of Online Community for Youth
- Blanket bans harm vulnerable children—community access is a key protective factor against mental health crises.
- “Access to community is one of the strongest protective factors for youth. And so removing their access to community is probably one of the worst things that you can do...” (34:09 – Meyers)
Meyers’ Main Prescription
- Address root causes:
- 1) Educate parents on platform tools and controls (many underused).
- 2) Teach kids digital/media literacy—how to navigate and critically process online information.
- Policy solutions must extend beyond simply keeping kids offline.
- Main obstacle: Education and community support cost money—and there is little political will to fund such programs.
Implications for the Ad Tech and Media Industry (37:25–42:56)
- Historical clarity under COPPA (>20 years): No targeting under 13, "directed to children" clear bright line.
- Emerging state laws threaten to raise the age bar to 16 or 18 and shift legal standards to “knew or should have known” and potentially “harmful to any minor,” making compliance nebulous.
- Advertisers’ Dilemma:
- Unclear liability: Is it the platform, the advertiser, the app store, or all three responsible for compliance?
- If platforms exclude minors, advertisers get certainty but at the cost of a smaller, older addressable audience.
- Fragmented US regulation could mean 50 different regimes, leading to a “lowest common denominator” for ad targeting.
- Other Sector Examples:
- Smart TVs add complexity: Multiple viewers, household-level IDs—raises challenges for targeting and liability.
- Advertisers risk overcorrecting, as seen by prior mass pauses on platforms like Twitter (now X).
- Quote (Chapell):
“I would love to see the list not of the advertisers who left, but for the percentage of the advertisers who left who came back within... three months.” (40:49) - Quote (Meyers):
“I would not want to be an advertiser today.” (42:22)
- Quote (Chapell):
Notable Quotes
-
On oversimplifying the issue:
“Children are very different from adults in a lot of really important ways.” (07:33, Meyers) -
On effectiveness and risk of age verification:
“We’re telling the technology companies to consume more information about children... They’re going to have sort of these honey pots of kids...and the reality is that that information is a massive target for hackers.” (25:54, Meyers) -
On the importance of online community for youth:
“Access to community is one of the strongest protective factors for youth.” (34:09, Meyers) -
On the regulatory and commercial quandary:
“You're going to see this sort of anxiety up and down the stack of who is the gatekeeper here, who is legally liable, who is actually in charge of this?” (39:26, Meyers)
Timestamps for Key Segments
| Timestamp | Segment Description | |--------------|------------------------------------------------------------------| | 03:00 | Jess Meyers discusses corporate v. academic freedom | | 07:33-11:46 | Why age verification is being pursued; difference between kids/adults, harm nuance | | 12:19-15:05 | Australia’s blanket ban and early reactions | | 15:26-17:42 | Europe & UK: “Age assurance,” industry responses | | 17:50-23:35 | The U.S.: First Amendment, legal whack-a-mole, regulatory patchwork | | 23:35-29:32 | Technical and legal pitfalls of age verification | | 29:32-33:27 | Problems with “soft” assurance; LGBTQ+/anonymity concerns | | 33:50-37:07 | Solutions: digital literacy, parental/child education barriers | | 37:25-42:56 | Ad tech implications and fragmentation of compliance responsibilities |
Memorable Moments
- Meyers shares a viral story about being publicly criticized during California legislative testimony, leading to her pivot into academia (03:00).
- Highlight of the paradox of forcing tech platforms to collect more sensitive data in the name of “child safety,” despite mass skepticism of their stewardship (25:54).
- Discussion of the existential risk to advertisers as the regulatory regime shifts from COPPA’s bright lines to a confused, patchwork reality (40:47).
Final Thoughts
- Age verification is becoming, in practice, the default response to “kids online” concerns—but at great legal, operational, and privacy cost.
- The harms ascribed to social media are real but deeply entangled with larger societal issues.
- Blanket age bans and strict verification risk stifling anonymous expression and shutting out youth from key support communities.
- There’s a pressing need for government investment in education and smart, nuanced policy—but political and financial obstacles persist.
- For the ad and media world, these regulatory shifts portend a period of uncertainty, fragmentation, and possibly overcorrection.
Find Professor Jess Meyers at controlaltdescent.com, BlueSky, and other non-X/social platforms.
