Unpacking the SECURE Data Act
Tech Policy Press Podcast – April 26, 2026
Host: Justin Hendricks
Guest: Eric Null, Director, Privacy and Data Program, Center for Democracy and Technology (CDT)
Overview
This episode of the Tech Policy Press Podcast dives deeply into the proposed SECURE Data Act—a new federal privacy bill recently introduced by House Republicans. Justin Hendricks (Host) is joined by Eric Null (CDT) to critically examine the bill, compare it to recent legislative efforts, and discuss broader privacy trends impacted by rapid AI adoption, state-level laws, and emerging civil rights concerns.
Key Discussion Points & Insights
1. Context: The State of U.S. Federal Privacy Law
[00:13–05:17]
- Ongoing Absence of Federal Privacy Law
AI is dramatically increasing data use, yet the U.S. lacks comprehensive privacy legislation at the federal level.
- Legislative History
- ADPA (2022): Bipartisan, advanced out of committee but failed to move to floor.
- APRA (2024): Included newer thinking (esp. on advertising), never escaped committee.
- SECURE Data Act (2026): Solely Republican-led, not bipartisan—framed as a party's privacy platform.
- Why Previous Bills Failed:
Conflicting stakeholder interests, broad issue scope, and the difficulty of achieving consensus mean "It's much harder to stop something in Congress than it is to get it passed." (Eric Null, [04:18])
- Patchwork at the State Level
States (CA, VA, CT, etc.) are leading the way, but there are increasing ramifications for divergence.
2. The AI-Privacy Challenge
[05:17–07:30]
- AI is a Data Issue
"AI is inherently a data issue...The general wisdom so far has been that more data is necessary to make AI better. I don't actually think that's necessarily true." (Eric Null, [05:41])
- Issues with unlimited data collection, including data overreach by tech giants (Meta cited as an example) and the diminishing returns of "feeding more unorganized data."
- The risks of poor data hygiene (e.g., AI hallucinations, inaccurate outputs) are noted.
3. The SECURE Data Act—A Step Backwards?
[07:30–12:32]
- Industry-Friendly Approach and Weakening Protections
Eric Null calls SECURE "a major step backward" ([07:48]):
- Narrower definition of sensitive data (health focus limited to actual diagnoses, excludes communications, neural data, some types of financial and biometric data).
- Omits requirements found in most state laws (e.g., impact assessments, smart TV data inclusion).
- Data minimization standard simply codifies existing business practices:
"As long as they disclose what they do in a privacy policy, they're allowed to keep doing it." ([07:48])
- Exemptions Riddle the Bill:
- Service provision exemptions could gut protections (“companies could make an argument that a lot of the data that they collect is to provide service to an individual, and therefore they’re essentially exempt from the bill” [09:06])
- Research exemption likely gives AI training total cover.
- Contract exemption could allow privacy policies themselves to nullify the bill.
On Consent & Dark Patterns
[10:20–12:32]
- Opt-in consent required for (narrowly-defined) sensitive data, but:
"There are no protections against what we call dark patterns..." ([10:48])
- Cookie pop-ups, interface manipulations, and user fatigue mean most consumers “end up being forced to do it [consent]” ([11:54])
4. Civil Rights, Discrimination, and Federal Preemption
[12:32–18:15]
- Insufficient Civil Rights Protections
Null and others (e.g., Alejandra Montoya Boyer, Leadership Conference’s Center for Civil Rights and Technology) argue the bill falls short in addressing bias and discrimination:
"Privacy rights are civil rights…we don't see the types of protections against bias and discrimination." ([12:32])
- Transparency Gaps in Automated Systems
Lack of auditing/reporting requirements means it’s difficult, if not impossible, to prove discrimination—especially in AI-adjudicated decisions.
- Preemption Threatens Strong State Laws ([16:04–18:15])
SECURE’s language could:
- Wipe out state civil rights and leading biometric privacy laws (e.g., Illinois BIPA)
- Use "relates to" preemption (broader than direct conflict or field preemption) to undercut all laws overlapping with federal provisions.
5. The Private Right of Action—Gone
[18:15–20:23]
- SECURE omits any meaningful private right of action, a key line dividing partisan approaches:
- Republicans “tend to disfavor them” ([18:27])
- Prior bipartisan bills had carved out compromise (allowing injunctive relief).
- "Our FTC [and] state AGs...are chronically under resourced...the default...is that we let people enforce their rights in court. And for some reason privacy is an exception to that rule." ([19:13])
6. State-Federal Tensions & Likelihood of Passage
[20:23–22:36]
- States With Stronger Laws Fight Back
"You definitely don't want your stronger privacy law to be preempted by a weaker federal law..." ([20:32])
- SECURE would force state AGs to enforce law in federal court, undermining their authority.
- Political Prospects
Not likely to move quickly, especially in election year:
“This year is particularly difficult to move something..." ([21:52])
SECURE is “so fundamentally weak” and should not be the baseline.
7. The Path Forward
[22:36–24:52]
- Null is hopeful for eventual comprehensive federal privacy law but more optimistic about states driving innovation for now.
- “Hope springs eternal...I remain hopeful that we can get something out of Congress at some point...” ([23:01])
- Ideal vision: "People just go online...and they just don't have to worry about their privacy because they know they have a strong federal privacy law protecting them." ([23:54])
- Current reality is "cookie banners and companies that do not care at all about your privacy..." ([24:45])
8. California’s BASED Act & Competition/Privacy Tradeoffs
[25:02–28:39]
- BASED Act & Forced Interoperability
- Would require large platforms to interoperate with any third parties, without accompanying privacy protections.
- Risks include forced decryption of encrypted messages and reduced defense against overbroad government data requests.
- Broader International Comparison
Other jurisdictions (e.g., Japan, EU) have implemented interoperability/competition laws with stronger built-in privacy safeguards.
- "If you want expanded competition, we're all for it. We just want not to have serious unintended privacy consequences on the way." (Eric Null, [28:39])
Notable Quotes & Memorable Moments
-
On the nature of federal privacy gridlock:
“It's really hard to get everyone to agree on the same vehicle...it's much harder to stop something in Congress than it is to get it passed.” (Eric Null, [04:18])
-
On the data minimization standard:
“That is not a meaningful change in privacy. That is basically just stating that whatever you're doing, you can continue to do...we can pretend that we actually protect your privacy when really we don't.” (Eric Null, [08:35])
-
On exemptions for AI training:
“There is another exemption for internal research to improve and develop new products, services, and technologies. And that, in my view, is basically the AI training data exemption.” (Eric Null, [09:23])
-
On forced interoperability and privacy harms:
“There are serious privacy issues with a competition bill like that, and I think they can be balanced. But in the BASED Act, we did not feel like they were properly balanced.” (Eric Null, [27:39])
-
On hope for meaningful privacy law:
“Hope springs eternal...I do have some hope for the states...At the federal level, it's a little harder to have hope, but I still do hope that we can come together.” (Eric Null, [23:01])
Timestamps for Key Segments
- [00:13] – Introduction, context-setting on federal privacy gridlock
- [02:40] – Recap of ADPA and APRA; why they failed
- [05:41] – AI’s impact on the privacy landscape
- [07:48] – Why SECURE represents a "step backwards"
- [09:23] – Exemptions, especially around AI/data research
- [10:48] – Data minimization, dark patterns, consent weaknesses
- [12:32] – Civil rights and discrimination gaps
- [16:04] – Preemption and state law threats
- [18:27] – The missing private right of action
- [20:32] – State-federal law tensions
- [21:52] – Political outlook, SECURE’s chances
- [23:01] – Hopes for the future of US privacy law
- [25:09] – The BASED Act and unintended privacy risks from forced interoperability
- [28:39] – Final thoughts on competition, privacy, and global comparisons
Conclusion
This episode offers a robust critique of the SECURE Data Act, highlighting its structural and substantive weaknesses, the risks it poses to existing state protections, and how it fits into the larger saga of U.S. privacy law. The conversation underscores the tension between industry interests, emerging technology (AI), and meaningful privacy/civil rights, leaving listeners both informed and, cautiously, hopeful.