TFTC Podcast #589: AI That Won't Share Your Data, with OpenSecret
Host: Marty Bent
Guests: Mark & Tony (OpenSecret)
Date: February 24, 2025
Episode Overview
This episode centers on the intersection of Bitcoin, AI, and privacy. Marty Bent speaks with Mark and Tony from OpenSecret, a startup aiming to fundamentally transform how user data is stored and used in applications, particularly AI-powered ones. The discussion spans from Bitcoin privacy principles to the technical details of secure enclaves and the philosophy behind giving users real control over their data in the cloud. The episode also explores the journey from the Mutiny Wallet to OpenSecret, the rise of AI agent workflows, and broader implications for jobs, user data, and developer practices in the AI era.
Key Discussion Points & Insights
1. Bitcoin, Fiat Collapse, and Privacy Optimism
- Bitcoin as a Safe Haven: The episode opens with the panel discussing global monetary instability and Bitcoin's position. Marty, Mark, and Tony reinforce Bitcoin as "the victor" in fiat chaos.
- “Central bankers are tripping over themselves to devalue their currency, Bitcoin wins.” — Tony (00:22)
- White Pilling and Future Outlook: Rather than succumb to doom and gloom, the guests advocate for a positive outlook. They argue personal agency, focus on self-improvement, and technologies that enable privacy are vital for societal optimism.
- “If it’s all doom and gloom, no privacy in the future, why do you want to bring kids into this world?” — Mark (01:31)
2. AI Acceleration and Developer Workflows
- Adoption of AI in Daily Coding: Tony details the transition from distrust to full integration of AI in coding—using tools like Cursor, Claude, Perplexity, and 01 Pro to build, analyze, and review code.
- “I've gotten to the point now where... probably AI is like 90% of the code that's written now... you’re not just going to write a sentence and get what you want. Prompt engineering is 100% a real thing.” — Tony (09:54)
- Prompt Engineering: Diving deep, Tony and Mark share tips for high-quality AI prompting, including context structuring and meta-prompting (using one AI to write prompts for another).
- “Sometimes I will get Cursor to help me write my prompt that I’ll give to the reasoning model... It’s called meta prompting.” — Tony (20:29)
- AI as an Executive Assistant: Mark uses voice-to-text for daily task prioritization, leveraging AI for personal productivity while ensuring privacy via encrypted tools like Maple.
- “The thing that I love about doing it in Maple...I can get very personal. I can say anything I want to and know that it's private and encrypted to the GPU.” — Mark (23:31)
3. The OpenSecret Story: From Mutiny to Encrypted Cloud
- Mutiny Wallet Lessons: The guys recount building Mutiny, a privacy-focused Bitcoin/Lightning wallet, and the insurmountable UX challenges, especially around Lightning liquidity and user state management.
- “We hit our cap of being able to reach people, being able to have something solid that actually works 99.99% of the time, just like unmaintainable.” — Tony (32:19)
- Pivoting to OpenSecret: They leverage cryptographic primitives developed during Mutiny: encrypted sync, seamless logins, and multi-device support, generalizing these for cloud data privacy across apps.
- “We just weren’t able to use anything as Mutiny devs... This is not accessible for just an everyday startup to start using a more secure way.” — Tony (60:19)
4. Secure Enclaves: Technical Deep Dive
- What are Secure Enclaves?
- Hardware-based isolated environments (on cloud or client devices) that prevent unauthorized access or tampering, even by server operators.
- Enclaves enable per-user encryption: data is encrypted in transit and at rest, only decrypted in verified, attested runtime environments.
- “You shove a bunch of code in [the enclave] and then you lock it and you can look inside the box and see the code... but all the data that passes through, you can’t see.” — Mark (86:28)
- Verification and Transparency:
- OpenSecret provides a “verified badge” for users to cryptographically check what code is running in the enclave, with links to open-source code.
- “You don’t have to be a software engineer to do that verification process...You can give it to ChatGPT-01Pro and say, hey, did they insert a backdoor in this latest change?... It will run through all the code and can tell you...” — Mark (39:22)
- Cloud Provider & Developer Workflow:
- AWS, Google Cloud, and Nvidia GPUs offer enclave support. Now not just Apple-scale enterprises, but any developer can use these tools thanks to OpenSecret.
- “The reason we built out Maple...as we were talking to more and more developers, developers naturally are very keen on AI...if we had a private AI—[that’s] one of the apps we could build on top...” — Tony (75:02)
5. Impact & Implications for App Developers
- Changing the Developer-User Relationship:
- In legacy cloud, backend developers (and third parties) have “God mode”—can impersonate users, access or sell user data. OpenSecret’s system means this is cryptographically impossible.
- “There’s no way to have an impersonation mode...it’s all client-to-server, end-to-end encrypted.” — Tony (46:40)
- Explicit user consent becomes the only way to access or share data, reshaping advertising/business models.
- “Maybe companies can still monetize user data, but they just have to have the user participate in the act of sharing.” — Marty (58:18)
- Liability & Regulation:
- Data breaches are a massive liability for companies. Securing user data by default, and being able to say “I don’t have access” is a major incentive, especially for sensitive categories (finance, health, journals).
- “I don’t want a knock on my door at 3 a.m. from some government agency saying... ‘hand it over.’ Well, I don’t have the keys.” — Mark (73:07)
- Accessible Disaster Recovery:
- Developers can export encrypted user data and open-source code at any time; migration is always possible—not locked into a single vendor.
6. AI, Privacy, and Emerging Standards
- Why Secure AI Matters:
- Most “private AI” is actually a proxy, not preventing back-end providers or GPU operators from accessing user data.
- OpenSecret plus enclave-hosted LLMs (like Llama 3) ensure the full request chain (user to GPU) is cryptographically private and verifiable.
- “We have introduced a new category of AI. Now it has the power of the cloud, it has the privacy of your home laptop, and you can verify it cryptographically that we are not logging.” — Mark (71:20)
- Potential for Industry Shift:
- Mark draws parallels to HTTP to HTTPS adoption: secure enclaves should become the default, especially as risks of plain-text data in the cloud become clear.
- “It feels like we’re at the beginning stages where it should be recognized this should be the standard way to do data collection.” — Marty (59:12)
- User Experience and Onboarding:
- End-users can use various sign-in methods (Google, email, passkey) that gate enclave access, and, if necessary, export their cryptographic recovery phrase—but aren't forced to by default, reducing onboarding friction.
7. Future Visions—User-Owned Data, Interoperability
- User Profiles Across Apps:
- OpenSecret could allow users to create a data profile used (with explicit consent) across apps—user-centric interoperability, but with strict privacy.
- “If enough developers adopt [OpenSecret], you could have a similar [multi-app] experience but have assurance only you have access.” — Marty (83:53)
- OpenSecret could allow users to create a data profile used (with explicit consent) across apps—user-centric interoperability, but with strict privacy.
- Standardization & Developer Excitement:
- Maple AI, a privacy-first ChatGPT alternative, serves as a proof-of-concept. Developers are excited by both the technical accessibility and privacy guarantees, lighting the path toward broad adoption.
- “Maple has just been this awesome tool to hand them…light bulb moment.” — Mark (88:04)
Notable Quotes & Memorable Moments
- “If you’re not paying attention, you probably should be.” — Marty (00:31)
- “The joke is prompt engineering as a job role, but… it’s 100% a real thing.” — Tony (09:54)
- “I can get very personal… and know that it’s private and encrypted to the GPU.” — Mark (23:31)
- “We’re not going to win by selling users on privacy… We’re going to win by just making the… easiest to use product.” — Mark (36:13)
- “You can verify it cryptographically that we are not logging, we’re not keeping track of anything…It opens up a whole new world.” — Mark (71:20)
- “It feels like a paradox, but a more open Internet needs to have strong privacy.” — Mark (92:56)
Important Timestamps
- AI’s impact on coding/professional workflows – (02:38–10:47)
- Prompt engineering and meta-prompting – (16:48–21:31)
- Maple AI as a practical example of encrypted AI – (23:31, 75:02–80:44)
- The pivot from Mutiny to OpenSecret – (30:41–36:13)
- Technical deep dive: Secure enclaves & verification – (38:35–42:05, 47:56–52:17, 86:28)
- Implications for end users, developers, and business models – (54:13–58:18, 80:44–83:33)
- Looking ahead: User-controlled profiles, data, advertising – (83:53–87:47)
- Developer reaction to Maple AI launch – (88:04–89:23)
- Closing thoughts on OpenSecret’s mission and vision – (92:56–93:07)
Closing
Marty wraps up affirming that OpenSecret's approach could be pivotal in reshaping not only app development but broader societal expectations around privacy, fiduciary responsibility, and the integrity of personal data. He urges listeners—especially developers and privacy advocates—to check out Maple AI and OpenSecret as blueprints for the future of secure, user-centric technology.
Learn more & try it:
Follow the team on X (formerly Twitter), Nostr, and other social platforms for updates.
