Tech Brew Ride Home – "Get Paid To Train AI On Your Phone Calls?"
Host: Brian McCullough
Date: September 25, 2025
Podcast: Tech Brew Ride Home by Morning Brew
Overview
This episode unpacks several fast-moving tech headlines: XAI’s escalating legal battle against OpenAI, Intel looking for a lifeline from Apple, Spotify’s ramped-up fight against AI music spam, a dramatic shakeup in prominent open source communities, and the controversial surge of the Neon app—which pays users to help train AI with their phone calls. The host threads together themes of AI’s rapid incursion into business, privacy, and even personal communication, highlighting the tensions playing out across the industry.
Key Discussion Points & Insights
1. XAI Lawsuit Against OpenAI and Apple (00:34)
-
XAI is suing OpenAI in California, alleging trade secret theft through targeted hiring of key employees with knowledge of XAI technologies and business plans.
-
XAI further claims OpenAI induced those employees to breach confidentiality.
-
Notably, former engineer Xu Chen Li is accused of taking XAI information to OpenAI in a separate legal action.
-
XAI has also sued Apple, alleging it conspired with OpenAI to suppress competition.
-
The ongoing tangle: Musk is separately suing OpenAI for its profit conversion while OpenAI has countersued Musk for harassment.
“OpenAI is targeting those individuals with knowledge of XAI’s key technologies and business plans... then inducing those employees to breach their confidentiality and other obligations to XAI through unlawful means.”
—Brian McCullough, quoting Reuters (00:34)
2. Intel’s Approach to Apple for Investment (02:04)
-
Intel reportedly approached Apple for a potential investment as part of Intel’s turnaround bid.
-
Apple, which once relied on Intel chips for Macs, now uses in-house silicon built by TSMC.
-
Apple touts its US investments, including a major pledge to Corning and a push for domestic component production.
-
Tim Cook, Apple CEO, expressed openness to Intel’s resurgence, stating competition is welcome.
“‘We’d love to see Intel come back,’ Cook said.”
—Brian McCullough, quoting Tim Cook (03:22)
3. Spotify Cracks Down on AI-Generated and Spammy Music (04:11)
-
Spotify updates its AI policies to adopt the DDEX standard for identifying and labeling AI-generated music.
-
New music spam filter aims to identify and remove fraudulent or spammy tracks, including those uploaded to the wrong artist profiles.
-
Unauthorized AI voice clones and deepfakes will be removed, but Spotify reaffirms support for non-fraudulent AI use.
“Spotify clarified its policies around AI-enabled personalization, stating directly that unauthorized AI voice clones, deepfakes, and any other form of vocal replicas or impersonation are not allowed…”
—Brian McCullough (05:13)
4. Circle Looks at Reversible Stablecoin Transactions (06:03)
-
Circle (stablecoin company) is exploring reversibility for its tokens, possibly allowing refunds for fraud or disputes—an idea borrowed from mainstream finance.
-
Wouldn’t unwind transactions on-chain, but could introduce a “counterpayment” or refund process.
-
This concept is contentious in crypto, challenging the priority of blockchain immutability.
“There's an inherent tension there between being able to transfer something immediately but having it be irrevocable.”
—Circle President Heath Tarbert, quoted by Brian McCullough (06:51)
5. Microsoft Expands AI Models in Copilot (10:09)
-
Microsoft is bringing Anthropic's Claude Sonnet 4 and Claude Opus 4.1 models to Microsoft 365 Copilot, alongside OpenAI’s models.
-
Users can now mix and match AI models in Microsoft's Copilot Studio and Researcher Agent, with an easy toggle.
-
Anthropic’s models are still hosted on AWS (not Azure), but integration gives users more flexibility.
-
Reports indicate Microsoft is favoring Anthropic models over OpenAI’s in certain features, like Visual Studio Code’s auto-selection and soon, Excel and PowerPoint.
“With this launch you can build, orchestrate and manage agents powered by Anthropic models for deep reasoning, workflow automation and flexible agentic tasks…”
—Charles Lamanna, Microsoft (11:29)
6. Open Source Drama: Ruby Central and Shopify (12:24)
-
Open source developer Joel Draper alleges Ruby Central took over major Ruby projects from community maintainers under Shopify’s pressure.
-
The backdrop: loss of a major sponsor following conference drama led to increased Shopify reliance.
-
Control was shifted via repository renames and permission changes; maintainers contested Ruby Central's control over the codebases themselves.
-
Ruby Central argued it needed to secure software supply chains, but critics saw it as conflating code ownership with service security.
“Critics...argue Ruby Central conflated its right to secure the service infrastructure with ownership of the open source code bases.”
—Brian McCullough (13:36)
Main Feature: Neon App Pays Users to Train AI on Phone Calls (14:44)
-
Neon Mobile offers to pay users for recording their phone calls—30 cents/minute to other Neon users, max $30/day for others.
-
Leapt from #476 to #2 in the US App Store’s social charts, demonstrating market appetite for data-monetization.
-
Neon claims to only record your side of the call (unless both parties use Neon), sidestepping some wiretap laws, but legal experts question the reliability and safety of their privacy protections.
-
User data is broadly licensed to Neon, who can sell it “for developing, training, testing and improving machine learning models” and related tech.
-
Significant concerns remain about potential for voice fraud, anonymization, data resale, and undisclosed “AI partners.”
“The fact that such an app exists and is permitted on the app stores is an indication of how far AI has encroached into users lives and areas once thought of as private.”
—Brian McCullough (15:53)“Once your voice is over there, it can be used for fraud.”
—Peter Jackson, cybersecurity attorney, quoted (16:52)“Now this company has your phone number and...recordings of your voice which could be used to create an impersonation of you and do all sorts of fraud.”
—Peter Jackson (17:00)“Well, I did bang the drum for years about somebody giving us the ability to make money off of our data, though I don’t know that I would do this one.”
—Brian McCullough (17:48)
Notable Quotes & Memorable Moments
- On Apple’s US investments:
“‘We'd love to see Intel come back,’ Cook said.” (03:22) - On Spotify’s AI crackdown:
“Unauthorized AI voice clones, deepfakes and any other form of vocal replicas or impersonation are not allowed and will be removed from the platform.” (05:13) - Crypto and reversibility:
“There's an inherent tension there between being able to transfer something immediately but having it be irrevocable.” (06:51) - AI in your voice calls:
“Once your voice is over there, it can be used for fraud.” — Peter Jackson (16:52)
“I did bang the drum for years about somebody giving us the ability to make money off our data, though I don't know that I would do this one.” — Brian McCullough (17:48)
Timestamps for Major Segments
- XAI/OpenAI Lawsuit & Apple Litigation: 00:34–04:11
- Intel Seeks Apple Investment: 02:04–04:11
- Spotify DDEX, AI Policy and Spam: 04:11–06:03
- Circle Stablecoin Reversibility: 06:03–07:46
- Microsoft/Anthropic AI Integration: 10:09–12:24
- Ruby Central, Open Source Drama: 12:24–14:44
- Neon App Paying for Phone Call Data: 14:44–End
Overall Tone
Brian McCullough maintains an accessible, lightly skeptical tone—often blending analysis with direct quotations and a sense of subdued alarm about the privacy and ownership shifts driven by AI and industry consolidation.
This episode is a brisk yet deep dive into the intersections of AI, privacy, law, and open source, with the Neon phone call story as a particularly striking illustration of how the boundaries of personal data are rapidly changing.
