How I AI – Episode Summary
Podcast: How I AI
Host: Claire Vo
Guest: Terry Lynn, Product Manager and creator of "Copper's Corner"
Episode Title: How I built an Apple Watch workout app using Cursor and Xcode (with zero mobile-app experience)
Date: September 15, 2025
Overview
In this episode, Claire Vo sits down with Terry Lynn, a product manager and self-taught mobile app builder, to explore how he leveraged AI-driven tools (notably Cursor and Xcode) to create "Copper's Corner," an Apple Watch and iPhone app for gym workout tracking—all without prior mobile experience. The discussion provides a detailed breakdown of Terry’s workflow, from ideation to code refactoring, and highlights practical ways anyone can use AI to supercharge their own side projects or professional development. Terry’s approach is accessible, process-driven, and illuminating for developers, PMs, or hobbyists interested in AI-assisted software creation.
Key Discussion Points & Insights
1. Identifying the Problem and Ideating the Solution
Timestamps: 00:06–05:43
- Origin Story: Terry describes his frustration with traditional fitness tracking apps (account setup, manual data entry) and the difficulty of staying consistent at the gym.
- AI Inspiration:
"I started using the GPT mobile app as like speech to text... why can't a workout app do this and then tag the data for me? Make it like a structured data set with analytics."
(Terry, 00:06) - From Hack to App: He initially used the Apple Watch’s voicemail feature, processed audio files with a script and GPT-4o to output workout logs in Excel, then iterated to a full-featured app.
- App Overview: Copper’s Corner lets users speak their workouts on either Apple Watch or iPhone, automatically transcribing and structuring the data, including analytics and history across both devices.
2. Demo: How Copper’s Corner Works
Timestamps: 04:23–06:42
- Authentication made simple: "I'm using sign in with Apple... one of the first things you'll notice is when you log into the phone here, it logs into the Apple Watch." (Terry, 04:43)
- Multimodal Logging:
"You could record it from your phone or you could do it from the Apple Watch... It can log your workout pretty much with no work."
(Terry, 05:08) - Real-Time Structured Output: The app processes spoken workout details (e.g., weights, exercises, reps) into clean, timestamped structured logs, with matching visuals.
- Analytics Views: Users can view workout consistency, top exercises, and progress via scatter plots and historical breakdowns.
3. Building the App with Cursor and Xcode
Timestamps: 07:23–09:07
- Dual-Workflow Development:
- Code in Cursor.ai, build/debug in Xcode (Apple’s IDE) for best mobile integration and error handling.
- Apple Watch and phone builds are run separately due to platform differences.
"I do something called dual wielding here... Cursor will do the coding and then I will do the building and the debugging in Xcode." (Terry, 07:23)
- Testing Realism: Using iOS simulators is useful, but real-device and in-gym testing is critical for user experience validation.
4. Iterative Prototyping and Workflow Evolution
Timestamps: 09:22–10:53
- Starting Point: Used Apple Watch voice notes, processed with Python+GPT4o, outputted to spreadsheets.
- Early solution lacked structured data and scalability.
- Transition to Apps:
"Then you got to put it into a database where it's actually structured data. You have, like, the foreign keys. You can actually manipulate it better."
(Terry, 10:00) - API Backend: Migrated from basic automation to structured, database-driven backend using Cursor.
5. AI-Driven Product Development Process
Timestamps: 11:28–13:13
- Three-Step Cursor Workflow:
- PRD Create: Generate a Product Requirements Document from the issue/task.
- PRD Review: Sanity-check PRDs, ask the model to self-rate (0–10) for completeness and clarity.
- "If another model were to take this plan, how would you rate this out of 10 if they had no context and it had to execute on this?" (Terry, 12:39)
- PRD Execute: Checklist-driven implementation, with explicit task breakdowns and code references.
- Gherkin User Stories: Used for scenario definition: "given [context], when [event], then [outcome]."
6. Structuring for the LLM and Refactoring
Timestamps: 14:30–24:08
- Learning Curve: Early struggles with AI making up directories, endpoints, or code led Terry to systematize context and break rules/templates into smaller, manageable pieces.
- "My rules were super verbose, maybe 800 lines long... now no more than 200 lines long." (Terry, 14:51)
- Token/Efficiency Awareness:
- Limits rule/token size for better LLM performance.
- Commits code before/after each Cursor phase for easy rollback and error mitigation.
- "I have a git commit before and after each phase and ask it to pause. So I don't just have it one shot everything." (Terry, 16:56)
- Vibe Refactoring: Dedicated 'refactor' rules to keep codebase maintainable, improving model comprehension and reducing tech debt.
- "I almost suggest to engineers... build the refactoring as a known cost of your AI implementation." (Claire, 21:36)
7. Collaborating with the Model: Rubber Ducking for Learning
Timestamps: 25:29–28:12
- Rubber Duck Rule:
- Uses the LLM as a patient tutor to explain and quiz on code, promoting deeper understanding.
- "Can you just take this file and explain it to me?... I'll just eat, it'll give me a function, I'll think about it, and I'll kind of use this to Rubber Duck as a partner." (Terry, 25:29)
- Claire praises this for supporting true technical skill development and mitigating knowledge gaps in AI-driven development.
- Uses the LLM as a patient tutor to explain and quiz on code, promoting deeper understanding.
8. Design and Prototyping with Index Cards and AI
Timestamps: 29:01–31:21
- Offline Ideation: Draws UI/UX sketches on index cards during subway rides, then uploads them to ChatGPT/GPT-4o for fast upscaling.
- AI-Assisted Mockups:
- Uses generative tools (e.g., uxpilot, Figma, Apple UI kits) to produce and quickly iterate high-fidelity prototypes.
- "You can send this picture into GPT4 on your mobile device and just tell it, hey, this is a mock up. Can you help me upscale this?" (Terry, 29:30)
- Design Limitations: Last 10% of polish remains challenging—an area where human design value endures.
Notable Quotes & Memorable Moments
-
On the AI-Driven Development Shift:
"If you get good at [working with the model], I think after this stuff is just kind of table stakes. It's just executing."
(Terry, 19:47) -
On Code Quality Anxiety:
"Yes, AI does have the ability to generate lots of code and maybe not the highest quality code... But what I have experienced is that exactly this is actually quite good at refactoring."
(Claire, 21:36) -
On Rubber Duck Learning:
"Rubber ducking vibe coding is essentially like a reverse rubber ducking... building this muscle actually helps you build faster over time because you're learning how to debug stuff with LLM as your tutor."
(Terry, 27:43) -
On Design Prototyping:
"Sometimes in New York when you're in the subway, you run into these dead spots where you don't have reception... So I started using these little index cards where I would just draw out the UX… then send this picture into GPT4 on your mobile device..."
(Terry, 29:01) -
On the Ever-Important MVP vs. Finishing Touches:
"It's like just looking at this app, there's things that annoy me and I'm like, oh, this is like the last 10%, but it works. I should do other things because just gotta keep shipping. Right. As one person."
(Terry, 31:53)
Lightning Round (Listener Takeaways)
Timestamps: 32:28–35:41
- AI Coding Wish List:
- Lamented lack of a 'network tab' in Xcode for easier traffic debugging.
- Language Support in LLMs:
- Mostly positive for mobile, minor friction from outdated libraries occasionally, but offset by LLM’s access to updated docs.
- Handling Model Deviations:
- "I will use a lot of git commits and then that's my fallback... every three tests, there's a git commit before, after, and that's how I know I can let it rip." (Terry, 34:26)
- Claire jokes about her YOLO approach to code commits, emphasizing the different personalities and approaches in the AI coding community.
Practical Tips & Replicable Workflows
- Start with low-effort, voice-driven logs to rapidly prototype before building a full app.
- Use split workflows: AI/editor for generation, native environment for debugging/build/testing.
- Systematize tasks with three-step AI rules (PRD Create → Review → Execute).
- Manage AI context windows and token counts—keep rules, files, and directives concise.
- “Rubber duck” with your AI: regularly ask it to explain code and quiz you to close knowledge/skill gaps.
- Prototype offline—sketch on index cards, digitize and upscale with AI image generators, then iterate with design tools.
Final Thoughts
Terry Lynn’s journey from zero mobile experience to delivering a robust Apple Watch fitness tracker epitomizes the power and accessibility of modern AI-assisted development workflows. His rigorous, modular approach—combining classic product and engineering best practices with AI-first tools—offers a blueprint for anyone looking to bridge the gap between idea and shipped app. The episode serves not just as a case study but as a mini masterclass on structuring your processes, collaborating productively with AI, and learning along the way.
Where to Find Terry
- LinkedIn and X (Twitter): @meTerryLynn
For more resources and past episodes, visit howiaipod.com.
