Podcast Summary: Hands-On Apple 208: Eye Tracking on iPhone
Host: Micah Sargent
Date: November 13, 2025
Episode Overview
In this episode of Hands-On Apple, host Micah Sargent explores the new eye tracking control feature in iOS, which enables users to navigate and operate their iPhone using only their eye movements. This episode delves deeply into the setup process, practical use cases, hands-on testing, accessibility significance, and related options like head tracking. The discussion is especially relevant for users interested in or reliant on Apple’s accessibility tools.
Key Discussion Points & Insights
Introduction to Eye Tracking
- Accessibility First: Micah underscores that eye tracking (and related features) are not “hidden hacks” or “Easter eggs” for techies to discover, but thoughtfully designed tools to empower users with disabilities and enhance device accessibility for everyone.
- Quote [01:38]:
“As I always talk about accessibility features, I think it's important to remember that these aren't hidden features … but instead are features that can empower you and other users to make use of their devices in ways that they may not otherwise be able to.”
- Quote [01:38]:
- Personal Spark: The segment was inspired after Micah saw someone promoting the feature on social media, prompting an in-depth test and explanation for listeners.
Setting Up Eye Tracking
- Location in iOS:
- Navigate to Settings > Accessibility > Physical and Motor > Eye Tracking ([02:18]).
- Prep for Setup:
- Device should be within one foot of the face and on a stable surface for optimal accuracy ([02:10]).
- Micah uses a tripod for hands-free stability during demonstration.
- Feature Options ([03:35]):
- Smoothing: Adjusts pointer movement fluidity.
- Snap to Item: Pointer automatically jumps to closest interactive element.
- Zoom on Keyboard Keys: Easier key selection during typing.
- Auto Hide: Hides overlay/UI when inactive.
- Dwell Control: Triggers actions by holding gaze on a target area.
- Show Face Guidance: Offers on-screen prompts for best tracking setup.
Live Demo & Experience
- Training the System:
- User must “follow the dot with your eyes” as it moves across the screen ([05:04]).
- Micah describes alignment, device placement, and facial position needed.
- Challenges: Noted the system's varying responsiveness and need for precise head stability ([06:38]).
- Quote [07:14]:
“Now, as you can see, this isn’t super accurate. And so we might go ahead and turn this off … you may find that you need to retrain it if you run into issues.”
- Retraining & Improvements:
- After initial wobbles, retraining while paying extra attention to not moving his head noticeably improved tracking reliability ([08:12]).
Related Accessibility: Head Tracking
- Quick Overview:
- Head tracking is a complementary tool in the same Accessibility > Physical and Motor menu ([12:10]).
- Recognizes specific facial expressions (e.g. raise eyebrows, smile, stick out tongue).
- Can assign various iOS actions to these gestures.
- Quote [11:15]:
“So a smile, for example, could open a specific menu, launch your camera, do a double tap, take you back to the home screen.”
Use Cases & Broader Significance
- Who Benefits Most?:
- Primarily aimed at users with motor impairments, but also demonstrates Apple’s larger commitment to “devices for everyone.”
- Encouragement to Explore:
- Micah encourages listeners to explore and champion accessibility features — even if not needed personally, they may help others discover empowering tools.
Notable Quotes & Memorable Moments
- Accessibility Philosophy
- [01:38] Micah Sargent:
“These aren't hidden features … but features that can empower users to make use of their devices in ways they may not otherwise be able to.”
- [01:38] Micah Sargent:
- Setup Tips
- [03:10] Micah Sargent:
“You actually do need to have your iPhone within one foot … and on a stable surface.”
- [03:10] Micah Sargent:
- Initial Frustration, Honest Impressions
- [07:14] Micah Sargent:
“This isn’t super accurate. ... you may find you need to retrain it if you run into issues.”
- [07:14] Micah Sargent:
- Highlighting Head Tracking's Range
- [11:15] Micah Sargent:
“A smile, for example, could open a specific menu, launch your camera, do a double tap, take you back to the home screen.”
- [11:15] Micah Sargent:
Key Timestamps for Segments
- 01:38 — Introduction to eye tracking and accessibility ethos
- 02:18 — Menu navigation and where to find eye tracking in iOS settings
- 03:35 — Explanation of key eye tracking features/configurations
- 05:04 — Onboarding and “following the dot” for initial setup
- 07:14 — Challenges and the need to retrain system for accuracy
- 08:12 — Demonstrable improvement with retraining and correct device setup
- 11:15 — Head tracking options and mapping gestures to actions
- 12:10 — Suggestion that users explore voice control as an alternative
Takeaways
- Eye tracking on iOS is a powerful accessibility tool with steadily improving reliability, especially when setup conditions are ideal.
- Head tracking and voice control add to the robust set of customizable accessibility features in iOS.
- Micah’s candid walkthrough offers both encouragement and realism—while the technology may still have quirks, its empowering potential is clear and worth exploring.
For anyone curious about Apple’s latest accessibility innovations or seeking practical guidance on setting up these features, this episode is a concise, hands-on resource, filled with actionable tips and real-world feedback.