Podcast Summary: Relatable with Allie Beth Stuckey
Episode: Ep 1292 | Ben Gillenwater | Cybersecurity Expert Reveals Shocking Truth About Parental Controls
Date: January 23, 2026
Host: Allie Beth Stuckey
Guest: Ben Gillenwater (Family IT Guy, Cybersecurity Expert)
Overview
This episode tackles the urgent, complex realities of technology, social media, and AI in the lives of children through a conversation with Ben Gillenwater—known as the “Family IT Guy” and a seasoned cybersecurity expert. Ben shares his personal journey as a parent, exposes the pitfalls of supposed “safe” tech for kids, illuminates the failures of parental control tools, and describes the grave dangers children face online, including addiction, grooming, sextortion, and the direct role played by AI. The conversation is frank and practical, offering parents both a sobering warning and concrete principles to safeguard their families.
Key Discussion Points & Insights
1. Ben’s Origin Story: The iPad Mistake
[00:01–05:12]
- Ben gave his 5-year-old son an iPad, believing YouTube Kids and parental controls would provide sufficient protection.
- Within days, his son encountered deeply disturbing content, including videos with sexual undertones, violence, nightmarish characters (e.g., Huggy Wuggy), and addictive mechanics—even in “kids” apps.
- Ben observed severe, lasting behavioral impacts—including nightmares and addictive tendencies—prompting him to remove the device entirely.
“He was seeing all kinds of inappropriate things… YouTube took him down some immediate rabbit holes with sexual undertones, violent undertones, things that are just addictive.”
– Ben, [02:18]
2. The Business Model of Addiction
[05:15–11:17]
- Major platforms like YouTube and Facebook are primarily advertising companies, not simply “tech” companies.
- Their goal is to maximize “eyeball time”—addiction is the business model.
- Even with deep industry knowledge, Ben fell into the trap because the system is designed to capture all ages, especially vulnerable, developing minds.
“We know that Google is an advertising company because their annual SEC reports show that 76% of their revenue is from selling ads… and they're in the business of addiction.” – Ben, [06:47]
- The “currency” exchanged is not money, but attention—a commodity children do not understand how to value or protect.
3. Behavioral Impacts of High-Stimulus Content
[12:45–16:46]
- Ben recounts how his son became fixated (“programmed”) on disturbing characters, to the point of uncharacteristic aggression (e.g., trying to take a Huggy Wuggy toy from another child).
- Kid-targeted content like Cocomelon intentionally maxes out stimulus (scene changes, colors, rapid editing) to maximize addictiveness.
“Cocomelon… it’s like cartoons on crack. Why? Because it works. It’s addictive. They get more eyeball time.”
– Ben, [14:18]
- Parents are encouraged to analyze and limit exposure to “high-stimulus” media, favoring older, slower-paced content or low-stimulus media.
4. The Dark Side: Grooming, Sextortion, and Suicide
[20:05–25:00; 27:08–37:01; 56:32–63:17]
- Ben introduces the concept of grooming: normalizing exposure to darkness or chaos until children’s boundaries are eroded.
- Present-day grooming and sextortion are increasingly automated, global, and innovative, using AI and chat functions in games and social media.
- Gangs like Nigeria’s “Yahoo Boys” systematically target teenage boys for sextortion. Tactics often begin with AI/real-girl profiles, escalate to sharing explicit images, and culminate in blackmail.
- The threat of exposure drives many kids to suicide—Ben shares real-world cases (Jordan Demay, Adam Rain).
“The same people that used to do the Nigerian prince scams… now it’s very well worded and well informed AI-powered hunting programs.”
– Ben, [29:05]
- Statistics shared:
- Suicide rates among children aged 10-24 have soared—by 2019, 1 in 5 deaths in these age groups was from suicide, with sharp increases tracking directly with social media adoption ([22:23]).
- Reports of adults sexually exploiting children online in the US: 187,000 (2023), 546,000 (2024), over 1 million (2025)—including 100,000 AI-generated cases ([57:55–59:00]).
5. Ineffectiveness of Parental Controls
[44:00–44:35]
- Built-in tools (Apple Screen Time, Google Family Link) are so convoluted that even an expert like Ben had to create an 82-page guide to help parents use them.
- The market’s best alternatives are expensive, third-party, whitelist-based devices (Bark, Pinwheel, etc.).
- Many schools, including Christian ones, now rely heavily on EdTech with little research to show positive benefits and significant concerns about cognitive impact.
6. Principles for Parental Technology Management
[42:41–47:41; 65:50–68:36]
- The “whitelist approach”: only allow specifically approved sites/tools, rather than blocking individual bad actors.
- Red flag features: bottomless feeds (scrolling never ends), online chat (often cannot be fully disabled).
- Devices belong only in public/family spaces—bedrooms and bathrooms are highest-risk times and places ([67:24]).
- No unsupervised AI usage for kids—ever.
- Parents must do the hard work of enforcing boundaries, modeling healthy digital habits, and cultivating “get out of jail free” conversations where kids know they won’t lose parental love if they make a mistake.
“You teach your kids how to defend their values in the process showing them what those values are… not in the words that you say, but in the actions you take and what you say NO to.” – Ben, [62:04]
Notable Quotes & Memorable Moments
- On technology as pacifier:
“You’re exchanging their long-term betterment for your short-term quiet.”
– Allie, [44:59] - On EdTech and adoption cycles:
“Kids should never be early adopters of technology… and EdTech is all early adoption.”
– Ben, [48:42] - On the myth of kids getting left out:
“There has not been a single exception to there being only positive outcomes [from removing social media]… In fact, the social life of the kid improved.”
– Ben, [41:09] - On parental modeling:
“How many times a day do your kids experience seeing the back of your phone instead of your face?”
– Ben, [68:19]
Five Things Ben Would Never Let His Kids Do
[65:50–68:36]
- No social media / bottomless feeds (e.g., Instagram, TikTok).
- No online chat in apps/games, unless fully and securely disabled.
- No AI usage unsupervised. Ever.
- No devices in private areas (bedrooms, bathrooms) especially late at night.
- Neglecting parental self-regulation: Parents must actively model healthy tech relationships.
Resources & Next Steps
- Ben Gillenwater’s Website: familyitguy.com — Articles, digital safety guides, app, and upcoming book Skills, Not Rules.
- NCMEC Tipline for Sextortion Victims: Call 1-800-THE-LOST or visit the National Center for Missing & Exploited Children.
- Further Reading:
- Blog post: “Digital Danger Zone” (Ben’s statistics and chart correlating suicide rates and social media rise)
- Articles on identifying low-stimulus content and other safety guides.
Timestamps for Major Segments
- Ben’s Story & Parental Awakening: 00:01–06:00
- Online Content & Addiction Mechanisms: 06:00–11:00
- Behavioral Impacts of Stimulus/Content: 12:45–16:46
- Grooming, Sextortion, & Suicide Trends: 20:05–25:00; 27:08–37:01; 56:32–63:17
- Parental Controls & EdTech Failure: 44:00–47:41
- Concrete Principles & “Top 5 Never Do” Rules: 65:50–68:36
- Resources & How to Follow Ben: 68:42–69:57
Tone and Language
The tone is urgent, persuasive, but empathetic and practical—Ben and Allie both stress that this is a “human” problem, not merely a tech issue, and focus on empowering parents rather than shaming them. Both draw from personal mistakes and lessons.
This episode is a must-listen for parents and educators concerned about the real, evolving dangers faced by children in today’s tech-driven world, offering both a wake-up call and actionable steps to reclaim family life and safety.
