
Loading summary
A
Today we are joined by Melissa McKay, President of the Digital Childhood Institute and a force to be reckoned with in the fight for online child safety. Just this past month, Melissa made headlines when her LinkedIn post about Google emailing her 12 year old son instructions on how to disable parental controls went viral, reaching over 2 million views and prompting Google to reverse their policy within days, which is a win. But this is far from her first victory. Melissa has been advocating for children's digital safety for nearly a decade, from testifying before Congress to filing complaints with the FTC and shaping legislation like Utah's App Store Accountability Act. Today, we'll be diving into her recent battles with Google and Roblox over their practice of automatically removing parental controls at age 13, what this means for families, and what parents can do to protect their kids in the meantime. Melissa, I am so stoked to have you on this podcast. Thank you for joining us.
B
It's so great to be here. Thanks for inviting me.
A
So for anyone who doesn't know you yet, can you share a bit about your background and what led you to become such a passionate advocate for online child safety?
B
Sure. So I've been doing this for about 10 years, but it was really my kids that brought me into it. I have five kids, four of them are boys. So that is like a ton of tech parenting maintenance, right?
A
Yes.
B
So it started way back in like 2017, back when devices were starting to flood into schools. And I had my oldest son who was in fourth grade at the time, was exposed to porn on a school issued device and my nephew, other family members, neighbors. I'm just like, what is going on with devices in school?
A
Schools? 100%.
B
Why? Like, it felt like I was trying to be so supervised at home only to send them to school and have it be like the complete wild west. So the very first bill I worked on was a school education bill, and then the next year was a cell phone policy bill. And then every year I've just come back for a little something new and something more.
A
I can't think of a more powerful force in this world than a mother. We get stuff done. We get stuff done and just, wow, five kids. That's amazing. I have one and I'm still treading water. And he's 17, so I don't know
B
how you do it.
A
So thank you for that background. And let's talk about Digital Childhood Institute. Now. What's your mission? What kinds of work are you doing to protect kids online? Sure.
B
So I formed Digital Childhood Institute because once you've Been in this for a little while. You realize it's the same 25 people that show up for every piece of legislation. You know, all of the exposes, it's always the same like 25 or 30 people. So this is kind of more of a policy think tank and education think tank. So I wanted to bring together the experts in policy particular because after you've done this for 10 years, you just realize that these companies are so self interested that unless you regulate them, like no amount of public shame, no amount of letter writing really moves the needle. So once you get into regulation, then all of a sudden. Or lawsuits, I would say regulation or lawsuits are the two things that actually make these companies flinch, as we like to call it. Yeah, if you go super viral with a shame campaign, you can, you can do similar things, but really beyond those three things, I haven't found anything that was effective. So we wanted to bring together the best of the best policy writers and you know, come up with where, where are the holes in the current system, what needs to be regulated and then putting together enough manpower to do that
A
nationally and their pockets are so deep, the lobbyists that they can afford, the campaigns they're able to support, the donations they're able to make unfortunately influences legislation that could save children's lives.
B
Yes.
A
And that's just not okay. So let's talk about what happened with Google. Like, it is a very. Like, there have been plenty of viral posts of things that platforms have done to kids and they don't care, they don't pay attention, they don't take action. Google took action. That's a big deal. So let's talk about what just happened with Google. Walk us through anything you'd like. What you discovered when your son received the email about removing parental controls.
B
All of the above. Yeah. Great. I mean this, it's a great story. So Google knows me because I'm running the App Store Accountability act and I wrote an FTC complaint against them. So, so, so they know that there's no like BSing me. Right. Like a lot of times they'll put out like a polished PR campaign or the like some of their friends came onto my LinkedIn post and basically said, oh no, this is, this is a COPPA requirement. They have to remove parental controls. COPPA says that. And I'm like, coppa doesn't say that
A
and you can do better.
B
Right, Right. So I think that it was a combination of the fact that the post went viral, which I did not expect at all, and that they know that I know what they're doing. So anyway, my, my youngest son, So I have five, but my youngest is just turned 13. Okay. So I get this email from Google and they do send an email to the parent, but the parent email is completely deceptive. It says, your son is turning 13, you're still going to have access to parental controls. So, and, and here's this link to international age cutoffs. You might want to go talk to him. It was only because I had written this FTC complaint that I'm like, I know that they're, they're going to strip him of my supervision. Yeah, Previously I did not know that they emailed the child. Like I had read that they had emailed the parent, but I thought there's no way they had the arrogance to email a child to encourage them to go around their parent to give them instructions. So when I got that email, like my heart was just beating so fast because I thought, has my son already read his email? When did his email come? What does the email say? So I go like running upstairs and I was like, open your email, open your email. And the crazy thing is adults, we get 100 emails a day. I get like five emails from Google a week. Kids, when they get an email from Google, they are reading that email, you know, they are like reading every word and they're following the links. And so I had them forwarded it to me and I was just so horrified at the depth of which they go to. When you click on that hyperlink to explain to the kid, click on this, go here, do this. And then we're going to remove the parental controls without your parents consent. You are empowered. Your parent cannot control this. So I open my phone and I like pop off a quick LinkedIn. I think it took me 15 minutes to write it. I was shaking, I was livid, yeah, livid when I wrote this. And when I went to bed that night, I think it had like 4,000 views. And when I woke up it had 100,000. I struck a nerve. I think every parent just feels like tech being able to email their 12 year old child should be out of bounds, right?
A
Absolutely. Like in no world does that make sense. And neither you nor I or anybody who is rational and educated is saying children shouldn't develop some semblance of autonomy and digital citizenship. That's not what we're saying. But you do not have a right as a company to go directly to my child and decide unilaterally that they are now fully capable of making adult decisions online. Just not okay. And so Go ahead.
B
Oh, I just can say every child is developmentally different. You know, I have a son on the autism spectrum. Like, they can't say your 13th birthday is when you're ready to remove parental controls, even developmentally. Like, it. It doesn't even make sense that they should be able to make that decision. So the whole thing just. It just like got to my mama heart and I just. So it blew up and I was like. And it just kept blowing up. Like it blew up on LinkedIn and then it blew up on Twitter and then it blew up on Instagram.
A
Good, good. As it should have. And according to our research, your post about Google went viral and reached over 2 million people across all the platforms. And. And within days. This is the best part. Google announced they were reversing their policy. How did that feel? And were you surprised by how quickly they responded?
B
Yeah, I mean, we had been talking with them about this particular policy for six months and, you know, no movement. And I think that they had announced that they were reversing it within three days and then made the change within like 12. Like the actual. They had updated their website. And so I was totally shocked, obviously thrilled and relieved. I mean, the issue is that kids who had been allowed to remove the parental controls before this policy changed, my son still got the email that says you can now remove them. So it didn't. Wasn't retroactively protecting kids. It was a little bit of a bittersweet win. Didn't protect. I mean, obviously my son didn't elect to do that, but it. I just kind of felt sad for the millions of families who've had to deal with that already.
A
100%, my son and so many others. Right. It's just. It's just not okay. And it'd be one thing if they didn't know if they were like, oh, we didn't know, but they know and they still were doing it. But I want to say good job, Google. Thank you. Thank you for listening to what moms and parents want and erring on the side of let's do what's best for kids. Like, that was a good choice. Thank you. More platforms need to do that. You've described this practice as grooming minors for profit, which so much is happening now, especially with the trials happening in Los Angeles of the big platforms. And you said it was one of the most predatory corporate practices you've seen. Can you expand on or explain why this was so concerning beyond just the parental control aspect?
B
Sure. I mean, it kind of what I talked about before Companies generally don't go around parents like even the roadblocks, when we get into that, they emailed the parent and just removed it by default. But you think about schools, how would you feel if a school counselor or a teacher started emailing your 12 year old kid and basically said, you know, I'm gonna, I know that your mom doesn't want you to go on this field trip or doesn't want you to, you know, stay after school, but you actually are empowered to do whatever you want. And I think that we just recognize that nobody does that anywhere. And so for me it just felt like a corporate practice that was even worse than almost anything else I've seen. So I mean the Rolex, when we get into that, I feel like that's really bad, but not as bad as what Google was doing.
A
Okay, all right, well let's talk about that then. You know, Roblox is doing something similar. You know, I've been keeping an eye on the people who have been sharing that the new facial recognition, age verification thing. I'm doing this because that's what you have to do, you know, is mislabeling their children. You know, I personally know a 15 year old who it said was 18 and that's not okay. Right. But so anyway, Roblox is doing something similar. What happened there and how is it different from or similar to what Google was doing?
B
Sure. And we're trying to figure out, I mean I was pinging Chris McKenna this morning wondering if they just started this policy change once they started the face scan. Like they're like, okay, now that we're sure that you're 13, we're actually just going to revoke your parents control. So we're trying to figure out when this started.
A
Right.
B
But they basically. So I got a similar email the day before my son turned 13 that basically said we already removed these parental controls. And it was similar to the Google in that it was hidden in the email a few paragraphs down. Kind of started as a, you know, feature update at the top and then the bullet points weren't concerning. It's only if you kept reading all the way down that it kind of comes out that we've actually removed your control over his spending, over his ability to contact strangers, over the ability to control his experiences. And so basically they're kidnapping my child with my credit card.
A
Right.
B
They're not even saying we're removing his credit card access. They're saying we're removing your parental controls, but we're keeping his access to your credit card. And if he spends anything, you're not going to get a notification anymore.
A
So.
B
So, I mean, wild. Right?
A
It's just there's common sense and then there's gray area. This is not a gray area. And the practice of even sending those emails that are just like a wall of text that they know nobody's gonna read. Right. The average parent might even go to spam. Like, you know, it's just, it's sad. It's really sad. Now you mentioned that what Roblox is doing wasn't as bad what Google was doing. I believe that's what I heard. Can you expand on that?
B
Well, I mean it, it's bad in a different way really. At that point, my only choice is to delete his account, which is what we did.
A
Yeah.
B
Because there was no ability for me to opt back into parental controls. I mean, I guess I could have changed his age, but now that he has a face scan or. And they're like face scanning without consent. Parental consent, like I think.
A
Right.
B
So I, I feel like that was bad in that it was worse in the fact that they just did all this without even like me having a choice to say, don't do this. I mean, at least with Google, I could have like deleted his email and hope he didn't ever see it. But Google, it's like he has a Gmail account that he has to use for the rest of his life. With Roblox, I could just be like, delete.
A
Right.
B
So that's what we did.
A
And that's, I mean, I know personally how difficult that can be if your child spent a lot of time and money on that platform. You filed a complaint with the FTC back in October about these practices. Can you tell us more about that and what you're asking the FTC to do? And actually, sorry, before you answer that, you had mentioned Chris McKenna and I want to give a shout out to Chris McKenna, ProtectYoungeYes.com amazing human dad and advocate for online safety. So shout out Chris McKenna. ProtectYoungUys.com but back to the FTC. What are you asking them to do?
B
Sure. So we talk about this kind of 10 year journey I've been on in advocacy. And the longer I'm in it, the more I played whack a mole. I mean, for a couple years I went only after Instagram and then I was going after Snapchat and then I realized that now it's chatbots and we're always playing whack a mole as advocates. And that's one of the most frustrating things because you think you have something under control and then they just get worse or there's a new threat. So pretty early on I realized that if, if Apple and Google just got their stuff together that it would fix everything. Like you look what happened with Tumblr when they were overrun with csam, they pulled Tumblr out of the app store and within two weeks they had nuked all of the CSAM and fixed their safety problems. So in my mind, I'm thinking Apple and Google have actual knowledge about all the exploitation that's happening on these platforms.
A
Oh yeah.
B
And they're facilitating a rating system that allows these apps to be rated, say for children. You know, you look at Snapchat rated safe for, for teens or even pre teens before their reading update. And Instagram rated safe for teens who these there's Chatbot companions on Google Play rated say for four year olds. Oh gosh. So Grok, even though it was producing massive amounts of pornography and non consensual intimate imagery, it was rated safe for 13 year olds until like two weeks ago and then it was rated safe for 16 year olds. So it's not even accurate now. So I think as an advocate, I'm thinking if Apple and Google were actually gatekeepers, imagine that they had pulled Instagram out, you know, five or eight years ago when all of these reports of child sexual exploitation and sextortion and they were just like, hey, take a three week break, get your stuff together. All of this downstream harm would never have happened. So I did a deep dive. My FTC complaints were a deep dive into all the legal requirements that a gatekeeper should have. You know, the deceptive marketing. Are you promoting app age ratings that are inaccurate? Are you complying with coppa, meaning that you actually know how old the user is because they entered it in the device? Are you letting the developer know that you know that this user is underage? The terms of service contracts that kids sign when they download an app that gives the app access to their camera, to their exact location. Are children being appropriately dealt with with these 50 page contracts that's indemnifying these companies or is there an exploitative trade practice going on with the way that they're handling contracting? So each of the complaints are longer than 50 pages. So I mean they're a hard read, but they're a really good read into all of the things that the app stores are doing wrong. Apple and Google that are causing this downstream exploitation. That's the short answer, but it was good. It is my defense of our App Store Accountability act and why we wrote it and why it's needed.
A
I'm so thankful for the tireless work that you and others in this space are doing because it's life changing. These instances. I know I'm preaching to the choir, but like, this is negatively impacting our children's physical and mental health. Children are dying because of what they're encountering online and who they're encountering online. And this has got to change. So you mentioned coppa, the Children's Online Privacy and Protection act, which hasn't been updated since what, like over 20 years?
B
Yeah, 25 years.
A
Yeah, that's. So that's alone problem A But both Google and Roblox cite COPPA, the law that sets 13 as the age threshold for certain privacy protections, as justification for their policies. What's your response to that?
B
Yeah, so Coppa, again, 25 year old law that basically said that if you're under 13, you need parental consent to gather data for like advertising. Right. So to consent to give away your data for advertising, that's completely different from. It's not even the age of Internet adulthood. A lot of people have unfortunately conflated the idea that Coppa made 13 year olds adults. That's not what it says. It doesn't say kids can sign contracts. It doesn't say that you can remove parental controls. But because it does cite the age of 13, companies have conveniently use that for 25 years as a weapon, not a protection. So it's, and it's really taken advocates who are well versed on legislation to call them out on that for the first time, which we're doing this year.
A
So yes, let's talk about that. You've been fighting for better App Store accountability and parental controls for years, even before this recent Google and Roblox incident. Please tell us more about that work. Like the Fix App Ratings movement, Utah's App Store Accountability act and anything else that you've been working on.
B
Yeah, great question. Fix App Ratings was the very first national movement I did. I did it with Chris McKenna and the national center on Sexual Exploitation.
A
Yes, Nicosi. I love Nicosi.
B
Back in 2019 is when we launched it. And just I think just with the disgust and disbelief that apps rated 13 had access to pornography. And you know, at the time, Snapchat, the whole OnlyFans. Snapchat was kind of the early OnlyFans. Right. And we were just kind of horrified setting up these tester accounts and realizing that it was a combination of horrible sexualized news stories. Only Fans and then this like sexting sextortion. And the app was rated safe for 12 year olds.
A
And I want to interrupt real quick and give you a shout out because, you know, we were talking before we started hitting record, you know, for those of you who don't know Bark Technologies, we went undercover as an 11 year old on Instagram back in like 2019 and within minutes had very inappropriate photos and grown men looking to talk with what they thought was a fifth grader. We did a whole thing, it went viral. You can look it up. But Melissa did that even before us. And so just thank you for being on the forefront of that. Please continue.
B
Yeah, so we were setting up, like you said, tester accounts on Snapchat, on Instagram, and just realizing these apps are adult apps. They were adult playgrounds created by adults for adults. And most of the parents that I talked to didn't understand this. So I tried to educate as many people as I can. And finally, I just thought, education is not the answer. I can't educate away a deceptive app age rating. I mean, could you imagine if there was a video game that was, everybody was buying that said it was safe for kids and it had like nudity, like four screens in, you know, I can.
A
Because it exists. Right?
B
So we started the fixed operating campaign. I mean, we got state resolutions passed, federal resolutions passed, and these companies did nothing. They didn't. They don't care. Like they. So after that we were like, well, let's talk to him about not only fixed app ratings, but your parental controls are terrible. It takes 30 steps to set them up and they're still full of loopholes. So we started kind of combine the fixed app ratings into a default to safety movement. And then like three years later, I was like, I'm going to write fixed app ratings and default to safety in a piece of legislation and we're going to call it the App Store Accountability Act. So the Abstract Accountability act is really, really simple. There's three parts to it. Age verification at the device level. Basically all good child safety legislation hinges on knowing who is an adult and who is a child. So there's a lot of argument about where should we do this? Should we require every app to scan a child's face like Roblox is, and that's kind of the alternative? Or do we take the age that's already on the device that the parent has attested to and use that age and then just have them send an anonymous signal out to all the apps? And most people in child advocacy that I work with are like, you know, even Jonathan Haidt is like, that's the way it's more privacy preserving. It's. I think the biometrics things is just a disaster and a conversation for another day.
A
Well, yeah, I mean we're seeing that with roadblocks right now.
B
Yeah. So age verification at the device level, parent consent for every download and in app purchase. Because we're talking about contracts and if children are signing contracts, there should be a parent involved. Right. And the third part is accurate app age ratings and so very elegant. Three parts. And tech companies have lost their minds like over those three pieces.
A
Expand on that.
B
Yeah. Okay. So I, I got it written in Utah. I mean after spending two years, it started as a PowerPoint presentation, got turned into a policy paper by IFS and EPPC hired an attorney to write it, introduced it in Utah. And I thought nobody's going to fight this. Who's going to, who's going to argue with these three pieces? Right. And it just like, like the day I walked out of our first committee where it passed unanimously, I got cornered by a lobbyist just losing their mind like. And everybody, like you would think companies like Walmart, AT&T banks, credit unions would be neutral on this bill. No. Anybody who has an app does not want to be regulated. They don't want to know. They do not want to know when they're dealing with a child. Do not want to know if they know they're dealing with a child. They have to follow coppa, they have to protect the child. And the way that it works now in app stores, it allows them to pretend everybody is an adult unless they like, you know, have age verification at the app level, which, who even does that? Or it's done sloppily. Right. So it, I mean I've been working on this bill. It's in 25 states now where it's had been introduced in 25 states. And every single day I wake up to horrible poison pill amendments. Google and Apple lying to our sponsors. You know, they're lobbyists lying to our sponsors. They started a whole 501c4 to attack us personally. And the bill, what's the name of it? I'd have to look it up again.
A
It's fine.
B
But everybody in Alabama started getting these texts like do you care about kids? Yes. Then vote against the App Store Accountability Act. What is insane? Like I got this. I mean the stuff I deal with every day, it's the amount of evil that I have seen from these companies to avoid being regulated. I mean even the lawsuits that they have put against this bill, it, like, there's just, like, conflating stuff and lying about stuff. And like, oh, you're going to restrict kids from accessing. I mean, the App Store is basically exactly like a bookstore, so you can't. And I'm like, the App Store is not a bookstore. We're talking about apps that track kids around. Like, if there was a bookstore that was selling a book that did what apps are doing to kids, you had better believe we'd be regulating them. Right. But I could probably write a whole series about the kind of, like, dark, manipulative tactics that Apple and Google have done. Oh, they. They put out a whole campaign in Texas to say that porn companies were behind the bill. Like, literally, like a $250,000 Instagram campaign to say that porn companies were behind this bill.
A
It's so hard to believe, But it's true. You know, if you are to be believed, which I believe you. Right. I haven't done that particular research, but time and time again, I have seen anecdotally this sort of behavior. And I know that there are people at Google and Apple that care about kids. There are parents there, but it's like, doesn't seem like those are the ones making the decisions about how to keep kids safer online. And I didn't ask you about this beforehand, but you may have thoughts you want to share. You know, a few years ago, Apple was going to work to remove CSAM from icloud, potentially flag people who were uploading it, you know, to help law enforcement, and they walked that back. And that was such a blow, such a blow to anyone who is a survivor. And I just, I get it. Apple prioritizes privacy over child safety, but I don't think a predator or a child abuser deserves privacy over a child's right to not be abused, digitally or otherwise. And not enough parents know that when they decide to buy an Apple phone or watch for their child's first device. So do you have any thoughts you want to add there?
B
Yeah, absolutely. What you find in these companies, like you said, there are good people, they're parents. There is a totem pole of priorities in a company with shareholders being at the top, basically, and then policy people right under that, the trust and safety departments are generally at the bottom of that priority totem pole. So even if there's good people in these departments, it's. You know, I personally know that there were people who had worked on that CSAM policy for years and were so devastated when Apple walked it back because they really wanted those kinds of like, like worst stuff in the world. Why not protect against that? Right? So I would just say that it, it goes back to the fact that outside of litigation, regulation and shame campaigns, you're running into these very dense, complicated structures within every one of these companies that safety is always, always ends up at the bottom.
A
And they move too slow. They move way too slow. And even when they do make like, every time I get like a Google alert or something about like such and such platform has rolled out new parental controls, I'm like, oh, click. And then I read them and I'm like, that is nothing. That means nothing. That is fluffy. That is smoke and mirrors. How did somebody think that we wouldn't see right through that? But here we are. Yeah, it's really frustrating. And what's also frustrating is that, you know, there's no secret that this podcast is sponsored by Bark Technologies for profit company that makes safer tech for kids. But I will say that if we at Bark, a company of little over 100 people that's been around for 10 and a half years, it helps to protect over seven and a half million children across the nation, are able to launch a safer smartphone that doesn't even allow a child to take a nude photo or video and have it saved to the device. What are you doing, Apple? What are you doing? Every other smartphone company, you know, we give parents the ability to turn off or on the camera from their own phone contact approval, time limits alerts when a predator is talking to your child or your child is experiencing suicidal ideation. If we can do it with their resources and their engineering teams, they should too. And they're not, they're not prioritizing child safety. And even Apple's latest, like, communication safety, whatever feature of like, we're going to blur nudes, but we're going to still tell you it's a nude and give you the option to look at it if you want. Like that again, it doesn't. It's just fluffy. So anyway, and then also while I've got the mic and I'm on a rant, you know, you know, we were talking about Coppa a few minutes ago and how it's 25 years old. TikTok didn't exist, you know, ChatGPT didn't exist. Like the laws meant to protect children and consumers online are so old and we have got to do better by our constituents. If there was a serial or a car seat that was out there and putting children at risk, either serious injury or death, it would be recalled, it would be pulled from the market. But because of our laws in this nation, those same rules don't apply to the places where children are spending the most time, upwards of eight hours a day. So something needs to change, and people like you, Melissa, are actually making that happen. And so thank you. I'm going to ask you now just tactically, strategically bringing it back to. Okay, any parent who's listening is probably, like, overwhelmed. What do I do? So what can parents do right now to protect their kids from having controls automatically removed when they turn 13? And are there specific settings or conversations families should be having?
B
I'm just a huge fan of the dumber smartphones like bark. I just feel like that because it is the wild west right now, because regulation has not caught up with the technology because it's evolving so quickly. I'm. I would consider myself a tech expert and it is so hard to protect my own kids. And my husband is like a tech expert and the two of us together. So I would just say for the average family, for the average parent, like, opt out. Opt out of everything you can opt out of. Get your kid a bark phone. Keep them off of roadblocks. I think that those of us who are tech experts sometimes think, well, I can supervise this. It's really important for them. But then they remove parental controls, and most parents wouldn't even know. They wouldn't have even known. So, I mean, give it 10 years. US advocates on the front lines, we're fighting this battle every day. We're fighting for safer regulation for online spaces. I'm doing ed tech bills. We're fighting for safer schools. We're fighting for safer chatbots. But it's going to take time. We're just hitting the tipping point where Congress has appetite. But in the meantime, don't let your kids be carnage. Just opt them out.
A
I sincerely appreciate you saying that. I wish I had the confidence and the knowledge 10 years ago when my son was 7 to just say no to delay. As Chris says, delay is the way. Even though all his friends had access. And, you know, I wanted him to fit in. I didn't want him to be left out as an only child. In hindsight, I wish he was left out of the bullying, the ability to talk to strangers, the exposure to graphic, violent sexual content. Mature themes that his heart and mind weren't ready for. And I worked in this space. I was still, you know, like, I. If anybody should have known, it was me. And I still made mistakes. And it's just not fair. It's not fair. Parenting is already so hard. And children, as you stated, you know, carnage, they're the collateral damage in this. They deserve to have a protected childhood and they don't right now. So it's up to people like you, me, Chris McKenna, Nicky Reisberg of Scrolling to Death Family IT guy. Like, there's so many people in this space that are working hard and actually, speaking of family IT guy, we had him on the show a few weeks ago and he has created a guide to help parents turn on parental controls if their kids have an iPhone. It's like 83 pages long. Who has time for that?
B
I saw that going viral on Facebook. I'm like 83 pages and like 220 screenshots or something. Ridiculous.
A
Nope.
B
Like, and then, I mean, on top of that, you think all of these companies are pretending parental controls is the answer. So you've got Apple's parental controls and then Roblox's parental controls and then TikTok's parental controls and Instagram's parental controls and the Chromebook's parental controls. And everyone's saying parent. Like, just be a parent. Like, I get so many trolls that tell me, just be a parent. Why do you expect the government to do this? Or companies? I'm like, are you kidding me? Like, do you know what we're up against? Like, like literally 60 different sets of parental controls, none of which really work. And we're just parent better. Like, it's crazy.
A
Okay, yeah, that's like expecting us to be a certified physician, dentist, sports coach, you know, algebra expert, like all the things we're expected to do. Right? And now you have to be an IT expert, an online safety expert. Be ahead of trends with AI and chatbots. Like, it's just, it's too much. We need help. It's not a parent's job solely. It's not the tech company's job solely. It's a group effort between tech parents, legislators, educators, mental health care, and other healthcare professionals. Like, it's a collective effort. So anyway, it's going to take time, but in the meantime, yes, delay, opt out, educate yourself, reach out to your local legislator and advocate for what you believe is the right thing for your children. And another free easy tip that I love is just keep connected tech out of your children's bedrooms and overnight. They don't need a smartphone for an alarm clock or listen to music. Protecting those spaces where you wouldn't invite strangers into anyway, right? Like, would you invite a 30 year old dude into your child's bedroom to play A game with him? Like, no. So why are you letting that happen digitally? Don't do that.
B
Yeah, 100%. Like, and the thing that we do so much of is try to help people understand this by bringing the physical world into the digital world. Like, would you let your kid go to a bank and sign a 50 page contract with a foreign adversary? Like, would that bank be in trouble? Well, then why isn't Apple in trouble? You know, and I mean, like, what if somebody was like wildly allergic to peanuts and Walmart was labeling all their cookies as this doesn't have peanuts when they knew it did? Like, why are we allowing Apple to rate Snapchat rates safe for 13 year olds when they know it's not? So it. I think anytime we can make a parallel between what we're used to in the physical world and the digital world is where we help people understand how crazy what we're experiencing actually is.
A
Yes. Bring it back to common sense. Like, cars have seat belts now because of the people that died. Children need to wear helmets when they ride a bike. Like, it just makes sense. So looking ahead, do you have hope? Do you have any hope? And if so, what gives you hope in this fight? And how can our listeners and viewers and followers support the work that you're doing at the Digital Childhood Institute?
B
Yeah, I mean, I do have a ton of hope and there have been times where I have not had hope. So I'm not just saying that. I feel like that five years ago there was no appetite for anyone to change there. It was the golden era of tech. Everyone was in like a bromance with the tech bros. Right? You know, you look at Congress and they're just like, oh yeah, self regulation is the way. And it was like it didn't matter what we did, you know, when we did our first, very first congressional hearing that Chris McKenna testified for in 2019, and we're talking about all of the kids that are being extorted on Instagram and the horrible things. I mean, the legislators were like, well, I don't know that we need legislation for that. We'll let the tech companies know. So it has been a complete paradigm shift. Now states cannot regulate fast enough. I mean, every state's coming out with like 30, 40 new bills every year regulating tech. So there has been a shift, A good shift, a good shift. So I would say to your listeners, all of your moms and dads at home, go get involved in politics.
A
Yes.
B
I'm a stay at home mom of five kids. Right. I graduated in nursing for Crying out loud.
A
And you made Google change policy.
B
Yes.
A
We can do anything.
B
Moms, particularly, like, there is a savageness about moms because we are protecting our baby cubs. And legislators do not want to get in the way of a mom. They don't. They would rather get in the way of a tech lobbyist who's been sending them Christmas presents for 10 years than a mom. Right.
A
We.
B
We bring this level of visceral anger that they are scared of and that they. They want. They want to help us out right there. It's this shared humanity when you're talking to a mom that you do not get from anybody else. So I, I would hope that all of your listeners turn into activists because never underestimate your power. I. When I did that LinkedIn post on Google, I had less than 500 followers. So.
A
Dang.
B
Wow.
A
See, like, this is. This is. This gives me hope, right, that moms can change the world. You don't have to be the president of a country or, you know, a billionaire, which you might be, I don't know. But, you know, to make things happen. Right. Like, it's really, really is amazing and commendable. And you're helping to protect childhood. And that's what we need to do every day for those of us who know better. And there's sometimes days I wish I didn't know everything that I knew about the Internet, but here we are. You're very humble, but I want to make sure that we are supporting you and what you're doing. In addition to driving people to reach out to their local legislators, get involved in politics that advocate for safer childhood policies. How can we support your organization?
B
Like, I mean, this is something that I think that you would do at Bark, just if you, if you can keep spreading the message, like posting, retweeting, Chris, me, anybody who is. Or, you know, if there's a hearing, if there's local people that are willing to come testify, even if you have no experience. Experience, we can help you, like with draft testimony of what you need to say. So I would say that let's, you know, continue this collective bond keeping interested, you know, keep watching everyone's stuff on social media. And that's really the best way to support us right now.
A
Okay. Is pull this up. Is digitalchildhoodinstitute.org the best place to begin connecting and following your work?
B
Yes, absolutely. Or follow me on LinkedIn. That's kind of my hottest.
A
Great.
B
I've got the Roblox post kind of going semi viral right now. I think I have 150,000 views?
A
Heck, yeah. Heck, yeah.
B
Yeah. Follow me on LinkedIn. Melissa McKay Digital Childhood Institute.
A
Amazing. All right, well, we'll be sure to link to your website and your LinkedIn profile. And just thank you so much. Thank you for coming on. Thank you for what you're doing. And if I personally or we at Bark or our community of over 600,000 parents on Facebook alone can help you, do not hesitate to ask.
B
I will.
A
Melissa, is there anything that we didn't cover today that you think we could or should?
B
No, I think that we covered everything very well. And I appreciate the opportunity to be on. Like, I've been a huge fan of bark and yours for, I don't know, what, 10 years now. Like, you're just, you're doing things that other people aren't doing. We've had bark. Like our family uses bark. My kids school uses bark. So it's like what you are doing is changing lives. Even of the activists, you know, those of us on the front lines, like, you're, you're protecting our kids. So.
A
Okay, well, wasn't planning on crying this afternoon, and I'm going to try to not do that right now because makeup. So. But I really appreciate your kind words. And we have such an amazing team. Our team is putting. Putting children first and really trying to help parents because it is so hard. And we're gonna keep fighting as long as we can. So thank you, Sam.
Episode Title: Melissa McKay on Taking On Big Tech and Fighting for Kids' Digital Safety
Host: Titania Jordan (Bark Technologies)
Guest: Melissa McKay (President, Digital Childhood Institute)
Release Date: February 20, 2026
This episode dives deep into the current state of children’s online safety, as veteran advocate Melissa McKay joins Titania Jordan to discuss her recent viral confrontation with Google over parental controls, troubling trends with platforms like Roblox, and what families and advocates can do to push for real accountability and safer digital spaces for kids.
McKay shares how tech companies’ practices often undermine parental authority and child safety, the limitations of outdated laws, the tactics she and other advocates are using to drive change, and practical advice for overwhelmed families. The conversation is candid, urgent, and full of actionable insights for parents, policymakers, and anyone invested in protecting children online.
[01:11–02:17]
[02:39–04:00]
[04:20–08:53]
Google emailed her 13-year-old son instructions to disable parental controls—directly encouraging circumvention of parental supervision.
Parent email was deceptive, suggesting controls would remain.
Child-facing email was explicit in showing how to disable controls without parental involvement.
McKay’s LinkedIn post exposing this practice went viral (2 million+ views), leading to Google reversing its policy within days.
Quote [05:19]:
“When you click on that hyperlink to explain to the kid, click on this, go here, do this. And then we're going to remove the parental controls without your parents consent. You are empowered. Your parent cannot control this.” —Melissa McKay
Quote [07:50]:
“I think every parent just feels like tech being able to email their 12 year old child should be out of bounds, right?” —Melissa McKay
[10:58–11:48]
Most companies don’t directly contact minors about bypassing parents—compares it to a school counselor telling a child to ignore parental wishes.
Deems it “one of the most predatory corporate practices” she’s seen.
Quote [10:58]: “You think about schools, how would you feel if a school counselor or a teacher started emailing your 12-year-old kid and basically said... you are empowered to do whatever you want?” —Melissa McKay
[11:48–14:56]
Roblox now automatically lifts parental controls at age 13, hidden in a generic update email to parents.
Parents lose restriction privileges, but kids retain any payment method linked to their account—without parent notification of spending.
The only recourse is deleting the child’s account.
Raises concerns over Roblox’s new face scan age-verification, which is already being gamed and doesn’t require parental consent.
Quote [13:18]: “They're kidnapping my child with my credit card.” —Melissa McKay
[15:37–18:51]
McKay filed lengthy FTC complaints, focusing on the platforms’ deceptive age ratings, and lack of accurate child protections.
Platforms often rate apps as safe for kids when they feature explicit or predatory content (e.g., Grok, Snapchat).
Apple/Google have power but shirk responsibility as true ‘gatekeepers.’
Argues if they just acted on their existing knowledge, most major risks and exploitation would be dramatically reduced.
Quote [16:36]:
“Apple and Google have actual knowledge about all the exploitation that's happening on these platforms… and they're facilitating a rating system that allows these apps to be rated, say for children.” —Melissa McKay
[19:31–20:40]
COPPA (Children’s Online Privacy and Protection Act), now 25 years old, sets 13 as the age threshold for collecting data with parental consent.
Tech companies misapply this as a rationale to remove all parental controls at 13—confusing ‘data collection’ with ‘online adulthood.’
Companies weaponize the age 13 to evade responsibility, contrary to the law’s actual intent and capabilities.
Quote [19:47]:
“COPPA made 13-year-olds adults. That's not what it says. It doesn't say kids can sign contracts. It doesn't say that you can remove parental controls.” —Melissa McKay
[20:40–27:58]
Fix App Ratings: Exposed how apps rated as safe for young teens featured adult content; led mock child accounts and viral investigations.
Legislation efforts: The Utah App Store Accountability Act includes:
Tech industry heavily opposes, deploying extensive lobbying and disinformation, sometimes painting the legislation as driven by “porn companies” to generate opposition.
Quote [26:41]: “Anybody who has an app does not want to be regulated. They don't want to know when they're dealing with a child.” —Melissa McKay
[29:31–30:36]
Even ‘good’ insiders in tech companies struggle—trust and safety roles are last in policy priority behind shareholders and legal staff.
Positive, well-researched initiatives (like Apple’s abandoned CSAM detection) are often scrapped when privacy or profit is threatened.
Fluffy parental controls are presented by companies as progress but are often superficial.
Quote [30:36]: “I personally know that there were people who had worked on that CSAM policy for years and were so devastated when Apple walked it back...” —Melissa McKay
[33:49–35:08]
“Dumber” smartphones (locked-down, child-focused phones like Bark’s) are safer until regulation catches up.
Even tech experts struggle with controls—average parents should ‘opt out’ where possible to keep kids safe.
Delay tech access wherever possible (“delay is the way”).
Recognizes that parental controls are convoluted, inconsistent, and often ineffective; broader systemic change is needed.
Quote [34:08]: “Don’t let your kids be carnage. Just opt them out.” —Melissa McKay
[36:36–39:22]
Parental controls are fragmented across dozens of platforms—no parent can realistically manage them all.
Keep connected devices out of bedrooms and overnight.
Draw analogies to physical world safety regulations (peanut allergies, seatbelts) to highlight the absurdity of unchecked digital risks.
Quote [38:35]:
“Would you invite a 30-year-old dude into your child's bedroom to play a game with him? Like, no. So why are you letting that happen digitally?” —Titania Jordan
[39:47–44:18]
Major culture shift: Five years ago, there was “no appetite” for regulation—now every state is moving urgently.
Moms and parents have unique power in advocating for safer childhoods—legislators listen to “mama bear” energy more than corporate lobbyists.
Quote [41:13]:
“Moms, particularly, like, there is a savageness about moms because we are protecting our baby cubs. And legislators do not want to get in the way of a mom.” —Melissa McKay
Grassroots efforts—posting, sharing, testifying—truly matter, as shown by Google’s rapid policy reversal after one viral post.
[43:16–44:18]
On industry self-interest:
“Unless you regulate them... No amount of public shame, no amount of letter writing really moves the needle.” —Melissa McKay [03:20]
On parental controls being revoked:
“They’re kidnapping my child with my credit card.” —Melissa McKay [13:18]
On true industry motivation:
“Anybody who has an app does not want to be regulated. They don't want to know when they're dealing with a child.” —Melissa McKay [26:41]
On the power of advocacy:
“When I did that LinkedIn post on Google, I had less than 500 followers.” —Melissa McKay [42:09]
For more, follow Melissa McKay on LinkedIn and visit digitalchildhoodinstitute.org to get involved or stay updated on legislation and campaigns.