
Loading summary
A
We just released Bark's 2025 annual report, and the findings are both eye opening and frankly concerning this year. We analyzed a record breaking 11.1 billion online activities across text, emails, and more than 30 apps and social media platforms, making this the largest publicly available data set on children's digital safety in the industry and across the globe. What we discovered reveals a troubling paradox. While predator alerts in monitored spaces have dropped 60% since 2021, national sextortion cases are surging. The danger hasn't disappeared. It's just migrated to places where parents and safety tools can't see it. At the same time, we're seeing an unprecedented mental health crisis among our youngest users, with tween engagement in content about disordered eating skyrocketing by 650% in just four years. To be clear, everybody's on the same page. Tweens are children ages 8 to 12. Today, we're breaking down what these numbers really mean, where the biggest risks are hiding, and what parents can do to protect their kids in 2026.
B
All right, I love when our annual report comes out. I know it's sometimes hard to digest everything that we uncover and everything that we're sharing, but it's such important information, and I look forward in a weird way to sharing it with my friends who are now starting to have kids every year. And I'm like, hey, just some light reading when you have some time. Yeah, let me know if you have questions. But this year, 11.1 billion activities analyzed. I think in 2024, it was around 7.4. What does processing that amount of data tell us that smaller data sets couldn't possibly.
A
I love that you surfaced this, and I share the same sentiment. When our annual report comes out, it's like I get a shot of adrenaline and like, I just can't stop working. Cause I'm like, everybody has to know about this. There is nothing more important than the rate at which all our children are being harmed by these places and spaces and people. So let's share. To your question, though, the real value of a data set this large is that it turns questions and concerns into clarity. As we know, people can manipulate data, right? Data can be manipulated and shaped to tell a certain story depending on the narrative that is wished to be the outcome. But when you're processing billions of interactions, patterns stop being anecdotes and they start being reality. Small data sets tell stories, but large data sets like ours show you what's actually happening. And it lets us detect risks early, understand trends as they emerge and, and give parents advice that based on. And give parents advice that's based on real behavior, not guesses. And we might go into this later, but like, I just want to jump to the fact that Instagram, Meta, WhatsApp, Snapchat, TikTok, Reddit, Roblox, like all of these platforms have data sets like this. They know how many children are on their platform, children under 13. They know how often children are searching for or encountering harmful content and harmful people. And they do not have to disclose what's happening, nor will they, unless pressured by the law or subpoena.
B
Yeah.
A
And how can we help our children if we aren't transparent about what they're encountering? I think it's a moral issue and I'm just thankful that we are able to do this. And actually, I was speaking at a school last night with Ben Halpert of Savvy Cyber Kid Shout out Ben, you're awesome. And he was saying that the waiting period right now for a subpoena for snap is 18 months.
B
Oh my God.
A
So unless your child is chained up in a basement and about to be murdered, your request does not get escalated to law enforcement. You gotta wait a year and a half to figure out what happened regarding a lot of things. And that's just on snap. And that's not okay. So.
B
Yeah.
A
18 months.
B
Yeah. I'm actually speechless.
A
I know, it's, it's, it's just not okay. But that's why, that's why SNAP settled. Yeah.
B
The recent Exact lawsuit. Yeah. No matter, you know, subpoenaed, you know, being pressed by the law, they still don't really care. They, they figure out a way around it. They figure out other things to kind of throw out there that might look nice.
A
Like their family safety center.
B
Yeah. Or like, you know, some new features that Instagram's gonna rol out to alert parents to kids searching for self harm or suicidal related. Suicidal ideation, related content. Okay, so how are they working? What are they actually doing? Tell us more.
A
Why didn't you do that ten and a half years ago when we did it? You could have. Why is it limited to frequent searches? Why is it only happening if it's in a teen account? And that's if Instagram even knows your child and that you're connected to your child. Yep. Because who are they going to alert if your child just signs up and it's not connected to you? Like, anyway. And also, Adam Mosseri and Mark Zuckerberg were just, you know, in court having to Testify as to the why behind some of their design and policy choices. So it's no surprise that they rolled that out this week. I mean, they're in the hot seat.
B
Yeah. Perfect timing as always. They find a way to just drop it when they're under a serious lamp, being exposed by so many people. And they're like, our only way out is to make things up and roll out features that are going to help protect kids. And it's like, you've had years to do this. Why now?
A
Yeah, you should be proactive.
B
Yeah, 100%. I know we could rabbit hole about that all day long. But one of the most shocking findings is what we've. We've called this predation paradox. You know, Alerts have dropped 60% in monitored spaces. That's amazing. But sextion cases are still rising nationally. So, like, what's really happening here? Can you help kind of uncover that for parents?
A
Yeah. So, yes, predation alerts have dropped, but please keep in mind that's through platforms and places and spaces online that we're able to monitor. That means there's a lot of places that are not monitored that don't have safety precautions and parental controls implemented that kids are able to go to. And if kids are there, predators are there. And one of the signs that we know is indicating that predation hasn't dropped or stopped. It's just not being seen yet is the rise in sextortion cases. Go Google it. The FBI has reported an increase in sextortion. It is a huge problem. And predators are smart. There's entire crime rings. There's. They're using AI to make their crimes go further, faster and impact even more children. So parents should be concerned because children are being targeted in places that some parents have never even heard of or spent any time whatsoever.
B
Yep. And I think to your point, and this is something we talk about, often predators will go where the kids go, and predators will try to outsmart tools like Bark. Just the way that a kid might try and get around things by using different slang terms or emoji. And luckily for us, and thankfully, Bark's algorithm knows those things. But still, you know, I'm of the belief that, like, if a criminal wants to perform a criminal act, they're going to. And that's what predators are.
A
I am so glad that you made that point, because I was actually talking to a lawyer who specializes in online harassment and, you know, restraining orders and all kinds of stuff, and he was like, you can have the best gated neighborhood with. With the highest fence and the sharpest spikes in an alligator filled moat. But bad people are going to find a way in. And we have to think about that when it comes to digital spaces too. It's not a matter of if, but when. Your child will encounter what's called a tricky person online. And so they need to be protected and prepared for when that happens. Protected meaning no connected tech in the bedrooms or behind closed doors. If it has a screen and it can access the Internet, it is not for a child, for their bedroom. It's just not. Back to Ben. We just spoke together last night, so it's fresh in my mind. He was like, the only exception to this is like a Kindle, like e reader, like if they, you know, they can read a book but not get on the Internet. If they really have to have a screen in their room. And I was like, okay, that's fair. But like they need an alarm clock, they don't need a smartphone, right? So, no, no connected tech behind closed doors or in the bedrooms. Make sure that they just don't have full unfettered access to all the apps, all the tech, you know, even in gaming platforms and consoles, make sure they can't just play with anybody who's online at the same time. You know, so you got to be proactive ahead of time to set those controls, filters, time limits, access. Just like you lock your front door, hopefully, right? But then also it's the conversations, you know, it doesn't matter if you lock your front door if your kid opens it when somebody knocks on it, right? So make sure that you talk to them about the fact that unfortunately not everybody in this world has good intentions, even if they seem really nice. And so here's our plan for if and when somebody asks you something that makes you uncomfortable or send you a picture of their body part or fill in the blank like you didn't do anything wrong. There's no shame, you're not in trouble. But like, we gotta have a plan. Just like we have a plan for if you catch on fire. You stop, drop and roll. Somebody's contacting you online that you don't know. Here's what we do. And again, you're not in trouble. But this is how we protect you. This is how we protect your younger brothers and sisters. This is how we protect your friends whose parents maybe haven't had this conversation with them yet.
B
Yeah, I love it. I want to go back to something really quick. You mentioned, you know, these predators are going to different places, different apps, platforms that, you know, maybe has. They might not have any parental controls at all built into these places or, you know, it's just places that don't want to work with, you know, companies like us at Bark. Do you have any examples of what those platforms might be or just kind of a general idea of, like, these are the types of places these people are going.
A
So first, I'm going to start with over 20, 25. Back to the annual report. The reason we're filming this particular episode right now, the top five apps that Bark flagged for grooming are. Number one, Snapchat. So even though Snapchat won't work with us, we're still able to monitor it. So take that, Snapchat. And that's the number one place where grooming is happening. Confirmed grooming. Not like this might be confirmed. Number two, Instagram, right behind. And again, Instagram revoked Bark's access not too long ago because, well, we know why. You know, they think they can maybe do a better job of it, but clearly they haven't. And we can still monitor it. So take that. And yeah, number two, Instagram flagged for grooming, followed by Discord. Discord's number three. How many parents listening to this, watching this, haven't even heard of Discord, much less spent time on Discord. That's an example of a place where parents really aren't spending time, but the predators are. Number four, group me. And then number five, Reddit. Reddit continues to blow up. It's super entertaining and helpful as an adult, but predators are there and kids don't need to be there. So that's the known places where grooming is happening. So, Sam, don't you love it when people, like, say your name and you're like, what?
B
Yeah, it's like kind of like when you hear your parents say your name and then they do like the full name. Yeah, yeah. So sometimes it's a little jarring, right? In this case, I'm good, but I'm
A
about to get real serious right now because this is so important. One of the biggest misconceptions is that grooming happens on just a few apps. In reality, we see predators start conversations anywhere that kids can communicate. Gaming platforms, hello, Roblox, live stream comments and social apps. And then they move those conversations into private messaging. It's not about the platform. It's about access to kids. Predators go where the kids go. So they might be on, you know, YouTube or Twitch or something, and they get a comment. And then that comment encourage them, encourages them to go to WhatsApp or Kik or Signal Or Telegram. And a lot of these platforms claim to be end to end encrypted, which means nobody can get access, not even law enforcement unless it, you know, through like major, major subpoenas or it just, it's like it's so convoluted.
B
Right.
A
So again, outside of the top five places that we know predation is happening because of the alerts we have sent to parents about predators talking to their kids. Any, any game that your child plays that has in game, chat, text and voice that offers the ability for your child to have private messaging between players. And it can be so initially seemingly benign with like friend requests and team play, Roblox, Fortnite, Minecraft, Call of Duty gaming platforms are one of the most overlooked places that predators meet kids because parents think of them as a toy, but they're really social networks. Then the next thing I know I touched on it, but I just want to go a little bit deeper onto live streaming and video platforms. TikTok, YouTube, Twitch. Predators will use comments and private messaging to initiate contact and again take them to other places. Like the third thing I wanted to encrypted or private messaging apps. These are often used after initial contact is made elsewhere that they really try to get the kids to move to Telegram, WhatsApp, Signal and Kick. Because it is so hard for law enforcement parent monitoring apps to get access to those platforms. Finally, just friend finding and chat apps. Any app that's marketed as, you know, designed specifically to help you find friends or connect strangers or help you find people with similar interests. That's a big, big red flag. You know, there's so many of them like Wiz, Yubo Monkey, omtv. If I'm even saying that right, it's ometv. But like this is one of the reasons why your child should not have the ability to just download any app from the App Store or Google Play Store without your approval. It's why if you get your kid a bark phone, you have to approve every single app download so that you can be that first line of defense.
B
So which is important because to your point, I'm sure you listed off a handful of apps that parents are like, hold on, what?
A
Yeah.
B
It's like you're speaking a different language.
A
Yeah. And if we haven't made you stick to your stomach yet enough, these, these are also some more overlooked places parents don't expect school collaboration tools. Right. You may have protected your kid up, down, left and right. But, but if they have been given a school issued device or account and it's not Locked down. And a lot of them aren't school collaboration tools that have chat features. It can start there. We have alerted schools to predation happening on school issued devices and accounts. Also Pinterest, messaging, Spotify, shared playlist and messaging features, and then other online forums, fan communities, hobby groups. It's just we have to be ever present and ever vigilant.
B
Yeah. Now is not the time to pump the brakes. It's time to like full speed ahead, focus on these things and just educate yourself. And I was actually telling someone the other day, you know, just in the three years alone that I've been at Bark, the shift that I've seen in parents and the way that we kind of speak to parents is huge because there's so much more awareness around the fact that it's something that you need to be mindful of and educated on. And it's no longer, I feel like we have to speak a lot less to the parents who are like, be free, do whatever you want. Like, there's so much more like, okay, I know these issues exist. I don't really know much about it, but I want to be as educated as I can. And I think that as much as I hate that that has to be the case, I'm happy that it is because it just means that more and more kids are going to be protected online.
A
Yeah, totally.
B
Let's talk about Sammy's Law. And I'll let you, for the viewers that don't know, I'll let you give them a quick overview of it. But how would legislation like Sammy's Law help solve the visibility problem that parents are facing right now?
A
Yeah. So I'll start with just what is Sammy's Law? And it's named after Sammy Chapman, who died from a fentanyl poisoning arranged via Snap. May his sweet soul rest in peace. This proposed federal legislation requires large social media companies to provide APIs to third party tools like Bark and others, allowing parents to monitor their children's children, children under 17, interactions for, among other things, threats of suicide, eating disorders, sexual exploitation, sextortion, trafficking, and then of course, drugs, drugs and alcohol. So I mean, any parent who hears about this I would think would be like, well, yeah, why hasn't this been pushed through yet?
B
Right.
A
And it's because big tech's lobbying and efforts are strong and have a lot of money behind them. But I really think that the truth will prevail and the right thing will be done by parents. And so more behind, like the why behind Sammy's Law, you know, it'd Be one thing if social media companies and big tech in general could be held liable and accountable for the harms happening to children on their platforms. Much like, you know, a recall of a bike or a serial, it's not that way yet. So the laws are outdated. Children are not protected under these laws that are 20 something years old. Also, the parental controls and safety measures and content moderation and blah, blah, blah that these platforms have is not adequate. They've made cuts to their content moderation and trust and safety teams not beefed them up. It's so sad. It's infuriating, and it's not right. And so one of the biggest challenges that parents face today is that so much of what kids see online and who they are surfaced to connect with online is driven by algorithms that parents can't see or control. Sammy's Law will help to address that visibility gap by pushing for more transparency around how content is recommended. You know, parents can't guide what they can't see, and this legislation will be critical for opening up that black box. You know, it's. I liken it to like a strip mall, where it's like every storefront is just something bad. Like store one, prostitutes, you know, luring your children and asking them to send nudes and engage in sexual acts. Store two, drug dealers trying to get your kids to try pills and vapes and all kinds of stuff, many of them laced with fentanyl. Store three is just everybody your kid cares about, bullying them, saying terrible things about them. Store 4 is essentially a shop encouraging you to not eat, restrict your calories, take these pills to speed up your metabolism, and yada, yada, yada. Store 5 is like all the beauty hacks that your kid doesn't need, selling them lies about how to be perfect in four weeks or less. And it's just supplements and vitamins and crazy, crazy town. I don't know what store on seven, maybe I've lost count. But like store seven, you know how to harm yourself, how to hide signs of harming yourself. Like the worst stuff.
B
Yep.
A
No parent would drive up and drop their kid off at that store, that strip mall. They would not do that. But that's what we're doing when we hand our kid unregulated, unmonitored, unsafe tech. And Sammy's Law would help to at least turn the lights on in every one of those stores.
B
Yes.
A
At least let us see what's happening and decide if that's right for our child. Because the social media companies have not made good choices. They have had the opportunity to say, hey, here's who and what we're gonna allow on our platform and who and what we're gonna allow to interact with children. They have not done a good job of that. They've done a terrible job of that. We let parents protect their children. That's what we're on this planet to do.
B
Right. And it's actually, when you make it as simple as that and you think about it from the perspective of the strip mall analogy, you have to sit back and wonder, like, why are you so resistant to that?
A
Right.
B
Big tech, like, why don't you want us to do that? I would love to know why. Because the answer seems pretty simple. And all of these big tech founders, our parents, they have kids, you know, the majority of them, which is wild. And, you know, I can't remember who it was. I want to say it was Evan Spiegel. Yeah. Yes. I want to say it was him first. He ran that stupid campaign that Snapchat is not social media, which is just total.
A
The dumbest.
B
Yeah.
A
Do they have, like, razzies for ads? I mean, like, whatever. The. Like the dumbest.
B
Yeah.
A
Ad campaign. Yeah.
B
Like, can I just flag that? Every single time. I see it as false information.
A
So dumb.
B
But he. I can't remember if it was him or his wife in an interview said, we don't let our kids use social media, but they're allowed to use Snapchat. But take apart the part that they let them use Snapchat, whatever, Maybe they have special accounts because their dad created it. I have no idea. But you don't allow your children to use Snapchat, but you're still sitting in court testifying that you don't need to allow us to have access to these things. And you're saying, tough luck. More kids are just unfortunately going to die by suicide or die by drug overdoses, whatever it might be. They have the ability to very quickly give the ability to the parent. Because, mind you, if I understand Sammy's Law correctly and tell me if I'm wrong, parents still have to opt into it. It's not like an automatic thing, right?
A
No, it's not optim. It's at least make it available. Right. It's like.
B
And it just seems so simple.
A
Yeah. Like, we're not asking the platforms to take on more work. Right. We're asking them to let parents protect. But they're. They will lose users if kids get a whiff that their parents will have insight into some of the stuff that isn't good for them. Right. They're not going to want to be there and they, they don't want to lose those users, which sucks for them,
B
but it means safer kids. And I just, I can't wrap my head around the concept of like turning a blind eye to it and just being like, well, profit, profit over profit,
A
protecting kids, Profit, engagement, user numbers, it's crazy. Over children's lives.
B
Yep.
A
Yeah.
B
One of the stores that you talked about had to do with disordered eating related things. You know, we saw a staggering, and I want to say this was from 2020 to 2025's findings, a 650% increase in tween engagement with disordered eating related content. That's insane.
A
Terrific.
B
If that number doesn't make your ears perk up, I don't know what number does. But you know what's driving this and why are we seeing it younger and younger in kids?
A
Okay, so this issue does not get enough attention in the media. You hear the term eating disorder or disordered eating or bulimia or anorexia and you just, you think, gosh, that's really sad. And that's probably affecting a small number of people. Not my kid, not me. Moving on. It's a big deal. It's a big deal. When you are not getting enough calories or you are throwing up the calories you are consuming. You are affecting every major organ and system in your body. You are potentially contributing to lifelong health issues, including death.
B
Right.
A
So it's a public health issue first and foremost. Next is just the confidence if your child, during some of their most awkward phases and stages, has extra pressure on them to live up to an ideal that isn't even realistic. That's a no win situation. I mean, I remember going from sixth to eighth grade and things changed fast. And there was fat in places that I never knew could have fat. Acne, my nose changed shape, I mean like all kinds of stuff. Right. Yet like I was looking through, I think YM and Teen or something magazine, Cosmo on like how to have that thigh gap and how to have that six pack in time for spring break and it's like, no. And that's the only stuff I had in my ear. If you have a device that goes with you everywhere and an algorithmic feed that preys upon your insecurity, and we know it does because of data. We know it does. There have been many reports showing that Meta, amongst others knew that this sort of content harmed kids and still served it to them anyway. Cause it kept them in the app longer. It's even worse. It's exacerbating it. You know, if everybody's using filters and making their waist look thinner and making their cheekbones look higher and their skin look smoother and their teeth whiter, it's so stupid. It's so wrong, and it's all a lie. So, yeah, it's not good. And our children are struggling. And it's not just girls, it's boys, too. And we've got to pay attention to this. We've got to help build up our children's sense of self worth that isn't tied to their appearance. It's tied to their intellect, their kindness, their heart, their character, their interests. Our culture has always been obsessed, you know, with beauty. And I fall for it. Right. Like, I mean, there's the reason why, like, I brushed my hair and my teeth and I put on makeup and like a nice outfit instead of my pajamas for this podcast. But, like, there's a balance, right? And so our children do not have that balance anymore when they're inundated. And it's really heartbreaking. So ask any pediatrician what they're seeing and they will tell you it's not good.
B
Yeah. And I think, you know, as much as I love content creators and influencers, I think that, you know, Even as a 33 year old, I fall victim to the comparison trap. Like, how do I, how do they all look like that? Why don't I look like that? And it's one. They have all sorts of money and lights and lights and makeup, professional makeup artists, they're, they're, there's a content creator that I follow. I love her content and I've had to be really intentional about not comparing myself to her.
A
Yeah.
B
But like an average day in the life is like getting a lymphatic massage at home for like three hours. And I'm like, well, no wonder I don't look like her. Like, I'm not doing that for three hours.
A
You know, comment shred below and I'll send you exactly my six step guide for getting sexy in six weeks.
B
Yeah, it's just like I, I feel the pressure as a, I would like to think mostly developed functioning adult. And so I can't imagine teen girls and teen boys and I mean, tween girls and tween boys, like, just feeling that pressure because of everything they see. Hello. On social media.
A
Totally. And like when we had spring break, we had those disposable cameras, Right. Like, we didn't even see what we looked like in our bathing suits until we got Back.
B
Yeah. And it still took two weeks for them to develop. So you're like, oh, okay, got it.
A
Now it's like, look at me. It's the look at me culture all the time, and they're everywhere, and they're all over the Internet, and it's just like, make it stop.
B
Yep.
A
Yeah.
B
Okay. Okay. I could go down a rabbit hole on that one too. Same. So, as someone who grew up what I dub as the Tumblr era, it emerged as a number one platform for severe suicidal ideation and depression this year. What's happening there that makes Tumblr so, like, what's going on there? I know when I was a big Tumblr user. I'll let you think on your answer here, and I'll give a little story. It was kind of like an online diary journal, if you will, where you felt like you could share the things you were feeling. You were kind of using, like, cryptic images, you know, but. And I don't know if it's still this way, it was anonymous then, so people kind of just figured out who you were based on your username or based on the things you were saying. Um, but it's kind of one of those places that you might not necessarily think of as social media. You know, earlier we talked about platforms like Reddit or Kik or WhatsApp. Like, those are still social forums. Um, so it's kind of a bit of a tangent, but again, my question is, you know, what's happening on Tumblr, and why did we see the shift to flagging for suicidal ideation and depression this year?
A
I'm really glad we're talking about this, because, you know, Tumblr doesn't just host mental health content. It clusters vulnerable kids together. And what makes Tumblr stand out is the density of vulnerable users. When a struggling teen lands there, they don't just see one or two concerning posts. They can enter an entire ecosystem built around depression and suicidal thinking. And that concentration effect is what makes it especially concerning. We see kids whose entire online identity becomes centered around being depressed or struggling, and you can't heal in that sort of environment. And instead of interrupting harmful content, Tumblr can sometimes reinforce it. Yeah, Tumblr doesn't just show you content. It helps you find your emotional tribe, which can be risky when the tribe is built around despair.
B
Yes.
A
Like, to back up, because it's important that parents get this. Tumblr has had a long and strong concentration of mental health content in communities, which can sometimes be supportive in some cases. Right. Like I remember 15 years ago stumbling across Tumblr being like, wow, people are really, really laying it all out here.
B
Yeah. You know, opening up and sharing a lot.
A
Yeah. But it creates environments where distress becomes normalized and amplified. And research has found that Tumblr posts frequently contain themes like self loathing, loneliness, self harm, and suicide, with one analysis finding that 82% of sample posts related to depression, suicide, or self harm. That's not a place where we want our kids hanging out.
B
No.
A
There's a long standing mental health subculture. Discussions of depression and self harm are deeply embedded across communities. And because Tumblr allows relatively unfiltered expression, some communities end up romanticizing or reinforcing mental illness rather than helping users recover, which creates an echo chamber effect. So the TLDR is. Tumblr is not for kids.
B
Right.
A
It's just not.
B
Yep. It's not at all. I almost like, kind of wish I could go back and see my posts from high school because I remember it feeling very much like there was a community for that.
A
Yeah.
B
Sadness.
A
It felt safe and it felt dark.
B
Right. Yeah. Which is an odd combination, but it makes perfect sense.
A
Yeah.
B
So for the first time, Slack, which we're very familiar with.
A
Yes, we do.
B
That brushing sound lives in my head. Brushing, the brushing, like knock.
A
Oh, to me it's a knock knock.
B
Yeah.
A
Okay.
B
Okay. I think they call it like a brushing knock. Probably.
A
That's.
B
I don't know. Yeah. Anyway, welcome to my world. So it was ranked this year number three for depression alerts, which is shocking. I mean, I think when I first saw it, I was like, wait a second, Slack. Right. But to your earlier point, you know, there are school issued devices and they're using tools like GroupMe and Slack. But, you know, why do you think kids are going to those places to talk about things like mental health issues?
A
Yeah. I mean, when you're sitting in school all day and you have tech in front of you and it allows you to communicate with your peers, school communication tools are going to become social networks. Right. And we've already talked about how not good social networks are for kids. Students are using Slack for group projects, clubs, robotic teams, student government, sports coordination. But they can feel safer, more safe space, more private, you know, and so kids feel like they can open up there. And when kids feel overwhelmed, they talk where it's easy and familiar. And increasingly that means school communication platforms. Thankfully, BART can monitor Slack, but yeah, parents just really naked. Parents really need to take a cold, hard look at what their child's school is Letting them access and how they're monitoring and managing it. And if they have any questions, please reach out to us. Because Bark offers our tech to every school in the nation for free. Right.
B
Which is incredible. Yeah. We've talked a lot today about Snapchat and Instagram, as we do most times we're doing a podcast episode. They're often at the top of the list for quite a few categories. You know, like, we already touched on earlier grooming, risky contacts. But, you know, is there anything else you can add that would explain why these two platforms continue to be at the top of lists year after year?
A
I mean, they're. You know, everybody knows about them. Right. Snapchat and Instagram stay at the top because they combine massive teen adoption with private messaging and easy stranger access. And when a platform has all three, it naturally becomes the primary place for grooming and risky contact. Yeah, I mean, that's. That's where the. That's where they are.
B
Yep.
A
And their features, some of which like, like, for example, on Snap, there's quick ad. So it's like adults, us adults. Right. Have you read on Facebook and like, you get surfaced something that it looks like somebody friend requested you, but actually they're just showing you somebody they think you might want to connect with, and then you hit connect and you're like, oh, no, I just sent them a friend request. It's like, tricky, right? But their social media platform's goal is to keep you on them as long as possible and grow your social graph. The more connections are made, the more time that you'll spend an app and more engagement that will take place, the more interesting it'll be to you. And so you're a predator. You go on Snap, you start friending as many kids as you can. Some of them are not wise and just quickly accept the ads. Right. So now you're connected to enough kids, then their friends see, oh, my friend Jimmy is connected to this person. Sam. Sorry, I'll use a different name. My friend Jimmy is connected to this person, Joe. Like, they must be a cool kid. I'll accept their quick ad. It just happens like that. It's design. These apps are not designed to keep kids safe. They're designed to optimize connection and engagement, point blank, period.
B
All right, as we kind of wrap things up here, you're looking at all this data together. The predator migration, the mental health crisis, the rise of all of this toxic content on platforms like Tumblr. What would you say if you could put it into one big takeaway for Parents listening right now. What would that be?
A
Yeah, first I'd say take a deep breath.
B
Right?
A
So take a deep breath. It is not easy to be a parent in this sort of world right now with all this connected tech and access. Now, per our data, you know, the pattern across all of this data shows that risk follows access. Wherever kids gather and communicate, the same risks eventually appear. So it's not specific to a certain platform, it's just period, if they have access. So, you know, being a good parent and trying to keep your kid as safe as possible online means staying involved, which you are, because you're listening to this. So good job. Share this with a friend, please. And then putting the right protections in place wherever your kids go. So safer tech when it's time for them to have tech. Delay, delay, delay. Our friend Chris McKenna of ProtectYoungeYes.com says delay is the way it is. Okay if your 8th grader does not have Snapchat. It's okay if your 9th grader doesn't have Instagram. Right? Listen to the wise words of Jonathan Haidt. Delay social media until 16 and your children will be healthier and happier for it. They'll be mad at you, there will be tears. They will not think you are their favorite person, but they will thank you. We have heard so many parents say, my kid did not like this, but they got older and they came to me and they said, thank you, thank you that you made this decision for me. It was the best thing for me and I'm so glad I didn't have this. They're going to say, hey, I'm missing out on such and such, or I'm not getting invited to prom or I'm not getting invited to parties or included on these things because I don't have this app. And yes, they are going to miss out on some things, but they'll also be left out of the predation, the suicidal content, the mental health, dark rabbit holes, the predators. And you want them to be left out of that. That's actually a good parenting choice. So don't be afraid to delay until social media becomes safer for kids. And we will not hold our breath. The answer is delay. And then when it's time, when you have decided it's time for them, please don't just give it to them. Please use tools like Bark to monitor it. No connected tech in the bedrooms, behind closed doors. And please optimize for in real life encounters and experiences. I mean, there's so many kids, you know, coming out of college now going to interview for jobs. Jobs at Bark. And they can't even look you in the eye. Like, let's work on these social and emotional interpersonal skills before they're sucked in to the world that we've all been sucked into.
B
Yeah. All right. You kind of, you kind of answered this question, but I'll, I'll give it to you just in case there's anything else you've thought of. And I won't even give it a specific number just to be safe. But what are some concrete things parents can do today to better protect their kids? Based on what we found in our 2025 data. Yeah.
A
Three things that can make a real difference. Besides no connected tech behind closed doors is reduce stranger access, create visibility into your child's digital life and stay engaged. When parents can see what's happening and kids feel comfortable talking, problems get caught earlier. Now I know some of that sounds fluffy, right? You're like stay engaged. What does that mean? I don't have time. What does that mean? Right, so let me give you some, some specifics. So for the first part, reduce stranger access. The that means turn off public profiles, restrict DMs to friends only, delay social media for younger kids and remove friend lists they don't know. You know, one of the fastest things, one of the fastest ways to reduce that risk is to reduce stranger access. And if you don't know how to do any of this, just reach out, drop a comment, send us, send us a DM. Yeah, we're okay with DMs from strangers will help you. And then to the put visibility in place. You gotta enable the parental controls. There are free built in parental controls for a lot of the connected tech your child already uses. And if you don't want to use Bark's guides for how to turn all those on, feel free to use an independent third party expert like Chris McKenna. ProtectYoungIs.com he has parental control guides for all of the connected tech your child can access. You've got to monitor all the different places. And again, Bark can help you with that. So it's not you having to do it manually and directly every day and then use the device level protections. Yes. You can't protect what you can't see. So just think about how can my kid access XYZ and start zoomed out with like how does the Internet even get into your home to what's inside of your home that can connect to the Internet and go from there? And we can help you. It's not easy, but we can help you. Don't hesitate to reach out to us and then to the, like, stay engaged. Which is probably like the fluffiest thing I've said all day. You know, like, what does that mean? Well, ask kids where they communicate. Like, you know, I love Facebook as your, you know, boomer mom. But, like, where are your friends talking? Like, what's the most popular app? Normalize conversations about uncomfortable situations. Like, nobody wants to talk about porn with their kid. But, like, you got to talk about how it can negatively impact their ability to have a healthy, consensual relationship when it's time for them. If they're like, wow, my mom or dad is talking to me like a friend, like a caring, wise friend. Not like a don't have sex until you're married. You know, like, it's a different conversation. Right. Make sure kids know how to report problems, too. Like, when they are in certain spaces, be like, hey, if you see something that's off, here's how you block. Here's how you report. If it's not csam child sexual abuse material, anything of a child that is nude, take a screenshot so that you have proof and I will help you. You know, it's really not a matter of if but when your child is going to encounter harmful content and harmful people. And also, you should probably just expect that your child is going to make a mistake because good kids make bad choices. They need to know that you will be there for them and with them if and when they do make a mistake, because that's parenting is helping them navigate. Like, hey, we made a bad choice here. Let's figure out how to make it right.
B
Right?
A
I love it.
B
Any other parting words or thoughts?
A
Dude, this is a lot. I mean, we went way over the time. I thought we would record in, and it's because it's so much, but it's time. Yes, we are seeing the tide turn. We are seeing things change. There is hope. There is so much more hope now for parents of, let's say, second to fifth graders than there was even three years ago. So stay strong. Help is coming. You're not alone. And if you want to talk to other parents who are right in this with you in real time, check out the Parenting in a Tech world Facebook group. It has over 660,000 parents in it. And I get the irony, right? I'm talking about going to Facebook. Right? But, like, Facebook is not the devil. Like, Facebook groups are pretty cool. They can be helpful. It's just not for kids. So check out the Facebook group and you can always connect with me. Sam Bark. Find us online. We're easy to find, and we're here to help. Awesome.
This episode of Parenting in a Tech World dives deep into Bark Technologies' 2025 Annual Report, analyzing over 11 billion children's online activities to uncover urgent trends in digital safety, predator migration, mental health, and tech parenting strategies. Host Titania Jordan and her co-host discuss the paradoxes of online safety in 2026, highlight key platforms of concern for grooming and harmful content, cover recent legislative efforts like Sammy’s Law, and offer concrete, actionable advice for parents navigating the rapidly evolving digital landscape.
“Delay is the way.” (A, 41:40; quoting Chris McKenna and Jonathan Haidt)
“It’s really not a matter of if but when your child is going to encounter harmful content and harmful people.” (A, 47:43)
| Timestamp | Topic/Segment | |-----------|---------------------------------------------------------------------| | 00:00–02:06 | Introduction & significance of Bark’s 2025 report | | 04:09–05:44 | Platform transparency, subpoena delays, and tech industry gaps | | 06:53–09:11 | The predator paradox; risks moving to unmonitored spaces | | 12:06–16:44| Specific platforms & overlooked risk zones | | 19:37–24:44| Sammy’s Law – rationale & the need for legislative intervention | | 27:20–32:52| Mental health crisis: disordered eating, comparison culture | | 34:13–36:39| Tumblr & Slack as mental health hazards | | 39:23–41:31| Why certain apps attract persistent risk | | 41:31–44:31| Top takeaways for parents; the “Delay is the way” mantra | | 44:53–48:36| Concrete steps for parents: restrict access, enable visibility | | 48:45–End | Parting hope and support; how to connect with the Bark community |
The episode closes with an optimistic message: Hope is on the horizon as awareness grows, tools improve, and legislation advances. Parents are not alone. Stay vigilant, stay engaged, and reach out for support. (A, 48:45)