
Loading summary
A
Paul Rafael has been on the front lines of cybercrime and online exploitation, exposing the ways social media platforms are constantly putting kids at risk. Today we're talking with him about the steps parents can take to protect their kids, both online and in real life. Alright, so before we bring Paul in, I just have to tell you how excited I am to have him here in the studio shortly. This man is so brave and so bold and so smart, and I want to tell you about something that happened which I probably won't talk about with him, but you need to have this backstory. So back In April of 2024, I actually signed up to attend a webinar to learn about the state of child exploitation and sextortion online. And it was a webinar put together by some pretty prominent people, trusted people, and Paul was one of the speakers on this webinar. And what I didn't know, and probably a lot of people didn't know, is that he had been offered a job. Meta had extended a job offer to him to help fight sextortion and protect kids on their platform. So he had had this job offer for Meta. He was part of this webinar to help educate and empower the world to help protect kids online. And after that webinar wrapped, his job offer was rescinded. So I will let you make your own conclusions, but that was really telling. And what I think is so such a miss on Mehta's part is they rescinded his job offer, but that didn't silence his efforts. In fact, even more children are going to be safer online now because of that. So they, I think, made a bad choice. They should have hired him. They should have kept that offer to help make their platform safer, because they're not. But everybody else in the world gets to benefit now from that loss on their end. So, yeah, that's, that's the backstory. If you want to look into that, just Google his name and you can see a lot of things that have been published about that. And it's just so. It's a really bad look for them. But anyway, I think we're all going to win from today's episode. So with that, we're going to welcome Paul into the studio. Paul, I am so excited that you're here. Thank you so much. Your work to help prevent some of the most horrific things that are happening to children online is truly saving lives. And for those who may not know you yet, can you tell everybody a bit more about yourself and your background?
B
Yeah, for sure. Thank you so much for having me here. You know, this is kind of a new space, a new area to me as well. I started out my career tracking ISIS on social media.
A
Okay, wow. Wow, heavy. Go ahead.
B
So yeah, I've been always focused on kind of these transnational organized crime, terrorist threats, really big, heavy topics. And that all changed a couple of years ago when I started to focus on the surge of financially motivated sextortion. Viewing this as an organized crime and viewing it through that lens, through a lens of harming tens or hundreds of thousands of people at scale. So that's kind of catapulted me into this space and I'm super happy to be here for this chat today.
A
Wow. How did you even get into that? I mean, what did you study? Self taught, your career path? How did you even get there?
B
Yeah, so yeah, I was always interested in national security studies and global affairs. So that's what I studied in undergrad, then moved into working on a government contract to track ISIS on social media, monitor their activity. Then I moved over into the corporate world. But things really changed two years ago when one of my friends actually became a victim and was targeted by a sextortion scam. And he came to me as the kind of online investigations guy of the group. And you know, I didn't know much about this crime or about this scam at all. And as we'll, we'll talk about this scam, you can understand why the victims are very perplexed on what to do. They don't know if they can block the criminal, they don't know how to exit the situation. So that's when I came in to try to assist with that situation. And where that led me was I started to look online with who this was happening to. What were other victims saying? What's the advice that victims should take? And I was shocked and surprised by a few things that I saw. First, I was shocked by the ages of some of these victims, 13 and 14 year olds, saying that they were being sexualized, sexually exploited, blackmailed, and being forced to pay money, gift cards. I was also shocked by the trauma that a lot of these victims had. I was seeing comments from victims saying that they were thinking of ending their life because they didn't see any other way to end this situation, this exploitation. And lastly, which I think was kind of the most important piece was these victims were all saying the same thing. It was happening on the same platforms, it was happening using the same exact methods and tactics and even down to the same scripted messages.
A
Oh my gosh.
B
Where victims all across the nation and multiple nations were getting these threatening messages from criminals that were the exact same, down to the same typos. So I knew that this wasn't just a whole bunch of isolated incidents. I started to look at this like organized crime and approach this in an organized fashion. The only way to combat organized crime is to be organized. So that that was my effort in that regard.
A
What apps and platforms are the worst when it comes to allowing the exploitation of children today? Where is this happening?
B
Yeah, so most of the financial sextortion that we are seeing today is happening on Instagram, owned by Meta, Snapchat and TikTok.
A
Even with the parental controls that they
B
have released, even with all these parental controls and safety features, these three apps are making up the vast majority of financial sextortion scams. And that's by design. A lot of folks tend to think, oh, this crime can happen on any platform, any gaming device. I'll tell you, I have not heard of a single financial sextortion case starting on a gaming platform.
A
Wow.
B
It's all, it's almost all happening on Instagram, Snapchat and TikTok by design. So for example, you think about Instagram, the moment a teen accepts a scammer's friend request or follow request, their entire social network, their entire followers list, following list is exposed to that criminal. And we know that is the primary source of leverage that these criminals are using. As soon as they convince the teen or the child into a compromising situation, they use those lists as leverage. Don't block me or else I'll share these compromising images with all of your followers, all of your friends. So, you know, we see that over on Instagram, over on Snapchat, of course, it's known for disappearing photos. So teens tend to think that, you know, if they're going to engage in risky behavior, they think it's safer to do it on Snapchat. But of course, the criminals are screenshotting and screen capturing that content as well.
A
And just by design, we have seen errors in judgment, I believe optimizing for growth over child safety. Whether it's Snapchat having the quick add feature of people they think your child might know and they just add, add, add, add, add. Of course predators and sextortionists are thrown into that ring. Or we've seen from whistleblowers that predators and pedophiles. I just made a new word, predophiles. We'll copyright that. They were being shown suggested accounts to follow and those were children. Children being serviced to them like, let's start at the design level to make these apps safer for kids. But of course, they haven't done that yet. What are some of the common misconceptions or misunderstandings that parents have about sextortion? And while you think about the answer to that, I'll with just the fact that they don't think it could happen to their child, or they don't realize how often it's happening, or they think my kid's too smart. But good kids make bad choices, and they're up against biology. So. But please expand on that.
B
Yeah. So I think when we talk about the word sextortion, this term used to be applied several years ago towards a former ex. Boyfriend or girlfriend would blackmail the other one into staying with them or performing sexual activity under the threat of releasing those images. So it was purely a sex crime at that point. Now, scammers have found a way to make this a financial crime, and that's when this crime has really, really exploded in recent years. So scammers have found a way to make this crime happen very quickly and to get them money extremely quickly. Virtually all of the financial sexortion that's targeting minors today is coming from a cybercrime entity out of West Africa called the Yahoo Boys, and they're operating on these mainstream social media platforms advertising their wealth that they're getting from the scam, unfortunately.
A
Why. Why don't the platforms just shut down the accounts out of that area?
B
Well, you know, Nigeria in particular is Africa's largest economy, so it is very difficult because these criminals are so widespread across an area. It's not. They're not in a scam call center. Right. That would make things a little bit easier. So these criminals are very dispersed, and what they've done is, you know, they've created Facebook groups, for example, where they share the methods of how they conduct
A
scams, which is so, so depraved. I mean, there are literal manuals online for how to groom, exploit, harm children, convince them to do atrocious things. I don't understand the humans that participate in that. And do they have a soul or a conscious conscience? Please continue.
B
That's right. And you know, some of these Facebook groups had over 20,000 members in them.
A
Oh, my gosh, Paul.
B
Public groups. So it's not like they couldn't have just looked. Medic could have looked and seen this.
A
Okay. It's just so disturbed. It's so disturbing, and it's so hard for everybody to wrap their brains around, but it's happening so much that The FBI has released multiple warnings warning the public. This is a problem to pay attention to. This isn't clickbait. This isn't like headlines to try to get traffic to a media entity. Like, this is a problem. This is a danger. It is a real and present danger. It is affecting your children.
B
And according to the FBI, this is the fastest growing crime targeting kids today. So it's clear and present danger.
A
Yeah, you would think that the people who have a responsibility and duty of care to protect kids would be all over it. Why is this not being stopped? Like, and for example, I went to the Child Rescue coalition headquarters around 2020, before COVID before it, like, it was like January, February. And they pulled up a map of known IP addresses that were distributing or receiving child sexual abuse material. And I saw these little red dots continue to pop up on a map in real time. And my first question then was, why is there not a law enforcement official showing up at every single door where there's a red dot, banging down that door, taking that tech, arresting that person and stopping this? And they said that there aren't enough law enforcement officials and there aren't enough trained law enforcement officials specifically around this crime to combat it at scale. Not only that, but the law is meant to protect children essentially are not strong enough. And these predators, these offenders there aren't stringent enough laws to prevent them from going down this path. So I guess kind of same question applies to, there are things that can be done to put a dent in this and to keep kids safer online, and they're not being done. I don't know if you want to talk to what those are or why, but, like, we need more info.
B
Yeah, I mean, this crime just, I would have thought with the increased awareness and publicity in the past year that maybe the crime would have started to go down a bit in terms of numbers.
A
Right.
B
Unfortunately, the 2025 data, this year it will exceed over 50,000 reported cases here in the U.S. and that's just the reported cases.
A
Right.
B
We know that these victims, they are feeling shameful, fearful confusion. They don't know what to do, who to report to. So when I say 50,000 reported cases, cases, I'm probably thinking closer to 10x half a million unreported cases. And compare that. Back in 2021, there were only 139 reported cases of this crime here in the U.S. oh, my gosh. So 139 to 50,000 in the span of just four years shows just how unprepared that tech platforms and law enforcement is to deal with this crime. And so there's so much more that the platforms can be doing. For example, Meta can simply hide those friends and followers lists for all teen accounts by default. And they didn't. Instead what they did was they said, we're going to hide the followers lists from accounts that we detect as might from accounts that might be sex orders. First of all, why are you not just blocking or quarantining those accounts? So they can't friend anybody.
A
Yeah, why are they allowed to even use the platform?
B
Secondly. Right, then why not just make this the privacy default for all teens?
A
Right?
B
Because of course they're not going to detect all the accounts. And you know, we've already seen teen suicides this year, unfortunately, after Meta rolled out this so called protection. But yet somehow the criminals still had the victim's entire followers list screenshotted to blackmail them with. So there's tons of things they could be doing. Another thing Meta recently rolled out to protect against sextortion, they said was their nudity protection feature. Now what this is, it sounds great. Nudity protection. So what it is is Meta's AI is scanning images that are being sent to or from teen accounts. This is on by default for all teens. And if it detects a nude image trying to be sent to or from a teen account, instead of blocking that image, there will be a little pop up that says, are you sure you want to send this?
A
Right.
B
And any recipient can just tap to unblur the image of presumably csam.
A
How is that a protection?
B
It's not exactly. It's probably even worse on them because they're now knowingly engaging in the transmission of child sexual abuse material. Their AI knows it's CSAM and is delivering it anyway.
A
I am so angry, so floored. Like, who in the world on that team in a decision making level could think this is going to be good? This is, this is good. It's not good. I mean, if I'm 13 and somebody that I'm interested in sends me a nude, but it's blurred, it's like, are you sure I want to see this? Of course I do. Absolutely. I'm 13. My brain isn't fully formed. Like, okay, yeah, anyway.
B
And it's like if their AI has the ability to detect and blur potential csam, then it has the ability to detect and block it.
A
Oh my gosh. So why not just block it 100%? And like, don't even get me started on Apple and the smoke and mirrors they're putting out there. I mean, if The Bark phone, when a child goes to try to take a nude photo or video, can realize that's what's happening and not even let that be taken. Like that media won't even be saved to the camera of that Android device. Apple could do that. Meta, Snapchat, TikTok, all of these platforms have the ability to make their tech safer for kids. If Bark can do it with our team of 100 people, like, that's insane that they're not doing that. But we know why, right? They, they make money off of the time spent in their app and on their platform. And our children are products. So. You have been part of multiple sting operations in an effort to catch sextortion scammers, which. Thank you for that work because I've been a part of a team that has done that work too. Not as frequently as you have, and it's grueling, it's gross. You need to take some time after doing things like that to just reset and have hope for humanity. But from your perspective, what has been the hardest part of that? And then on the other hand, what's been the most rewarding?
B
Yeah. Thank you. I had a really exciting opportunity earlier this year to work with a Channel 4 documentary in the UK. And what we did was we tried to essentially scam the scammer. We created several Instagram accounts as young, you know, teen and young adult men, boys, who are really the targets of this scam. Over 90% of victims reported to NCMEC are boys age 14 to 17. So they are really primarily the target here. And so we created these decoy accounts to try to see if we would get sex orded on Instagram. And of course we did. And I was able to partner with, out in the uk, Jordan Stevens, who is an amazing hip hop artist. Yes. And he really brave soul. He was kind of the star of this documentary. He voluntarily got himself blackmailed for this, for this sting operation.
A
Oh, my gosh.
B
And so, you know, the DMs were coming in from these, you know, pretty women and, you know, girls his age. And, you know, they, of course, you know, the conversation starts out very normal, but then it turns quickly flirtatious. And the criminals, the scammers, will actually send a nude photo first and then request the victim, the target, to do the same.
A
Right.
B
And the moment that happens, as soon as that, you know, teen or young man sends a photo, that's when the blackmail begins. And exactly as we predicted, the criminals had screenshotted our Instagram followers, they started to threaten to send it to all of them. The criminals even started to call Jordan via the Instagram device. Via Instagram, they started to call him.
A
Yes. And that's chilling.
B
It's chilling because you know, he's threatening, he's yelling, do you want me to ruin your life? I'm gonna, you know, send me $200 right now. And, you know, imagine a 13 year old victim in that scenario. They think they were just talking to Jesse from the other science class. Yeah, right. And now you have this full grown man with an accent yelling at you on the phone, saying he's gonna ruin
A
your life, kill your parents.
B
Yes.
A
Like show up and. Yeah, yeah.
B
I mean, these threats are terrible. They're saying they're gonna get you kicked off the sports team, your scholarship is gonna get revoked, they'll hack your school website and put the photos on there. Just really vile things. And at the end of the day, these scammers are after money. So what we did is we created a fake gift card redeem website which we controlled and we sent the gift card redemption code to the criminal. And of course, as soon as he entered that code, we had his exact GPS coordinates.
A
Good.
B
So again, like most of these criminals, he was based out of a remote village near Ibadan in Nigeria. And so the very next week, Jordan and documentary crew fly all the way out from the UK to Nigeria and knock on this guy's door. Oh my gosh. To actually confront him about it.
A
Okay, will you send us the link to that? I wanna make sure. Is it live?
B
Yeah, yeah.
A
Okay, yeah, let's. We need to.
B
Y' all can include a clip in this. That would be great.
A
Yeah, everybody needs to see that. You need to show your kids this. You need to be like, hey, this is why I'm delaying your ability to use these apps that I know your friends have and you want. This is why I don't want you on a platform that allows for anonymous chat with anybody who can just create a Persona. This is the why. These are the people who are coming after people like you. They need to see this. They need to know what's really happening. I have a very visual memory. Whatever. I guess everybody does. But like, what I mean by that is I can still see that LinkedIn post that you made that went viral of very, very easy and tangible tips that the platforms could have done right away to make their platform safer, and they didn't. I just asked ChatGPT, because we're talking and I'm trying to be present, but I was like, what are some of the key Recommendations that Paul Orphile makes for making platforms safer. So I'm going to read them to you and tell me if this is accurate. Okay, so one is to just rethink platform design and age policies. Right now, Instagram basically acts as a massive directory of teens that predators exploit. So, you know, if the visibility of a youth profile makes targeting easier, let's make it less visible. Like, for example, if you go try to find a teen on Instagram, their profile might be private, but like their bio is public and it's like find me on Snapchat at this username, you know, CHS class of 27. Like you can figure out where they are and what. Like, it doesn't have to be that way.
B
And the fact that most teens, if they get a friend request, they're going to accept it anyway just to see who it is.
A
100%.
B
The private account thing is really just a facade, Right?
A
Exactly, exactly. The second thing was build proactive detection response. It's like, oh, how about that? Yeah, beyond reactive content removal, maybe they should invest instead of fire. You know, teams that can handle proactive detection of sextortion flows and predatory behavior patterns use AI to detect signals that human exploitation analysts can use to spot abuse before it escalates into harm. How about that? Maybe, right?
B
And one of the most shocking things to me was the fact that these criminals are using the same scripted messages and it is so easy to detect. Like if that text matches this text, block it, block it.
A
Put it into a queue to review.
B
Exactly. Quarantine it until it's been approved.
A
Right. Like if I go, if I take a picture of like a bloody arm or a firearm and try to post it, there are platforms that will be like, this violates our content policies and it won't let me. Why are we not doing that for protecting children?
B
Who knows?
A
Right?
B
We all know.
A
Well, we know, but. Right. Well, which takes me to what ChatGPT says is your number three, which is to improve transparency and accountability. These platforms are not transparent about their safety gaps. They will not own up to it, nor will they be clear about what actions they're taking other than saying, oh, we have new parental controls that are going to go by the, you know, Movie Picture association guidelines for PG 13. And it's like, and even the Motion Picture association of America is like, they didn't ask us about that. So it's just like so silly. And it's smoke and mirrors. Like, we don't need high level reports, but we need specifics. We can't address the problems if we don't have the data. And at Bark every year we report on the rate at which children are encountering harms on platforms. We're sharing the data to help improve safety. They should be too. Four is to collaborate better with law enforcement and, you know, LEO specialists because correct me if I'm wrong, but it has been cumbersome in the past to get the info that law enforcement needed from these platforms. They were tight lipped and not helpful in a fast manner.
B
Yeah, and to that I'll talk briefly about a case involving Jordan Demay, please. Jordan was a 17 year old from Michigan. He was targeted at 11pm on Instagram by a sextortion scammer. Over the course of the next six hours, Jordan was groomed, coerced, and then blackmailed and threatened. He died by suicide within six hours of the initial contact. He was, you know, a star athlete. He had everything going for him. He was ready to go on the family vacation the next morning, bag packed and everything. He was found the next morning by his dad with a handgun in one hand and his phone last open to Instagram in the other. And when law enforcement came and investigated, they had a hunch that it could have been sextortion. And eventually that hunch turned into something more material because Jordan's friends started to receive these images. So they knew exactly what it was. When law enforcement from Marquette, Michigan submitted an emergency request to Meta to shut down this account and provide the data about this suspicious account, Meta denied that request for and during that denial, during this period that it took extra time for this law enforcement to come back with another request, an additional four kids were sextorted by that same account. So there's so much more that can be done during these exigent circumstances. Meta's reasoning at the time was that it didn't meet the threshold for an emergency data request. It had this account to human life. This account had already killed one kid and was going on to extort multiple others.
A
What is the threshold, Meta?
B
I mean, what is the threshold at that point?
A
There's a few more. Yeah, unless they miss some and there's more. You could.
B
Sure, sure.
A
The next one is to empower users and families with tools, actual tools that work, not like the fluff parental controls that they say they have. You know, recognizing grooming and scam tactics to begin with and surfacing where to report these threats. Right. Not being a blocker, but a facilitator to help the children that they're monetizing. The next one is to advocate for Smarter regulation. I mean, one of the larger questions here is in that instance, any reasonable human would think that meta should be held accountable for that egregious lack of protection. But under our current laws, there's nothing that says that they have to do anything. And that needs to change. And this isn't just a spotlight on meta. I mean, it's Snapchat, it's TikTok. It's not just Meta. Next thing is to prioritize harm reduction over engagement metrics. How about instead of prioritizing growth and engagement, we try to prioritize the reduction in the rate of sextortion and bullying and disordered eating related content. And content that teaches children how to cut and starve themselves and misinform. I mean, anyway, so this is what ChatGPT said that you would would say are some tactics to help make platforms safer. What did they miss, if anything?
B
Yeah, I think ChatGPT did a decent job this time around.
A
Okay.
B
Yeah, so. So that's really fascinating. I'll one more in there. And you know, this example is from TikTok, actually. And if you had asked me this question a year ago, is TikTok on the top three or a risk for sextortion? My answer would have been no.
A
Okay.
B
But this has changed in the past year.
A
Wow.
B
And really just in the past few months, actually. So several months ago, it was reported that TikTok was working on photo sharing with indirect messages. Prior to this, TikTok only would allow you to share like public content on the platform within direct messages.
A
Right.
B
Or you could send text. Text messages through the direct messages. Now, these employees internally were concerned about the risk of sextortion if they would allow photos and videos to be shared in direct messages. And these employees went public, they went to the media about it, and there was a report from the information about this whole risk and how employees were trying to stop it. Despite these concerns, TikTok rolled ahead with the planned feature anyway. So now anyone above 16 can send photos and videos, indirect messages on TikTok.
A
Anyone that TikTok thinks is above 16. And we know children lie about their age.
B
Exactly. So if you actually look at the chart of like sextortion incidents that are happening on TikTok all through 2122, it's very low. 23, it's very low. 2024, it's also still very low. And this year it has just absolutely spiked. It was so predictable and foreseeable that when they rolled out this feature, there would be a tsunami of new harm coming with it. And at the time, the TikTok employees said, oh, of course we're going to have safeguards in place. Of course we're going to prevent this from happening. The data shows otherwise. So they didn't have the safeguards that they thought they had in place.
A
It's so frustrating. And like, not that there's a counter argument, but for example, people get hurt by cars. We don't pull the cars off the road, but we do employ seat belts and stop signs, stop lights, laws, driver's ed, you know, if you're driving under the influence, you lose your license. Like we implement some safety precautions that we can use a really cool technological advancement to get places faster. Right.
B
And then if the car manufacturer lies about the safety features, then they can get sued.
A
Right? Right. Right now the largest social media companies do not have to legally comply with meaningful measures that would keep children safer on their platform. And that needs to change. I'm just not that they're watching, but they attended the webinar that you spoke at. People from Snap and Meta and others. They were there. There were over 350 people who attended that webinar to try to fight sextortion and human exploitation and trafficking across the globe. So maybe some are watching. And I would just implore you to please make the right choice for protecting humanity over growth and engagement. User metrics. I mean, children are doing dying. What excuse is there? There is no excuse. Just it's common sense. Right. Unless I'm crazy. Am I crazy?
B
No, you're totally right. I think financially motivated sextortion is one of the only forms of child exploitation that is completely, completely preventable. These criminals are not after the photos or the videos. They're after $50. And you know, these are scammers. If we put enough hurdles in the way of them doing sextortion, they'll just move on to another scam. And I'm not saying those scams are okay to do, but it's less kids being groomed and exploited and killing themselves because of this. So if they want to move back into romance scams or crypto scams, so be it. Elder fraud, but it's terrible. But they've got to stop killing kids. And the platforms have to do more to prevent this from being the go to scam for these criminals because it's just so easy for them to do it. That's the only reason they're doing this, is because they're able to make money so fast, so quickly. They're creating so many fake social media accounts with no barriers. In place. And, you know, they're getting more money now doing financial sextortion, getting $50 from a couple dozen kids every week versus, you know, the long con of a romance scam. Waiting several months before they finally ask, you know, an elderly victim for $3,000 or $10,000. They're making their money this way now.
A
So from my perspective, Meta really missed out by not adding each of the team. That could have been a real win for humanity and children. You know, we won't, we won't go into that. But because of their loss, other entities and organizations have been able to win. Because of your time and expertise and thought leadership, can you speak about what you're working on now, who you're working with, who you're open to work with? Everybody who has the ability to utilize your expertise to make the Internet safer for kids should do that.
B
Thank you. I appreciate that. Yeah, I've been working. Financial institutions are taking this very seriously. I'm helping consult with the, a few of them on how to detect these payments, how to make their platforms safer, to make this not the go to crime. What else? Additional awareness I think is key. So I've been working on several documentaries about this topic. You might see one here in the US Pretty soon next year. But yeah, stay tuned. Keeping busy.
A
Okay. Okay, Wonderful. Now, we've talked about just the. What is happening, how it's happening, what platforms could be doing, but they're not to stop it from happening. I do want to give some hope to people and after everything you've seen, what gives you hope that we can make the Internet safer for kids?
B
Yeah. So legislation has been kind of a tricky battle here. There have been a few things, like the Take It down act, which criminalizes sextortion federally and adds the penalty that adds to criminal penalties. There's actually just last week, the Stop Sextortion act was introduced in the Senate. So excited about that. However, I think the biggest game changer here is the Kids Online Safety Act. It's known as cosa. And this would provide a duty of care for platforms in order to design the platforms in a safe way, even for children users of those platforms. And it provides legal recourse if the platform is designed in an unsafe manner. And the Kids Online Safety act, it passed the Senate last year, 92 to 3.
A
It was very exciting, bipartisan, very incredible.
B
Bipartisan. We had the majority of the state's attorney generals even send a letter to Mike Johnson to pass it in the House because it had already passed the Senate. We had even Microsoft Broke ranks from Big Tech to support cosa. Snapchat broke ranks from Big Tech to support cosa. And everyone thought it was going to pass in the House if it was brought up for a vote. And so there's one person in the House of Representatives that can choose whether or not something even comes up for a vote, and that's the speaker of the House, Mike Johnson, who's from Louisiana. Several weeks before the end of the congressional session last year, Meta announced a pretty hefty donation in the sum of $10 billion to build a data center in what state? Of all states, Louisiana.
A
How about that?
B
So while COSA had this huge momentum and the support bipartisan from both parties, it wasn't even brought up for a vote in the House of Representatives. It would have passed. And we see the timing of this $10 billion commitment from Meta to Louisiana just weeks before the end of the congressional session. It's not a coincidence.
A
Why is there not a massive petition being sent to Mike Johnson? Like, I mean, like, does he have kids?
B
I don't know. We should start one link in the description.
A
Yeah, I mean, like, how could you? How could you?
B
Right? I mean, that's just putting, you know, profit of people and safety, in my opinion.
A
I mean, I would. I mean, Mike, if you want to come on the podcast and defend that, like, you're welcome, you know, but all I have right now are the. What. What are the facts? And the facts are it did not. It did not come up for vote. And this donation came to the state of Louisiana from Meta. So let's. Let's unpack that so that. Actually, I asked, what gives you hope?
B
Oh, okay. Okay. So I'll turn it into hope.
A
Is there hope?
B
Is there hope? I mean, so the Kids Online Safety act has been reintroduced in the House again, but this time this year, the piece that actually had liability on the social media platforms has been removed. So now this legislation has no fangs anymore, the House version. So there's still the Senate version. And I think what we need to push for is the Senate version, because that actually has legal accountability and liability if the platforms don't perform their duty of care. So we can hope for the Senate version to pass.
A
Oh, my gosh. Oh, my gosh. This is so I just. Just do better. Do better. Children are being harmed every single day. If there was a cereal or a bicycle or a toy that hurt just one kid and then maybe another kid, it would be recalled. Yet nothing is happening, and it is not okay. So the people in positions of power to make it better. You need to do something and you need to start yesterday. We're doing what we can at Bark. Paul is doing what he can to raise awareness, educate, consult. But we're just two people. So we need help for people to learn more about what you're doing to support your efforts. Where should they go? What should they know?
B
Yeah. So I think, you know, final note for parents on the topic of sextortion, I think every parent has the conversation, never send a nude photo. Just don't do it.
A
Right.
B
But I think, you know, we need to have a part two of that conversation which says, you know, if you find yourself in a compromising situation, if it happens, come to us, let us know. Like have that, give that, give the kids a soft landing space. They know that they might have done something wrong. They know that they might have gotten themselves into a little bit of trouble, but they're the victim in this situation. So give them that soft landing space so they don't feel like they're alone. Because that's exactly what the criminals will want them to do. It's part of the scripted messages. Your parents will be ashamed of you. They would hate you. All these things, they are driving the wedge between the victim and the parents. So I think just kind of preempting that and having this conversation, let them know what the scam looks like. It's targeting boys 14 to 17 in majority. You know, the criminal's there pretending to be a young girl and sending the nude photo first. So the victim thinks it's less risky for them to do so. So those are clear signs. Like once you know those signs, it's so obvious what to look out for. So have those conversations, let them know what this scam looks like. If you know how a scam works, you're 80% less likely to fall for it.
A
Yes.
B
So let's take some power in that.
A
Oh my gosh. As you were talking, I was thinking about some content that we should create to just specifically talk to the parents of boys ages 14 to 17. Because a lot of them don't know how to talk to your kids about that sort of thing. It's awkward, right? It's not easy. So we'll be rolling that out. It's just too important. There's too many lives at risk. Paul, Rafael, everyone. It's heavy. This is not like a light hearted thing. But you need to know what's happening. And Paul is one of the top experts in the globe that I turn to to keep tabs on what's happening. And who needs to be held accountable. So, Paul, Rafael, thank you so much for being here and all that you do.
B
Thank you for having me.
Podcast: Parenting in a Tech World
Host: Titania Jordan, Bark Technologies
Guest: Paul Raffile (Cybercrime & Sextortion Expert)
Episode: Paul Raffile on Sextortion, Social Media Dangers, and Fighting for Kids Online
Date: February 12, 2026
This urgent and eye-opening episode explores the rampant dangers of sextortion targeting children on mainstream social media, the personal and systemic failures of tech platforms to sufficiently protect youth, and the practical ways parents and lawmakers can fight back. Guest expert Paul Raffile shares his journey from tracking terrorist activity online to combatting organized sextortion schemes, detailing the methods, impact, and tragic human costs of this epidemic crime. The episode closes with actionable advice for parents and a call to action for legislative change.
This episode is a sobering but vital listen for anyone parenting in a tech world. The data, the stories, and the shared expertise make it clear: these crimes are not inevitable, and the failure to protect children is a choice—one that must urgently be reversed through both personal vigilance and collective action.