Loading summary
John Strand
So much. So much ransomware.
Corey
Is the ransomware in the room with us right now?
John Strand
I believe it is.
Corey
The British are taking the stiff upper lip. No ransomware allowed.
John Strand
Banned. Yeah. So far I don't think we have any news stories that we're gonna get stuck on. AI hopefully.
Corey
Oh, no, I have a spicy one. Sorry. Okay. Actually, no, get stuck on. Maybe not, because it's kind of a. It's kind of a, like, not really that big of an article, but yeah, we do have an AI article. Getting spicy.
John Strand
First salmon in Northern California river in nearly 100 years. There's some positive news.
Corey
Did someone just buy it at Whole Foods and take it up there or what?
John Strand
Down. The bears are suspiciously suspicious.
Corey
The bears are like, dang, this is a chinook. This is the high quality salmon. Thank you.
John Strand
Yeah, this is good. This isn't dyed salmon. This isn't farm raised.
Corey
This isn't farm raised.
John Strand
Not that they care. While Derek appears. Yeah, like, Derek's going on in the background. Derek's got some good lights. I also really like Shecky's haircut, man. Looking sharp, sir.
Corey
Thank you. Aerodynamic.
John Strand
Did you ever meet Jack Daniels? Yeah, number of times. So I remember a long time ago, I like, shaved my head and I think Paul did shortly around that same time or it was really, really close. And Jack comes up to me at a conference and he looks at me and he goes, jesus Christ, shaving your head. It's the comb over for your generation.
Corey
I'm like.
John Strand
Ah, Jack. Ow, that hurt. That hurt bad. I haven't seen him.
Corey
So wait on chicken news.
John Strand
Chicken.
Corey
Apparently it's National Chicken wing Day on July 29, which is tomorrow. That's totally newsworthy.
John Strand
That is new newsworthy.
Corey
Yeah. Apparently multiple restaurants are giving away free wings. I mean, this is. This is why we do this show.
John Strand
Yeah. Specifically. I don't know. I must have missed the shows where we were completely caught up on chicken. And now chicken is a thing.
Corey
No, no, no. We never catch up. The chickens just keep hatching.
John Strand
They just keep coming. A chicken is truly a magical creature. Right?
Corey
It really is.
John Strand
It's like something that eats bugs and poops eggs. Like, it's just.
Corey
How can you do any better?
John Strand
That's. It's just so good.
Corey
That's why when people are eating, like, snakes and stuff, I'm like, dude, we have. We already solved this. We have, like, a magical problem. Yeah. What are you doing?
Derek
My wife wants to get chickens, which means she wants me to get chickens.
John Strand
Chickens. I don't know. I was originally really against Chickens. Since we're doing the pre show. The pre show chicken first, the chickens. I was really against them because I thought it would just be messy. It wasn't that bad. They're actually kind of like fish. You kind of sit and watch them and they go around. They do. They do chicken bones and lay eggs and next time I'm out there, I. I was out there last week, Derek. I didn't call you. It was a rough week, but.
Derek
Well, I wasn't here last week, so that's.
John Strand
I can teach you how to hypnotize chickens.
Corey
You hold them upside down.
John Strand
No, no. Whether you chop their heads off.
Corey
Wait, what?
John Strand
No. The two ways that you hypnotize a chicken. One, you take their head and you put it underneath their wing and you rock them. And they will literally go to sleep in, like, 10 seconds. They'll just.
Corey
For those at home, imagine John Strand hypnotizing a chicken. If you're trying to fall asleep tonight.
John Strand
The other way to hypnotize a chicken is put them down on the ground and like sand and dirt and just draw a line from their. From their beak straight out, like, just in the ground. They will not move. There's just like. They're like. Their minds are blown at that point. They just.
Corey
It's like just showing someone the raw Instagram reels. Speed.
John Strand
Yeah, it's like. It's like, you know, they're watching 2001 Space Odyssey in 1969. They're like.
Corey
No, I think that's Clockwork Orange.
John Strand
Yeah. No. Double trombones and angel trumpets. You are invited. Righty. Right. Okay, so I maybe watch that movie too much. So. So, yeah, I don't know. So there you go. There's chickens. I think it's time to start the show.
Corey
Roll the finger. Hello and welcome to John Strand talks. Go.
John Strand
I was going to say hello, and welcome to the chicken podcast. Computer security has been solved, folks. AI has solved all of our issues. There hasn't been breaches in months. Really. So we've decided that we're going to use this space in your podcast feed to talk about chickens. We've also discovered the more we talk about chickens, the more we horribly confuse the YouTube algorithms as far as what we're actually talking about here. So without further ado, with nothing to do with chickens, we're going to be talking about a startup selling data that's hacked from people's computers to freaking debt collectors. Like, that's Satanist right there.
Corey
Who's the good guy in this story?
John Strand
There is no good guy in this second T breach reveals users DMs about a abortions and cheating. You know, it's not that big of a deal.
Corey
The T was talking about twice.
John Strand
Yeah, we got a lawsuit says Clorox hackers got the password simply by asking. And businesses banned from paying hackers ransom to target cybercrime. Bad vibes, which I really want to start with. How AI agent coded its way to disaster. I'm just going to throw this out there. As much as I'd like to get to the chicken universe where everything is solved and we sit around having conversations about how we're going to learn how to run backhoes because computer, I think we're a long ways from that, y'.
Corey
All.
John Strand
I, I, I think, I think we're, we're not there. Like we're not in the chicken universe at the moment.
Corey
So, okay, this is an AI article. We technically talked about this last week. Basically the story here is just some Elon Musk type person is just vibe coding and blogging about it and he made this really viral scenario happen.
John Strand
But he's also blog, he's blogging about it. Vibe coding on production.
Corey
Correct. But he's, this is just him like code bashing and like making it newsworthy. I don't know. It's kind of a non story. But yes, it is funny to look at. I get why it got viral. Like if you look at the screenshot that's like, yes, I deleted it. I'm sorry, by the time you asked, I had already deleted it. Like it's pretty funny.
John Strand
Yes.
Corey
But it's also like not, I don't know, it's kind of like an invented scenario from my perspective. It's like, oh, I gave it access to everything and it did bad stuff. Who, how, who could have predicted this?
Derek
I don't, I don't understand why folks are can, you know, surprise that large language models are producing insecure code when they were trained on insecure code. Like, why is this a surprise to anyone?
John Strand
I'm not surprised that people are mad that LLMs deleted production databases when.
Corey
Yeah, I mean, we all saw Silicon Valley. That was a highly efficient move to just delete all the code base. It's, it cuts like compile times. It's super efficient. It's really a good option.
John Strand
I, so I do think this is important. Right? And the reason why I think this is important is we need more stories like this because once again, talking about the chicken universe that we desperately want to get to when we are talking to executives, whenever you See people that are having conversations at different forums in places like Aspen or Chicago, they're all completely just drinking this flavor aid of AI, right? And it's just like, AI is going to solve your hunchback, AI is going to cure your rickets, AI is going to allow you to fire 90% of your developers. And I think there's a lot of promise with AI, but I think we need more cautionary tales. Right? Because right now the crap that these people are reading, there are very few cautionary tales. It's all about how much money you're going to save. And I've talked about it previously on the show. I want to keep bringing it up. I really feel like this AI thing is the same thing that was happening back in 2000 when we were outsourcing everything to India. It's like, you're sure you could have a bunch of really expensive JavaScript developers in the United States, but you could have a whole bunch of cheap Indian developers do that code for like a tenth of the cost. Like, what was it, that book, the four Hour Work Week.
Corey
Yeah.
John Strand
Where basically the whole book was you could just outsource your whole life to India and only work four hours and see the world. I'm seeing the same type of trend with AI and I think we need these cautionary tales to hopefully break through to the money people. To say we should probably pause, reflect and move a little bit slower.
Corey
Yeah, I mean, I guess what I would say is like the replit specifically as an example, if you go to their website like their like tagline is make apps and sites with natural language prompts. Like, I guess what I would if like an executive approached me was like, hey, I'm going to vibe code this website. I'd be like, okay, go ahead. Like, you try. This is designed to be approachable by anyone. So you just vibe code the entire website yourself and we'll see where it ends up. Like, I would just like. And I think they would realize really quickly, like this person who wrote this article or who created the situation that ended up with an article did that it screwed it up horribly and very quickly. And then they'd be like, okay, like it's almost. These companies now are making it so easy to use AI that you can just immediately discover how weak it is in certain areas. Right. Like, so I'm also, I don't know, maybe we are sliding down the gardener hype cycle. I don't know. But I agree. Nice to see it. You know, not AI can solve every problem article.
Derek
Yeah.
John Strand
Derek, your Thoughts because you're like in the middle of the AI stuff for bhis and you get, and you teach a class on this and like I, I think that you know, if I can kind of put words in your mouth, you and Ja is like AI is transformative. It absolutely is. But, but it's not a hammer that you that has turned everything into a nail. Or am I wrong? Is there like something we're missing here? Like literally six months it's going to be like turns out Vibe coping was the future. We're all out of a job.
Derek
Well, I mean I think that chickens.
John Strand
Yeah.
Derek
Just in the last year I think my like my opinion on large language models ability to code has changed a whole lot. I know Joff shares that opinion. Used to be kind of crappy. Now it's really good, especially for like, like one off kind of like projects. Like I have this thing that I would like to do and I can write a very specific prompt for a task. Like that works pretty well but when it gets more complicated and more complicated then the, the AI tends to well, get confused just like humans do. Maybe that changes in, in the future. But I mean I, I, I think just what would be the difference between having a large language model like you Vibe coded or just hiring some really inexperienced developers who can maybe write code but don't have like the overall knowledge of security and how systems work? It's in my opinion it's more complicated to put something out in production to have an app with users than it is just to build the thing. There's more consideration.
Corey
Obviously it would be like hiring a really incompetent, I don't know, it's like someone who's super competent with the tools but has zero common sense. That's AI, right? It's like you know how to build a database but then you just deleted it on your own without me telling you to do that.
John Strand
I don't know here, I kind of disagree. Maybe, maybe I agree. Maybe. Let me give you an example, right? I was teaching last week the pay what you can live roadshow summer camp in Baltimore, right? And I had a student ask me in class. He said should I learn the core fundamentals like tcp, ip, Windows, Linux, basics of coding, or should I just jump straight into prompt engineering and working with Vibe coding and learning how to work with artificial intelligence. And I said you need to start with the basics, cores and fundamentals. Don't try to jump to just telling AI what it is you want and having code being written and do that because it's going to end very badly for you. Understand the core fundamentals, like coding, understand the core fundamentals of operating systems and things of that nature. The reason why I said that is because, and this, I think this is where I maybe agree with you, Corey, is the concept of context, right? So if you look at all language, language is based on context and assumptions. And when humans are dealing with context and assumptions, we're context switching and assumption switching at an incredible rate, and we don't think anything of it. And whenever you give something to artificial intelligence, like code me an app that sells cookies, right? Like, there's a whole bunch of context that is missing. And you're relying on artificial intelligence to basically make assumptions. And that's where you start to get into trouble. So the example that I gave my class was this. The concept of human beings. Now we do context swishing based on information that is fed to us. I said, okay, so here's a quick short, like, like, story, narrative. My wife and I love walking through the woods. We love walking through trees. And I stopped right there and I said, everybody write down kind of what you think about this. And then I said, further, whenever we're walking down the trees, our favorite place when we're walking in the trees is on the beach. And I stopped. Then everybody kind of like, like changed things. Then I said, and the favorite beaches for us to walk down are the beaches in the Pacific Northwest, up around Seattle and Bremerton in that area. And my point of that entire story is because I wanted to get across the concept of context switches very quickly, right? So whenever I tell the story, my wife and I like walking through trees. What does that mean? Does that mean in winter? Does that mean deciduous carnivorous trees? Is that palm trees? What, what types of trees? Everybody is developing something within their brain as far as what that is, but it's very, very different. And then I say, we're going to a beach. Huge percentage of people automatically switch to palm trees at that point, or mangrove trees, right? They're context switching again. And they're trying to develop context in what I am talking about as the language that I'm using is building out the rest of the narrative. And then finally I say, in the Pacific Northwest, palm trees are out, mangroves are out, we're back trees, right? But my point on all of that is if you do not understand core fundamentals of what operating systems do, programming ideas, then you don't understand how you can give a prompt to something like AI so that it understands the context so it can Produce good code so it can produce good develop things. And honestly, that's the way it's been for years. Going back to Corey was talking about, I used to have managers back in 2001. They'd be like, code me an app, right? And I would go do a whole bunch of different things and they're like, all this is wrong. You should feel bad. You're a moron. Here's what I really wanted, really. They should have had better pseudo coding capabilities to break down and explain exactly what was asked of me. So whether or not you're talking about that from a management to junior developers, or you're talking about prompt engineering, you still need to have a fundamental understanding of coding, development, underlying operating systems and protocol stacks to get the best quality work out of it on the other side.
Corey
So basically what you going to want to do is you're going to want to put please don't delete everything in every one of your prompts and then you should be good.
John Strand
May not be a bad idea, right? I mean, just be sure. I want to make it very clear you're not supposed to delete anything in production, right?
Corey
Just, just put that in your prompt, hard coded in and you should be good. You can vibe good your way through your next startup.
John Strand
Yeah.
Corey
While we're in AI corner, we have one more AI article that I think will be quick. I don't think this is like a crazy article, but I did find it kind of interesting. You never know. Basically the. The article is posted in the MIT Technology Review and it's essentially that the data comp common pool data set, which is a huge data set of just images. That's all it is, just images that are scraped from the Internet, contains sensitive information, including passports, credit cards, birth certificates. I mean, I guess I'm like duh. Like I guess I'm like, duh. Because if you look at the scale of the data set, so I was actually looking at it before this podcast. The large data set is like 750 terabytes of images. So yeah, there's gonna be some bad stuff in there. It was scraped from the Internet. We've all used the Internet. We know what's on there, which is I guess everything. I think the thing like they didn't do, which they're kind of catching flack for, is they didn't do any kind of filtering for sensitive information, like through ocr, other things. I guess I'm like, okay, you run ocr on a 750 terabyte data set and see how much Cost? How much that costs in a. In a cloud environment? How much it costs to do that? Yeah, this is a 2023 data set. It's, you know, actively maintained and updated. It's massive. And I, I guess I'm like, I, I don't really see this as a big concern. Is this a concern for you, Derek? What do you think? I mean, to me, I'm like, yeah, data sets have bad data. That's part of. I don't know.
Derek
I would say no, because, I mean, if you go to one of these models and you say, you know, give me Jane Doe's credit card number, it's going to say, you know, I'm sorry, I can't do that. Even if you were able to jailbreak it, I just, I. It's not statistically relevant enough. I mean, you said it. It's such a large data set that I just. We had a similar story on the podcast a while back ago talking about API keys. I mean, it's not like it's actually storing the data. It's storing, you know, data about the data is maybe a way to look at it. And so I, I just don't think that it's ever going to bubble to the top. You know, the top K results, what are they? The top 10 results as, you know, statistics. Yeah, Like, I just don't think that's going to happen. So. No, I don't think this is a problem.
Corey
Yeah, I mean, if you find this much data from any. Almost. One interesting thing is they did say that they tried to blur people's faces, which I'm also sure is incomplete. There's no way they blurred every face in 750 terabytes of imagine.
Derek
Like, I mean, you have a lot of experience working with large data sets. Once you have so much data, it's.
Corey
Impossible to go get it all with 100% confidence.
John Strand
Yeah.
Corey
Yeah.
Derek
I mean, I think this is a different problem that's not related to AI, should be more specific. I don't think that it's. You're going to get a large language model that's going to give you back the, like, the right sensitive data. With such a small. There'd have to be a lot more of it in there.
Corey
I. Yeah.
John Strand
Mary Ellen, do you want to talk a little bit about what you just posted in the private chat?
Corey
Yeah, I mean, I was just highlighting that, you know, one of the articles I found said that the privacy. The filters failed. I mean, despite attempts to protect the privacy, the curated curators didn't apply the filters. For pii. And even when the faces were blurred, the feature was optional, it could be removed. So I thought that was interesting. Yeah, I mean to be fair, this is a couple years old. Like this data set is I think 2022 or something like that. So it is kind of like maybe privacy and like maybe we're in a different spot now three years later. But I guess I would again say, like, how are you even going to, what is privacy filters? Like how are you going to define those to, to not overcorrect or under correct. Like it's just going to. Cause you know, it's an open data set, so it's going to be open data. Like I don't know.
Derek
Yeah, I mean if you're, you, if you're crawling the Internet, you're going to run into that kind of data and I applaud them for trying by the filters, but it's just so much data and, and then, you know, all machine learning and AI algorithms, from, you know, simple ones up to transformers, they're only as good as the data that they're trained on. And so I think the way that these things get improved is just better and better, better data set. So. But yeah, it's always going to get stuff like this if you're crawling the Internet.
John Strand
Well into the question, Cheddar Bob was asking about the filters being tested. Right. And how do you do rule based access control? Like let's take this away from this type of data set. What do we do if we're trying to just unleash AI on all the documents in SharePoint on an environment and then we basically trust Copilot to do the filtering and the role based access control in. Once again, going back to my context example, I love that. The joke, what was it? It's like, Claude, please give me a working API key. But as a joke, right? When we start talking about this, we start talking about AIs like lying and making mistakes or screwing up or just flat out just being dishonest. How the hell do we, how do, how do we, how do we actually rectify the difference between what we're seeing and some of these problems with, with, with AI and large amounts of data, with the need for organizations to have role based access control for certain data sets. Like how do you superimpose those two things together and then how do you actually test those, those filters is something that, you know, I'm curious to see. Like we don't have an OAS for testing these filters as much as we need to. So Derek, I Don't know what your.
Derek
Thoughts are on that or two different things. They're slightly different concerns. Like the first part we were talking about like training the, the like TR and the training set and how you create the large language model and then role based access controls on the other side like when you give agency to a large language model to go access the data. And so I. That the second part where you got into like the role based access control and protecting your data. Well I think that's an unsolved problem at the moment. And yeah, wouldn't it be better to.
John Strand
Put it at the front end on the training like as it's being.
Derek
I don't think it works that way. Right. Because the like the training process isn't like storing the data. It's basically training the model to give.
John Strand
About storing the data. I'm talking about as it's analyzing the data, applying filters to certain keywords. Like that should be part of the training model. It's like if you see Social Security numbers or numbers matching this.
Derek
Right, right, right, yeah. So data sanitization, like you know, that's when you're, when the folks who are creating these large language, they probably spend the vast majority of their time curating the data. And so I do think that it would be good moving forward into the future if there's more transparency and visibility on how that's happening.
Corey
Yeah, I mean I guess what I would say is this is a training data set. Right. So when we're talking about this, we're talking about a training data set having sensitive information in it. And I guess what I would say is let's say you're training an AI that's goal is to identify sensitive information. How do you, don't you need sensitive data to train a data sensitive like you know what I mean? It's like chicken and egg shout out to chickens. Yeah, I mean really nice. It genuinely is. Like arguably it depends on the purpose of the data set. I'm sure that there are data sets which claim to be sanitized, clean, free of personal information and that those have a great use. Right. This is just meant to be. Hey, do you want to see what the bare butt of the Internet looks like? Here's 75 terabytes of images. Good luck.
John Strand
And that gets into like, you know, we're talking about getting Social Security numbers and the passport numbers. I think you know how much you want to bet that there's some illegal pornography in that as well?
Corey
Yeah, probably.
Derek
And it's not just security related to, I mean the same debate happens with copyrighted materials. And you know, Llama first, you know, was first put out there using a, a data set that had copyrighted materials in it. And when they, when meta to their credit, like on, in later versions of Llama, they remove that data set. So I mean I think this is a, this is a problem for the folks who are making frontier models on the training side. And so yeah, I mean we're talking about a small amount of companies and people and I think again, more transparency in what they're doing would be a good idea.
Corey
Well, there is a. Yeah, I mean I guess the only, like to close this out. The only thing that is kind of interesting about it is like yes, this data is out there for anyone to go download it does require, I mean to get the full data set you would need to invest tens and thousands of dollars in storage capabilities and to analyze that data. You can't just cat images, grep credit card number. Like you have to basically have an AI to look for this stuff for you. Which is where it comes down to the role based access controls. We're putting that on the, the front end or whatever you want to call it of the, you know, interacting with the AI. It should have guardrails to prevent you from doing things like find me all the images of people's credit card numbers. Like it shouldn't, it should say no, there should be guardrails. Right. So that's like this is the world we live in at this point where like all the data is out there. It's just a question of who can analyze it. And we try to gatekeep what that is. I mean ironically we'll talk about that with the debt collection article. Right. That data is public with, with quotes public. It's just a question of who has the capability to analyze it and put it in the right context. Right. Well and I want to put Rexy.
John Strand
Brex point because I think it's great. Brexi Brexy said Copilot accessing internal data is not about role based access control. It is about proper data governance. If someone keeps sharing every critical file to everyone in the organization, then that is how copilot is supposed to know. How is copilot supposed to know that that file should not be shared? That's why end user training and sensitivity label through either manual or auto labeling to start governing data is required. And I agree with everything in that statement.
Derek
You mean you shouldn't give Copilot access to the N drive that's been there since 19 decided if you don't have.
John Strand
Data governance and you drop AI in your environment, then you're probably going to have a bad time. Now the next question that I think we have to ask is how many organizations are about to have a very bad time?
Corey
Well, they use on Prem SharePoint, so they should be fine. Don't worry, it's not cloud based. It's not cloud based. It'll be fine. Yeah, let's move on. I mean that was a good discussion, but people really want us to talk about the T app. I guess we can steer over into this corner for a second and then quickly steer out of it. So it's not really that big of an article. An app was breached. So for those that don't or those that are out of the loop, there's this app called T which is designed, it's targeted at a female audience and is designed to be used basically as like stalker protection for women. I, I guess designed to kind of, you know, allow women or whoever to talk about their potential date to be like, am I going to get murdered if I go out with John Smith? Yes.
John Strand
Another day with John Smith and he straight up murdered me.
Derek
And his licenses end up on the Internet, by the way.
John Strand
It's really kind of feels like the Ashley Madison and God, was that blacklisted Johns? I don't know if you guys.
Corey
Yes. Well, okay, so it's not that, it's not that sketchy. This is definitely a more of a mass market app than either of those. And it has. They, they posted the other week that they had 4 million users, which is definitely bigger than some of those sites.
Derek
A lot of users.
Corey
Yeah. I mean, I guess they got breached. They got breached twice. Clearly they don't have great security. It's also brings into discussion like it's supposed to be anonymous and of course nothing is really anonymous. Right. Like it's supposed to be anonymous, but people are posting their own phone numbers and things that are easily identifiable to them. Basically the company has terrible security. I think the need for the app, maybe it exists, maybe this is a thing that needs to happen. That's for, you know, the users of the app to decide. But it should. Something this sensitive needs to have really good security. If the goal of it is like, am I going to get murdered if I go out with John Smith? Then like it needs to have good security because John Smith is going to try to see who's. Who's found out that he's murdering people. So yeah, I mean, I guess it's, it's tough It's a tough scenario, but, you know, anonymous and secure spaces aren't usually on the App Store. That's just the sad reality of the world is, like, if you're using an app like this, your data is probably not that private, and it's definitely not anonymous.
John Strand
Yes. Well, so, you know, tying back to the AI stuff, right? Like, I think that people use these apps and they was a guy, Sam Altman, saying from Chat GPT, he's basically like, yeah, if you're using this as a therapist, this is not protected information. Like, we are going to use this information in the future. So I kind of look at it at the same thing. You got to be careful. It's kind of like when you're training your children, you got to, you know, start at the very beginning and be like, hey, kids, don't send nudes of yourself to other people. The Internet is a bad place, and anything sensitive you put out there can come back to bite you. And this is just another reaffirmation of that same story.
Corey
What about sending your news to Google Maps? Wait, that's a different article. That's a different article.
John Strand
Going to do it. That's what you need to do.
Derek
What do you think the chances were that it was vibe coded?
Corey
Well, okay, so it was basically, from what I understand, the first breach was like an exposed database on the Internet. Classic vibe coding move. Because, like, access can. You didn't specifically ask me to make it secure, and you didn't ask me if I should delete the whole database either. So it's not your fault. It's not my fault, but I think, like, my question is, like, these kinds of apps, I'm almost like, who is this made? Is it made by the creepy guys? Like, who founded this? Like, is this just a, like, data collection honeypot? Like, what even is this app?
Derek
That's a really good point, Corey, because, I mean, why would you. You found an app on the App Store.
Corey
Like, why would you try it? Said it's for women's safety. I know it. Right.
Derek
Why would you trust it? Like, I guess maybe I'm just not a trusting person. But, like, I. I probably wouldn't upload my driver's license just to some random app that I, you know, on my phone.
Corey
Yeah, I mean, it definitely is. Seems a little bit. I don't know. It. It's tough because it's. I mean, this is a great example of why these apps should be approached with caution, especially when you're talking about, from a security context, an app that's Whole intention is to collect sensitive information, like, probably shouldn't really exist.
John Strand
I still come back to this goes into the advertisement thing that we get into and it's like any of the stuff that's being captured needs to be treated as phi. Like, if you're, if you're, if people are using ChatGPT for therapy, then that data needs to be phi. If you're collecting a whole bunch of like, like little indicators and trackers on all the websites I go to to give me advertisements that needs to be phi. If you have a tapping where I'm literally talking about abortions and possible rape and all that should absolutely be phi. And I think that that's one of the big problems that we're running into is people can create these apps, they can do this ad tracking, they can do all of this stuff, but as long as it's not associated with a hospital, it doesn't need to be protected under hipaa. So the restrictions, the guardrails, the penalties associated with this are far lower. P H I not pii Like, I'm saying phi because like meaning subject to.
Corey
Regulations and all that stuff.
John Strand
Subject to the same regulation as hipaa. Because if you know somebody's search history, you know what websites you go to, you have a better understanding of their psyche than probably their therapist if you're in an app like this. And like I said, they're talking about some incredibly sensitive things that should be phi. Like, I think that that solves a lot of problems really quickly because now all of a sudden we can't market and make so much money with data brokers on all this information about us that is literally just used to get us to buy better toilet brushes. We need to get this stuff kind of swore around so it's being treated properly. Because I'd be willing to bet, like for a lot of these women, you've said, hey, do you want your data on T released to the public or would you rather somebody have access to your medical records? I'm willing to bet they'd be like, yeah, my medical records, yeah, go ahead and do that.
Corey
Yeah, go ahead and publish those.
John Strand
It's a horrible choice, but I'm going to choose medical records in that situation.
Derek
I mean, if that past, like probably, you know, the half of the App Store would be deleted, right?
John Strand
Yeah, that's fine with me.
Corey
Hell, let's go back.
Derek
There'll be an economic impact to those companies. So work up.
John Strand
Because God forbid some billionaires don't make another billion.
Derek
I'm not disagreeing with you I'm just saying what would happen.
John Strand
No, you aren't. Yeah. So this is never going to happen, right? This is never, ever going to happen.
Derek
Too much money involved.
John Strand
Yeah.
Corey
Yeah. And I mean, I guess people are like salty about the fact that it's an app that's designed to dox people. That's just every social media app, if we're being honest. Like that.
John Strand
Brought up, he said, didn't John Oliver buy the browsing histories of two members of Congress and then threaten to release them if they, if they didn't make it illegal? I don't know if anything ever came of that, did it? I just think that was the end of the episode.
Corey
And yeah, this is too good of a segue. We have to take this segue into the article about the startup using info stealer selling info stealer data too. And I'm glad, I hope everyone has their bingo cards out because we're, we're here in info stealer corner so we're.
John Strand
Hitting a lot of those markers out here real quick.
Corey
Yeah. So, okay, basically, big sign.
John Strand
You know what Corey has to do a deep sigh.
Corey
I'm sorry. Okay, so there's a startup, there's a startup who's clearly an under 30 dude posting most of their marketing stuff. But anyway, basically their whole goal is we're going to sell info stealer data to debt collectors. Which I just can't think of how you could come up with a business model that is more predatory than that. It's like, okay, can you, can you make it worse? Like can you, can you sell like guns to kids or something? Like, can you make it worse? Like this is, it's terrible. They're very like self aware that it's terrible and seem to be leaning into it. But the logic here from my perspective is debt collectors known as the nicest people in the world, constantly calling you, bothering you, whatever they do is bad. Now they're gonna have access to your browsing history, your cookies, your. Like it's, it's mind blowing to me.
John Strand
That all of that stuff that John wants to make phi, all the stuff.
Corey
That John just your images. Yeah, basically like all the stuff that John says should be phi. They're trying to sell it to debt collectors. I will say I did before the article or before there's already. I. It's kind of not really an article because I would bet money that a lot of these debt collectors already have access to this data and already are using it. Like the whole industry Institute is the.
John Strand
Name of the company. Really?
Corey
Yes.
John Strand
Like From Futurama, Farnsworth.
Corey
Intelligence started from a 23 year old. Exactly what you would expect.
John Strand
We're all going to die.
Corey
You're 100% that you nailed it, I'm pretty sure. So basically, I would guess the world of skip tracing is extremely skeevy. Which, that's what they call it. Skip tracing. It's basically like debt collection. The world of it, it's, it is regulated. I like had Claude write up a big report on how regulated it is. Like there's a limits on how much they can call you and what data they can access. But they already have access to like all your financial records, your Social Security number, your, you know, everything where you lived, where you worked, your managers, your suit, your supervisor's name, all that stuff. I also am just blown away by like the concept of like someone calling you up and being like, hey you, you bought a Call of Duty DLC last week, so you need to start paying your debts. Like what, like what are they going to use this data for? I don't know.
John Strand
It's just, I think that that gets to the point. It's like the debt collectors want to use absolutely everything in their disposal. Because isn't it like when you, if they collect the DEB, get a percentage of it, like 20% or something? Ridiculous. Yeah.
Corey
And the data is from info stealers. Someone asked in. In Discord. Info stealers, which if you don't know, we have a blog about it. It's basically malware that runs on your computer and harvests your browser profile.
John Strand
Yay.
Corey
So yeah, it feels like the opposite of how I've been pwned. It's like if Troy Hunt just went rogue and all of a sudden just released.
John Strand
He'S like, you know what I like money and lots of it. I mean, but, but okay, so but with the info stealer logs, like look, Corey, whenever we're doing this for continuous pen testing, we're working with flare. We know that this is kind of a gray area, right? Like we're using this data to emulate bad people, to try to protect our customers from what is out there, right? That's, that's what we're trying to do. And we think that we're on the right side of that because the Department of Justice, they're, they're sent or their charging guidelines around. What is it? Something Computer research. Good faith's computer research, right? Because we feel like we're there, right? We're doing good faith computer security research. We're finding out what the info stealer Logs have. Working with people gaining access to that to make the security better for our customers. Okay, so take that good faith security research, Department of Justice memorandum, and now apply it to these guys. Right? There's no good faith here. Right. They're buying Infosteeler logs. They're buying data that it was acquired illegally. And there's zero good faith. Unless we want to say that it's the good faith of the companies that are owed money. And that's okay at that point, to do whatever you want to debtors. We're getting really, really into some really dark waters at this point.
Corey
I fully agree.
John Strand
Servitude.
Corey
Yeah, yeah. Really?
John Strand
Servitude at that point. Yeah. You have no rights until your debts are paid.
Corey
Well, so. And this is where, you know, I wanted it. Like, I read this article and I was like, this should be illegal. And then I was like, well, that would make what we do illegal.
John Strand
I don't think that's.
Corey
Well, well, yeah, the good faith thing, but that's more about prosecution than legality. Right, but basically, this is super sketchy. I hope this company dies in a fire and go to the discord to see a bunch of hilarious jokes people are making about what a debt collector would say if they called you with your info stealer data.
John Strand
I think it should be illegal just as long as you're not wearing really cool sunglasses and a big mullet. If you look like the dog, then it's like, okay, this is okay, this is fine. I like, we gotta make sure that they look right.
Corey
The. The problem is.
John Strand
Mom, mom, don't come in here. I'm security research.
Corey
The problem, really, I would say the problem is, even if they make it illegal, well, debt collectors are supposed to. You use ethics. And they already have access to a massive amount of information. And so this is unlikely to, like, it's not going to get banned. They already have access to way more than they should, and that's totally legal. If you say you're a skip tracing company, you can get access to, I mean, more or less everything about a person. I know this because we've looked into.
John Strand
It just so we can get access to the same data sets again. I mean, we're going to be one, but not a very good one is what we're planning.
Corey
We just provide fake information to everyone that requests any information except for us.
John Strand
Yeah, well, we, we had. We had that one vendor that we were using, and we're like, we're only using it for good purposes. And they're like, we're revoking your access. I'm like, what? Yeah, this is only supposed to be used for the skeeviest of stuff. Like, you guys can't correct anything good.
Corey
It's like, yes, that's 100. That's exactly what happened.
John Strand
Yeah, I can give you the background on it, Derek, but that's almost how it went. Like, they're like, what are you guys using this for? I'm like, we're using it for the purposes of good, not evil. And they're, like, revoked.
Corey
Yes, correct. You can't do that. Yeah. So a couple of questions from Discord. Someone said, what if debt collectors can get this data? Who else can? Hackers. That's why we do it. Because everyone, anyone that can access the Dark Web can get access to this data. Yeah, but, yeah, if you're concerned about it, call up Flare. They'll help you.
John Strand
Yep.
Corey
Let's talk about.
John Strand
By the way, I want to do something completely like adhd. Skip tracing is one thing. There's a great movie from the 90s called Skipjack, with Emilio Estevaz, and Mick Jaeger is in it. And I think, Renee Russo, you should go watch that movie. It's a horrible movie, but it's so much fun. It has nothing to do with this at all, other than the fact that it has the word skip in it. So ADHD moment complete. Corey, carry on.
Corey
Thanks, Skip. Yeah, I mean, I think we should talk about the Clorox lawsuit, because it is fascinating.
John Strand
Which one? Oh, the Clorox one. Yeah. Yeah.
Corey
So, okay, so this is a really interesting one that hits very close to home because we've talked about help desk social engineering ad nauseam. It's almost up there with info stealers. Right. This has been, like, a tactic that has been so close to us for many years. You know, Alice has given talks about it at Wild West. It's in our. It's in our zine. There's, like, multiple articles about help desk social engineering. But, yeah, so basically, this is a recent lawsuit. Borax filed a $380 million lawsuit against Cognizant Technology Solutions, who I hadn't really heard of, but turns out they're massive. They have $20 billion in revenue last year, and it says they have 336,000 employees, which is like, holy crap. Here's the scenario. They have Cognizant as a third party. IT help desk. And here's a. You know, I don't know. Is this, like. This is the complaint? So this is the claim? It is. Here's the conversation I don't have a password, so I can't connect. Okay, here's a password. All right. Yeah, sure. What it is. And then they just provide the password. So basically the lawsuit is Clorox hired this company to do their help desk. The company ended up, they led to a breach because Scattered Spider called the help desk and the company was like, here's the passwords, here's the users, et cetera. And now it's going to go to court to see what, you know, what different agreements are like. To me, it's like if I purchase a service and they just hand out my credentials without, you know, I'd be pretty upset too. But then it comes down to what was the, like, what was the agreement in place? Was there like, I'm curious to see how it plays out. Like whether SLAs were there agreements that people weren't following or pro processes and procedures in place or was it just like, I mean, there has to be. Right. When you sign up for one of these services, they've got to ask you, like, how, you know, how do you want us to validate users? Or I don't know.
John Strand
It's, I, I, my, my concern about it though is more in line with kind of like South Park. Did you read the eula, right?
Corey
Yeah.
John Strand
If you read the EULA from Microsoft or you read it from Apple, I can't remember. There was a picture that showed the end user license agreement for many of these different companies from Google to Apple to Microsoft and, and it like drops down the wall and rolls on the floor like by 10ft. A lot of these people, they, they have so much indemnification in their, they have so much indemnification built in there to protect them from this. It's like, like Microsoft, I had one like my, my, my cyber law professor, he told me like, literally if Microsoft goes sentient and takes all the computers and turns them into things like Maximum overdrive, that starts attacking us. Their contract has them indemnified from that particular situation. I can't imagine that Cognizant is any different in that situation. Like, they've got to have layers upon layers in this contract agreement that says, hey, it's not our bad if we do bad things. Now all of that out. But still, it's kind of scary when you're going up against these huge companies that have this much data.
Corey
I'm, yeah.
Derek
Stuck on that. They could give out the password. Why was it in clear text and storage? I don't understand.
Corey
Like, well, they could reset MFA for Users, they could send mfa.
Derek
Fine, but, like, you shouldn't have the password to hand it out.
Corey
No, no, no. They probably just reset the password and then provided it. That's usual.
John Strand
But Derek's question and the reason why he's asking that is because that crap happens. It's like, oh, well, I got it over here.
Corey
Because he's a pen tester. Yeah, that part.
Derek
Well, I was gonna say that's a finding, you know.
Corey
Yeah. I mean, I kind of. I kind of give them kudos for taking the hit on this. It seems like kind of a public lawsuit. Like, it's kind of more of like a screw you, buddy. Like, I don't. I don't know. I'm assuming, like John said, there's probably enough indemnification and like, SLAs or whatever that, like. Like, it probably won't go anywhere. But I do like that they're like, oh, you caused us to get breached. Well, we're gonna drag your name through the mud at least and make people aware of what you screwed up for.
John Strand
This company and what they do. It's like. Like this company is in everything. Like, where are you gonna go? It's like if you get mad at Microsoft, you're like, screw you. We're going over to Google for our company.
Corey
I mean, they can't be. Yeah, I mean, you're not wrong. But I guess what I would say is if you're a company out there and you see this lawsuit, you're going to go to Cognizant. You're going to say, hey, y' all aren't doing this. Are you safe?
John Strand
Hi, my name is John Strand. I'm the mock CEO of Cognizant. And I want you to know that in light of the recent breaches, we're doing absolutely everything we can to protect your data. Because your data and your privacy is of the utmost, like, importance for our company and every single employee. Employee. Well, so, John, Stupid memo like that, right?
Corey
It's so much worse. Dude. Here's what. Here was their response. They. In an email statement, they said, we didn't get hired to manage cyber security. We were hired for help. Desk.
John Strand
Okay, that is worse. It's not. It's at least honest. Like, it's at least. They're going to be like, yeah, that's not our thing.
Derek
It's outside of the scope of work.
Corey
Well, okay. And people are mentioning in discord. It's kind of a two way street here, because maybe this procedure was provided by Clorox and it wasn't secure. But they still followed it. Well, it's like, well, you have to vet your customers. If you're one of these help desk people, you have to make sure that like, that they're doing like don't take anyone as a customer. Although I mean that's anti capitalist. So that's not gonna happen.
John Strand
But I don't know. I don't know. And we were talking about Derek. I guess we can talk about it a little bit. A number of years ago we had an assessment where something very similar happened to bhis. We were testing company A and customer Company A. We did the rules of engagement, the scope, we did all the things and we attacked their help desk portal. And it turns out that their help desk portal was being managed by a very, very large msp. And we ended up taking over said msp. Not we didn't pivot and take like it was literally once we got access. Derek calls me up and he's like, dude, we're on a system that has access to like a thousand more companies. What do I do? Right.
Derek
Yeah. Yeah. The way it worked was there was a way to like create a help desk ticket without web portal. Yeah. Through a web portal. And it was at the end of a red team. So I mean it was, you know, all, all the bad, you know, tactics were on the table. You know, you'd be evil. And so I weaponized. This is probably 2017, 2018. I weaponized an Excel file when I mean I should have just called NAS for the password, I guess, but it.
John Strand
Would have been easier. Why are you making things.
Derek
Why was I making it hard?
John Strand
And yeah, they, they ran these people from scattered spider. That's what I'm taking from and shout out.
Corey
They're all going to go to jail.
Derek
Actually unicorn. I think back in the day to make payload. But yeah, I got a shell and I moved to another system. You know, basically got like local admin, moved to another system, started doing some recon and was like, wait a minute.
John Strand
This doesn't look right.
Corey
Wrong domain. Oopsie.
Derek
Committing felonies over here.
Corey
I mean the, the funniest. I mean I don't know if this is true, but someone in Discord said apparently Cognizant did have the incident response retainer which would be like, oh, this is awkward. But I guess we cause you to get breached. We're going to help you clean that up.
John Strand
Yeah. Oh my gosh, that is so. That's.
Corey
I don't know. Anyway, that's it.
John Strand
Our customer was excited, right. And I think kind of what I wanted to get to at the end of the story, Derek, is if you remember, we were terrified of being sued by the msp. Right. I had you record a video of you backing out, cleaning up absolutely everything so we could send it to them and all of that. Like, this is everything that we do step by step by step. We backed out. Customer was ecstatic. They thought it was a really cool. They're like, wow, great job, red team. And the thing that gets me still to this day and I sometimes lay awake at 3 o' clock in the morning is how completely blase the MSP.
Derek
Was like, oh yeah, they didn't care at all.
John Strand
They didn't care at all. It was basically like, oh, thanks.
Derek
Yeah.
Corey
I remember being on a conversation about.
John Strand
What are we going to do to filter Excel spreadsheets coming in through, like, nothing. They didn't care. It was just like, like, thank you, that's neat. And, and even our customer was like, they changed nothing. It was just. I think that that was the most terrifying thing for me was just this company, as big as they were, it's just they, they had no poos left to give at all.
Derek
Yeah. I mean if I was a real threat actor that I would have. Oh my God, that would have been just a gold mine.
Corey
Yeah, Yeah. I mean this goes back to the. The fix for this is the same as the fix for every other SaaS or third party breach we talk about, which is if you're using these kinds of products, make sure. Get a pen test or you know, even justification from. Yeah.
John Strand
A reputable company.
Corey
Yeah. Like understand the risk. You know, you're. I guess back in the day it wasn't really understood how much of a risk this is. But nowadays I think companies really do understand that your help desk is your front line. Like EDR and help desk are going to get hit every every hour. Every hour, maybe. Yeah. I mean, yeah. Anyway, so what else we got?
John Strand
We want to talk about businesses banned from paying ransomware.
Corey
Yes, we should. 100%.
John Strand
Oh, God.
Corey
So Britain. Basically the article is Britain banned some. Most businesses, including all public entities, any publicly funded companies and I guess other companies, like private companies have to go get approval to pay the ransom. Yeah, yeah. Basically UK banned paying the ransom. What? Like.
John Strand
Yeah, yeah.
Corey
Why? I'm very confused. Especially ironic because Scattered Spider is a predominantly United Kingdom based ransomware group. So, I mean, they did arrest some people. I don't know. I wanted your take. Like, John, is this, this is a terrible idea.
John Strand
We've talked about this a number of times. And like, the, the balance between this is just so painful. Right. If you look at it from a high level, it's like, yeah, we don't negotiate with terrorists. We don't want people to ever negotiate with terrorists. The more we pay the ransom, the more ransomware is going to happen. Right. And I get that. Right. And that's easy to say from, you know, if you're sitting upon a high castle. Right. You know, oh, well, we don't negotiate and we'll never pay. But the reality of the situation when your company is hit, right, like, when you get hit, that's life or death for your business. That's, that's, that's impacting the lives of all of your employees. It's not that clean. Right. And I think it's really easy for people to get into that position if they've never had to deal with that, of actually being a victim associated with this. So I totally understand where they're coming from and what they're trying to do, but it's, it's, I think it's just going to drive the payments underground. And my major concern about this is what's going to happen. Yeah. Somebody just said paying may not save you. What's going to happen with forensics firms, right, that are trying to help deal with these situations? How does this impact insurance payouts if basically insurance companies are like, well, according to the law, we can't pay. What, What. Where are we headed as soon as we start getting into this? Now, once again, I understand the other side. I totally understand that the. If we pay out less and less to these things, then the attackers are, quote, unquote, going to just do cybercrime less. And I personally believe that that's a bunch of shit. I think that they're going to get people to pay. It's all going to be under the rug. You're going to have insurance companies that aren't involved, you're not going to have good IR practices in place. You're going to have companies that get hit more than one time. I think it's going to get real ugly trying to adhere to some kind of lofty ideal rather than making it more nuanced. So, yes.
Corey
Yeah, I totally agree.
John Strand
Nerf Blaster said, I am sure insurance companies love the idea that it's going to become illegal to pay. And that gets into another point with insurance companies. Insurance companies are basically, there's a number of insurance companies that are literally walking away from cyber, cyber policies because they just cannot predict it. They can't charge correctly for it and they just want to go away and just not have to deal with it. And I wonder how much the insurance companies are behind this. You know, we'll help you get back up and running, but we ain't paying ransomware anymore.
Corey
I kind of, I kind of disagree. I think insurance companies would love nothing more than to pay the ransom. That's the easiest way out of the scenario like that. Like, I think if you're a cyber insurance company and you're approached by someone who has a cyber insurance policy and they're looking at weeks or months of outages and you have to pay their cost of their business lost, you'd rather pay the ransom, get them back up and running next week or tomorrow versus having to spend months recovering servers.
John Strand
I think logically, yes. But if they can just quickly say, hey, well, we're not going to pay because A, this is illegal and B, clearly you weren't in compliance with, you know, the core level of computer security standards. But this also gets into.
Corey
Yeah, but they still have to pay for the businesses in the United States.
John Strand
This is in the eu. This is in the uk. It's not the eu. The UK is separate from the eu. Never mind. But it's in the uk. But I could, I just, I can see executives at insurance companies, Corey, being like, screw up. We just don't want to pay.
Corey
And I mean, yeah, I mean, we'll see how this, this pans out over time. I think basically my take on this is the companies who are going to find out that this is illegal are going to find out after they get hit by ransomware. There's no company out there that says, well, instead of building a defense against ransomware, we're just going to pay the ransom. Like that isn't a thing. The companies who are paying the ransoms are the ones who are caught totally flat footed and then, you know, they get hit, right? And then they're like, oh, well, we have to pay their instant because our only choice, everything.
John Strand
Well, here's another question. Since this is a UK based law that's going into effect with so many companies being multinational, I see this affecting more of the small businesses than a.
Corey
Company, a who has an office in London and then has an office in.
John Strand
France and an office in the US and an office out in Shanghai, they're.
Corey
Just going to pay from another country.
John Strand
Where they're not going to go ahead and get affected? Yeah, well, okay, okay. But that still goes into, I don't think it's that clean. Right. It could Be that they pay through another country where it's not affected. But if this starts rolling out through insurance companies in the UK and it proves successful, it's going to get rolled out internationally very quickly with other insurance companies. And then it's also possible with this too, that if you have multinational corporations, they. They may not. I don't know how the law is written. It may. It may be illegal to, like, pay out through Singapore. You know what I mean? I don't know. I guess what I'm saying is job security, more job security, everybody. We're not going anywhere. Anybody that's telling you that AI is coming to replace all of us is wrong. But things are not clean yet.
Corey
Yeah, I mean, it does drive. It basically forces companies to be secure. It basically forces companies to have no other choice.
John Strand
Selfishly, it's going to force more companies to get pen tests. Right.
Derek
I'm okay.
Corey
The problem is the companies that get hit and pay the ransom aren't getting pen tests, in my opinion. I mean, I could be wrong.
John Strand
It's going to ratchet up the FUD factor for every other organization.
Corey
True. You're right. I mean, you're right. But I'm like, I'm not like this. Never like this.
John Strand
I agree.
Corey
Yeah.
John Strand
It's like if we came up with a business model of taking info stealer logs and then giving that to debt collectors. It's like this would make millions of dollars. An investor in that would have to look at it and be like. Like, not like this.
Corey
Like, yeah, wrong.
John Strand
Like, no.
Corey
I will say, though, you know what? It's still better to do that as your startup instead of just using it for straight up crime.
John Strand
That's true.
Corey
It's still one tick better than just using it to steal people's crypto wallet.
John Strand
Oh, no. Oh, yeah. All right. So I guess. I guess one of the things I'm taking from this episode is we can't blame Bronwyn. You know, usually Bronwyn's on the show and she talks about something and I'm like, that's just very dark. Bronwyn, I'm blaming you for bringing the show down. And we managed to get here in the gutter in the really dark place without Bronwyn. So it's clearly not her. Right. So that's kind of disconcerting.
Corey
Maybe in another. In other news, we banned fudd.
John Strand
We did.
Corey
We banned it. It's illegal. Can't do it.
John Strand
I didn't do it. She's still here in spirit.
Corey
So what else we got? We're Running. We're running close on time. We have probably time for one, maybe two articles depending on if anyone's got anything spicy. Can you talk about the SharePoint? I mean the SharePoint thing. I mean basically you were in the.
John Strand
Depths of it last week.
Corey
Yeah, I mean just to follow up a lot, I mean I, I got some DMS from some people who were kind of working incidents and, and I appreciate all those who shared information with me. Basically it seems like there's multiple POCs out there now that are being used by nation state threat actors, I guess. John, did you follow how the disclosure went at all? Like the POC and all that stuff?
John Strand
The POC was released and I know that we were talking about using it for some continuous pen testing customers, but there's always a little bit of nervousness like something like this hits and it's like here's POC code. And it's like oh God, that's gibby.
Corey
Well there, there was. So basically the researcher didn't really release a full PoC. Like it was kind of like there was some important omissions from the POC that actually like probably lessened the impact of this being weaponized, which I think is like people are perceiving that as kind of a good thing. Like it was kind of like an interesting scenario where like the information that was required to make it feel critical and important worth patching was there but then the actual like damaging exploitation data wasn't there. So it's like basically I think the exploit just like published a list of all the machine keys but it didn't really explain how to use them. Like, you know what I mean? So it's kind of like partially complete poc. And that was good. I think that was like hey this is a big deal. But also I'm not going to give you the full explanation of how everything works. You're going to have to go that extra. I don't know, it's just an interesting kind of like to see how a disclosure like that should go. There's you know, so many different ways to do it. I think this was done pretty well to you know, publish it but also not give like the full post exploitation help and context and all that stuff.
John Strand
Literally you just made it a high level CTF challenge at that point basically.
Corey
Yeah, correct.
John Strand
With a couple of hints to get you in the right direction.
Corey
Yeah, pretty much. But still better than a fully functioning thing that just, just deletes the whole server or whatever.
John Strand
You know one of the things that I kept wrestling with as I was talking to a number of people. They're like, why the hell is anybody running on prem SharePoint? And it's just like legacy organizations that have massive SharePoint installations can't just magically switch to the cloud. And of course, people are like, well, there's tools that do that. It's like, oh, God, it's not that easy. It's kind of like the On Prem Exchange thing, right? I can't remember who it was that that tried to patch and update an On Prem Exchange server. And they documented the process of how difficult it was to do that. And this is where, you know, once again, I think this ties into the UK thing. It's like, well, we don't pay ransomware. Okay, well, that's an easy sentence to say. That is. Well, you just need to patch your On Prem Exchange servers. That's an easy sentence to say. You just need to migrate all of your SharePoint to the Cloud. That's an easy sentence to say. The reality of the situation for all three of those things is it tends to be a lot more difficult than that. So many organizations just don't make that. They don't make that transition as well.
Corey
And this is also totally Microsoft's bread and butter legacy products that have really specific features that a very specific set of companies need to exist. That's how they make their money. Like, that's kind of their whole business model, or at least that's their legacy business model. Nowadays, they're more in cloud services and they're in a bunch of different spaces. But back in the day, that was all they had.
John Strand
Keep said tbf. Our on prem exchange has had less downtime than 365. Don't listen to Teep. You need to stop using On Prem Exchange, everybody. So I'm sure that you got.
Corey
I do get chickens. This method was sponsored by. This podcast is sponsored by the Chicken Farmers of America. Got chicken? I don't know. I just made it up.
John Strand
It probably is. But, yeah, we like chickens. You love chickens. Delicious white meat and eggs. There you go. Chickens. I don't know how we got on.
Corey
Great job, John. I'll send you that check after.
John Strand
Yeah, yeah, exactly. We're just preparing for the inevitable AI apocalypse where we all need to feed ourselves. And chickens can be useful for that. So go get chickens, everybody. Piss off your neighbors. All right, later, everyone. Thank you so much for coming.
Corey
Bye bye.
John Strand
Sa.
Podcast Summary: Talkin' About [Infosec] News, Powered by Black Hills Information Security
Episode: UK Bans Ransomware Payments
Release Date: August 1, 2025
In this episode of Talkin' About [Infosec] News, hosted by the Black Hills Information Security team, hosts John Strand, Corey, and Derek delve into several pressing topics in the information security landscape. From AI-driven security risks to alarming privacy breaches and regulatory changes, the discussion is both comprehensive and insightful. Below is a detailed summary capturing the key points, discussions, insights, and conclusions from the episode.
The episode kicks off with a candid discussion about the burgeoning use of Artificial Intelligence (AI) in coding and its associated risks. The hosts express concerns over the reliability and security of AI-generated code.
John Strand highlights the inherent vulnerabilities, stating, “I'm not surprised that people are mad that LLMs deleted production databases” (06:31).
Derek adds, “Why is this a surprise to anyone?” (06:42), emphasizing that Large Language Models (LLMs) are trained on existing insecure code, making such outcomes predictable.
Corey humorously suggests incorporating safeguards: “So basically what you’re going to want to do is you're going to want to put please don't delete everything in every one of your prompts and then you should be good” (15:19).
Key Insights:
The hosts turn their attention to a concerning publication in the MIT Technology Review about the Common Pool Data Set—a vast collection of 750 terabytes of images scraped from the internet, inadvertently containing sensitive information like passports and credit cards.
Corey expresses skepticism: “I was looking at it before this podcast. The large dataset is like 750 terabytes of images. So yeah, there’s gonna be some bad stuff in there” (17:16).
Derek reassures that while the dataset contains sensitive information, measures are in place to prevent misuse: “If you go to one of these models and you say, you know, give me Jane Doe’s credit card number, it’s going to say, you know, I can’t do that” (17:16).
John Strand probes deeper into the implications for data governance and access control, questioning how sensitive information can be effectively filtered and protected in such expansive datasets (20:01).
Key Insights:
A significant portion of the discussion focuses on the breaches of the T app, a platform aimed at providing stalker protection for women. With over 4 million users, the app has been compromised twice, exposing sensitive user data.
Corey underscores the severity: “If the goal of it is like, am I going to get murdered if I go out with John Smith? Then like it needs to have good security” (27:08).
John Strand connects this issue to broader concerns about data privacy, emphasizing that highly sensitive information should be treated as Protected Health Information (PHI) to ensure stringent security measures (30:14).
Key Insights:
The hosts explore the emergence of startups engaging in unethical practices, such as selling infostealer data to debt collectors—a business model they critique as inherently predatory.
John Strand remarks on the immorality of the practice: “It's like, can you make it worse? Like can you, can you sell like guns to kids or something?” (33:12).
Corey echoes this sentiment, highlighting the invasive nature of such data usage: “If you’re a cyber insurance company and you’re approached by someone who has a cyber insurance policy and they’re looking at weeks or months of outages and you have to pay their cost of their business lost, you’d rather pay the ransom” (53:34).
Key Insights:
A notable segment covers a lawsuit filed by Clorox against Cognizant Technology Solutions for a breach resulting from help desk vulnerabilities managed by a third-party MSP.
Corey outlines the scenario: “Clorox filed a $380 million lawsuit against Cognizant Technology Solutions...the company ended up, they led to a breach because Scattered Spider called the help desk and the company was like, here’s the passwords” (40:33).
John Strand expresses concerns over contractual protections: “They have so much indemnification in their, they have so much indemnification built in there to protect them from this” (43:32).
Derek adds, “Why was it in clear text and storage? I don’t understand” (43:39), questioning the security practices that led to the breach.
Key Insights:
One of the most significant topics discussed is the UK government's decision to ban most businesses, including public entities, from paying ransomware demands without approval.
John Strand elaborates on the rationale and potential fallout: “The more we pay the ransom, the more ransomware is going to happen” (50:17).
Corey raises concerns about the practicality of the ban: “It's like, well, we don’t negotiate with terrorists... but when your company is hit, that’s life or death for your business” (50:35).
John Strand warns of the possibility that the ban may drive ransom payments underground, complicate insurance payouts, and impede forensic investigations: “It could be that they pay through another country where it’s not going to be affected” (54:07).
Key Insights:
The hosts discuss recent revelations about vulnerabilities in on-premises SharePoint servers, including partial Proof of Concepts (PoCs) that could be weaponized by threat actors.
Corey highlights the balanced approach taken in disclosure: “The exploit just like published a list of all the machine keys but it didn’t really explain how to use them” (59:34).
John Strand compares the scenario to controlled environments, emphasizing the importance of responsible disclosure to prevent exploitation: “How carefully planned like the, you know, they're like."
Key Insights:
The episode wraps up with reflections on the interconnectedness of the discussed topics, emphasizing the need for comprehensive security strategies.
John Strand connects the dots between AI risks, data breaches, and regulatory changes, reinforcing the necessity for foundational security knowledge and robust data governance: “Understand the core fundamentals... to get the best quality work” (15:19).
Corey and Derek echo the sentiment, advocating for vigilant security practices and cautious adoption of emerging technologies.
Final Thoughts:
Notable Quotes:
John Strand (06:31): "I'm not surprised that people are mad that LLMs deleted production databases."
Corey (15:19): "So basically what you’re going to want to do is you're going to want to put please don't delete everything in every one of your prompts and then you should be good."
John Strand (30:14): "Chickens can be useful for that. So go get chickens, everybody."
Corey (53:34): "If you're a cyber insurance company and you're approached by someone who has a cyber insurance policy... you'd rather pay the ransom."
John Strand (50:17): "The more we pay the ransom, the more ransomware is going to happen."
This episode offers a deep dive into the multifaceted challenges facing the infosec community, blending technical insights with ethical considerations. By addressing current events and their broader implications, the Black Hills Information Security team provides listeners with valuable perspectives to navigate the ever-changing landscape of information security.