Loading summary
A
Like, it's what tomorrow. What you expect Tomorrowland to actually be,
B
but it's just a bunch of drugged up people asking you where the toilets are.
A
They got, like, real rockets everywhere. All right? How dare you. They have a moon rock you can touch, like.
C
No, I know.
B
I'm saying that's what isn't. I don't know what Tomorrowland is, but in my head, it's like an electronic music festival. Is that not what it is?
D
Yeah, that's what I was thinking.
A
That is partially what it is, man. You guys have never been to Disneyland. Tomorrowland is like the.
B
No, we don't live in Disneyland like you do, you Californian.
A
I just assume you all know what Disneyland is.
B
All right, yeah, that's the California perspective. Just like Bronwyn's. Like, I traveled to another state and the weather wasn't as good. Okay, welcome to California and first world problems. Yes. That wasn't a shock.
E
Hey, and you know what? Truthfully, walking around in an open air sauna day in and day out is probably wonderful for the skin.
B
Except for the sun wrinkles your skin, so maybe not.
F
Yeah. And then the downside is that you're inside of. You're in Florida. That's the downside to that. That thing.
A
I feel like our typical Floridian isn't here to defend himself.
B
Yeah, I'll show up. You never know.
F
It's like talking bad about someone when they're not here, basically.
B
Maybe he joined. I saw a little join as John Strand. Maybe Ralph's just confused.
F
I just saw a silhouette that said
B
seven and it's like the Bill Gates mug shot. The Bill Gates mug shot. Does anyone remember that little text? Yeah.
D
Model had to warm up first.
B
Yeah. Yeah. John's off screen going, how now, brown cow.
C
And it's like, I've been teaching all day long. And now Zoom is gonna poop the bed. Like, great.
F
Better now than when you're teaching, I guess.
C
Oh, right.
E
Definitely.
B
Yeah. All right. What do you think, Ryan? We ready to launch off? I have show.
A
I have one question. I have a question for John. Wait, John, have you seen the new
C
star Fox the game? Is it out or is it just a video trailer?
A
It's the trailer, but it's supposed to be out at the end of the
C
month because I've seen the trailer and it looks awesome.
B
Okay, Dude, Half Life 3 is going to be out this month. I'm just kidding.
C
Stop.
B
All right, roll the finger. Let's do it. Like how Hayden's just ventriloquisting me I don't know what's happening. Hello, and welcome to Black Hills Information securities. Talking about news. It's May 11, 2026. Should we just call it Canvas Day?
C
Everyone's wearing it. It also might be WAZA Day as well, because they got a nice.
B
All right, I'll start with introductions. I'm Corey. I'm here to podcast with us all today. We've got Wade Wells. He's here to wade through some logs.
G
Yeah.
B
And I caught your webcast last week, Wade. It was great. I. I enjoyed it.
A
I was about to say, weren't. You were there before I started. You better have caught it.
B
Like, I was there. I listened to it, and you, like, plugged me at one point during it. I mean, honestly, like, if you listen to this show, if you're here listening to this podcast right now, you should check out the recording, because it's very much what we do. Like, genuinely pretty much what we do on the show. So nice work on that. Hayden, how's it going, man? Hayden runs. Do you. Can I say you run the soccer? Do you just. Are you just clawed? I'm one of the people.
F
I'm one of the people running.
C
Yeah, one of the people that runs the sock.
B
He's one of the people writing the Claude that runs the sock, writing the claw.
F
I spend more time talking to AI Than humans. Yeah, that's fair.
C
Oh, I have.
B
Okay. I have legitimately noticed that since I've been using Claude more and more and more that, like, my carpal tunnel is coming back because I just spend so much time being like, no, you need to refocus.
C
You're.
B
You're focusing on, like, I hit, like,
F
a 5F bomb Claude Day on Friday, and I was like, it's time to end the week. I think this is not going well.
C
Like, carpal tunnel. Excellent. It's working. It's working.
B
Hayden, were you the one that it just unprompted, decided to call it Meatbag?
C
No, no, that's fine.
B
That's Derek Banks. Yeah, Derek.
C
Derek called his. His. His AI Skippy from the book series Expeditionary Force, and Skippy constantly calls the lead character Meatbag. And. Yeah, so if you're ever seeing Eric's screen and it's like, well done, Meat Bag. Or it calls him like a, you know, unintelligent bacteria or a bacteria with promise.
B
Yeah, that's because that makes more sense. I've seen some screenshots of his chats, and I've always been like, are you next in the AI Uprising? Are you?
C
Why? Why Is your AI so abusive? Like.
B
Yeah, yeah. It's funny though, we also have John Strand, the owner of Black Hills Information security and like 17 other companies, including a coffee shop, I think maybe.
E
Yeah, definitely needs an intervention.
B
And then we have Bronwyn, who we already heard from. She's back from Hack spacecon.
E
Amazing conference. Definitely going to be keeping an eye out for the CFP call because I want to go back.
B
You went as an attendee?
E
I was invited to participate in a panel talking about GRC and business. And it was, it was really nice because we had a mixture of people on the panel. So I was able to speak from my experience of what I see on the pen testing side. And then we had a couple of CISOs and I'm not sure what the other guy was doing, but it was, it. It seemed to be very well received. The, the audience had great, great questions and it was genuinely an honor to be there. And so many, so many people had so much love for this web stream, the, the weekly newscast and all the other stuff that we do. And I mean it was, it was great. It had a wonderful BSIDES kind of vibe. It was small enough that you got that cool intimacy, but they had four or five, five CTFs and everything from OSINT to packet hacking. This is the real deal. So, yeah, I will be going back to Hack spacecon if at all possible.
B
Nicely recommend. I was going to give you, you know, kudos for going as just an attendee and having a good time, but instead you decided to contribute. So kudos redacted. But you know, I'll allow it. Redacted or, or retracted and retracted. Let's let the record show no kudos were issued. Redact the record. Yeah, strike that, break that.
E
Reverse it.
C
We also have it.
B
We also have our community members and webcaster friends. We have Chad Mustache, champion of the discord.
D
Another happy LMS customer who's here.
B
Yeah, I was gonna say who's here to talk about Canvas. I'm sure he just has like a freaking whole. Like it's just gonna be therapy for Chad. And then we also have Elite with us who is here to hang out and plug her upcoming webcast and workshop. So if you are interested in doing some social engineering or pre texting or other. I'm a fake. Evil things.
C
Got to ask Elite the question though because I just kind of kick things off. Here we go. If you were a tree.
G
No, I was actually once in a play. Go on.
C
So here's, here's the question. Do you think social engineering is going to become more important with. In. In like this next evolution of AI, like AI defenses, AI offenses. Is social engineering being able to implement that? Because social engineering has always been in the ether, right? It's absolutely been in the ether. But I'm wondering if it's going to increase or decrease the necessity for social engineering. Just with the explosion of AI, everything everywhere, like people stay the same.
G
I would agree that it is going to become more important and probably more prevalent. I think if you look at the more recent attacks, most of them start with social engineering. Whether they highlight that in the news story about it or not. It's usually buried somewhere towards the bottom. But there's a phishing email or a voice call that happened. And being able to not only put together social engineering campaigns, but also being dynamic and able to think critically and pivot is something that AI isn't really capable of doing quite yet. So a lot of the things that we're automating are going to be easily circumvented, in my opinion, using more advanced social engineering tactics.
B
I mean, also to pivot into the article of the day. Well, first of all, go to Elite's webcast. It's two days from now. Yep. And that one is about building a pretext, which you will need if you're going to be doing social engineering. You don't want to just wing it and be like, I'm sure this will work. I don't have any reason to call or any recon done, but I'm sure it'll work. And then Al also has a workshop next week talking about that's even. I'm assuming it's just expanding on the ideas from your webcast or how does it different.
G
Yes. So we're going to be starting with sort of a precursor to how to make pretexts, and that's in our webcast on Wednesday. And then at the end of the month, May 29, we will be digging much more in depth into pretext fabrication and how to put that into social engineering. And that syllabus is pretty robust for four hours. So we're going to pack quite a bit into that time frame.
B
Yeah, I mean, that's an insane value. Four hours for 25 bucks. I mean, do the math.
F
Yeah, that doesn't sound right. It can't be only $25, right.
G
Somebody made it.
B
That's crazy. So, yeah. So, I mean, honestly, let's just get straight into it because we're talking about canvas this week, which is executed by shiny hunters who I would. I would personally Say Shiny Hunters is probably the most prolific social engineering threat actor of this modern era by far. They have done tons of social engineering attacks. I will say with this one specifically, it's kind of unclear whether it's social engineering or not. Honestly, not a lot.
G
Yeah.
B
It seems like the answer is it's not, but also it's Shiny Hunters, so I just assume it is. Right. Like, like elite said they might not say it right away, but, like, if you scroll down far enough, eventually it'll be like, by the way, social engineering.
F
Yeah. They normally gain access through some sort of social engineering. Don't. Yeah, it's like, yeah.
B
100 fishing. Yeah.
A
Didn't they claim that they had a breach and then they had another breach?
C
Yes.
B
Right.
A
Okay.
B
So Chad, since you've been living in this, and just for audience context, Chad works in the higher ed industry, and so he's been just living this day to day, and I apologize for that. Chad, do you want to run us through, like, high level kind of how it's been playing out from your perspective?
D
Yeah. Now's a good time to mention that I speak for myself and not my employer. The Krebs article is interesting because he. He goes over the timeline. He also has some insider sources. Krebs does. So it looks like there was an initial access, which Instructure detected and then took down, but they called it a maintenance window. So already it appears that there were
C
transparency problems and an instructor detected it.
B
Instructure, that's Structure, is the company that owns Instructor is the company that owns Canvas. I miss.
C
I misheard. I misheard. Sorry, continue.
B
So, yeah, there's a lot of different moving pieces here, and Instructure and Canvas are two of the hardest terms to just say and remember on a. In a conversation.
D
So, yeah, anyway, yeah, good call out. But they called in a maintenance window, and because the initial access vector wasn't known, when they came out of maintenance, boom, they were hit again. But instead of Shiny Hunters just negotiating within structure, it looks like Shiny Hunters put that pwned placard in front of 8,800 customers. So then the customers are like, what?
B
So then it's like the Power School thing. It's exactly like the Power School thing where it's like, oh, well, your parent company didn't go for the ransomware demand. So again, let's try again.
D
Exactly, exactly. And so before you had Instructure possibly negotiating with Shiny Hunters, but then some of some universities started reaching out to Shiny Hunters to do the negotiations. Again, it's all in the Krebs article, which is excellent. Toward the end of the Krebs article is something interesting where they mention the, the UPENN breach which maybe is associated with the Harvard was breached at about that same time. So who knows if old credentials were exfiltrated during that breach and then they just weren't rolled on the instructor side. Again, like when these big breaches happen, like it takes time for the digital forensics, the incident response to happen. So we don't really get the full picture until weeks or months later where we get like a fully written out technical blog on this. So yeah.
B
So real quick Ryan, can, can you scroll back up a teeny bit? So one of the things here that I want to call out this specific paragraph right here where it says we also identified a vulnerability regarding support tickets and are free for teacher environment. So this is like I want to kind of bring everyone up to speed on like conceptually how this works. So SaaS products, right? They have different like tenants and different subscription levels, right? This is how almost every SaaS products works for like if you're, I can probably go sign up for a Salesforce account for like know with my personal email and do some stuff. But then also there's some company that's paying Salesforce a billion dollars a year that has a, you know, an enterprise tenant with all kinds of capabilities that I can't access. When they talk about their free for teacher environment that is like the lower tier environment that would be available to just any teacher can sign up and use Canvas without it being like an enterprise level deployment. So basically reading between the lines on this, a lot of people have been speculating that this is a SaaS. Like this is a SaaS industry problem where there's not appropriate separation between like different tenants and different environments. And so that's like reading between the lines on this, the free for teacher environment, they created their own accounts and somehow exploited a vulnerability targeting that environment. So it could be like social engineering via context or comments or something like with XSS or something like that, they basically essentially compromised one tenant and then we're able to bleed data from one from that into the enterprise like actual real paying customer environment. That's like speculative and hasn't really been confirmed but that seems to be what a lot of people are kind of. But it, it, it's assuming.
C
It kind of feels random though, right? Like the way that they drop it in and it's like oh, the free accounts we're getting rid of, it's like huh, why? Any reason why what an odd thing to say.
A
This is. This is like one of the like blue teamers worst nightmares. Right. When you say everything is all clear
B
and then it's not. Yeah. So there's a bunch of write ups and different, I mean different people have reported different things and like Chad said it's going to be months. These kinds of incident response things like until we get really a postmortem that actually fits together. Together, I guess. Wade, do you have any like lessons learned like you as the incident responder or Chad, feel free to hop in as well. Like what, what do you think the.
A
There's some really like how you said that it's a, a maintenance window. Right. That is a really interesting thing. Like you're not going to say incident right off the bat. Especially if it. Nothing.
C
Nor should they. Nor should they.
A
Exactly. Nor should they. So that's perfectly. But what I'm thinking is they thought it was something smaller and then it evolved into something bigger. So it's not like they got caught with their pants down type of deal. Right. Like we don't know the initial breach or how they got in. I think they timed this well. I think this is on purpose during taxes.
B
Yeah. The timing of it is so brutal. Yes.
A
Which, which makes me think for other platforms. Right. Like it's like I know Intuit has a very strict policy of no changes during tax tax season.
B
Right. Most industries have some kind of a peak season where they're extra careful target
A
during Thanksgiving type of deal. Right. Like, but how do people build that into their threat models? Like this is the time, the worst
B
time a breach could happen.
A
Yeah. Yeah. Like what. When's the possible worst time a breach could happen into your threat model? Because most of the time it's very much like we see during the holiday season. Right. Like Russian Orthodox is. Is a much different holiday on different days than our holidays. So you see a little bit threat actors drop teams.
B
Yeah.
C
What was the. Was the ransom. Was it. Did I read it right? It was about a million dollars was the ransom.
F
So I don't think they were saying that when they, they attacked. I think it was the school in Pennsylvania, University of Pennsylvania in September. That was a million dollar ransom. I don't know.
C
But I thought for this one. I thought that they had. I thought I saw it in The Krebs article. $1 million.
B
That might be the per school price, but I guarantee the price for instructure was much higher than that.
A
They're saying negotiate your settlement. So they don't have it actually in this in the photo that we see,
C
but because the reason why I ask is, man, if it was a million dollars, I'm surprised they just didn't pay that. But if it's per school or it's negotiating, then yeah.
B
Do you think most higher ed places have a million dollars laying around to deploy for something like this?
D
No, but I'm glad you asked.
C
No, I'm talking, I'm talking canvas. Like if, if canvas was hit and they're like, that's like, oh, we switch it from the university. Even though universities are just hedge funds with classrooms. But the quote that I stole from
D
someone should negotiate a raise.
C
Yeah, you should actually.
D
No, but I wanted to make a comment here. If you were not using canvas and your leadership is of the mentality, we dodged that bullet, we're not a canvas customer, use that as a lever to move on. You know, we've needed an audit person for a while and we haven't had the political or the price or the, the technical debt to get whatever you need, a tool, a person, an audit, a pen test, use the tabletop exercise, use the fact that you dodge this
E
mullet as political, never let a breach, an incident go to waste. Even else's.
A
It's like you just watched a really good webcast about watching the news, right?
C
Like, I always like looking at that because you get into security conferences and they look at companies and they're like, well, they were stupid. Like, I remember going back to the Sony, the Target, and even this one, there's already some chatter. It's like, well, their security team must suck. And I think that that's kind of an emotional, like, human way of trying to push what happened further away from you so you don't have to acknowledge or work on things at your own organization. It's like when the Sony and Target ones, you know, happened, you know, I got, I got to know those teams fairly well. And yeah, they got crapped on a lot. But the thing that I did, you know, kind of talking about those breaches was as a pen testing company, what hit those, we see all over the place. It is not unique. And we still have yet to see how they got breached here. It may be something really stupid, but even something like default creds or something like that, it happens a lot even to organizations that are trying to do everything right. So I think we have to be careful in the industry of like crapping on people that have been breached just because, you know, the, the vulnerabilities and stuff that nail. These organizations are not exclusive to these organizations. They happen everywhere.
B
This is totally speculation and I want Hayden to react to it. Here's my like, here's my like theory. All right, so I am Shiny Hunters. I create a free account, a free for teacher account. I craft some kind of pretext or some kind of reason why I need to get support from and structure. I submit some kind of whatever payload this is, you know, whatever pretext I've submitted, I submit this to support. A support workstation is compromised. So like the person, the support person who's looking at this analyzes this thing that I've sent in for support. I don't know what it is. Probably just a zip file with an exe in it, lol. That payload detonates on a workstation and then basically my theory is that support person has super over provisioned access to everything. And so they basically the blast radius is absolutely insanely huge. And Instructure has the management problem of, okay, the support person had access to basically everything in this company. We're going to take things down for maintenance, do a quick threat hunt, see what we can see. Engage incident response. They just failed to basically identify some kind of access that the threat actors gained. Whether it was a token or a credential, something like that, that, that they were able to harvest from maybe a session cookie, I don't know. And then basically they said, okay, all clear. And then, you know, threatening Hunters was like, we have 17 API tokens for everything. We're just going to make it rain. That's like my, okay, that's completely made up for context for the audience. I made all that up. None of this is confirmed by any sources. Hayden poke holes in it. What do you think is that, how realistic do you think that is?
F
I think it's very realistic because eradication is very, very difficult. Like there's a reason that a lot of companies, if a machine is compromised, they're just like, well, we're wiping it no matter what happens because we can't really guarantee that it's safe. But in the world of cloud environments and all these other API integrations and everything, there's so many different ways you could have a backdoor. We have Playbooks for different cloud compromises where there are lists of things that we make sure that we check because there's so many of them in so many different ways that we're like, we will never remember this just off the top of our head. We have to go check for X, Y and Z. And maybe they did, you know, H over here somewhere. And so I think that is very plausible. And especially if someone is like, has an account, they have some sort of like established trust and, and they're submitting in a support ticket. So there's not as much, you know, scrutiny there as if it's coming truly from the outside. So I feel like your, your proposed scenario, I would not be at all surprised if that's exactly how it happened. And maybe the solution is like, scan the attachments, like.
B
Yeah. Well, it also comes down to social engineering. Right. This is where they're known, they're known for being good at this and tricking people into doing things they shouldn't do.
C
I'm going to do a completely different one. I keeps sticking in my head they're disabling the free for teacher accounts that are out there.
B
Yeah. Temporarily, they say.
C
I think that was the vector and.
B
Well, that's what I'm saying.
C
No, but I, I don't think that they got an endpoint. I don't think that they got an endpoint.
B
Okay.
C
I, I think what happened is whenever you're working in an lms, especially as a manager, there's management tools where you can see your users, you can see their progress, you can see all of those different things. And my belief is that they got access to a free teacher account, they gained access to a management dashboard, they found a way to trick the system into dumping the records for everything.
B
Through a web app type of thing.
C
Yep. Like a web app type of thing. Probably a web app or a lot of these things have APIs. So you may not be able to access certain things within the, within the web ui, but you can access it through the API. But once you get granted that token for a teacher level account that you have full control. I'm guessing that one of those tokens, the restrictions were not specifically just to the students in their unit and they were able to pull down everything. And the only reason why I'm saying that I don't think the desktop is involved in this. I don't.
B
I think they had crowd strike, so it's fine.
C
Yeah, they had crossfree, so it's fine. You know, crowdstrike, it's like hacker Kryptonite. Oh crap, commercials now. So the fact that they just apropos for nothing, like what we're disabling temporary like teacher accounts, that screams to me.
B
Yeah, right. It's like we know these can continue to be exploited. We can't do anything about it. Right.
C
Or like I was, if I was CTO and they got breached through this particular Mechanism, you know. Okay, let's back up. If you are running a university, I can't remember how many thousands of universities are actually using this, but if you're
B
running a lot, they're super dominant a lot, right.
C
If you're running a university as an administrator, you're probably doing two factor, you're probably doing all these protections, you're probably doing these things, you're probably a low risk for trying to hack the platform, right? But if they provide the same level of tools that exist for managing your students, your student groups and all of these different things and they provide that to free accounts, I think that that's, that like they're trying to wipe that out right away to stop the attackers from gaining access to free back end and free access to the management portal on these things until they can have it looked at a little bit better. So I don't think it went to the end point. That's my theory. I think, I think that they, it was exclusively within the SaaS app. They found a way to move laterally and pull data either directly through the web UI or I'm going to bet, I'm going to bet that they gained access to a token for API access and abused the API access because we're talking about a huge volume of data that they got. So that's my theory. So I think, everybody take your bets, let's go.
B
That's a lot more plausible, I think,
F
because in like the, the scenario you described, Corey, I mean they're not necessarily going to go, all right, well let's just take down help Desk. So it'd probably be a different kind of like, you know, remediation avenue, like how do we fix this? But for them to say, hey, we're shutting down this specific program, like, I think that might actually be also, let's
C
also, let's also go with Shiny Hunters. And if you had the Help Desk scenario that went through, if you use that, then I think that the ransom would have been fundamentally different than what we saw. I think that you, I think the attackers may have pivoted, tried to gain additional internal access and tried to shut them down, maybe do both. But the fact that it's just data, like I said, I'm leaning towards it's, it's the SaaS app.
B
I think that's a reasonable, like, I mean, obviously the, we don't have any real theories, but it's interesting. Part of the reason I brought up what I, the scenario that I did is because I think we talked about it like last week that exact scenario. And essentially the explanation of oh well, why did this happen is because they had, they were missing EDR on that system or whatever, like it was misconfigured. So hypothetically possible, but for a more mature company, like in structure, you would hope that isn't possible. And yeah, like abusing an API probably not going to get picked up by most endpoint security product. Like you're kind of under the radar at that point. You're just kind of flying under, you know, you're dumping data but not payloads. Sorry.
E
Another factor too. I mean, think about how many teachers would be using this system and, and realistically, how savvy are they going to be about cyber security practices and why they matter?
C
They're not going to be hacking the APIs. Well, most of them you would think, right?
E
Yeah, but, but getting in through reused creds or something like that, through the free teachers accounts. Yeah, there's a lot of plausibility there. I think.
C
The other thing I want to ask is we get to logs, right? And we have Wade and we have Hayden and Chad. Like, here's the problem I have and this is one of the things I, I submitted a talk that I'm hoping, I'm hoping that it gets accepted somewhere, but it's like zero days, zero trust and zero visibility when you're looking at the logs and the analysis that we have. We haven't even gotten like good logs and good analysis out of Azure or Entra yet. Right. And we still have a lot of these SaaS apps where the logging just sucks.
B
Right? Right.
C
So when you're trying to run a sock, how in the hell, like right now, most socks that we deal with and most socks that we're dealing with an offense and on the defensive side, they're barely keeping up with the crap that they have. Not even starting to look at their custom SaaS apps that they're using or their cloud SaaS apps that they're using. So what the hell are we going to do moving forward as far as logging and analysis for these things? Because I'm willing to bet that there, there should have been at some point and I don't want to crap on, on them too much, but should have been at some point a very solid purple team against this SaaS app of an attack stimulus response detects that needed to be engineered.
A
Where do we start? Right, so the log, the, the logging. What? Here's, here's one crate. Like, yeah, SaaS logging usually sucks, but I have found that if you're a Customer of them and you're paying. They sometimes if you say you're not going to pay anymore or you build it into your contracts, they will build out logging for you. I've had that, I've seen that strong arm enough. The crazy part I think is a lot of the new AI tooling does not have good logging and you have to do something very, very similar. Like I don't think I've seen one good AI log. Any of the big AI log.
C
Wait, why does AI coding not have good logging? Is it dude, because it's built on human coding?
A
You're about to make me flip a table. Why they can't even have like an export to S3 for me. Why I have to build a custom Python script script to pull from seven of their APIs to just to get an okay log.
D
It was my understanding that we were supposed to move fast and break things.
A
So one of the things, yeah, one of the things with the SaaS stuff that like a lot of the APIs, usually they have some sort of management of the API as well with it, where theoretically you can build out your own custom logging by using that and querying that and storing data with it. But it gets, then you're, you're getting way too heavy, right? And then your detection engineers aren't just detection engineers, they're engineering detections. Right? Like going above and beyond. And it's just make me sad.
C
I think that that's the whole game, right? Like, yeah, I keep talking about the SAS apocalypse that's upon us. And you know, people can literally, you can literally build your own LMS if you know what you're doing in a week.
B
Um, and it depends on how many
E
cloud API less than that.
C
I know, I know, Bronwyn, I know you have and, and, and think about how painful it was, right? But now you can get something that's pretty damn functional and looks good, but it's absolutely, you know, held together with duct tape, bailing wire and toothpicks. So that's, that's where we're moving to. I, I, and I just, yeah, I don't know, the whole AI thing's getting great.
B
So. Okay, I have two comments before we pivot to another article because we have burned 30 minutes on this.
C
We should talk about too.
B
So. Yeah, so, okay, let two, I want to give a couple final thoughts before we close. One is, I want to be clear. The burden, the responsibility for having the ability to log this kind of essentially what I would say is a mass download event. The burden to log that properly is on the SaaS provider. And you, if you're a SaaS provider listening to this, do it before you get dumped by all your customers.
E
Customers.
B
Because every other SaaS product has had to do this when they got breached. Salesforce didn't have great logging for this. Guess what? Now they do because they got breached like 17,000 times last year. So if you want to be proactive and you're a SaaS provider, have logging for these, the equivalent of all your data is belong to us. You need logs that indicate that's happening, right? Like mass downloads, enumeration, even like stupid things like logging. The user agents that are getting hit, that are hitting your API. Implement that stuff now before you become the next headline. The other thing I wanted to say is I wanted to talk about the impact of the data, right? Because everyone's claiming, I think in higher ed or in education in general, a lot of people would say like, so what? Who cares if everyone knows I got a D in physics or whatever? Like I suck at physics. Um, like what is the impact? Obviously they're claiming a huge, huge amount of data. Like I, I forget the exact numbers, but it was something just stupid. They said billions of private messages.
D
Billions of private messages.
B
What they claim, okay, so to people that work in this industry, what are those private messages? Is it a student saying, I'm drunk, I need an extension on my Shakespeare paper? Like, what is this?
C
It's going to be that, but you're also going to have conversations between students. There's going to be private conversations, are going to be highly personal conversations of minors.
B
A lot of minor days.
C
Oh my God, I didn't even think of that. Oh my God. But like the level of violations, not, you know, for PI, pii, phi, gdpr. Like, yeah.
B
Do any of those apply? Are there any regulatory frameworks for the education industry at all?
E
Yes, there are.
C
Well here, let me put it, Let me give you an example. I'm a student, I'm going to school, I think I have a venereal disease and I'm talking with my friends through Direct Chat on that. That's phi.
B
Yeah.
C
So you have the platform and all of a sudden now there's a bunch of data that could be classified as pii. Definitely pii, but also phi health, mental health issues. People talking about self harm and having these types of conversations. This is, yeah, this is, this is. Yeah, this is bad.
B
So, yeah, so, so I mean, I guess, you know, does anyone else want to chime in on the impact of this kind of data? Obviously, social engineering. That's what everyone always says. Like, we're talking about building a pretext.
C
I'm thinking extortion.
B
You have. Yeah, okay. Extortion is. Yeah, it's basically extortion and social engineering. They tie hand in hand. I mean, well, is there anything else?
E
So you've got, you've got. And just in terms of regulations, you've got phi, you've got pii, you've got copa, the Children's Online Privacy Protection Rule. You've got that stuff going on. So many things. What I really, really hope, though, is that we get downstream a positive outcome that now more people in education, meaning educators, teachers, administrators, will realize, no, you really, really do have to take cybersecurity seriously. And this is why, because we're not just protecting ourselves, we're also protecting the children who are using these systems. And, and you know, we have. There's going to be fallout out of this for years.
C
I'm going to bring in Chad on this one for no reason whatsoever. But here's, here's. This gets back to the dichotomy of universities, right? And it's the same dichotomy that exists in financial institutions, the same dichotomy that exists in healthcare institutions and across the board. If you look at financial institutions, healthcare institutions, their profits are off the charts. If you go and talk to the people that are working, their budgets are shoestring, right? If you go to a lot of higher ed universities, they're like, we have an endowment that's $46 billion. And it's like, well, could we update our operating systems over here? No, there's no money for that. Right. You constantly have that dichotomy. And I, you know, I want to get Chad once again for no reason whatsoever. Just apropos for nothing, because I think I've used that twice because that's a fun phrase, it's a good phrase. People need to use it more. But this is not something that's unique to educational institutions. The only. And okay, Chad, you go. And I want to come back in the end because I can tell you, the only organizations that actually take this crap truly seriously and put money into it. So go ahead.
D
I mean, even if I had a $46 billion budget, if I don't have the political will at the organization to recruit talent, add technology, stack build detections, get SaaS, platform, third party buy in, we're not moving the ball forward. So it's a complex problem. There's not a silver bullet, unless you count education as the Silver bullet. But this is going to take some, you know, thinking on this complex problem. To push the ball forward.
C
Perhaps we need some internal social engineering.
D
That would be awesome. Well, I have a legal.
C
I have a lead line, so that's a class. I think there's more I want you to write. I want you to write social engineering to get what you need in a security team. Like.
G
But no, I think that there's more, more to it than just like, this is going to be used for social engineering. I think that the data that they gain from these personal conversations, not only does it lead directly into pretext for extortion. Like, we see a lot of kids targeted with extortion scams that start as romance scams. And I could go on for hours about how messed up that is, but I will not. But 275 million affiliates across 9,000 schools worth of personal conversation data. That is something I would guess people are going to be putting into LLMs to analyze that data and then come up with ideas for pretexts and different scams. I'm not going to call it social engineering because it'll be leading directly into scamming either the student directly, their parents, and then using this data to make those requests, or those campaigns seem more legitimate coming into the person that they're targeting.
B
So, okay, final. Final say, I guess if you're in instructors seat. Do you. I mean, I know there's negotiations ongoing. I know there's negotiations between both instructure and the threat actors and also between each individual school and the threat actors. So, like, I mean, I know we always say don't pay the ransom. Some people say maybe you should. Like, what are the chances that paying the ransom actually has any impact? Or is this data bound for the. Like, is it bound to be released? Kind of like the power school thing, right? Where like, they were like, oh, it's fine, we paid the ransom. Then they were like, oh, you're being extorted again. Like, I'm assuming that's the calculus here. But I'm like, I'm kind of worried because it seems like schools would jump to try to contain this. But, like, I don't think that's a legit. I mean, I don't know. What does everyone think? Are these threat actors to be trusted? I don't think so. I feel like you pay the ransom, you just get hit again in a month, right? Like, I don't know, maybe even by then.
G
Two sides to it. Like, you, you can't expect the school to have the capital to Pay the ransom, number one. Number two, you can't anticipate what the bad actors will do with the data following a successful ransom payment. I think double, triple extortion and then releasing the data anyway is something that would probably be a likely outcome. I don't think you win paying the ransom. Okay, come on, Case here.
E
If these malicious actors had anything resembling a moral comp. Compass.
G
Right.
E
You have to have a moral compass to make good on what you promised to do. Well, they've already demonstrated. They don't.
B
Yes. Also. Well, it's beyond that. Even if we assume they were acting in good faith, the, the, the chances that they can actually properly cordon off this data, not allow unauthorized access to it, and not let anyone else in the threat group get to this before the like it. I'm, I'm pretty sure the cat's out of the bag to some degree. But I mean, who knows?
C
Like, we, where are they gonna host it? It's. There.
B
Have. Well, it's got to be an S3. There, there, there. It's, we know, we all know it's an S3.
F
What does instructure. What does Instructure gain by paying the ransom? Like, PE schools are not going to stop.
C
Well, just looked at, I did look it up. The ransom is a million dollars per university.
B
$8.8 billion.
F
Oh, yeah. They'll definitely pay that for sure.
B
Yeah. Well, okay. Yes. Hayden brings up, But I know we're still just circling the drain on this. Hayden brings up a really good point, which is they're super dominant and, you know, oligopoly or whatever you want to call it, like, there's not really legitimate alternatives for this. Schools aren't going to switch. I mean, honestly, people in the discord were saying like, oh, apparently we're just going to go back to pen and paper. Like, that's basically where we're at with this.
C
Maybe it's better. I do, I do disagree. I do disagree. I think that you are going to see a lot of universities that are going to switch off. I think they're going to look at other alternatives. I, I, yeah. Being at, working at universities for a number of years, working with universities, and this not a good thing. I just want to make that clear. A lot of universities will look at this and two things happen. The first thing that happens is CTO is going to come in or somebody, you know, president of a university is going to be like, screw it, we're moving off and we're going to start a project right now to do this. And The CTO and the president, whoever, are going to start that process to do that, sink a crap ton of money in it, and then they're going to leave and go to another university. Because universities, it's like this constant churn in the upper echelons. And then the university is going to be stuck with this really big transitional project that's super painful and super expensive. That's my theory. But I don't know, other people may have a different approach.
B
Hey, bud. Basically, don't pay. If you're one of these schools negotiating, don't do it. It's. I know, I get it. Like, I under. They're manipulating you. These threat actors are literally social engineering you into paying. And do not do it. Just don't.
G
And their customer service is fantastic right up until the point where they receive the money, and then they become completely unresponsive.
B
Yeah, yeah, yeah, don't trust them. They're manipulating you just like they manipulated the poor people who were part of the breach.
A
It.
F
It is sadly too late.
C
Yeah. I did want to point out the two industries that actually take security like crazy. Crazy serious. Attorneys and investment firms. And the reason why attorneys and investment firms take security so seriously is their whole entire, like, value is their reputation. And it's also very easy for many people to pick up and go with a different attorney or pick up and move to a completely different investment firm. That's the only two groups that we see that consistently have good security. So.
B
All right, moving on. John, you said you want to say what up?
C
I love this story. So I just shared the link in chat so Ryan can bring that up. And I, I just want to go over why I love this one so much. 1. WAZA's write up is amazing. It's on their own GitHub repository. I don't know where Ryan is. I think we lost him.
B
I just pasted in discord. We're good.
C
Oh, there we go. So I love this and I want you to go through this with me because this is just beautiful. First, it's cluster sync path traversal and decompress files that enables arbitrary file write and code execution from an authenticated cluster peer. If you go down, they have. Here's the problem in the code. This is the vulnerable code, and this is. These are the lines of the vulnerability. They describe the vulnerability, then they talk about what are the different Python like scripts that actually call that particular function that has the vulnerability. So they break that down. Then the cluster sync archive format of VIP embeds the file path and the payload and then you get down to the proof of concept code. If you keep going down. Oh, what? Stop right there. Stop right there. You can see the permits, the two exploitation variants where it's like the relative traversal, which is. And the absolute path injection to run the backdoor. So that is just fantastic. Keep going down. And then Waza in their GitHub repository has also released the proof of concept code for the vulnerability as well. This is like amazing to me. I know it's kind of a dumb vulnerability when we're talking about directory path traversal vulnerabilities. They've existed since the early days of IIS and Rainforest Puppy. It's been around, should have been tested for. I understand that, but it doesn't sound like it's directly accessible through the web. But it just. Whenever you're talking about disclosure from a vendor about a vulnerability to give this much data and give this much explanation and POC code like just. Well done.
E
This is impressive.
C
Yes.
E
I don't, I don't think I've seen anything like this before. I mean. No, it's conceivable.
B
Wow. Hey, John, what was. Is it just elastic, but not. What is this?
C
Yeah, it's, it's, it's, it's, it's an open source edr, so. Yes, it is. They actually do use Elastic. I think they still use Elastic and using the full ELK stack with it. But they are an edr and how they make their money is they, they release an awesome tool to the public and they charge for services for implementation and maintenance and things like that.
B
So it's like the true NAS of edr. Yep.
C
Yeah, I kind of go with that. That sounds good. So nothing. They're awesome. They're, you know, we're big supporters of them in our classes and a lot of small MSPs and small businesses use them because they're good enough and they're, they're, they're cheap enough and gosh darn it, people like them.
B
Are you telling me that there's an antidote to all this insane SaaS, extortion monopoly stuff and it's open source? Is that what you're saying?
C
Yeah, that's what I'm saying. It's open source. The future is open source.
F
If you're on Canvas, switch to WAZA right now.
C
The quality of our interns just spiked.
B
Dude. Dude. Hated. I'm pretty sure the one of the alternatives is Moodle and I have freaking PTSD from Moodle.
F
I use something called Blackboard.
A
It was.
B
Oh, it Was almost worse than I was a blackboard programmer. That was my first internship in college was writing Java.
C
Oh my God, that explains so much. I did, I did want to. I did want to call out. Somebody said they, they moved away from Elastic a few years ago. They moved to open Search, which is the Amazon Fu version of Elastic. Well, just thanks for the clarification.
B
Well, yeah, it makes sense because Elastic is doing their own VC monetization strategy that would conflict with was. Yeah. All right, what's the next story? What, what's. What else is going on? We got. I mean, Canvas is the big thing. Apparently Google Chrome is Now rolling a 4 gig AI model with every install.
C
Need that my little spicy.
D
Shoving it into the silent silent install.
C
I mean, for me, quickly go into incognito mode. I don't understand. I don't. What. Sorry.
B
Four gig. Listen, four gigs nowadays, that's one tab. That's one tab. So it's all good. I mean, I don't know. It is what it is on device, honestly, I would say don't be upset. Don't be. Okay. As long as you can opt out of this. I don't think you should be upset because on device AI is actually more private and secure than doing everything through Google. Right.
F
They're like arguably all of your space anyway. Like Chrome uses all of your ram. We might as well just add even more.
B
I.
C
So I, I've been thinking a lot, Corey. A couple of weeks ago we were talking about the AI wars and you were talking about Anthropic and OpenAI and you're like, Google is going to win this. And I, you know, just the chipping away. They have money to burn. They can literally stand to lose the money that anthropic and. And OpenAI are losing. And I just see this as like, you know, another brick in the wall. Like they just keep adding AI in their overall portfolio. Anybody that's using Google is now using Gemini. It's bringing the AI results right to you. Now we're putting it inside of Google Chrome. I just think that this is just a slow march of Google winning the AI wars.
B
Yeah, yeah. I mean, me personally in this moment, I'm like, I would much rather have my browser doing on device model stuff instead of sending everything to Google. Now I know that it's probably still sending everything to Google.
C
Yeah, that's.
B
That's the key, right?
C
That's the key.
B
But, but the concept of on device is great. That's how Apple does all their AI stuff. Granted, it's absolutely Terrible in every way. But like, that I think is the secure and private way to do AI stuff. And like, some of the use cases they give, I think are kind of interesting. Like, you know, on device detection for fraud and safe browsing and stuff like that. I don't know. We'll see. But either way, if you run out of disk space, first of all, how did you only have four gigs? Also, Wade, you're muted and I want to hear this spicy comment you just made. So unmute right now.
A
Dude, I've been muted for so long, I've been making spicy comments. I'm over here reading things when you. Out of the frontier. Out of the frontier models. Like, I will admit I haven't seen anyone using at least like my connections using Google.
B
Me neither.
A
At a production level. Right. Like, and it's, it's. I'm not talking to them. It's. It's missing so much functionality, the lot. Because I, I've seen it, I've used it. But like, it didn't even have ways to share things. It didn't have ways to work together. They only have like a couple of different functionality. Like, I agree with you with your March. It's. This has been a year, a year like, since I've had both. Yeah.
B
And like, think about Copilot still doesn't have agentic coding either. Copilot still just like, what's my SharePoint files? Here's all your SharePoint. Like, it's still completely useless and companies are paying $50 a month for that.
F
Well, I feel like you're talking about like, two different markets. Like, Chat GPT is in like, the AI chat market. So Anthropic is in the market for that and development. Google is in, like, we will build a pretty decent mod and you will be using it in all of these tools that we have that you already use anyway, and you're going to not even realize you're using it.
E
I've heard some pretty amazing things about Notebook lm, which is a Google product, and I've, I've seen some pretty impressive demos.
A
I wasn't, I wasn't impressed with it.
C
I want to, I want to throw this out there, though. Like, I, I think what, like, I think what Google is doing is let OpenAI. Let Anthropic be bleeding edge. Let them step out, let them find the re. The, the features that their customers want and learn which ones they want to ignore. And there ain't a goddamn thing that is like, oh, well, Anthropic beat.
A
Did John go there's? Just.
B
John, you muted yourself.
G
Okay.
E
It wasn't just me.
C
Every time I touched my microphone in his rage. All they have to do. All they have to do is just sit back, watch what people are using, and then start implementing those features. I. I just, I just. I think it's a good strategy for them in the long run. Now it's Google. They're like the Microsoft of today. I don't know. Microsoft's the Microsoft. But it's totally possible that they can screw this up because they have the unlimited money glitch in their favor. But I just think that they're gonna sit back.
E
All the AI companies are already doing that. I mean, they, they're like babies in a nursery. One starts to do one thing and, oh, gee, it's over here too. So I, I think that's.
B
That's a great analogy.
A
Instead of talking about AI all day, let's talk about free, newly open source security programs I'd like to talk to you about.
B
Please elaborate. Oh, Trellix. Yeah.
A
So Trellix is the big security company that like, well, like anytime I hear chalks, I just think of, you ate more security companies and now you're Trellix. Type of deal
C
did you get? Wait, did you get Trellix? Do you look rough, man?
A
Bro, sometimes I feel Trellix.
C
Have you thought about getting. Oh my.
A
I'm gonna make that a T shirt. Did you get Trellixed? But it seems like their entire software base, or at least a big part of their software race, got completely leaked this week, which is never good for a security team, let alone a closed source security team. Unlike an open source, where you're always leaked. But Ransom warehouse, right? Hit them up and fun times, right?
F
Tech in there, doesn't it?
A
Yeah, dude, that's their email. That is. Don't get me started.
B
McAfee. And they own FireEye.
F
Oh, yeah.
B
This is one of those companies that you don't think of as an industry leader, but then somehow they have like, you know, freaking $2 billion in revenue. It's like an IBM. It's like no one uses them, but then they just still have $2 billion coming into their pocket every month or whatever. I.
A
That is exactly them. It's. It's just so wild.
B
So how did it happen? I mean, I'm just gonna assume social engineering.
C
Like, there we go, social engineering.
A
They have not. They have not. They have not responded to comments. The company confirmed the breach that Trellix recently identified. Unauthorized access to a portion of our source code repositories upon learning of this matter, we immediately began working, leading towards forensic expert to resolve it. So you're telling me they don't have the forensic experts in house to resolve it? They got to go.
B
Okay, well, that's the mandiant part.
C
I'm not gonna dig them.
B
Okay, no, no, hold on. New, new question. How much of this company's revenue is from people unintentionally subscribing to McAfee when they buy a new computer and they like don't know what they're doing?
A
You know how many people tell me, I don't have McAfee, I don't have computer security even to this day, and
F
I'm like, bro, this is 53,000 customers. That's got to be at least, you know, three quarters of that.
A
It's over. It's over 9,000.
G
I think none of us would be shocked at how many of these services are being outsourced to other vendors. I get asked to test vendors all the time.
B
Yep, true. That's a good point. That's a really good point.
G
So you got, you know, on demand, like during incident vendors, and those are probably the folks that they're bringing in at this stage. Just with my experience with tabletops, we usually have to loop in external legal, external support for incident response. And a lot of times you've got two individuals that are responsible for all the security in the entire organization, which is a little scary that teams are so under resourced even today.
B
Nice. I agree. I've, I've. We've had a few vendors reach out to us for or companies reaching out to us to have their vendors tested. And it's always like, you guys trust this vendor who you barely even know to do all this. Are you sure that's a good idea?
G
It actually hurts my soul when I pwn the living daylights out of like tier one basic, you know, tech support for business. And it's because it's a completely outsourced team that's managing like many multiple clients help desks and they just kind of like boop into the right one when they get, oh my God, yeah,
B
help desk.
C
When that happens, it's always a pain in the ass because I think, I think you would agree, like a lot of times you just gained access to a lot more than what was in scope.
G
Yeah. And then I realize how like, tricky this is going to be to explain because ultimately that individual has access to so much data that is not even my clients. And then I think there's also some, some scoping conversations that have to happen before we agree to test third party entities that don't know they're a part of the test in many cases.
C
Well we, we had one where they, we, we were going after their help desk and they never told us it was a third party help desk.
G
Yeah, that's usually how I find out once you're in behavior of this person seems like somebody who doesn't work there. And then I go back to ask the poc, hey what's going on here? And they're like oh yeah, totally different company. I'm like when were you going?
C
Would have been nice.
B
Yeah. Now the other thing we've seen a lot is like companies have multiple help desks. So they have like their on hours help desk is in, in house and
G
then, and then they ship everything. Yeah.
B
Different numbers and you know the outsourced people are basically just sitting around with okta God admin keys. This is fine. I don't, I don't have any flashbacks.
G
I can't, cannot imagine why social engineering is so successful. Because in a lot of cases the majority of these employees are trained purely for excellent customer service and customer experience and not really trained towards proper validation that this person actually works for that company because they don't work there. They don't know. But there's this level of like assuming that the person must work for the company if they have the internal help desk number which I can just call a branch and ask for usually and then I'm in.
B
Yeah, no, it's bad out there. Honestly. The incentives are just incredibly misaligned. Like in the, it's, it's similar to the MSSP space, you know, like any of these outsourcing type scenarios, the incentives don't align with security almost ever. It's always cost reduction, you know, capabilities. Security is an afterthought.
G
I'm absolutely shocked.
B
What a thought. What else we got? Got any chicken articles? Let's see.
A
We do.
C
Apparently chickens are.
B
We technically do have a chicken article and that chicken article is that Rose Acre Farms, America's second largest egg producer was hit by links Ransomware.
A
You ready for egg?
F
You try to crack your egg.
C
You want expensive eggs because this is how you get expensive.
A
This,
C
this is ridiculous.
B
Seriously though, what like what do these. If you're a chicken company, what infrastructure do you actually have and what do you need?
C
Actually there's ordering, there's a lot ordering,
B
distribution, distribution and ordering all out. What about egg based local language or large language models?
F
And your chickens are all Iot so you Got to make sure.
G
I wish I had.
C
You know what's weird?
G
It'd be a lot cheaper, like, owning chickens.
C
I could totally see them having videos, like, monitoring by AI and saying, hey, this group of chickens is starting to exhibit, like, weird behaviors.
B
Do you not remember the. On this show where we.
A
Yeah.
B
Talked about the AI translation app where you can talk to chickens?
E
It translated their. Their squawks to let you know how happy they were.
B
Correct. It's. It's like. But for chickens.
E
Oh, yeah.
C
I still like the Far side cartoon where the guy does the device that translates what dogs say when they bark, and the only thing they say is. Hey.
A
Hey.
B
All right, so before we close out, we should let Wade plug his upcoming web or workshop this week. Right, Wade? This is on the 15th. What is that Friday or something?
A
Yeah, Friday I am doing a workshop for. It's a four hour workshop on threat actor profiling. It should be really fun. I've added in some pretty cool labs. This goes directly the AI 1012 day course that's going to be in June. So this. This definitely has a lot more about threat actor profiling than that course actually does. So that'll be fun. I actually was gonna throw something in here about reading blog posts and looking for, like, OPSEC failures, and I feel like the WAZA one.
C
It.
A
Like, I can't. Like, Like, I can't fault them. That's too good. Like, do you give them vulnerabilities?
C
You can use that. The shining example of how to do it. Right.
A
That's. You're probably. That's exactly what I'm probably going to do, to tell you the truth. But it should be a fun course. I think it's only four hours. I don't remember the time dates. I just wake up and Ryan sends me invites to Zoom.
C
I got into an argument with my
B
wife for four hours and. Hope you did a course.
C
Yeah, we haven't. I haven't slept much in the past, like, three days, but I got into an argument with my wife on what day of the week it was? She's like, it's Tuesday. And I'm like, no, hon, it's Monday. And she's like, no, no, no, it's Tuesday. It's Tuesday. We flew home on Monday. I'm like, no, we moved it to Sunday. And when she found out it was Monday, I think she started, like, choking up, like, oh, God, it's only Monday. There's been a lot of time zones, folks. It's been a lot of time zones.
B
So all right. If you. I guess. Anything else to plug. There's a threat hunting summit coming up June 17th. That's. That's like next year as far as I'm concerned. I don't want to.
C
Yeah, that's. That's.
B
Infinitely Elite has a webcast in two days. How to build a Bulletproof Pretext. And then later this month a workshop similar.
G
Yes.
E
Well, apparently I'm doing a webcast on the 21st too, and I haven't figured out what I'm going to talk about. So if there's. There are any requests, put them in discord and I'll take them.
B
I think you should talk about using Notebook lm. Sounds like that was the. You know.
E
Well, I've dabbled with it, but nowhere. The, the problem that I'm running into is that you can't go deep with every frontier model or tool out there. There just aren't enough hours.
B
You can spend 10 days deep diving.
E
Don't have enough.
C
You know what? Here, here's here. I got a request for you and I think that this is something that you already have bits and pieces of. Is I would like to see a webcast that's like straight up, like top five, six things about prompt engineering for infosec professionals done. And how to do it. How to do it properly, not general. Because all the presentations are like, here's how you use a push, here's how you use a poll, here's how you do this. And. And it's very general. I would like it. Security. Like just like what are the six rules for like info sec. Prompt engineering that everybody needs to know right out of the gate so that. That would rock.
B
All right.
E
I reserve the right to possibly take it up to as many as 10 or drop it down to 5, depending
B
on what I get into.
E
When I get into the.
C
I trust you. You do that.
B
Sweet. Sounds fun.
E
All right.
B
All right, everyone, thanks for coming. We'll see you next week. Bye. Bye.
C
Do I have to go?
Host: Black Hills Information Security
Episode Theme: Deep Dive into the Canvas/Instructure Data Breach, Social Engineering, SaaS Security, and Notable Industry News
Date: May 12, 2026
This week, the Black Hills Information Security (BHIS) crew tackle the significant breach of Instructure's Canvas LMS—a SaaS platform used by thousands of schools—allegedly executed by Shiny Hunters. The panel discusses industry trends, impacts of the breach on higher education, speculation around attack vectors, weaknesses in SaaS application logging, and the increasing importance of social engineering. Additional segments highlight open source security (WAZA), Trellix’s source code leak, AI integration news, and even a ransomware attack on an egg producer, illustrating the range of cybersecurity threats.
(10:15-16:41)
(14:15-28:05)
(07:44-09:47, 23:22-24:47, 37:11-38:29)
(29:08-32:18)
(33:21-35:29+)
(43:14-46:33)
Mixing technical rigor, dry humor, and industry war stories, BHIS’s panel offers actionable analysis:
For listeners who missed this episode:
You’ll walk away understanding not just how the Canvas breach happened, but why SaaS security is so fraught, how social engineering methodologies adapt with AI, the limits of incident response without proper logs, and how every security headline is an opportunity for advocacy within your own org.
[End of Summary]