Loading summary
John Strand
And that's the rest of the story.
Brian
Nice.
Corey
Nice. I do not believe that's how you stole the Declaration of Independence. Only time will tell if it's the truth.
Brian
I know.
Corey
I just put a flag over it.
Brian
And carried it right out and it was fine. It was fine. I didn't say which flag, but I just put a flag over it. I mean.
Derek
It'S okay.
John Strand
You'll get pardoned.
Corey
Okay, no, don't go there.
Brian
We are not going there. This is not happening.
Ralph
Okay?
Corey
I cannot remain a neutral party any longer.
Brian
Right, That's.
Corey
I think we only have, like, a trigger.
Brian
I think we have Brian. I'm sure we have a set amount of political courage and that just cost us one. Like, it just draws from the hand and you go. It's like, okay, 10, negative 10 points for mentioning it. That's it. Hi, everybody. We are live, right? So we are now live.
John Strand
Yes.
Brian
You know, it's really good to see everybody in 2025. This is my first time actually on. I call it Restream, but streaming with everybody in 2025. So, yeah, I hope everybody had a fabulous holiday break and all that. I know I'm saying that late in January now, but I'm not ready for the year to begin yet.
Corey
So we're kind of waiting on John before we blow this popsicle stand wide open. Yeah, but we'll see if he shows up. As an audience member, you're probably wondering how it feels to be a host. And this is basically it. We just wait around. If John Strand shows up, then we throw him into the rants. If not. If not, we just throw each other into the rants.
Brian
Yeah, we just go.
Ralph
I.
Brian
Well, Brian was just saying, I survived Australia. I. The west island, as our friend Brian likes to call it. Yes, I did go around the world and back again. I got to see an enormous amount of family. There were 46 people at my brother's Christmas hosted celebration, and they were all family.
John Strand
That's ridiculous.
Corey
You're fake Australian now. They probably judge you for not knowing how to do whatever Australians do.
Brian
They do. I get judged a lot for not being Australian enough. Oh, look, there's John.
Corey
Well, what I want to know is, Joff, how could you run America so poorly? You know, because we know that you're the one who's doing it. It's obviously working.
Brian
Damn it.
Corey
Yeah, That's a swear jar. I'm counting that 10 points into the switch. I know what he said. Everyone watching, if you know what he said, we're good.
Derek
That's Right.
Corey
Oh, no, it's public.
Ralph
It's public.
John Strand
Yeah. There was no audio. If it was podcast, we'd be fine.
Corey
That's true. Can AI, can the transcript read lips yet? Let's hope not.
John Strand
Yeah. What's happening in the world of hacking today, guys? Anything new in AI?
Corey
No, nothing's new. Well, John, this is going to be fun to throw you into this rant. I. I don't want to, but I. I want you to remember like I want to see the look on your face when I say export grade ciphers. Okay, that's a preview. That's a preview of what John's going to be doing in 10 minutes when we start talking about AI chips and things. It's like that, but dumber.
John Strand
Just so you know, I'm coming in real cold. I just got done teaching, so I don't know the news stories.
Corey
That's okay.
Brian
My comfort blanket's right here. I'll give it to you.
John Strand
John, I'm really happy that you and Derek are on Joff because we're going to need some help with the AI models and I hope you guys have done at least some research.
Derek
I have a deepsea car one running locally on my MacBook. Does that count?
John Strand
Yeah.
Corey
What you want to do is you want to use Hamseq R1.
John Strand
That's you an expert, I mean.
Brian
Oh, crap camera. Oh, I don't know. I owe the swear job now.
Corey
So. Wait, Derek, how are you running that with export restricted, extremely limited hardware, military encryption. The politicians are trying to keep that out of your hands. Just kidding.
John Strand
Okay, let's go live.
Ralph
We are ready.
Corey
We already have everything. Do you mean roll the finger?
John Strand
Hello and welcome to another edition of Black Hills Information Security. Talking about news. My name is John Strand and I don't know what we're covering today because I'm teaching a pay what you can intro to security class. And the only thing I know is like if you had Nvidia stock, it's probably a bad day for you. So could we bring up the news stories? Let's start right there. Does anybody want to jump into this? Corey, do you want to.
Corey
So the first. Let's. Let's hit the reason why we're all here, which is for the hottest, freshest AI news that will this hot. This hot air goss will be dead in like two weeks because AI moves very quickly. But basically there's a Chinese LLM called deep seek R1, which sounds like a submarine that you would go on and then get killed if you're a billionaire. But let's not go there. Basically this is a new LLM and the kind of like cool thing about it, I mean we'll get into Derek and Ja from a technical perspective, but the political thing about it is the US government was like, you guys can't have the fastest AI chips to develop your LLMs. And then this Chinese company was like, well we don't need that. So they basically published an LLM that's competes with OpenAI's chat GPT01, which is like a reasoning model. So basically this went public today, it's been all kinds of active and it.
Ralph
Did go public today though. It's been out for a couple days.
John Strand
But wait, wait, right. There's more to this model than. So we. So just from what I know and I, I want to, I want to bring Ralph and Derek.
Ralph
I've used it a bunch.
John Strand
It's interesting this model, like who cares, New models are released all the time. You have lots of open source models, all of that. However, this one in the why do I care matrix jumps up because the amount of energy and computational power required to actually make it hum and do AI things is much much much less.
Ralph
Okay, so China says, so China says.
John Strand
We don't know yet.
Corey
Right, well, and so it's now running on Derek's laptop. So now he's compromised himself, he's joined, he, he's joined the Bo Derek. How's it hanging? How's it feel? Are you learning Chinese yet?
Derek
Not yet, no. But I imagine it could help me with that if it wanted to. So yeah, I am running it. I'm running essentially a version of it. So it's the 7 billion parameter model. It's the way I understand this model to be. It's essentially fine tuned using their secret sauce based off of a llama base model. So it's not the big deep seq. So I think was it 675 billion parameters since the release1?
Ralph
I think it's like over 100 gigs.
Corey
Of RAM or something like that.
Ralph
So like this gets into the RAM problem by the way, for anyone.
Corey
Yes. And by the way the whole thing we're talking about with the whole like restricted hardware, they didn't restrict the amount of memory the chips can have, only the speed. Right. So it's actually kind of a pointless restriction.
John Strand
I want to build that up so people know once again like kind of why this can be a big fricking deal. Right? So if you can do it with less power, you can do it with fewer GPUs and you can kick all these things out. That effectively kicks the entire IT stock market, NASDAQ in the nuts. Hard. Yeah, because the entire stock growth that we've had in the IT infrastructure is pretty much fueled by Nvidia, because a lot of. Well, in ARM too. But when you're looking at what is required for these AI models to properly render, it's the same thing you need for password cracking rigs. It's the same thing that you need for video game processing. Lots and lots of simple mathematical algorithms running very, very fast and massive parallelism with floating point arithmetic. You need lots of CPUs to do that. You need lots of power to do that.
Brian
So if you all of a sudden.
John Strand
Now have a model that you don't need a monster crazy supercomputer to make it run, it's less power, fewer GPU chips, which means people are buying fewer Nvidia GPUs, which I'm going to come back to here in a second. And it also means that AI is now much more accessible for companies like the size of bhis to basically compete with AI models that previously very large cloud computing companies like Microsoft, like Amazon, like Google had kind of cornered because they had the capability of putting that, that stuff, stuff in the cloud and running it for large corporations. This potentially changes a lot of financial dynamics, a lot of stock market futures. Like what's the value of Nvidia? Who knows what's the value of cloud computing platforms like Amazon, Microsoft and Google when all of a sudden now maybe you don't need to be running all of the stuff in Azure, like maybe it's going to be able to run it much easier here in your own environment, on your own hardware, behind your firewall than it is kicking it up in the cloud. Now I want to pause because I've been out of it for five hours now for my class and Ralph, did I sum that up? Ralph Joff Derek, did I sum that up properly about why this is a big effing deal?
Brian
Yeah. Yes, I'd like to. You did. I'd like to jump in there and say a couple of things. One, Nvidia took a 17% hit on the stock market today. But you've got to put that in a little bit perspective on the financials. Nvidia is trading at a 56 times forward price.
Ralph
Oh yeah, they're insanely overpriced.
Brian
So they're, they're insanely inflated right now. So that, that may not be as big a deal as the technical stuff that John was talking about. And let me talk speak to that briefly. If you look at llama 3 for example, and you look at trying to do what's called supervised fine tune training.
John Strand
Can you describe to people what Llama is? Because there's a whole bunch of people that are confused, has to do about.
Ralph
Oh yeah, really kicks the llama's ass.
John Strand
Yeah. Swear jar.
Corey
Swear jar, yeah.
Brian
10, 10 demerit points in the swear jar. Again, is that going to cost me real money or is it just going to cost me cheap real money?
Corey
We'll bill you covered.
John Strand
I got you covered. Let's try control anyway.
Brian
Let's back out a minute and just say, say a couple of things. There's a open source, a community that's been running for some time, it's called Hugging Face and there's a lot of openly published large language models on Hugging Face. And actually this has been a concern of mine for some time because I don't think the AI community has been as conscious about reining things in a little bit than they should be because this is a significantly transformative technology. Right. But putting that aside for a minute, there is an enormous community of researchers that have tapped into these models that are published openly on Hugging Face and other places to take those models in, adjust their weights for their own purposes and that's the process of fine tuning. Meta publishes one, it's called Llama. Llama 3 is their latest generation. Google publishes one, there's a number of others, Mistral, and I can't name them all off the top of my head, but there's a number of models out there.
Ralph
Right.
Brian
To build these large language models that are openly published is an enormous amount of compute power as we currently understand it with the algorithms that most of the US and or western related scientists have invented. And you know, when I say an enormous amount of compute power. These guys are building data cluster, clusters of compute that is so large that they're in, they're, they're in the process of relocating them, you know, closer to power generation facilities because they actually maxing out the ability of, of the power grid. So there's a couple of things at play here. One us for a while has, has tried to deny China the chips in order to build models. That seems to be something that they've walked around with their own manufacturing process. But the other thing is, and this has been interesting to me is the United States has been quickly approaching the crossover point where it cannot generate enough electricity to supply the model building process that's actually going on. China has a massive amount more electricity that they can produce. So they had us beat the infrastructure already.
John Strand
They're actually scrambling as fast as they can to get as many Gen 4 nuclear power plants up and running as quickly as they possibly can.
Brian
So they're getting a competitive advantage in energy infrastructure. And then they turn around and do this, say, oh yeah, by the way, we're publishing a model that is way more energy efficient in terms of being able to train. And one has to assume being able to use for inference, which is the process of actually querying the model on a day to day basis with prompts and all the things that we do with large language models. But the corollary to that is you can fine tune that model with a lot less resource as well. And that's a really big deal because to take one of these very large models and fine tune it for a task specific goal, or what's called a domain knowledge focus, takes a GPU resource of some sort and to put it in an individual hand, kind of like.
John Strand
Refactoring the whole model or a good percentage of it.
Brian
Yeah, it's refactoring a portion of the model. That is what it is. It's adjusting the weights of a subset of the model and leaving the other weights frozen. And it's a really interesting thing to do because you're essentially taking the college student, let's call the model a college student, and you're taking them to school and train them in a specific domain to tell you in that domain. Right. If you like. That's the way I like to think of it because these things are digital brains.
Ralph
The other thing too about this recent model is that we're getting into the next phase of AI. I've been listening to a lot of stuff about all the different models and all the things. So at first we were just throwing power at it. We're like more GPUs and we were getting better results. We're getting these bigger models and that was GP4 and GP4. And then when we got to O1, we moved to a reasoning model, which is what this new deep stack model is doing as well. And what this is doing is it's actually pumping out a bunch of answers from the model and then looking at all of those answers and picking out the best ones based off of other kinds of inputs. Right. And this reasoning model is how we're moving to the next step of these AI models. And the wildest part about all of this is that this is happening so fast. We're not talking about over the Last, you know, five, we're talking like in the last six months it went from no reasoning to reasoning models. Like we're pushing it so quickly to the next thing and.
John Strand
Yeah, can I jump in with a couple more things about deep SEQ and kind of like why this might be important? The first thing I also want to talk about is the chips, right? There are people in the political spectrum that are saying it must be impossible for them to have done deep SEQ without having access to export controlled. Export controlled hardware and software algorithms. There are people that are like, well, this is just espionage on a grand scale. What is your take on that? Because that seems like absolute BS to me. Let's say that they did somehow get Nvidia chips. It doesn't mean that this.
Corey
Well, they can buy Nvidia chips, they're just limited in performance in some ways.
Ralph
The can't buy Nvidia chips. They're literally like sold out. Like Elon was dying for these chips. They were like doing everything.
Corey
Like. The point is, from a legal perspective.
John Strand
Point is there's people on the political spectrum that are saying China broke the rules by gaining access to stuff that we were trying to stop them from getting dumb.
Corey
It's just like the export grade Cyrus.
John Strand
Thing that feels dumb.
Brian
What I alluded to earlier, you know, the, the, the AI data data scientist community have been openly publicly publishing what they've been doing for some time. And so the chip and the hardware side of it is a different question than the software and the algorithm side of it. The software and algorithm side of it and the models themselves have been available. They're there. Right.
Corey
I mean also, we don't even know what it was trained on and it doesn't really matter. They could just buy AWS credits in the US like it's not. There's like, it doesn't. It's totally a political thing to say, oh, we gave them export restricted chips so they won't be able to compete with open AI. That's like us saying like, well, we're not shipping any wheat to China, so they're all out of bread. Like that doesn't make any sense. Like, that's not how it works.
Ralph
The wildest part is they spent all this money and they gave it MIT open source.
Derek
Right. I mean, I can run the thing today. Right? So, yeah, I mean, I guess I.
Ralph
Would say, yeah, you can't do that with OpenAI's model.
Corey
Well, yeah, no, yeah. If it was, if it was supposed to be for espionage or if it was like threat actor China. Why would they open source it? They would just keep it internal to China.
John Strand
And here's my theory on that. Hey, let's get political. If you're looking at China, they, their main goal is disrupting the United States hegemony in, in the global order, right?
Corey
Yep.
John Strand
And they've, they've said that. I mean that's in their news. That's not anything that's classified. They've had a bit of a bug up their ass since, you know, the Boxer Rebellion. Justifiably so for a lot of things that happened in China. When you're looking at that kind of history, you're looking at what's going on. What better way to kind of destroy that kind of pole position the United States has in technology than throwing it an open source kick ass AI model over the fence? That completely cuts against ChatGPT and OpenAI, that completely cuts against Nvidia, that complet cuts against Microsoft, that completely cuts against a number of these IT companies that are US based companies. And you basically threw a free and open source hand grenade into the, into the works on this. Yeah, damn, that is, that's fair.
Corey
But here's, I predicted this six months ago.
Brian
Actually more than that, I wrote every senator I know, I said you guys better wake up because what's coming.
Corey
Well but here's the counter argument to this. Here's the counterargument. This. Okay, so here's the problem and they've already realized what the problem is. So they, Deep tech or whatever, what is it called? Deep Seek. They've already, they have a chat platform similar to Chat GPT and guess what? Signups are limited because everyone's signing up and it's crapping all over their hardware. Like just because it's efficient doesn't mean they can answer every question about what if you put pizza glue inside of cheat or you know, what if you put glue on pizza and what if you generate this? Like there's still hardware limitations to this. Just because it's more efficient doesn't mean. Basically the counter argument is they still don't have the infrastructure to really support this at scale and that's still a requirement. And it's like yeah, really different.
Ralph
That's the interesting thing about these new models as, as their way of like progressing AI, right Is that these reasoning models, they actually require more compute now because they have to go through and go through all of these options and pick them back as opposed to like the pre trained model that just spit out what it knew and it didn't and so that's, you know, and the more reasoning that they've been doing research, the more reasoning that they apply, the better answers they're getting. And you can see how this is starting to move from the training requirement of CPU to the reasoning side of cpu. So like both are going to be happening, right?
Corey
The gpu. Totally.
Ralph
So that's why, you know, it's putting such a strain on this. But you know what, wait till next month when ChatGPT drops O3 and it's like, you know, it's that fast guys, I swear to God it is that fast. I wish it was like one year, but it's going to be like two months.
Corey
So one question I have for Derek and J. Is this the first like reasoning model that I can self host with reasonable hardware or has that existed before?
Derek
That's the first one I know.
Brian
First reasoning.
Ralph
Yeah.
Corey
Okay, so that's notable. It affects like people who self host or people who, you know, like nerds. It affects nerds.
Brian
My final point that I didn't get to making is getting the technology in the hands of individuals that, where the individuals can actually have effective training and manipulation of these models has been that last mile and that last little step. And that's why this is very significant because you know, I don't have teraflops of compute cluster in my office. I wish I did, but I don't.
Derek
It's just gigaflops, not teraflop, gigaflo, gigas.
Corey
What kind of flip flops are those?
Derek
Flip flops?
Ralph
Flip flops, yeah, they're like agrees.
Brian
So that's the, the consumerization is kind of. Or the democratization if you like.
Corey
Totally.
John Strand
So that's the question I want to ask to everybody now. Right. And I really want to go because Ralph was saying some things that I think are really important. You know, everyone's saying this is a Sputnik moment. I don't know how much of that is true, but I'm going to throw this out there. Isn't this shit inevitable? Like and what I mean by that is as you're having these AI models improve now you're using AI models to improve the AI models. This idea of like open source, this idea of these models not being in the hands of super, super, super large tech giants and being more out in the open, self hosted and those types of things, like we're like Ralph, you were saying like within the past few months things got real and they changed real fast.
Ralph
Yeah, really fast.
John Strand
And this level of disruption, this isn't the end of it. Like, I've. I was reading some articles that were talking about, like, these types of improvement were theorized by university students and all these things. And for people that are in the middle of this, this isn't a huge surprise. It's a surprise that it's China and they just dropped it deep sea. It's like, oh, open source. Here you go. Have fun. That is a bit weird, but from what I'm hearing, like, what happened, like, in the past couple of days was something that a lot of AI scientists had been predicting for a while now, that it's just a matter of time before these models get distributed more and they get more efficient and they get better because that's just the general downward pressure that is in all technology stacks. Get cheaper, get more efficient, get more power efficient. Is that so?
Derek
Mark Zuckerberg actually said that's why Meta released Llama, was to put more. Put it more in the hands of not just the, you know, the techno tech elites. Right. So. And I think that if anyone's read the situational awareness AI papers, this is a good example of unhobbling. Right? Like somebody creative came up with a way to do something and it just happened to be in a different country. I don't think that. It doesn't seem like to me that it's any more, you know, that they stole it or something like that. It seems like they just came up with something creative.
John Strand
So, Derek, I have a question here from Kip. I want you guys to answer. The fact that the AI boom was so highly funded by venture capital is the root to the problems we have had with it. So what if we stole the training data? What if the number is going up and kind of going off of Kip's question all of a sudden now all these people have been sinking all of this money into these AI models and VC funding, like ridiculous amounts of money has been dumped into space. And now it's like, hey, everybody, here's a bunch of open source crap that's better than anything else you have. Have fun. What happens to that funding money?
Derek
I mean, I think that. I think we're still going to spend $500 billion on. Was it Stargate? I don't know why.
Corey
That's the next. That's the next article. Yeah.
Ralph
What an apropos way to spend more money.
Derek
So it's almost like I prepared a.
John Strand
Little bit or something.
Corey
Yeah, this is.
Brian
This is way different. And I'll tell you why it's way different. We've had technology leaps in the past that are potentially scary. But this is a broadly generally applicable massive advance. This is a transformative influence.
Derek
But, but you know, we also, I.
Corey
Don'T think, well, I mean our companies.
Derek
And our researchers might come up with something that's almost like going to, you know that like what Ralph was saying, like O3, the next reasoning model from OpenAI or Anthropic or something.
Ralph
O3 is actually in the works. They've already pretty much announced and shown, they have shown the Data on the O3 model and that comes to them pushing the reasoning models even further. Right. I think the bigger thing to even step back from this, from a transformative. So it's really interesting. It's almost like a grenade. I think it's the best way to describe it. What China did with the DeepSeek R1 model, right. They threw this thing in the pitch and everyone else is also working so hard to bring all these models in and it is moving so fast. That's the stuff that blows me away more than that China threw in this grenade, right? That, that's kind of. Well, I'm at shock already. And then there's a grenade on the ground because China is jumping in and everyone thought they were way behind, but.
John Strand
It'S moving so fast and kicking with that. That goes back to the investment. And this is going to get Stargate that I think Corey wants to get to here in a second. All this like how, why in, how in the hell would you invest in this right now? Like if you're looking at this space and you have China throwing these hand grenades is open source just like out there, right? And it's growing and it's changing so fast. Like a lot of the VC private equity, if they're looking at time frames and they're investing in something they're usually looking at like a six month to a year to two year like investment horizon to develop the technology and get it rolling, all of a sudden the game has been changed dramatically. You could sink billions into something. Like there are companies that have billions of dollars in revenue or not revenue, billions of dollars in investment. Two different things, boys and girls. Valuation now completely effed because of this.
Corey
Well, yeah, okay, so here's, here's my take on all this and to kind of inform where I'm coming, where I'm coming from on this. I have a few friends who work at intel making, you know, making rocks, think making semiconductors. And interestingly enough in that, in that industry they don't really file for patents or really their trade secrets are Just.
John Strand
How they do public and you're screwed. You wouldn't.
Corey
Well, it's not so it's not even that. It's more just. If you show me a picture of a transistor, I'm like, okay, I don't know how to make that like it. They've evolved so far beyond the concept of what you can put in a patent that just their procedures are trade secrets. But they're not patenting them, they're just moving as fast as they can. And similar to AI, things move so fast, by the time you were to patent something, it's already obsolete.
John Strand
You're done.
Corey
The second that the paper's published that the, that the patent's been filed, it's already obsolete. AIs in the same boat. They're just publishing this stuff as ads, basically because it's already obsolete by the time it's published because someone's training, oh, 30405 like it's already behind.
Derek
Well, so to answer John's question of why would someone invest in it? Well, I mean now we can take like I just went and looked. There's an academic paper on like how they did what they did. Right. And so our researchers take that and then scale it up and now we're, we have a better one that use.
Corey
Yeah. And that's the other thing. It's not about what you can do on a scientific level. It's about how you can make it into revenue. Right. It's about like, like chat. GBT is valuable or copilot is valuable, not because of how sick their parameters are and how good their weights are or whatever. It's useful because the, you know, secret sauce, the trade secrets, whatever they put on top of it, make it so it doesn't tell you to put glue in your pizza sauce, right. It make it like that, that element of it and also host the infrastructure. Right. That's a big part of it. Like. And why do we, why do we think that this affected the price of bitcoin? I mean, because the article that I read said. Yeah, I mean I get, I get the nosedive with Nvidia. I get that Google stock was down. But, but you would think that the bitcoin, the price of bitcoin would go higher, right? Because it's going to be. You can make more of it quicker, right?
John Strand
I don't think that you can necessarily make more bitcoin quicker with AI models just because of the way that the non pro, the NP hard problem is established. It's basically brute forcing. So I Don't think that AI is going to make those types of problems happen faster. Right. Because it's still a brute forcing algorithm that has to be done. For that you would generally look at more like quantum computing. But that's a whole nother conversation which is going to happen. Like that's right around the effing corner there.
Ralph
I've got another article on that. We will bring it up later.
John Strand
Yeah, we'll bring it up later. Here we go. But AI does help build those types of quantum computers and can help with all of that. So why did it cause a hit against this, against crypto? As near as I can tell, and I'm just guessing, a lot of the same people that are investing in a lot of these cutting edge technologies, they look at them without understanding what it is. They're just investing and they're pulling back and they're trying to go to a safer position. Anytime you get into any type of like, let's say instability in markets, you're going to move to gold, you're going to move to commodities, you're going to move to real estate. And I think what you're seeing is a large number of people that are pulling their stuff out of crypto because they tie it like crypto and AI are two. New tech must be the same.
Ralph
They must be the same thing. They were totally built.
Corey
Their AI investor bot thought it was connected.
John Strand
And so that's my theory on that. I, I, that's my guess.
Corey
I don't think they're connected at all.
John Strand
Family members that have been investing in Nvidia, a lot of the people that are investing in Nvidia and all these different, like AI models, they're also heavily in crypto. And today they're spooked and they're trying to pull back of both of them because I don't think that they know the difference between, that's my guess.
Brian
I, I would, I would just offer this guidance. I don't call it investing, I call it speculating.
Corey
Oh, yeah, no, this, yeah, I mean, so let's talk about the Stargate thing. Let's talk about the Stargate thing. So this is kind of a, I mean, we're getting close to politics. We also have a politics jar that costs $15 to enter. So just keep that in mind. So let's talk about the Stargate thing. Basically this is a political thing. It's a little bit of a beef between Elon Musk and Sam Altman, which is just classic. But basically people are, people are putting up money. So I Think it's based in Softbank, but there's other capital partners, Oracle and other people. Yeah, they're basically 500 billion is like the viral number. And. And of course, they're already just like tweeting at each other, being like, you don't have the money. Yeah, I do. Like, no, you don't. Yeah, I do.
Ralph
I have 10 billion going in a room. All that money.
Corey
Yeah. So, like, yeah, I don't know exactly what it means, but essentially the government is working a partnership or supporting a partnership to, to collect capital to invest in data centers is basically the best I can understand. Is that right? Derek and Jeff, like, is that. Is that basically what they're going to do? Built like, give us those Nvidia one. A1 hundreds are so expensive.
Ralph
Data center built, by the way, there already is a data center. They're not.
Corey
But they don't have any A1 hundreds. They just have open shelves. And to buy the A1 hundreds, they need $500 billion. It's like if you go to your boss, hey, John, I need some gpus. And he's like, how much is it going to cost? And you're like, 500 billion. And they're like, oh, that's fine, by the way. Just for context, by the way.
John Strand
Joff, Derek, no ideas, guys.
Brian
Like, I think we tried that and, and my answer was, John, you don't have enough money.
John Strand
You know, you know, you know, you know things are getting real at the Strand household when you try to get approval for something and I say, approved. And then 15 minutes later, Erica comes back and says, hold on, we need to talk about this.
Derek
Dinner.
John Strand
Conversations happening.
Corey
So just to put. You just hear Erica yelling from another room, John. No.
Ralph
So just, just about $500 billion into quick context. That's more money than we spent on all of the interstate systems with adjusted for inflation by a lot.
Corey
Yeah, we should probably rethink that, if we're being honest.
John Strand
Well, I.
Derek
Last year there was speculation of, you know, the race first trillion dollar cluster. And I think the thought was it'd take a couple years and so we're halfway there, right?
John Strand
Yeah.
Ralph
So the real thing that kind of interested me about this is that we keep expanding. We're like, why just throw more money in the pit? And then we could just build something faster and better. And the wild part is every time they do build the next bigger model, it is faster and better in very large, meaningful ways. It's kind of like they haven't found the end where if they put more money in, it it doesn't result in better models. Does that make sense?
Corey
Yeah, yeah.
Ralph
It's wild and that's why it's happening. I, I guess that that's is what I see is why they're like 500 billion then I mean, come on, it keeps working.
Corey
Come on, we got this data center ready to go.
John Strand
I think would just happen though. Like if I was investing in this space, I would get out of data centers, I would get out of trying to build models and I would get into services around it. Yeah, like, like if we're looking at what happened today, this is me being selfish. As soon as it hit, I'm like, if people can run really efficient self hosted models and they, they can do it on the commodity hardware, all of a sudden now we're looking at way more AI implementations out there. And that's really good for a pen testing firm that does services. Right. Because it's no longer like, well, you're going to run this in open air, you're going to run this in Microsoft, you're going to run this in Google, you're going to run this in Amazon with their infrastructure where it's cookie cutter. It's now wild west for this and people are going to implement it in a bunch of wild, crazy and beautiful ways. And that's great for the services industry.
Derek
Yeah, they were already doing that wild and crazy stuff, by the way.
Corey
What's that?
Derek
We're going to still see companies are already doing the wild and crazy stuff.
Corey
It's going to get wilder and crazier.
Derek
Wilder and crazier.
Ralph
I just want to put it into concept too. So the full R1 model takes 16 A1 hundreds to run that locally.
Brian
Right.
Ralph
And so because that's 100 or it's like one point.
John Strand
Can you break down what an A100 is please? Because we have a lot of people.
Corey
A100 is God's GPU.
Ralph
Yeah, yeah. If, if the CEO of Nvidia came down from the mountains like Jesus and brought something in his hands, this is what would be okay.
Brian
Well, the H100 is a little better, but the A100 is a cost effective inference GPU that costs about $8,000.
Corey
Yeah, it's basically a very expensive. Yeah, the H100 is the, is the, is the. But more, even more expensive version. Someone linked it in Discord. Right now on Amazon it costs $22,000.
John Strand
One of them is $8,000.
Corey
Yeah, one of them is. Yeah, one of them is 8,000.
Ralph
That's honestly, John, that's like the starter model Right now. So, like it's gotten that.
Corey
So you need 16 of those, Ralph.
John Strand
To get almost any, any corporation guys like.
Ralph
Yeah, yeah, yeah. I guess I was just saying that there's also smaller models that you can run on more commodity stuff. But to Derek's point and to Josh's point, you can train these models for your really specific use case. And a lot of times you don't need the full 671 billion parameters to make the same kind of success, right?
Corey
No, I do, I do. I don't think you understand. I submitted another expense request for another H100.
John Strand
But if we're looking at this, like if you guys came up with a business need, right, and you said we need 16 of these things. I, you know, unfortunately I trust you guys too much. I'm saying that's definitely attainable for Black Hills Information security as, you know, a small to medium sized tech business. And that's why I feel like this is just like RALPH said, it's a grenade going off, right? So now everybody is like, why are we going to be running this on Amazon? Like, we can literally just spin this up locally, save tons of money.
Ralph
There's also some other websites out there right now that aggregate all of them. I've been using them recently. It's like open router is one of them. And what you can do is you can actually ask the same question to all the different models at once. And really the point here is that you can get the direct API price. So there's a really competitive market now for making requests to these more advanced AI models at a really low tokenization rate. Right. So that you can get this to work for you really cheap. So you don't even need to build these systems. Right? You can build the cost of asking all of these questions into your price.
Corey
So yeah, yeah, yeah. I mean, basically, like it's classic. We're just, it's an industry that's evolving so quickly. The question is basically, is someone going to trip? Everyone's sprinting, open AI sprinting Grok or whatever. Elon. Oh yeah, everyone's just sprinting. No one knows where they're sprinting. They're just, they're running. Elon's sprinting, looking left and being like, Sam Altman, are you sprinting in the same direction as me? And if one of them trips, it's over. And then one of them takes over as the winner. But right now it's like, who can dump the most money into H1 hundreds and hope that leads to an outcome on the Stock market.
Ralph
But the outcome is like actual AGI, which, by the way, we've improved so much. They're already talking about improve or changing the models that they have built to test whether it's AGI. They're like, we got to move the.
Corey
Flicking field goal because you guys are getting too close. Yeah, I mean, yes, I, I got you, but I don't know what the goal is. Derek Joff, what is the goal? Is it AGI? Is it just like.
John Strand
Well, you've got to put it in.
Brian
The broader societal context, right? Every, Everybody's forgetting that this is massively transformative because biotech and quantum computing are coming along with it and artificial intelligence is researching itself to improve itself, improving itself. And so we have, we've, we're reaching this point of exponential inflection where we won't be able to stop. This is actually going to transform everything. I mean, we're just, we don't even, we, we as humans are not even capable of understanding how much this is going to transform the world. It's, it's out of the bag, right?
John Strand
And that's why I come back to, like this. And what's happened, like, man, I would hate to be investing money in this space right now because it's entirely possible that you could sink billions of dollars into something that's completely wiped out and irrelevant in the middle of you investing billions of dollars.
Corey
That's the whole point of the stock market, my friend. That's the whole point. That's why it exists. The stock market exists for one purpose.
John Strand
To raise capital, get people to invest in you. And boy, that sounds a lot like VC funding. And Series A in Series B, it's.
Ralph
Not about how can you give me 100x return, John? Just show me. Just show me.
Corey
What you're going to want to do is approve this Amazon purchase of another H100. Now, I mean, the way you're going.
Brian
To get your 100x return is to look at the actual services that are researched as a result of the technology. Technology, not the technology itself. So a good example is biomedical device miniaturization, right? If AI researches that and produces a device that's extremely popular, why, you know, and will sell wildly over the next two years, then you can get your, get your speculation bucks out of that.
John Strand
Did you see that article about AI creating RF chips?
Corey
Yeah.
John Strand
Where they get it, it's getting more and more efficient for power. It's getting more and more better as far as range and all these things. And when humans are looking at the chips and what it's doing. They're like, we have no idea why this works. Like, why is this technology that's inside of this rf? Like, like one of them had like this weird spirally thing and they're like, we don't even know if that's used. But, like, so that.
Brian
That's actually the. The real point. They're like, that's the real point. We are going at a point to the point where we don't know what's being created and how it works. Is that comforting?
John Strand
Yeah, yeah, yeah.
Corey
But Josh, that's already been true. Dude. You don't know how half the stuff you're. The shoulders we're standing on are so high. No one's been able to see the ground for 25 at least years. I mean, if I'm being honest, those.
Ralph
Shoulders are going to be even one upped on the shoulders that you thought were the highest. Like, we're gonna be like, the people who are the experts are going to be confused.
Corey
I think this is part. But it's just another bubble. This is the same thing the Internet was doing in the 1990s. Like, it's the same thing.
John Strand
It is. I disagree. I think it is a bubble. Not from the perspective, like, whenever you're thinking bubble, a lot of people are like, it's fake. Like, the housing market was a bubble because that value was fake. They were looking at AI as like a bubble because, well, it's fake. I agree that this technology has a lot of room to grow, but I look at it from an investment speculation perspective. It is a bubble.
Brian
Like, oh, financially it's a bubble.
John Strand
Yeah, financially it's a bubble. But if you're looking at what this technology is going to do, we don't know where it's going to go.
Brian
Yeah, technologically it is not. This is a transformative change.
John Strand
But. But sinking billions of dollars and being like, this is where it's going to go. It's like saying, I'm going to. I'm going to sink billions of dollars into Betamax or laserdisc.
Corey
Yeah, you put, you put, you put.
John Strand
Radio on the Internet bhis Betamax laserdisc argument that happened back in the 80s and now have people putting $500 billion on Betamax. Like, that's where the bubble is going. You're gonna have.
Brian
There's the T shirt and the bumper sticker.
Ralph
Right.
Corey
So. And.
Ralph
And the wild part is you don't have to wait four years to find out whether your investment was good or bad. You get to find out in Two months. Two months, right.
Corey
That's the crazy.
Ralph
That's the wildest part to me personally. It's just how fast.
Corey
All right, let's talk about some normal hacking stories. There's some fun ones we can just, we can just quick fire through DDoS.
John Strand
Story that I think is kind of interesting.
Corey
Apparently there's multiple DDoS stories that apparently.
John Strand
This company Deep Seek is currently facing a large scale cyber attack.
Corey
Yes, yes. So this is, this is what I was talking about. Right. So they, this is, I mentioned this. So this company deepseek that published this model that we've been talking about, they've also have like a chatgpt instance of it and it got smashed. Now, whether this is intentional, if it's just a bunch of kids that can't get on TikTok getting excited about it, we don't know. But it's being attacked and they have new user registrations disabled. So essentially, you know, it actually is pretty funny. Look at the criteria. Zoom in a little bit on the picture. It says only login via email, Google or Plus86 phone number, which is China mainland supported in your region. You can see the writing on the wall there. This is. Obviously they've already open sourced the stuff, but you have to have, have big hardware to really use it. But yeah, I mean, there's another DDoS.
John Strand
So switching out. Like I, I agree with Corey. We got to move on. We've got other stories.
Corey
John, where you want to move?
John Strand
I just posted.
Corey
This is the right move.
John Strand
Security Cyber Review Board is cleaned out in Trump's move to eliminate. Eliminate misuse of resources. This impacts cisa. This impacts the research for SALT Cyber Security Review Board. Oh God. Some of this stuff is going to be so hard to tiptoe around. But I agree with the idea, like, hey, the government should be more efficient. I do agree with that. But like this review board and what they're funding with CISA and looking at cyber attacks and understanding cyber attacks and getting in front of cyber attacks and sharing information, like we kind of need that in the industry and there's a bunch of research into cyber attacks that have happened over the past year that the investigations are done, like just lock up the door and move on. And part of me is like, you know, a lot of the businesses that were kind of behind this move, that were pushing politically in the back end, they're like, we don't need the government, we don't need the government to tell us what we did wrong. We will self regulate. And I honestly, this is tied to The AI stuff and everything, it's like, when has these companies doing self regulation ever worked well and I'm hoping that it's just a temporary thing and like the funding and stuff comes back like you're looking at some of the investigations like Salt Typhoon. What's the healthcare company that was compromised? It's 190 million records.
Corey
Well, see, John, it's fine because they said, they said they think they removed the hackers. So it's fine, it's fine. They said, we think the hackers are gone. We're, it's, it's fine.
John Strand
Can. I don't want to sound cynical, but like all the stories we're talking about in our industry, I kind of see all of this as an absolute win if I'm being completely cynical, right? If it's just about money for computer security, right. If you're removing the agency that's supposed to teach us how to get better and protect ourselves against, there's just going to be more cyber attacks. And if we're looking at like the AI stuff, like now all of a sudden AI is scattered to the winds and anybody can do it at this point. Anybody with 150,000, $160,000 can pull it off. It's just like, that's good for our industry. Like, I guess there's a great quote and I think that a lot of the stuff that we're seeing right now is chaos as a ladder. And if we're looking at offensive security, we're looking at just general security. There's a lot of chaos out there right now.
Corey
Well, so John, here's my take on this. It's so the government is the thing that ties people's, people's value in their own data and companies together. Because companies will just get breached all day, every day and not give a crap. And it'll have no effect on their bottom line. And they'll just. There's no money coming back to the cyber security industry because they're just paying ransoms and moving on, right? Like that just, it's, it's great for the criminal industry. But the problem is the government is the thing that says you're not allowed to get breached. If the government doesn't say you're not allowed to get breached, they'll just be like, well, bre. Getting breached is just part of our bottom line. It's called, it's called breakage. And it's there in the, it's there in the P. Ls right there. Ransomware. 50, $50 million a year. We just pay for that. All the people's data that gets leaked, we're not too worried about it. Like, I know that's the cynical side of things that I look at. I'm like the government at the ones that make it illegal or wrong to get breached. If they decide it's fine, they're just going to be like, oh, well, we just get breached every year, like T Mobile style. It's fine.
John Strand
Yeah.
Brian
They Even like, didn't PayPal get to pay some, like, paltry sum for their breaches to where it's like, oh, well, the fine is less than what it would cost to remedy the situation.
Corey
Yeah, I mean, that's just my. That's how I see it. But the, you know, arguably the government are the ones that are like, you can't get breached. Like, that's kind of a. It's a state of mind, right?
Ralph
It is a state of mind. That's right.
John Strand
Well, speaking of not getting breached, I really want to talk about the United Healthcare breach because it is so effing crazy. So what is it? 190 million records. And it was a subsidiary of UnitedHealth that was actually compromised. But of course they were. They were part of the UnitedHealth. So when you're looking at this particular case, like UnitedHealth paid the ransom, but what's interesting is the actual people that did the ransom never got paid out. So you have this weird scenario where Group A hacks UnitedHealthcare and then Group B is basically who is kind of the front for this attack. And I think it was. I think it was Alfie or Black Cat, I can't remember. But once they got the. I think it was $22 million, they never paid the actual hackers that got the actual data. And then the hackers went back and then re ransomed UnitedHealthcare and got a second ransom out of them, which, you know, you're kind of getting into that honor amongst thieves type situation. But this is just like UnitedHealthcare. Like, that's a significant amount of money. And I think they talked about the total hit was like, in. But 1.6.
Brian
Yeah, 2.9. But is this really a significant hit for them?
Corey
Percentage wise, yes, but only because it was ransomware. Only because it was ransomware. If it was any other kind of cyber security incident, it would have no impact. Like, it only matters like this. Basically the problem, you know, to expand on John's previous thing like this basically means, like, the only kind of threat we're trying to defend against is one that impacts our, our ability to operate. If it's just data being leaked, we don't really care because there's no fines, there's no whatever. It's only if our systems go down, then the numbers start to get big really fast, which is what happened to change healthcare. They were down for, you know, they couldn't process claims, they couldn't do anything for like was it two months? It was a long time. So like that was why they had this huge billion dollar impact. If it was just data leakage of 190 million people that would be like, that's free. That's already happened multiple times with. Yeah, that. So like that it's kind of crazy to think about, but that number is huge because it was ransomware. But if it wasn't ransomware and it was just like extortion or you know, that kind of thing, they'd be like, eh, whatever, you already have the data, it's fine.
John Strand
So I did want to ask you all, like I just shared another link, the blinking computer article that was talking about the payment of the ransomware. So it was Black Cat was behind the attack. They stole the credentials, they got six terabytes of data as well. UnitedHealthcare Group confirmed it paid the ransom, got the decryptor, prevent the attackers from publicly leaking the data. They paid allegedly $22 million according to the Black Cat ransomware affiliate who actually conducted the account. The actual attack, the ransom paramet, was supposed to be split between the affiliate and the ransomware operators, but Black Cat suddenly shut down and kept the entire ransom for themselves. And there's a great screenshot if you scroll down, it's right there. If we can kind of zoom in on that, where they talk about, you know, what they gained access to, how they weren't paid, and then they say this is where it gets worse for UnitedHealth threat actor behind the actual attack said they did not get the data, they did not get paid the way that they were supposed to.
Corey
This is hacker tears.
John Strand
Yeah, hacker tears. And then they literally went back and re ransomed UnitedHealthcare again.
Corey
Just so, absolutely.
John Strand
Just brutal attack for them, you know, basically like, hey, you should pay the ransom. They pay the ransom. The people that took the ransom are like, well we're not going to pay the people that did the hack. That's a lot of money. See ya out. And then they got ransom again.
Corey
Yeah, well this is, this is exact. Like people are going to jump on this and say, well this is why you don't pay the ransom. Right, but this is the whole no honor among thieves thing. You basically have to accept these risks if you're considering paying. I will say this does. They chose this. It's basically a one time exit, right? Like now no one is going to work with this affiliate group. No one's going to work. I do think Alfie actually is like shut down. Like I think they formally said like we're done. But yeah, I mean this basically is their one time exit from being this ransomware affiliate. The affiliate program works, it has to work on trust. Ironically, criminal enterprise works on trust. The people who are actually doing the hacking have to trust the affiliate to give them their fair share and to communicate with the people and get the payout. So now this ransomware group is dead, right? Like the, the LF name is dead because no one will ever trust it after they just said bye bye. But yeah, I mean the hackers behind it are probably alive and well and continuing. They'll probably switch to another affiliate program. There's like a thousand of them. So. Yeah, yeah, it's a weird, weird situation.
John Strand
Yeah. Middlemen.
Brian
Well, it's sort of like everybody has a price, right? They found that price. They're like, okay, we're out.
Corey
Yeah. Oh, I, I tell this, yeah, it's the same thing we say in hacking. It's like if you're gonna take, you better take enough to live in a non extradition country for the rest of your life.
John Strand
22 million is a good start to that. Yeah, reputation is everything.
Corey
So let's, let's run through a couple of fun little hack articles. So there was a issue with Cloudflare, which it's not really an issue, but it is kind of a unique way of using CDN to kind of take advantage of a user's geolocation. So this is essentially an article posted. Security researcher figured out if you send someone an image on Discord or other chats, then you can check essentially what region of Cloudflare cached that image and then depending on that, it'll tell you roughly where that user is based. So it's kind of a fun little functionality thing, not really a vulnerability, but it would give you. If someone's like, oh, I live in China, and then you send them this and it's like, like New York City or whatever. Obviously VPNs are at play. So like it is, it's not really that big of a deal, but I did think it's kind of funny to be like, it's basically like a mini docs. It's like you don't live in China, you just hit this from New York City, like, you know, or whatever. I know. It's kind of a funny little idiosyncrasy of how clouds and CDNS work. That's. The Subaru hack was pretty interesting. Did anyone see that?
Ralph
Did they donate all the money to dogs?
Corey
No, but they probably should have. So this is an interesting kind of something that's like shouldn't been, shouldn't have been a thing, but was a thing. The basically security researchers as they do, they took the Subaru app and they took it apart and they found an API call that essentially lets you take over someone's Subaru account and then you can become like an authorized user of their account, which means you can like track their location, you can unlock their doors, like do all kinds of whatever control is enabled in this app. The security researcher did disclose it to Subaru and they fixed it in like very short time. It was like the next day they fixed it.
John Strand
Which is awesome. Hats off to Subaru for that.
Corey
Which is awesome. But it is terrifying that at least from my perspective and anyone who's done an API or app pen test reading this, you're going to be like, which by the way, it's really well written. Kudos to Sam Curry for the write up on this.
John Strand
Very nice.
Corey
It's like, it's seriously like the most basic. Like you could use this as a how to hack an API like class. Like it's so, it's so like literally if you watch the video, it is so transparent. It literally looks like something we would put in a pen test report. It's like they take that they, you know, decompile or not decompile, but monitor the traffic coming out of the app. Find an API endpoint, realize that there's this issue with the API endpoint. Send a repeater request, take over the account, get the email that says you've been at it as an authorized user, then log in to the app. Done. Like it's like, so it's like a minute video that just exposes obviously it.
Ralph
Was an off bypass. I didn't read the article, so I mean, enlighten me what the exact little thing was. They just went checking to see if you were the authorized users if you hit this URL.
Corey
So I guess like the problem is that the password reset function would just function without a ver. Without confirmation. So it's a classic. I don't know the exact API term for it, but it's basically the API. The two components aren't talking to each other. It's like, well, hopefully they have a token. But then the code doesn't actually check if the token is valid. Well, it's like, oh yeah, I reset.
Ralph
The password right on the password reset. So typically, if you were rolling this out, right, there would be some kind of token in the URI to say, hey, this is a valid request that was sent in and was checked. Like this token was sent in the email. So that proves that you are the only person to receive that email. Right, but what they, what I guess you're saying is that they could have put anything in that token. It didn't matter as long as they just said which email they were trying to receive.
Corey
You just skip that step. Yeah, you skip the confirming the token. You're like, no, I already got the token. Now reset the password. Right, yeah, I got you.
John Strand
But, but here's like, basically you need to have the victim's last name, zip code, email address, phone number, or the license plate. And then any.
Brian
Or.
Corey
That's an or.
John Strand
Any or.
Corey
So if you have their last name, their zip code, which isn't hard, that's.
Ralph
So easy to get.
Derek
Those are probably API calls too, right?
Corey
You can just drive around and just.
Ralph
Find Subarus right there.
John Strand
Here's, here's what you can do if you have that. Start, stop, lock, unlock. Retrieve the current location. Retrieve the location history for the past year within five meters.
Ralph
This is a car jacket.
John Strand
Retrieve the personally identifiable information for any customer, including their emergency contacts, authorized user, physical address, billing information.
Corey
So nothing. No impact.
Ralph
No impact.
John Strand
Access. What is it? Miscellaneous user data, including support call history, previous owners or dominant readings and sales history.
Corey
Wow, that's pretty bad. I mean, it's a stalker's dream. It is a stalker's dream.
John Strand
Guys, boys and girls, go check out this article.
Brian
Yeah, I also really like the like the paragraph of like bypassing 2 fade.
Ralph
Where they just go. We thought of this.
Brian
We did try the simplest thing we could think of, removing the client side overlay from the ui and it worked.
Ralph
Like, oh my God.
Corey
That was their. Yeah, you can do this with the.
Ralph
Their whole authentication model was a modal.
Corey
Yeah. Yeah. Well, you could. Yeah. Again, it's like, it's not. It's nothing new. If you're an API tester, you're going to be like, yeah, I did this on the last six pen tests. But it is concerning to see that Subaru didn't get a pen test on this. Or at least not a very good one.
John Strand
I gotta be honest. One of those things that, that I dove into our history and I'm like, did we pen test this app? Like.
Corey
No, we would have found this, dude. You would have found this.
Ralph
Jesus Christ.
Corey
But, yeah, that's pretty crazy. Does that count? Another.
John Strand
Another 15?
Corey
Ralph, you are expensive to have on the podcast.
Ralph
I'm gonna. I'm gonna reflect for a moment and realize that I can't swear.
Brian
Damn it.
Corey
Right?
Brian
Oh, damn it.
Corey
That's at least two. That's at least two. So keeping on with the quick hits.
John Strand
The quick hits, by the way, we're keeping this going for all of 2025 and, like, for all of our. All of our news segments. And I will write one big check to the eff at the end.
Corey
We've got to be over 100 bucks by now.
John Strand
Like, I want it to be one of those ridiculously large checks. And I'll go to the john.
Corey
I have bad news. The large check cost a thousand dollars, and you're like, that's fine, it's worth.
John Strand
It, but we'll take it there.
Ralph
We'll hand it, like, can we have a small one?
Corey
So, real quick, last quick little story, because this is terrifying and just. Just dumb. Basically this is the story about the European power grid, Ryan.
John Strand
European power grid, yeah.
Corey
So basically this is like a hacker story, similar to the API thing. Like, your friend just happens to have a Subaru and you just happen to poke with the app. This is two security researchers, I believe, or maybe more, who are living in Berlin and are like, just looking at RF signal signals, which, if you've ever been a hacker, you've probably done this. You just. Just fire up your SDR and you kind of hang out and you look at what's going on and they realized, or they. It's funny that actually says in the article, they noticed antennas on light poles and were like, what do those do? Classic hackers. So basically they figured out that there was unencrypted communications for, like, almost all of Central Europe's power grid that's essentially used for, like, shedding load. Essentially. They just replayed. They captured some of the RF signals and then replayed them and realized, oh, so this just shuts off some stuff on the power grid. So it's like, it's kind of an OT SCADA type of story, but essentially, like, this is a legacy thing where you can just capture these requests. There's no authentication at all with the request, and there's no encryption either. So it's like UDP for power grid management, which this isn't super unlike, or this isn't, like, rare. A lot of OT and SCADA stuff has no encryption, no authentication. It's just an afterthought. But it is weird that it's being used in power grids to actually actively turn on and off different relays and things.
John Strand
Yeah, we're just going to see another reason for people to ban Flipper Zeros.
Ralph
Yes, I know, right?
Corey
It's like Central Europe is. Yeah, Central Europe is now banning flippers, because that's going to solve it. But just like banning GPUs to China will fix all the AI problems.
John Strand
Problem solved.
Brian
Oh, man.
Ralph
If you back them in a corner, they might find a way out. That's all I'm saying.
John Strand
Full circle. Hey, to close things out, do you guys want to take a guess on what is the value of Nvidia losing 17% of its value? Like, what? The real dollar amount is like a.
Corey
Hundred billion more than we'll ever see in our whole lives.
Ralph
Well, weren't they the first? Not first. It's a trillion, right?
John Strand
No, 500.
Ralph
They were the first.
Corey
They were. No, no, Apple was the first trillion. And that was, like, at least three years ago.
Ralph
Well, no, I know, but I think Nvidia was over a trillion. So that's Nvidia.
John Strand
Yes, they were $589 billion.
Corey
Literally, the price drop. Could have built that other building.
John Strand
Could have built it. Could have built a building also. It also could have solved hunger in America, could have afforded to basically educate every American child all the way through a doctorate program. We could have gone back to the moon probably two, three times.
Ralph
Or we're going back either way, but that's irrelevant.
John Strand
Funded 100 cougar runs, diplomatic immunity. That's a good. That's. Dude, deep cut. Deep cut. All right, so that's it, everybody. Thank you so much for coming. We will see you all next week. Be sure to check out our podcasts this week. We also have Mile, like, mile high Hack and Fest in Denver, Colorado. We are coming to your city, so please check it out. If you're in the Denver metro area, and if you can't attend locally, we'll also be doing it virtually and you can register there. Thank you so much, everybody, and we'll see you next week.
Podcast Summary: "Fake Australian" | Talkin' About [Infosec] News, Powered by Black Hills Information Security
Release Date: January 29, 2025
Episode: 2025-01-27 - Fake Australian
Host/Author: Black Hills Information Security
Black Hills Information Security presents their engaging weekly infosec podcast, where a team of experienced penetration testers and ethical hackers dissect the latest cyber attacks, breaches, and technological vulnerabilities. In the episode titled "Fake Australian," released on January 29, 2025, the hosts delve into groundbreaking developments in AI, significant cybersecurity incidents, and the evolving landscape of tech investments.
The episode kicks off with the hosts exchanging light-hearted remarks, setting a casual and collegial tone. Corey jokes about hosting styles and engages in playful banter with John, Brian, and Ralph, establishing a friendly atmosphere.
Timestamp: [04:34 – 09:44]
The primary focus of the episode revolves around deep seek R1, a new Large Language Model (LLM) developed by a Chinese company. This model is noteworthy for its efficiency, requiring significantly less computational power than its Western counterparts like OpenAI's ChatGPT-01.
Technical Insights:
Political and Economic Implications:
Impact on Nvidia and IT Infrastructure:
Timestamp: [09:44 – 22:35]
The hosts discuss the ramifications of deep seek R1 on venture capital (VC) investments and the broader stock market, particularly focusing on Nvidia’s financial vulnerability.
VC Funding Concerns:
Stock Market Reactions:
Transformation of AI Accessibility:
Timestamp: [43:38 – 52:42]
A detailed examination of the ransomware attack on UnitedHealthcare reveals the complexities and challenges in cybersecurity defenses.
Attack Details:
Implications for Cybersecurity Practices:
Lessons Learned:
Timestamp: [54:13 – 58:58]
The hosts discuss a significant vulnerability discovered in the Subaru app, highlighting the importance of thorough API security testing.
Vulnerability Breakdown:
Technical Insights:
Security Implications:
Timestamp: [58:56 – 62:30]
A concerning incident involving the European power grid showcases vulnerabilities in legacy operational technology (OT) and Supervisory Control and Data Acquisition (SCADA) systems.
Attack Methodology:
Security Shortcomings:
Preventative Measures:
Timestamp: [42:23 – 61:38]
The hosts briefly cover several other cybersecurity anecdotes, providing swift insights into diverse topics:
Cloudflare Geolocation Exploit:
API Testing and Penetration Testing:
European Power Grid RDH Vulnerability:
The episode wraps up with reflections on the rapid advancements in AI and their broader implications for technology, security, and investments. The hosts underscore the transformative nature of current technological trends and the necessity for robust security frameworks to navigate the evolving threat landscape.
John Strand sums it up, “A lot of the chaos out there right now is chaos as a ladder. And when we're looking at offensive security, we're looking at just general security. There's a lot of chaos out there right now. [46:07]”
Brian adds, “The broader societal context... this is a transformative change. [41:36]”
The hosts encourage listeners to stay informed and proactive in their cybersecurity practices, emphasizing the importance of understanding and adapting to the fast-paced changes within the infosec domain.
Notable Quotes:
John Strand [08:21]: “If you can do it with less power, you can do it with fewer GPUs and you can kick all these things out. That effectively kicks the entire IT stock market...”
Brian [09:44]: “Nvidia took a 17% hit on the stock market today...”
Corey [51:18]: “This is the whole no honor among thieves thing...”
Ralph [56:52]: “They just skipped confirming the token. You're like, no, I already got the token.”
Final Remarks:
"Fake Australian" provides a comprehensive examination of pivotal events shaping the cybersecurity and AI landscapes. The hosts at Black Hills Information Security deliver insightful analysis, blending technical expertise with engaging dialogue, making complex topics accessible and compelling for both seasoned professionals and newcomers to the field.
For more in-depth discussions and the latest in infosec news, tune into subsequent episodes of Talkin' About [Infosec] News.