Loading summary
A
You're listening to the Cyberwire Network powered by N2K In July 2023, the securities and Exchange Commission imposed new cybersecurity reporting requirements on publicly traded companies in the United States. Concerned that companies might be under emphasizing the impact of cyber incidents to the potential detriment of investors, the SEC required companies to report material cybersecurity incidents within four business days of determining materiality. Materiality was defined as a substantial likelihood that a reasonable investor would consider the information important for investor decisions. Sounds reasonable at first blush, doesn't it? Contextually, though, this change set a ripple of fear through companies. The SEC was in the midst of investigating the solarwoods breach and for the first time in history, had announced its intention to pursue charges directly against a ciso, Tim Brown. This, combined with the vague phrase reasonable investor, pushed many companies into a better safe than sorry approach to incident reporting to the sec. If a company felt that an incident might even potentially be material, they sent notifications to the regulatory body. I heard better to create processes and generate paperwork than become the next SolarWinds time and time again from leadership from various companies. The end Result? As of February 2025, only 14% of all 8K filings for security incidents actually had declared a material impact. Laws and regulations are often kept specifically vague in order to account for both innovation within the tech stack and the evolution of public policy. That said, regulation made without sufficient input from those who will be impacted may cause more bureaucratic harm than actual good. For this reason, not only are most changes to regulation opened up for comment prior to being enacted, but many federal agencies employ civilian advisory committees to better inform policymakers with the depth of their experience. Recently, though, we have seen the role of the advisory committee pilloried and eliminated as a waste of resources. In January 2025, the administration dismissed members of all of the Department of Homeland Security DH advisory boards. This included the dismissal of the Cyber Safety Review Board. The CSRB included public and private sector experts who issued reports and recommendations addressing major cybersecurity incidents. The dissolution of the CSRB represents an additional limitation on the ability to provide practical cyber expertise to government officials. While various congressional committees can get expert testimony from industry leaders and from members of the U.S. cyber Command, the loss of an advisory committee to not only investigate major cyber incidents, but also to provide a non governmental, non military perspective on issues represents a potential gap in federal knowledge as it addresses cyber challenges from a legal and regulatory perspective. If between 20% and 80% of the US critical infrastructure is in civilian hands, it is important that civilian Experts have a standing structured mechanism to make their concerns known. My two cents welcome back to CISO Perspectives. I'm Kim Jones and I'm thrilled that you're here for this season's journey. Throughout this season, we will be exploring some of the most pressing problems facing our industry today and discussing with experts how we can better address them. On today's episode, I had the opportunity to sit down with Ben Yellen. Ben is the Program Director for Public Policy and External affairs at the University of Maryland center for Cyber Health and Hazard Strategies and and teaches at UMD's School of Law. Today's conversation revolves around examining how regulations have been evolving in recent months and how this has been impacting businesses. Let's get into it. Good. Welcome to the podcast and thanks for agreeing to do this.
B
Absolutely. Good to be with you, Cam.
A
Fantastic. So you and I, while we have mutual acquaintances, have actually never met before us sitting down here. So not just for my listeners, but for me as well. So tell me about Ben Yellen.
B
Well, where to start? First of all, I'll start with some shameless self promotion. I am the co host of the Caveat podcast which is on the CyberWire N2K network. We talk about law and policy of cybersecurity data privacy with a focus on issues related to electronic surveillance. So highly recommend. If you haven't listened to that, that's our baby. That's a good show. I co host that with I want.
A
To interrupt you and double down on the Shameless Promot. It is an absolutely fabulous podcast so it is absolutely worth your time. So if you're not listening, add it to your list.
B
Fantastic. I appreciate the endorsement for my day job. I work for the University of Maryland center for Cyber Health and Hazard Strategies. So we are a not for profit organization housed within the University that does academic work including teaching courses at the University of Maryland Cary School of Law and and consulting work on public policy issues related to cybersecurity emergency management. We've had an increasing focus in recent months on state level policy related to artificial intelligence. And then I teach a couple of courses at the law school as well, including one on national security and electronic surveillance. So that's the very basics of me.
A
I got the right guy. This is good. So we're going to touch upon a lot of what you've already mentioned, Ben. So one of the things I want to take a look at in this episode is we want to start with some of the changing landscape around regulation. So we're going to take it layer by layer. So let's start at the highest level and let's start federal. What are you seeing at the federal level regarding existing regulation, pending regulation, viewpoints on regulation, et cetera? Talk to me.
B
So we've had a change in the administration, obviously, in case you've been in a coma for the last year. So we have a new administration, they come in with their own policy priorities. I think for the first few months things were a little bit chaotic because of a lot of the layoffs that took place as part of the Department of Government Efficiency effort. So we saw layoffs, for example, at cisa, we saw large scale layoffs. At National Institute of Standards and Technology, we saw certain agencies pretty much shut down entirely. So things like the Consumer Financial Protection Bureau, I think we're out of that immediate storm at this point. We can start to take a higher level view of where the regulatory landscape is headed now that we kind of know where the Doge bomb, it's going to stop exploding. Right. So beyond staff reductions, we had a regulatory freeze on all Biden era regulations, which is common when you have a new administration. So, you know, some of that was for things like artificial intelligence, where we are in our relative infancy in terms of federal regulation. We just recently got the administration's newest guidance on artificial intelligence, which represented a major change in terms of other regulatory changes that I've noticed when it comes to cybersecurity, as you'd expect with a more conservative administration, there's been a shift from mandates, so less of a focus on mandatory compliance and more around risk based resilience. And we've seen this in emergency management as well. And I think both with cybersecurity and emergency management, there's been an increased focus on having a risk based resilience, so developing rules of the road for offensive cyber operations against our adversaries.
A
So I'm going to dig down on a few pieces of this, as you can imagine. So let's start. And I appreciate you for all of our listeners who are filling out your buzzword bingo cards. I appreciate you, Ben, hitting AI early so we could fill out that space on the card. Let's start with AI. You talked about a major shift in terms of the administration's view on AI. So I read the initial executive order and I read President Trump's AI plan or AI strategy that came out in June or July. From a fundamental standpoint, from your standpoint, what is the major difference or differences in the approaches between those two documents?
B
So there are certainly some similarities. I mean, they both mentioned things like Oversight and accountability and transparency. But then there are obviously differences in principles reflecting the ideological makeup of each administration. So as you would expect, there was a focus in the Biden era on bias and equity. Those are two very separate things in the context of artificial intelligence. But when we're talking about bias, like the use of AI applications for facial recognition and in hiring, I think there was a major focus on that and equity, just access to some of these tools. Equity is kind of a word that's just not used in the Trump administration. So if you look inside Trump's approach to artificial intelligence, it's more free market focused. Deregulation is the name of the game. We want to foster innovation. It's important for us to say, competitive on the international stage. The one area of commonality relates to building AI infrastructure in the United States. So I think there's some commonality there, but less of a focus on kind of the, the bias and equity and how can we put guardrails around these new artificial intelligence tools? And more of a focus on, let's unleash the market and see how we can be more competitive on an international stage that we don't lose out to China and other countries.
A
Okay, so let's shift gears and talk about risk based approaches to resiliency versus setting up regulatory mandates within the environment. So fundamentally, as a senior cyber guy, one would think I shouldn't have a problem with that because theoretically, everything that I do is designed to balance risk. I often argue both in the classes I teach as well as in episodes of this podcast, and when I go out and speak and evangelize that absolute security, by definition is an oxymoron. I can secure you absolutely, if you shutter your doors, wipe your computers, wrap them in Lucite and drop them in the marine ass trash. But then again, you ain't going to make no money.
B
Yeah.
A
So we need to understand that it amounts to risk versus reward calculus, which is hard for a lot of particularly commercial enterprises to understand. You know, if I tell the CIO that I want to go from 99% uptime to 99.99% uptime, he or she knows exactly what he or she needs to do to get there and can measure that. At the end of the day, though, if you tell me I want certain things to happen within the environment, I can't give you the same type of binary guarantee. So on one end, a risk based approach to tackling cyber versus a regulatory framework to it would seem to make sense. The challenge that I have with that though, is accountability. And responsibility that goes with that. So what is that going to do in terms of responsibility? Or are we really just saying, hey, users, if you choose to do this and they've accepted this risk, sucks to be you. And what does that do to me as a cyber professional when, as usually happens, you know, I don't want to do this, I'm just going to accept the risk. Talk to me about those.
B
Yeah, that's really interesting and well put. I mean, on my worst days I kind of have the dismissive attitude of if you are so hostile to regulation, like that's fine, we'll let you destroy yourself if you don't comply and the worst happens to your organization. But the better angel in me realizes that we are part of a larger ecosystem. And, and if you look at the largest cyber incidents, they have massive downstream effects. And I'll give you an example. So there was a massive cyber attack on Change Healthcare, which is not even a health provider. It is associated with United Healthcare Group. So they do like insurance processing, that sort of thing. They're kind of like a middleman organization. When you look at the downstream effects of that attack, it doesn't just affect the company, it doesn't just infect the insurers, it, it starts to affect the providers because if they can't process claims or if they don't have proper data for medication management, then they can't do their jobs in medical facilities. And if you get really downstream from that, then ambulances have to be deferred from the emergency room that they usually go to because this ER is completely hamstrung by the cyber incident. So again, that was something that affected an entire ecosystem, even though the attack was just on a single entity. The other thing that bothers me a little bit about a risk based approach, and I think this kind of echoes what you say a little bit, is that it's, it's very reactive instead of proactive. So it's good at addressing risks that we already know exist, but it is not good at putting up guardrails around the entire industry to protect us from risks that do not exist.
A
So I'm going to push back on that just a little bit. As a guy who lives in that space, one would argue that today, even with a heightened regulatory framework, all of our analysis tends to be somewhat predictive and we're all dealing with unknown unknowns as opposed to the former Defense Secretary and the known unknowns that are out there.
B
I was just going to say that's the classic Rumsfeld quote.
A
Yeah, we're all dealing with that. In the environment, regulation by definition tends to be reactive. In many cases we see something happening, we say, oh crap, this could be bad. Or enough constituents have complained about it, we need to do something about it. And even when we do, we are still trying to balance from a regulatory framework. How do we give people a good sense of peace of mind within the environment without restricting the ability to innovate? After 9 11, they took steel knives out of meals in first class. And I'm sitting there saying, you know, I used to teach self defense for 15 years. I can do more damage with the pen in my pocket than I can with the knife here. Yet you're not restricting me from using pens.
B
The good news, Kim, is we just started not taking off our shoes at the airport. Yeah.
A
So that's a good thing. That's a good thing.
B
Times, they are changing.
A
They are changing. So when we talk about, you know, a risk based approach or in any environment, regulated or otherwise, there are always going to be a set of unknowns there. So if I am doing appropriate risk analysis, I can account for that set of Black Swan events and create frameworks that have an ability to deal with at least a goodly portion of the unknowns. So any well structured risk based environment is always going to react to some level of unknown, even within a regulatory standpoint. But if I'm doing things appropriately, I should be able to react well. So is the question the fact that we may have unknowns we aren't prepared for, or is the question that we're not doing continuous risk management, as in we've decided we're accepting this risk today? Screw you, I'm not going to look at it again, which leaves us unprepared to deal with those unknowns down the road?
B
No, I think there's a lot to that. I mean, first of all, I should say the government's regulations, especially when we're talking about federal regulations, are never going to keep up with industry, just the nature of the regulatory process. It's designed to be slow. So it's not an exaggeration to say that we are always in the process of regulating technology that came on the market a decade ago. We keep finding out about agencies that are still using fax machines and floppy disks. So I think it's less. What concerns me about a risk based approach is more of the execution of it. So especially like do smaller organizations have the infrastructure to know how to assess risk? And that's one aspect of it. And then is it something where when we do need regulators to step in. They don't have enough resources to either ensure compliance or even encourage compliance because there's such a balkanization of risk across different types of agencies. So I would say those are my primary concerns. I I think your broader point is is correct that a risk based approach does allow us to basically be nimble. Yeah, but you know, those are just a couple of concerns I have. I mean I think you could take a hybrid approach which is kind of what the European Union has done with its AI act, where they have things like risk tier, so different regulations apply depending on the level of risk and the risk isn't specific to one threat vector. It's what's the worst thing that could happen if XYZ was attacked.
C
@ Talas, they know cyber security can be tough and you can't protect everything. But with Talas you can secure what matters most. With Thales industry leading platforms, you can protect critical applications, data and identities anywhere and at scale with the highest roi. That's why the most trusted brands and largest banks, retailers and healthcare companies in the world rely on Thales to protect what matters most applications, data and identity. That's Thales T H A L E S Learn more@the TalisGroup.com Cyber AI adoption is exploding and security teams are under pressure to keep up. That's why the industry is coming together at the Data SEC AI Conference, the premier event for cybersecurity, data and AI leaders. Hosted by data security leader Cierra. Built for the industry, by the industry, this two day conference is where real world insights and bold solutions take center stage. Datasec AI25 is happening November 12th and 13th in Dallas. There's no cost to attend. Just bring your perspective and join the conversation. Register now@datasecai2025.com CyberWire Foreign.
A
Let'S also talk about a swing towards offensive operations. So I'm going to lead in with a bit of a story. I For those who may not be familiar, Google recently announced that it was going to take a stance to taking a more offensive approach to the bad guy. The bad guys came back and said we have your data and unless you fire two people within your threat team, I'm assuming one of them is one of the people that was named in the announcement. We're going to release your data. So Google said we're going to do this. The bad guy said, we'll see you in raise. So obviously the pushback against offensive operations is multifaceted to include the ability of botnets and doing horror Harm to somebody who doesn't know they're actually attacking your system? Because my system has been hijacked. And what if that person who doesn't is attacking your system and that system's been hijacked, happens to be a medical system? I could cause harm needlessly to an individual. There's still some of that, but also the ability and what the bad guys may be doing towards that environment. So I would love your opinion regarding taking a more bellicose stance a la the supposed executive order being signed today renaming the Department of Defense back to the War Department, which is supposed to be signed today. Taking a more offensive approach has some appeal, but there are some challenges there. I would love to get your perspective on this.
B
Yeah, I mean, it's one of those things where because of my limited technological aptitude, which I'm sure comes through, I always try and think of this in terms of things I know, which is like conventional warfare, there is always going to be risk in offensive operations. There's always going to be the risk of escalation. So if you are going to be engaging in offensive cyber operations, I think it means you have to have a certain expectation of your own capabilities that you have the resources to outwit our adversaries. And if this ends in a full on conflict that we, as Google, for example, are setting up a cyber disruption unit that can beat the best that North Korea, China, Iran, et cetera could throw at us. So I do think there's a certain appeal to that type of offensive cyber operation.
A
The question that still is not clearly answered there is when does hacking, using the generic term and or hacking back, become a potential act of war? And war has a lot of big B baggage. I'm probably one of a handful of people, more than a handful, a handful of people outside the federal government are, or pure academia that's actually read the TALON Manual, you know, which was the international document that talks about international law applying to cyber operations. And if you read that as well as several of the other documents on it, what constitutes cyber war versus cyber warfare as a theater of operations during kinetic conflict is not well defined. So if we are in a point where we're throwing kinetic conflict and we're throwing big pieces of steel at one another, there's clearly defined of what constitutes appropriate and lawful and unlawful cyber operations. But is there a scenario where, okay, I'm sitting here and I'm Google and I have been attacked by North Korea, okay, I go back then and use the resources that Google has to shut down a portion of the North Korean infrastructure. I am a international company incorporated in the United States whose resources are based here, and I have now launched an attack on a nation state that has shut down part of their infrastructure. So the question arises that do we get to a point where that lack of definition create, say you've committed an act of war, what's your thought process?
B
No, I mean, I think a couple of things. It's, it's naturally legal gray zone because it's somewhere between what we would call espionage and what we would call warfare. So that's ambiguous. You can initiate cyber attacks plausibly without triggering the type of conflict threshold that would invoke these international agreements. So that's bad. And then the problem you addressed is these non state actors. So like, is there some principle in international law that if Google in its cyber offensive, cyber operation is shut down North Korean infrastructure, there would be some type of acceptable or justified responses with pieces of steel, giant pieces of steel, against the people of the United States? Yeah, I don't, I don't think our international frameworks are built to account for that type of scenario.
A
Yeah. But yeah, here's another interesting take on that. Non state actors causing harm to infrastructure. That sounds to me like part of the fundamental definition of terrorism.
B
It sure does. It's kind of funny to think about the Googles of the world in the same breath as non state terrorist organizations.
A
Well, you use the term non state actor and it is a correct term obviously. But yeah, Google is Al Qaeda. And you know that that's, that's kind of weird.
B
I mean, we had to adapt our domestic antiterrorism laws post 9, 11 to account for the fact that we weren't fighting against nation states anymore. And yeah, I foresee that from a domestic perspective, our government in the face of a cyber 911 would be nimble enough to make those changes. There could be a second Patriot act that says any cyber attack by a non state entity affiliated with one of our adversaries is considered an act of war against the United States and justifies a kinetic response. Like, I think we could thump our chests and do something like that pretty easily.
A
And when China does the same thing.
B
Do you know where, do you know where a good fallout shelter is? How's your basement?
A
I'm in Arizona. We don't have basements in Arizona. But yeah, that's that, you know. So I'm glad we're having this conversation because there are days when I bring this up and my peers look at me like, you're crazy. Or you're on some sort of heavy hallucinogenic and haven't and aren't sharing. So, you know, it sounds out there, but as we change this framework, it just feels like we're not necessarily thinking about the pieces and the parts, et cetera.
B
So, I mean, I think this Google thing is a good example to start rethinking the processes, because I think it's. It's a unique circumstance if one of the most prominent US Companies hurts civilian infrastructure again in a country that we are not sympathetic to. I get that. Then what does that mean for United States in the context of international law? Yeah, to the extent that international law actually exists in the first place.
A
Which one of the things in terms of, you know, the change in administration and outlook and thought process, one of the things we saw early on that happened in January was the pillorying of advisory committees, you know, to the federal government and to the point where, you know, President Trump, you know, disbanded all of the advisory committees for the Department of Homeland Security, including the Cyber Safety Review Board. So in my mind, these committees, among other things, do several things in the environment. They give a perspective that is outside of the potential federal echo chamber that exists within D.C. and that's natural. Echo chambers are natural. I'm not dissing it, but it gives you perspective in terms of impact. It becomes a method to provide input into that environment. And depending upon whose figures you believe, at least 20%, if not up to 85%, which is a common figure that gets bantered around of our critical infrastructure, is run by the civilian sector. If advisory committees are no longer recognized as having value, what is that going to do in terms of the impact of decisions regarding regulation and the promulgation of regulation within the federal sector? If we've lost our voice or lost a part of our voice, Talk to me.
B
I think it's. I think it's bad. I mean, I think we have lost a lot of institutional expertise for no real reason. Like, and this is not just me saying this, but there have been, and this goes beyond advisory boards, but there have been instances where they literally, like, accidentally fired a bunch of nuclear scientists.
A
Yep.
B
Just because they pressed the wrong button. Or the National Weather Service, where, for whatever reason, that became an enemy of the new administration and Elon Musk and Doge, and that agency was gutted to the point where they had lost forecasters, they were unable to staff operation centers for severe storms, to the point where come July and August, you see frantic job postings for, like, NWS needs, weather forecasters so yeah, I think that loss of institutional expertise is certainly a problem, a concern. We've seen that in the context of public health with what happened to the Vaccine Advisory Board which was fired. I mean, this has generally been something that's been bipartisan. I don't think there's ever been partisan focus on the Vaccine Advisory Board and certainly never an instance where the entire board was fired. I always think about, like, how could we solve this from a policy level to prevent something like this from happening again? And there are supposed to be, at least for some of these agencies, protection for appointees so that they can only be fired for cause, so that you have a sense of insulation from the political whims of a presidential administration. That entire concept is up in the air right now and I think lots.
A
Of generous legal terms for saying it seems to be non existent right now.
B
It is. And I suspect that by next June we might get a Supreme Court decision overruling Humphrey's Executor, which is a case from the 1930s that allowed Congress to develop kind of quasi executive positions that are protected from the whims of President's administration. So I think at least the early indications we have from the Court are that that precedent is certainly in question. So yeah, the impact is going to be felt rather severely. One other potential solution that I've seen is states trying to recreate the role of getting together expertise for some of these advisory boards, not just within individual states, but regional groups. We've seen that in public health, where California, Oregon and Washington just had some type of compact to have a shadow vaccine advisory organization with those states. And I think we could see that in all different types of context as well. If there's a vacuum of expertise among these advisory boards that we've relied on for so long, then maybe that's something that states either individually or collectively can try to recreate. In the meantime, while this is all going on.
A
I like to end these sessions by offering my guests an opportunity to. What is one thing you would like my audience to know about, Think about what's one thing you want to bring to the table that we haven't discussed? So the floor is yours.
B
One of the most boring things in the world, if you are not a lawyer, is administrative law. That is kind of the process of making regulations and the rules surrounding the administrative state. And most people have very little knowledge of how that process works. And so what happens is a federal agency will propose a regulation and, and the process through the Administrative Procedure act calls for things like notice and comment to affected stakeholders. I've gone through notice and comment files. It's either the titans of the industry that are commenting or the craziest MFers you can possibly think of who live in, you know, the woods in Vermont. All of this is to say learn how the administrator if you care about the regulatory state, learn how the process works. If you work for a smaller organization, if you're self employed, follow what happens within regulatory agencies. There are daily updates to the Federal Register, which gives notice for the opportunity to comment on regulations. And I think that's an area where people can have a real impact, especially on issues that are more obscure and not politically charged. And I think it's a disadvantage that a layperson just doesn't understand how that process works. So I am always happy to give a brief primer on administrative law, but that's kind of my call for people to get involved if you're not already part of the Googles, Apples, et cetera of the world.
A
Ben, this has been a lot of fun and I think my audience is going to really benefit from this conversation. Really thank you for taking the time for this man.
B
Thank you Kim. It was a lot of fun.
A
And that's a wrap for this episode of CISO Perspectives. I hope today's conversation gave you new insights and practical takeaways to navigate the ever evolving world of cybersecurity. Leadership strategy and shared knowledge are key to staying ahead and we're glad to have you on this journey with us. To access the full season of the show and get exclusive content, head over to thecyberwire.com pro. As a member of N2K Pro, you'll enjoy adding free podcasts, access to resource filled blog posts, diving deeper into the CServe, perspectives, research, and a wealth of additional content designed to keep you informed and at the front of cyber security developments. Visit TheCyberWire.com PRO to get the full experience and stay ahead in the fast paced world of cybersecurity. We'd absolutely love to hear your thoughts. Your feedback helps us bring you the insights that matter most. If you enjoy the show, please take a moment to leave a rating and review in your podcast app. This episode was edited by Ethan Cook with content strategy provided by Mayan Plout, produced by Liz Stokes, executive produced by Jennifer Ivan, and mixing sound design and original music by Elliot Peltzman. I'm Kim Jones and thank you for listening. Sam.
Episode: The Existing State of Regulation
Host: Kim Jones (A), N2K Networks
Guest: Ben Yellen (B), Program Director for Public Policy and External Affairs at University of Maryland Center for Cyber Health and Hazard Strategies
Date: September 23, 2025
This episode delves into the rapidly evolving landscape of cybersecurity regulation in the United States, focusing on the effects of new federal policies, the tension between risk-based and regulatory approaches, and the impact of dissolving advisory boards. Host Kim Jones engages public policy expert Ben Yellen in a candid, practical exploration of recent changes and what they mean for CISOs, businesses, and the broader ecosystem.
The episode is conversational, frank, and laden with industry-insider observations and wit. Both speakers blend wit with gravity, underscoring the high-stakes, nuanced trade-offs facing cybersecurity leaders.
For leaders charting strategy in turbulent regulatory waters, the episode offers grounded perspective, practical warnings, and a call for informed participation.