
Loading summary
A
You're listening to the Cyberwire Network, powered by N2K.
B
Welcome to Afternoon Cybertea, where we explore the intersection of innovation and cybersecurity. I'm your host, Dan Johnson. From the front lines of digital defense to groundbreaking advancements shaping our digital future, we will bring you the latest insights, expert interviews, and captivating stories. To stay one step ahead. Today, I am joined by Dr. Lori Cranor, Director of the Cylab Security and Privacy Institute at Carnegie Mellon University and one of the world's leading researchers on usable security and privacy. Lori's groundbreaking work has transformed how we think about authentication, passwords, and the human side of cybersecurity. Laurie, welcome to Afternoon Cybertea.
A
Thank you.
B
So I am really excited to dig into your research and what it means for our chief information security officers who are trying to build security that works not just in theory, but in practice. And I definitely want to start with this usability gap we have in cybersecurity. I know you have spent your career studying how people actually interact with security tools. So can you tell the audience why do so many security controls fail in practice, and what does that tell us about. About the usability gap?
A
Yeah, I think in practice, when people are designing security tools, they're focused on security, and they often don't take the time to think about the users and how the tool would fit into their workflow. And often the security experts behind the tools are not actually usability or human factors experts. And so without the security people working in partnership with usability people, we often forget to consider the human and the user.
B
Makes a lot of sense. And let's pull a thread on that a little bit. When you think about CISOs and how they are designing their programs today, what is the most common mistake you see them make in terms of usability?
A
I think just not thinking it through.
B
Yep, that makes sense because as you said, they're cybersecurity professionals. They're not actually looking at it from a user lens. They're looking at it from a risk lens or from securing their, you know, their environment lens.
A
Yeah. And increasingly, I think we are seeing CISOs who get it and who are trying to figure out how they can consider the user end. But that's, I think, a relatively new development.
B
Good. Well, I hope it increases, honestly, because when you think about the work that you've done on passwords and authentication, it's been foundational. But it's another place where we have tremendous improvements we need to make from a usability standpoint to particularly make consumers, but also employees at all different types of organizations safe. We know that passwords themselves are flawed, yet we're still relying on them. So why do you think it's been so hard for the industry to move on from passwords?
A
Well, we haven't really found a great solution that is better than passwords that meets all the criteria that we have. I think we want something that is going to be more secure than passwords, easier to use, compatible with a wide range of different devices, and also, by the way, compatible with all sorts of legacy software. And it's really hard to find something that meets all of that criteria. I think in some specific domains we've been successful. So I think in the context of mobile phones, the biometrics that are used on a lot of mobile phones, either face recognition or a fingerprint are effective in that context, but it's not effective in contexts that don't have a camera or a fingerprint reader. And it may not be secure enough for a lot of contexts.
B
And I think that's right. I also think that there's so much friction, right, for end users when you move from using some type of biometric, when you try to get them to use some type of even a hardware or a software token or some type of yubikey, et cetera. It's just. It creates something in their environment. We'd really like to get, honestly, Lori, to this place of passwordless authentication. Right? But there still then has to be some type of authentication. Do you think passwordless is going to become mainstream? And let's talk for a second about passkeys. I get the prompts on my phone, right? Do you want to use a passkey for this app? And it's always like, yes, I want to use passkey for the app. But as a cyber professional and also a consumer, I often think about what the user experience is because I look at it and say, okay, if this is complex for me, who ostensibly has been doing this a long time, what's it like for the average person? So do you really think pass keys are the things that are going to remove the friction?
A
Not anytime soon. I think the concept behind passkeys is good, but they're confusing. And, yeah, I also am confused by them. If I accept the passkey here and then I want to access this account from another device, what do I do? And I often in the passkey process get confused about where I am and don't know whether it succeeded or what's going on. And so when my less technically sophisticated friends say, should I use passkeys? I Don't really know what to tell them. Yes, in theory they're more secure and it will eventually be easier, but if you run into problems, I'm not going to be able to help you.
B
Makes perfect sense. We really need to truly get to the place where we're passwordless and then truly get to the place where we make the user experience of logging in incredibly simple. And I know you talked a little bit about biometrics, but for most users that's at least the simplest thing for them to do and it's something they're reasonably familiar with.
A
It has worked reasonably well on recent models of cell phones and it wasn't always that way though. I remember the first phone that I got that had face recognition. I probably got it about 18 years ago. I turned it on for a couple of weeks when I got the phone and then I turned it off because anytime I was not in a well lit room it didn't work. And then the last straw for me was when I left my phone sitting on the kitchen counter and my 6 year old child picked it up and authenticated and, and I was like, okay, maybe I shouldn't be using this. But it's very different 18 years later.
B
It is. The technology has certainly improved. 18 years later I was actually at RSA Security so doing hardware tokens up until about 11 years ago. And I think about just the light years we've come right in just that short period of time. Speaking of that, take us out 5 to 10 years with your research. What does digital identity look like and what role is usability actually going to play in making it real and better?
A
Yeah, I'm not very good at predicting the future. And when you say digital identity, so that's not just the authentication, but also there are issues like age verification and knowing there's more to the identity than just unlocking the phone. I think that things are coming to a head where politicians are getting involved and you know, age verification is a good example that in jurisdictions all over the world politicians are saying, well, we need to age verify kids before we let them access all sorts of things. And the current solutions that vendors are offering are pretty privacy invasive and not actually very secure and can be easily routed around by not very clever kids. So that's clearly not how we should be doing this. And so there are also proposals and systems where everybody has some sort of a digital wallet which can be used to store various identity information and credentials. And we'd like to get to a point that anytime you need to prove that you are over 18 or over 21 or under a certain age or whatever, that you should be able to use this digital wallet to prove that without having to send all your personal information to whatever website wants you to do that.
B
I think that that is a great example. I love the way that you expand the conversation about authentication because I did ask you more than an authentication question. And it's also a really great lead into the next topics. Like I want to to you about privacy. And you brought up age verification. There's a tremendous. When I talk to my own child, I have a 24 year old. When I talk to my own child. This concept of privacy is a little bit foreign to the Instagram, Snapchat, TikTok generation. They just don't think about it the same way we do. But I think they should. So how do you see users expectations about privacy shifting now that we are in an era where we have pervasive data collection, we have AI driven systems, we have people voluntarily putting all of their information out on social media for the world to see? How do you think about privacy?
A
Yeah, so I've been doing privacy research for about 25 years and I think people's attitudes have shifted some, but not in the way that it's often characterized. Like I often hear the media say things like, you know, young people don't care about privacy anymore. Actually, nobody cares about privacy. Look at all the data they give away. And, and I don't really think that's true. So when I started doing research in this area, when you talk to people about various technologies that were invading their privacy, they actually were quite surprised. Sometimes they didn't believe that these things were real. I remember talking to people about third party advertising on the web and people said, really? They can do that? That sounds like science fiction. And you know, they definitely didn't like it. Once they heard about it, they said, it sounds like they're following me behind my back. This is terrible. Are you sure this is really happening today? You talk to people about these sorts of things and even new things that are just barely happening. And people are not surprised. They're like, yeah, I know everybody, everybody can spy on you all the time and there's nothing you can do about it. They don't like it. Great. They still would like to protect their privacy, but they feel powerless to do anything about it. And many of them will say, well, I've really just given up. I like the convenience of using all these privacy invasive services and since there's nothing I can do about it, I've Just given in and I use them.
B
Yeah, I agree. And I do agree that I don't think people understand to your point when you talked about privacy. Everyone is concerned about privacy. When you explain to them then the data they're freely giving away, they suddenly realize, oh well, I'm not actually following my own concerns or something like that.
A
Well, you said freely giving away and I would argue that often it's not free. You don't have to give away this data, but then you're going to miss out on something or it's going to be a lot harder to do the thing that you want to do. And the workarounds to not give away the data are cumbersome and time consuming or expensive. And so when people feel like they don't really have a choice in ways, they're right.
B
Yeah, exactly. If they need an access to a service or most people actually don't read the terms of service, but if they need access to a service, they're going to give their data away if they need that access. Right, Yeah. I wish terms of services were a little less complex and more explicit and said here, you know, there's a summary, right. Tldr, here's the five things you're agreeing to that would be ideal.
A
It would be. But then beyond that, we need to actually have real choices for people so that you can get useful services without having to give everything away.
B
Exactly. So when you advise organizations, what do you tell them about designing for transparency and trust? Speaking of my, you know, I wish there was a small tldr, not just compliance, but actually designing their systems for transparency and trust.
A
Right? Yeah. So the first thing to realize is that compliance is not enough if you want to actually have a trustworthy and pleasant user experience. So you could say, well, we comply with these 10 things, but that doesn't mean you're done. So it's really important to actually do user studies and to see how users are navigating your system and interacting with the privacy related features, whether they're the informational, getting information or changing their settings or understanding what their current settings actually are. So definitely start by looking at what users actually do on the system and then to improve designs. There's a lot, and I've written a lot on this. We start with things like keeping it simple, trying to put all of the privacy related things in one place where you can find them, but also putting just the piece you need to know just in time in the place where the data is collected. So if I'm filling out a form, having A little blurb to the side of the form explaining what you're going to do with what I fill out is great. And then a link there to the full privacy policy if I want all the gory details. But probably I just want to know right now about this form and not all the other stuff that your company collects. So those are some examples. We're actually working on a framework at Carnegie Mellon called Users first that is designed to help designers actually improve their privacy related interfaces in their products and services. And it basically has a list of, we call them threats, but basically common things that can go wrong. And we ask designers to basically go through and systematically look at every touch point they have with a user related to privacy and go through this list and say, is the information comprehensible? Are the choices easy to understand? A reasonable number of choices and things like that.
B
I think that's all fair. And I do think that the simpler you can make it, humans are busy, humans are often in a hurry. So putting in practice, not just for consumers, but for your employees, things that make it simple and call out the important things, it's just fundamental. Which takes me to that question about behavioral insights, right. For security leaders. I know that one of your key contributions has been showing that human behavior is actually central to how we secure enterprises and environments. How do you think CISOs should apply your research to approve adoption of security practices from a human behavioral standpoint, CISOs
A
need to look at the research that is applicable for the particular problem they're trying to solve. So if, for example, they're trying to improve their password policy, they should read the research on password policy. If they're trying to improve their access control system, they should read that research. And I think looking at what has been empirically tested and then trying to figure out how that applies to their particular situation. Because of course we haven't tested their exact situation, most likely, but nonetheless, there's probably things that they can take away from what we and other researchers have tested to figure out how this would apply in their situation. And then I strongly recommend, once you think you have a solution, doing at least a small user study to make sure that it actually works the way you think it will work.
B
I think that makes a lot of sense. And the one thing that occurs to me is that in academia you actually have the time, you know, probably never as much time as you want, but you have the time to complete meaningful and well researched papers and just to do the work that you do. Whereas a lot of businesses are always moving very quickly. So how do you think about advising folks that balance, right? If they want to move really quickly, what shouldn't they sacrifice as they're moving quickly?
A
You know, my job besides teaching students is to do research. And so yes, we spend a lot of time on it. And there are ways though, that you can get the information that you need to make a business decision a lot more quickly and inexpensively. So there's a range of, you know, what a research study means at the low end of the range or the easy end of the range is to say, you know, get a handful of employees to try the system and watch them use the system before you launch. That's like the easiest low hanging fruit. Better would be to get people who are not familiar with your product or service to do that. Or if it is a security system for employees, make sure that it's not the security team who are testing it, but whatever other random employees in the company that will have to interact with it, get them to test it. And even having five to 10 people test something can actually give you really useful insights. So at the very minimum, you want to do something like that. And then depending on what it is that you would like to roll out, there are other ways of getting information. It may be doing some focus groups with people in the target audience. Those also don't take a lot of time. And you can get eight people in a room and in an hour get a lot of feedback about something. So those are all good things to do now if you have a little bit more time and resources. One of the things that we actually even do in research is that we take advantage of crowd workers in order to do research studies quickly and inexpensively. So you can actually like put up a survey for crowd workers and depending on how particular you are about the demographics of the people you're recruiting, like you could in an hour have a few hundred responses and just pay people essentially minimum wage for their time. There are definitely ways that you can get a lot of feedback very quickly.
B
I think that makes a lot of sense. And just a lot of organizations will test with a small pilot group and make sure they get things right. So I think that's something for everyone to remember. I'm going to ask you a hard question now.
A
Okay.
B
When you think about closing the usability gap, if you could redesign one widely used security control from scratch to actually make it work with humans instead of against them, what control comes to mind?
A
Ooh, passwords. Is an obvious one that I think we all realize that the system of having people remember, supposedly remember 100 unique passwords, which is about the number a lot of people have completely not working. And so I think there are a lot of efforts to try to replace that with something else. And I think the workarounds that we have right now, including password managers, to remember them for you, are a step in the right direction, but we're not there yet.
B
I think that's fantastic. Let's do the opposite. Is there a security tool that you can think about today that actually gets usability? Right.
A
So I think encryption in web browsers, you can browse the web and have encryption between your browser and the website, and you don't have to do anything to make it happen. It says HTTPs and it just does it automatically behind the scenes. And that's beautiful.
B
That's great. I love that. And it was really easy and something everyone will understand. Yeah. So at afternoon cybertea, we always close with a note of optimism. What gives you hope that we can finally bridge the usability gap in cybersecurity?
A
Well, we have actually seen progress. When I started working in this area about 25 years ago, there, first of all, was very little research. I started looking for usable security papers, and there were like two or three out there. And I started looking for usable security researchers, and I found a dozen or so people. And I looked at, well, what companies were actually thinking about this, and there were very few. And I think today, well, there are thousands of usable security research papers and at least hundreds, if not thousands of usable security researchers. And we're seeing that companies are increasingly trying to make some efforts to find more usable security solutions. There's still a lot of work to be done, but I feel that we actually have made progress. And things like the encrypted web browsers is a good example of how far we've come.
B
I agree with you. And just the fact that you are doing this work and people like you are focusing on it honestly gives me optimism. Somebody's actually paying attention and has been for a while. We will solve the usability problems and hopefully the next generation of technology, as we adapt, it will continue to help us.
A
Yes.
B
So, Laurie, thank you so much. I know you're incredibly busy. I really appreciate you joining today. Your research has definitely reshaped how we think about usable security, how we think about privacy. And I know the listeners are going to walk away with some practical advice, which is what we always try to give them. Thank you so much.
A
You're welcome.
B
I enjoyed this and many thanks to our audience for listening in. Join us next time on Afternoon cybertea. So I invited Lori Cranor to join Afternoon cybertea because we don't talk enough about usability in cybersecurity and I think it's incredibly important topic and her research and her work over the past many years has led to fantastic outcomes. Great conversation. I know the audience will like it.
This episode explores the persistent and vital issue of usability gaps in cybersecurity—the disconnect between security controls as designed and how real people actually use (or avoid) them. Dr. Lorrie Cranor, a pioneering researcher on the human side of security and privacy, joins Ann Johnson to discuss why so many security tools fail in practice, what barriers keep us stuck with flawed controls like passwords, the future of digital identity and privacy, and practical advice for leaders seeking to build systems that are both safe and user-friendly.
Usability Neglected:
CISOs’ Common Mistake:
Password Persistence:
Passwordless and Passkeys – Still Confusing:
Biometric Evolution Story:
Complicated Landscape:
Digital Wallets as a Privacy-Preserving Solution:
Attitudes Changing, Not Apathy:
Barriers to Exercising Privacy:
Compliance Isn’t Enough:
Practical Design Steps:
Users First Framework:
Empirical Research First, Then Adapt:
Testing Can Be Quick & Cheap:
Passwords Need the Most Overhaul:
An Example That Gets Usability Right:
Notable Growth:
Quote:
On Usability Culture Shift:
On Passkeys:
On Privacy Attitudes:
On What to Redesign:
On Positivity for Progress:
This episode bridges the human and technical sides of cybersecurity, emphasizing the imperative to design with—and not against—real people. Dr. Lorrie Cranor provides illuminating examples, clear advice, and optimism that reflects the genuine progress made (and still needed) in bridging the usability gap. For CISOs and security leaders, the message is clear: Empathy, testing, and science-driven simplicity are foundational to systems that actually keep us safe.