Root Causes 476: The Need for Security KPIs
Jason recounts a 2024 Black Hat talk about the need for objective measurements of our IT defenses and whether the good guys or bad guys are winning. Jason breaks down how to define and measure the impact of security measures.
- Original Broadcast Date: March 10, 2025
Episode Transcript
Lightly edited for flow and brevity.
-
Tim Callan
Jason, you were telling me offline before this recording session about a very thought provoking presentation that you saw at last summer's Black Hat, and it sounded like something that we might want to discuss and share with our listening audience.
-
Jason Soroko
Obviously, every cycle of cyber security conferences, sometimes there are some standout talks that are interesting. One of the set of personalities that I keep track of is Jason Healey, who I believe, teaches at Columbia University and is also a fellow over at The Atlantic Council. Very prolific writer and speaks at a lot of these conferences. He was talking about a subject that is of interest to me. Something I think about a lot. I think a lot of people think about it, Tim, which is asking the question, is it still a golden age to be a bad guy? Are the bad guys winning with respect to cyber security in general?
-
Tim Callan
Sure feels like the bad guys are winning, doesn't it?
-
Jason Soroko
I think, Tim, that's the gut feel. I think this is exactly what Jason Healey is saying here in his presentation, which is, let's turn it from being a gut feel to actually turning it into objective measurements to being able to say what is defense winning? What is the adversaries winning? Let's try and develop a framework around it and actually do a better job of keeping some data to more rigorously answer that question.
-
Tim Callan
That's an interesting idea. Does he give us a framework?
-
Jason Soroko
What he did, and I think we should stick to this because he actually had a lot of ideas in his presentation, but he talked about breaking the problem into three phases. The initial phase being just asking the question, is defense winning, and just bringing up a number of propositions and examples. In other words, trying to define what that actually means. It has to do with everything from changes that we see in TTPs, a way of tracking how responsive that the defensive software industry is getting at being able to watch the changes of the bad guys, how responsive we are, etc. That's the initial phase one. But I think that it starts to get interesting in phase two and phase three, where phase two is more about creating a really complete catalog of indicators of threat across the threats themselves, vulnerabilities and also measuring their actual impact, which is really at the heart of what we're not really good at, especially when you're reading technical journalism. The type of click bait that you'll see on the internet with every one of these attacks, you might think that every one of them is some sort of sophisticated, high level attack and if you look at the way that a lot of enterprises report when they've been attacked, well, everything is a gigantic ATP that is sophisticated. But is that really true? Is there anything really novel going on? And so even the way in which reporting of attacks happen is extremely skewed and non-rigorous. And so how do we fix that?
-
Tim Callan
Extremely. That's hindered by a few things. One of which is kind of, I think, what you're getting at here, what Jason Healey is getting at here, which is lack of objective criteria. Like we can tell how big an earthquake is because we have a way to measure it, and we don't really have those objective criteria in the realm of breaches or attacks or vulnerabilities.
The other one, though, is there's a secrecy problem. There's no reason to hide the fact that there was an earthquake. But when you get into breaches or vulnerabilities, you get into all kinds of tricky issues about, what do I want to reveal? When do I want to reveal it? How much do I want to reveal? What's the nature of what I want to reveal, and what's the nature of what I don’t. I can see where that would be a big impairment to building some kind of fact-based, comprehensive, consistent model around these factors.
-
Jason Soroko
You got it, Tim. I think one of the most interesting ideas that came out of the talk, he discusses as part of a phase three, and that is to evolve from indicators. Right now, we're looking at indicators of threat, vulnerability and impact. But he says evolving from indicators to actions, and these the types of actions he's targeting here is to go from reducing a global mean time to detect to a really, really specific goal, to saying things such as, can we get to the point where mean time to detect actually goes from days or weeks and answering what is the actual answer? Are we at days or weeks or months, and then having a goal of saying is the average organization able to have a mean time to detect of an attack to less than 24 hours? I think that that's a really interesting idea that was brought out in that talk, and it's just one example of what would it look like to be able to answer that question. I think if you go through, and I actually did this in preparation for this podcast, just to see what was commonly said, and I gotta tell you, it comes up a lot more often than you'd think that technical journalism makes it out to be. This whole idea of time to detect and who is actually doing the detecting. In other words, companies for themselves have such a hard time of detecting their own attacks. They're typically told by another entity and the amount of time it takes between knowing that you've been attacked and actually then being able to respond does seem to be a major problem we have right now.
-
Tim Callan
I'm not sure, I don't know that you would always be able to measure that. One of which is breaches evolve over time. Sometimes it's a single catastrophic event, but very frequently, it's not. Very frequently, it's like there's multiple stages, and you have to walk through the various stages, and you're just trying to get access for a long time and get permissions, and that could go on for weeks before suddenly you're able to do something that's actually damaging, and you've been working this process for weeks before you get into that place where the gold is, and where are we measuring from, and what are we detecting? It's not one event, it's a dozen events that lead up to this ability to steal something that matters. That's what the whole defense in depth philosophy is about is to say, well, you got through my perimeter, but because I had defense in depth, it turns out it didn't matter. You couldn't get to that thing you're trying to get to. Or you did compromise me, but not as badly as you wanted. All of that plays in too. I can see where these things could be defined, but they would need to be defined.
Then lastly, sometimes you just don't know. Some attackers do things like go and erase their activities so you can't figure out what they did, and sometimes that's effective enough that it obscures things like timelines, and so there will be times when that isn't really measurable at all.
-
Jason Soroko
You’re right, Tim. In phase two, he actually talks quite a bit about some of these things, and talking about how it is incredibly difficult right now for cybersecurity companies themselves to report on defensibility relevant statistics that are actually time series or mapped to the catalog of threats and vulnerabilities and impacts. So the problem is, right now, we have almost nothing anyway to - - because we don't think about how to defend in a systematic and rigorous way.
-
Tim Callan
When there is a problem, when there is a breach, there's a tendency to try to thwart the attack, plug the hole, stop the gap, get rid of the data leakage. That's a focus. So it's kind of this we’re in combat now attitude. I have to imagine that the forensics of going back and understanding how did we get here, though, it is valuable, definitely if nothing else occurs later, and also probably occurs to a lesser degree.
-
Jason Soroko
I would say, Tim, if people actually had these kinds of statistics, if they had ways to measure it, it would actually be really embarrassing and really scary.
-
Tim Callan
Then there’s that. There's the reporting thing again, which is we are whatever we are. We're a bank. Do we really want to tell the world how badly we screwed the pooch on this one thing, or do we want to shut up about it because we're afraid that there won't be confidence in our bank or whatever the company is. That would be a worry I would have, too.
-
Jason Soroko
So anyway, Tim, it's an interesting idea, and I think that it has a lot of hurdles, a lot of very human hurdles to have to overcome, but at least somebody is asking the question. It's just a bit of a look back in terms of the last conference cycle. I think there's at least one more I'd like to talk about at some point in the future.
-
Tim Callan
One thought on that last point there is that we also don't have to get all the way to perfect on this. You run into this problem a lot? The perfect is the enemy of good problem, and maybe - and progress is progress - maybe you can make progress. So I'm the father of two young boys, we're very much focused on progress is progress. But I don't expect them to get all the way to what you would expect from an adult in terms of behavior and knowledge and various other things - responsibility. What I'm trying to do is, I'm trying to get them to move forward from where they are now, and I think we could approach it that way. To say, look, there are a lot of factors that hold us back from being the perfect ideal city on a hill that we want to be as an industry or as a secure ecosystem. And what we can do is we can set our sights on being better than we are today. Maybe sitting down the hill, it doesn't feel achievable to any of us, but better than we are today definitely achievable. So that's perhaps where you start, and that allows you to make forward progress and get going, and then maybe you figure out the way to get even better and figure out the way to get even better. This is evolution of technology. Like what technology really has been, if you look at it, is very few giant quantum leaps and a whole lot of incremental improvements. Maybe we get into that habit of making those incremental improvements and even if it doesn't get us to perfection, maybe it gets us an awful lot closer than we are today.
-
Jason Soroko
Well, Tim, let's put it this way. You and I do a lot of podcasting on some extremely heavy duty topics, such as PQC. It’s almost kind of comical in a way that we're talking about, really the keys to the kingdom, and how we're going to have a fundamental change in under a decade. It's things are coming, I think, a little faster than people realize, and yet, the rest of cyber security is just so incredibly backwards that it doesn't even have a framework to try to understand whether or not there's basic success in defense right now. That's how bad it is.
-
Tim Callan
I mean to even poo, poo a little your talk of PQC, I feel like in the world of cryptography, in a lot of ways, it's the same way. We walk around and we tell people to inventory your cryptography. For most organizations, this is not something they've ever even thought about. They don't have a method. They don't have a definition of cryptography. They don't have an idea for how to go forward. Like, it feels like such a basic thing, but just to have a software bomb of cryptography is an absolutely foreign concept to almost everyone in the world and that's a similar thing that you're talking about. We don't have a framework. We don't have metrics. We don't have common vocabulary. We haven't thought through and defined and codified what this needs to look like. That work has never been done by humanity.
-
Jason Soroko
You got it, Tim. I think we're walking into a world, and we've said this before, and if you guys are wondering why we say it, I'll say it again. We're walking into a world where we will have operational systems that are completely insecure.
-
Tim Callan
That's unavoidable. Another topic I think we should have somewhere along the line is we should dive into how to prioritize. Because to be clear, this is a triaging exercise for almost everyone, where you're going to decide what gets done first and what gets done later, and in some cases, what gets done never and that is going to happen whether you want it to or not. It seems in general, like it's better for that to be a considered informed decision than an uninformed happenstance decision, but, again, part of what we're impeded by is we don't even have models for thinking about this stuff today.
-
Jason Soroko
No. Exactly right. That is what I wanted to bring up. I think in 2025, Tim, it doesn't have to be every podcast, but I think we have to throw into the theme of, let's really try to bring some rigor back to this. Bring some framework thinking back to this.
-
Tim Callan
I agree. Let's you and I start to define that. At least take our crack at it. We have a lot of people listen to this. Maybe someone else will take what we did and will make it better. Maybe those people will publish it in blogs and articles and speak on it in conferences. Maybe, we all as a community can start driving us.