Behind the Research: The Risk You're Already Accepting
I didn't plan to admit it on the recording. But somewhere in a conversation about ransomware recovery, I confessed I don't always test my own backup plans. Angela's response was blunt: "If you're not testing, you are accepting the risk."
Part of the Cyber Resilience Maturity Model series
Angela Brandt explains why your organization is probably more fragile than it thinks—and why the gap between preparation and practice is where companies actually break.
Angela missed our first call.
"I was not camera or even conversation ready," she said when we finally connected, adding a quick apology before diving straight into the topic. No drama. Just honest. She'd had a rough week—she was in St. Louis with a colleague's team, flying home Friday morning for a recording session that same afternoon, and somewhere in there, life had gotten in the way.
We kept going because that's how these conversations work. And honestly, the delay didn't hurt anything. What she said in that session was worth the wait.
The Word Nobody's Using Yet
Angela is WWT's Director of Global Security Consulting. She works across healthcare, finance, and critical infrastructure. And in our first prep conversation, she said something I hadn't heard before: the word "restoration" itself is relatively new.
"I don't really think this concept is very common yet in the parlance," she told me. "I think it's something that we at WWT are really at the leading edge of in terms of a terminology."
Think about that. We've been talking about cybersecurity for decades. We have words for prevention, detection, incident response. But the specific capability of rebuilding a technology stack from a known good state, in the right sequence, fast enough to matter—that concept hadn't really found a name yet in most organizations.
That's not a small thing. Problems without names don't get budget. They don't get tested. They don't get plans.
The Question That Lands Differently in January
Angela kept coming back to health insurance as her analogy for risk tolerance. It's a good one, and I told her it hit particularly hard for me this time of year.
"I'm already seeing the results of deductible decisions we made this year," I said during the recording. "Something that costs ten dollars this time of year is four hundred with the higher deductible, and I'm like... yeah, this is gonna hurt for a while."
She laughed. That's exactly the conversation organizations have with cyber risk—except the stakes are company survival, not a pharmacy copay.
What you're spending on security controls is your premium. What you've accepted you might lose when something goes wrong is your deductible. The gap between those two numbers is your risk tolerance. Most organizations haven't sat down and made that calculation explicitly. They've just accumulated tools, checked compliance boxes, and assumed they're covered.
Angela's job—and this episode—is about finding out if they actually are.
The Moment That Stopped the Recording
About halfway through, Angela paused us mid-conversation.
"Can I pause for a quick minute? There's something happening out in the house that shouldn't be."
We waited. Turned out to be some people wandering around outside that seemed a little sketchy. Everything was fine, but it was a funny moment—a cyber resilience expert briefly unsure about the threat level outside her own window, during a recording session about knowing your threat level.
She came back, we picked up, and I couldn't let it go without noting: that's kind of the whole lesson. Something unexpected happens. Do you have a plan, or are you figuring it out in real time?
The Confession I Didn't Expect to Make
Late in the recording, we were talking about testing—the idea that tabletop exercises and penetration tests are only useful if you actually do them, regularly, before anything goes wrong.
And I heard myself say something I didn't plan to say.
"I don't always test my plans. And I know that's a whole thing, and I've accepted that risk, I guess."
Angela didn't let it slide.
"If you're not testing, then you are accepting the risk."
Not as a gotcha. Just as a fact. The clearest possible statement of what accepting risk actually means: it means you've made a choice, consciously or not, about what happens when the plan doesn't work.
I've sat through a lot of these conversations. That one landed differently. Because it wasn't abstract—it was me, admitting I have the same gap I was warning others about.
The House Leak That Finally Made It Click
I brought up a personal story during the recording because Angela's ER/ICU analogy needed a concrete anchor for me.
A few years ago, I had a major water leak in my house. USAA sent a rapid response team immediately—they stopped the leak, pulled baseboards, ran fans and dehumidifiers everywhere the water had spread. That was the ER moment. Stop the bleeding.
Then came the completely separate phase. Carpenters. Replacement materials. A general contracting process that had nothing to do with the emergency team—different people, different skills, different timeline.
That's what WWT does in a cyber event. The forensics teams—Mandiant, Unit 42, CrowdStrike—they're the ER. Stop the bleeding, contain the threat, identify the root cause. WWT comes in after, like the ICU, to rebuild: fresh technology, known-good state, identity services first, everything restored in the right sequence.
Angela put it plainly: "We do not do forensics. We do not do incident response. We intentionally work with partners to be the incident responders."
That's not a limitation. That's a deliberate design choice. And it matters to understand the handoff—because your incident response retainer doesn't include rebuilding your infrastructure. Those are two different plans, requiring two different partners, both of which need to be in place before something goes wrong.
What the Video Covers
Angela walks through the full arc: understanding your actual risk posture, business impact analysis, dependency mapping, tabletop exercises, and what a tested restoration plan actually requires. Rob Joyce's quote from the previous episode threads through—"Who decides to shut us down, and will they have my backing at 2:00 AM?"—as the authority question that nobody has answered until they've practiced it.
The research identifies the specific steps to move from Level 1 to Level 2 on the maturity model. Angela explains what Level 3 actually looks like from someone who's helped organizations get there.
Watch the full conversation: World Wide Technology Presents Research — Cyber Resilience Part 2
The Part That Stayed With Me
Angela said something near the end that I keep coming back to.
She was talking about how organizations think about preparation, and she described the minimum question every organization should be asking itself—regardless of size, regardless of regulation:
"If we were hit with ransomware today and unable to recover for two months, would we still be in business?"
Not a complicated assessment. Not a maturity model. Just that question.
If you can't answer it with confidence, you already know what to work on.
Resources:
- Full research paper: Cyber Resilience Maturity Model (available at wwt.com)
- Part 1 of this series: Episode with Madison Horn and Rob Joyce
This is part of the work I do with World Wide Technology's research team. More at explainerds.net.