
No Way Out
Welcome to the No Way Out podcast where we examine the variety of domains and disciplines behind John R. Boyd’s OODA sketch and why, today, more than ever, it is an imperative to understand Boyd’s axiomatic sketch of how organisms, individuals, teams, corporations, and governments comprehend, shape, and adapt in our VUCA world.
No Way Out
Digital Guerrilla Warfare: Deception in the Cyber Age with Tim Pappa
Step into the shadowy world of cyber warfare with No Way Out, where episode co-host Steven McCrone joins cybersecurity expert Tim Pappa to unravel the cutting-edge tactics reshaping digital defense. From guerrilla warfare-inspired deception to behavioral analysis honed at the FBI, this episode dives deep into how modern cybersecurity borrows from historic battlefields—like the Mujahideen’s ambushes against the Soviets—to outwit today’s cyber attackers. Discover why trolling ransomware gangs, exploiting human vulnerabilities, and embracing uncertainty could be the keys to staying ahead in an escalating invisible war. Perfect for tech enthusiasts, security pros, and anyone intrigued by the mind games behind the screens—tune in to explore the future of cyber conflict!"
Tim Pappa, a cybersecurity expert with a unique background from the FBI, shares insights on how understanding human behavior can shape security strategies. Tim emphasizes the importance of crafting defensive tactics that rely on deception and behavioral analysis to outsmart cyber adversaries effectively.
Throughout the conversation, Tim highlights that organizations often miss the chance to implement cybersecurity strategies that leverage psychological insights. By recognizing the behavioral patterns of attackers, organizations can develop more robust defenses instead of simply responding to threats reactively. The discussion also covers the role of artificial intelligence and adaptation in evolving cyber landscapes, showing that effective cybersecurity must be dynamical and contextual rather than static.
Applying Models of Historical Mujahideen Ambushes and Raids to Cyber Deception Practitioner Design
NWO Intro with Boyd
March 25, 2025
Find us on X. @NoWayOutcast
Substack: The Whirl of ReOrientation
Want to develop your organization’s capacity for free and independent action (Organic Success)? Learn more and follow us at:
https://www.aglx.com/
https://www.youtube.com/@AGLXConsulting
https://www.linkedin.com/company/aglx-consulting-llc/
https://www.linkedin.com/in/briandrivera
https://www.linkedin.com/in/markjmcgrath1
https://www.linkedin.com/in/stevemccrone
Stay in the Loop. Don't have time to listen to the podcast? Want to make some snowmobiles? Subscribe to our weekly newsletter to receive deeper insights on current and past episodes.
Recent podcasts where you’ll also find Mark and Ponch:
Kia ora. Everybody, welcome to the no Way Up podcast. Our guest today is Tim Papper. Tim is a cyber security specialist currently working with Walmart. Tim's background includes some time at the FBI behavioral analysis unit. Tim is a specialist in the use of deception in cybersecurity. I met Tim. We were doing a cybersecurity strategy for a utility here in New Zealand. We wanted to do, you know, what we thought was best in market cybersecurity and we found some of Tim's work through that process, in particular, very interested in his work with using Mujahideen and guerrilla warfare deception tactics and bringing it into the modern age of cybersecurity. So we'll probably interrogate you a lot on that, tim, but for the first part let's have a more general conversation about cybersecurity.
Tim Pappa:And really what I would like to know from your perspective is over the last few years, what's changing in the field of cyber? No, stephen and Brian, thanks for having me on the show. I come in at a different, interesting time because I've been in private sector, in the industry, for about a year and a half, and before that I was in government, in the intelligence community and at the fbi. So, um, maybe I had a different perspective than people have been in industry for a while. What was interesting in this I think this provides some reflection on your question is when I was transitioning out of the FBI, I wasn't sure how difficult it would be with someone with my skillset to land in the infosec industry, right In cybersecurity, because what I specialized in at the FBI and the behavioral analysis unit was specifically cyber behaviors, right Of offenders, nation state actors, things like that.
Tim Pappa:But a lot of big companies don't necessarily have positions in like a concentrated focus on the behavior of attackers. A lot of people are talking about the psychology of attackers. A lot of people for many years have been talking about humans and human behavior as the weak link or the vulnerability in organizations. But what I'm talking about is crafting a response for your defense and maybe your offense, based on the behaviors you observe in an attacker looking for their vulnerabilities, right, and their personality, the way they function online, their technical skills. It's kind of niche, right Like it. Just it wasn't widespread. I had heard that, that you go to a company like Google, mandy, and there's not going to be somebody who just does that right, and sure enough, that was the case. I was fortunate that Walmart has a dedicated cyber deception team where they were looking for that kind of skill set.
Tim Pappa:So I think, in terms of, like, the challenges for sure that a lot of people are facing, that we're facing too is how do we incorporate into what degree AI into what we're doing? I just got a question today about, essentially, how would the way we do cyber deception defend against an agentic AI type of actor? Right, and I know there's a lot of question marks with that, but I think I would argue we would be able to defend in the same way because, if anything in general, because if anything in general, right, in general, if you think about agentic AI and how an agentic AI actors created through an LLM, large language model and like prompts and and how it's refined, you're still talking about prescribed behavior, right, and that's what this is all about. Like, can we see it, can we document it and can we craft something to respond to that? That's the most behaviorally responsive?
Tim Pappa:I just don't. I haven't seen a lot of that in the industry. There doesn't really seem to be a space for that. I think that'll change in the next 10 years, and I think the same for cyber deception. I mean, for a long time, deception tech has had its place in industry and a lot of people were buying that software off the shelf, and I think in general, people in industry have mostly acknowledged there's severe limits to taking off the shelf software and putting it on your network, just that you've got to customize it and adapt it. And we have the same perspective in industry and that kind of goes back to my time at the FBI too that you've got to customize your responses to these cyber threats because obviously you see in the headlines every day, right, it's like they're getting better and better, they're victimizing more and more, and I have found the most consistent way to effectively counter that is to have behaviorally based design and how you respond to it so do you use the behaviorally based um approach to identify the bad actor?
Steve McCrone:So could you? I mean, in my time as a IED specialist the guys used to say when people were manufacturing devices by hand, they could identify the bomb maker by the way that they, for instance, soldered the wires. Can you identify bad actors or could? Is it potentially viable to identify bad actors through that behavioral analysis it?
Tim Pappa:definitely is. I mean, surely there's limits, right? Because a lot of times offenders, bad actors, they're unknown and we look for those signatures too. Behavioral analysis as a methodology it's designed that, even if you have major limits on what you know about them, it's designed that, even if you have major limits on what you know about them, that you can still apply it. You can still apply those frameworks for analysis to something.
Tim Pappa:So there'd be cases in the past where you know a field office would come to us and make a request related to some kind of activity, some kind of offender, and they usually would have a goal in mind, right? Some investigative goal, like we want an interviewing strategy, we want an engagement strategy, what's the best way? We should approach them online. But that all kind of starts with the behavior analysis and in some cases there'd be a lot of information for us to look at, right, depending on the nature of the case. Sometimes it'd be nothing, it'd be whatever you can find on social media, good luck. And even that's like a representation too, right? That's okay. The whole point that I think what sets behavioral analysis apart is a very effective methodology for analysis is, if you think about the example of, like an intelligence analyst, right. They usually have key intelligence questions and those questions might be something like we're going to look at someone and ask how loyal are they to this government? How nationalistic are they? Well, we wouldn't approach things that way. Our approach would be on this particular person, what behavioral artifacts do we have? It's literally everything we can possibly know about them. We want that right. But then we want some kind of timeline, right. So you have all that information in the aggregate and you can look at different times in their life. Why'd they make that decision back then? What else is going on in their life? Does that reflect the nature of their relationships at the time? Always going to be gaps, right.
Tim Pappa:But when you take a lot of the literature that's out there in academia, that's been empirically tested or maybe it's just qualitative. You have a structure, you have rigor, you can add to it, and that's why I often found it was so effective instead of investigators and operators necessarily just trusting their gut, which isn't always wrong, right, they've got experience, they've been trained, they have skills. But what was eye-opening to me as an agent going to BAU was hey, I used to always do interviews in a particular kind of way I think I could develop relationships in a particular kind of way and I was good at it. I knew how to talk to people. There were approaches or techniques I would use, and it's not that those weren't grounded, but behavioral analysis.
Tim Pappa:The way we practiced it at BAU was a much more methodical approach. It's like let's get together everything we might know about them, let's look at it over a period of time. Can we establish any kind of baseline for when they made those decisions and why? And then can we start dropping them in brackets. So that's where you start to see like people's lives are so dynamic. We want to consider all that.
Tim Pappa:That's really a hard part for cyber threat actors, definitely for industry too, right, because you can look at the research literature on how people make decisions and it's very naturalistic, it's very dynamic and there's a lot we don't know. And certainly that reflects that same space for the OODA loop, where it's like I'm trying to make a decision, that threat actors trying to make a decision also, maybe in the same environment, but there's so much we don't know. And can we do anything to influence their decision making? Yeah, that's difficult to do that at scale in industry when you have a massive enterprise and there's a lot of moving parts.
Tim Pappa:I think one thing I'd say that might be a little counterintuitive that we've been trying to do and it definitely reflects our approach when I was at the FBI is this whole idea of scalability. I often get that question is this even scalable? And I'm not saying scalability is some kind of myth, like it's an important question. But what we've found often is what is actually the most effective or efficient in the organization is actually look at demonstrated behavior and activity, actually look at the documented activity and then craft a response to that. With all these tripwires and that's much more passive and those kind of alerts you can't always depend on them. That's how we respond effectively and efficiently to malicious activity. It'd be the same thing with BAU, where it's like we might be talking about one person, one threat actor, and sure, maybe they're part of a larger criminal enterprise, maybe they're part of a government, but for that particular individual we're looking for vulnerabilities in their behavior that we can exploit behaviorally, and it might be technical or it might just be how you influence them during an engagement.
Steve McCrone:So you're looking for an asymmetry between the attack and the attacker?
Tim Pappa:looking for an asymmetry between the attack and the attacker. Sure, I a lot of people ask like um, what's the besides, like the cost of trying to do cyber deception to defend an organization? What's the value?
Tim Pappa:in it and you can measure that in a lot of different ways, right, you can be like what's the notification or alert time span? How does this support incident response efforts, whatever? I guess my big answer to that big question is it's about managing your environment. So, yeah, you can have a GenTech AI, facilitate steps in a cyber attack chain, right, but they're probably going to follow relatively predictable pathways or they're going to be looking for particular kind of things that we know they're looking for. So cyber deception is really about managing that environment, influencing the direction of that agentic AI, influencing the direction of that actual human actor and influencing what they end up doing.
Tim Pappa:That's a part that, yeah, that might be hard for people to traditionally measure that in terms of, like, how many threat actors did we mitigate? But that's not our goal. That's where I wonder if a lot of academic research on cyber deception they keep getting stuck in the strange funnel of like applying cyber deception as if it's enterprise wide cyber defense and it's not. I think cyber deception just has one small part. So for us, we want to go to different teams and look for how can we augment what you're doing, different teams and look for how can we augment what you're doing. How can we play a small role in this function that you have and use it to deter attackers, right, yeah, um, so yeah, that's that's often the challenge, like how do we take a lot of this research literature in theory and then apply it to practical, real world application?
Steve McCrone:So just to sort of roll back and maybe for some people who are less informed about this whole idea, could you explain. When you use the word deception, just explain what you mean.
Tim Pappa:Yeah, okay, with deception, you're trying to create erroneous decision-making in someone else, right? And maybe another way to think about it is you're distorting their decision-making process, you're distorting their analysis. A lot of people want to craft deception to reaffirm things that most people would have expected. I think they call that Magruder's principle, right, like you want it as close as possible to what they expect in reality. It's really why, if you're familiar with, like you know, soviet cognitive models of, like reflexive control, that's exactly what they're doing, for how organizations and people get information, how they analyze the information, how they make decisions, and they're just injecting right into it. Right, that's why it's so hard to catch and why it's so effective. So, in deception, we're trying to do a lot of those same things. That's really what it comes down to, though.
Tim Pappa:Can I distort the processing, thinking, decision making of a threat actor in a cyber deception space? That's what I'm looking to do. Um, sure, there's like parts of that that come down to influence communication, but really, at the end of the day, that's what we want to manage, because even with reflexive control, um, it's not necessarily about controlling behavior. It's about managing behavior. That's like a slight nuance to it, um, but there's parallels there. There's close parallels.
Brian "Ponch" Rivera:Yeah, to me it's all about getting in the mind, time space of the of the opponent, understanding what drives them, and we talk about taking actions that generate information that could be false information or that deceptive information that gets back inside their mind, time space and they make a decision based off of that. So that's the whole idea of like getting inside the OODA loop, or getting inside the mind of your, your opponent. Um, so we're seeing a lot of talk about guerrilla warfare now, online, we have a lot of information flow, um, misinformation, disinformation. We have a lot of information flow, misinformation, disinformation. How does that factor into understanding the behaviors of these bad actors, and sometimes good actors too? What's changing the landscape so fast for you when it comes?
Tim Pappa:to information flow. I think my starting point for how I looked at a lot of campaigns in the past couple of years that have been highly publicized by different platforms like Meta and X and other organizations, where they've looked at what are the Russians doing right In terms of manipulating social media accounts and platforms to do large scale disinformation. My read on a lot of those reports is you see increasing sophistication, right In terms of how they build their infrastructure, how they grow their infrastructure in terms of accounts, the content creation obviously accelerated right and the delivery. The thing I wonder about, though, is how many people are actually affected by this. So, certainly people are being impacted.
Tim Pappa:We just don't always know to what degree, and it's really hard to measure that. Of course, what I've often seen in these broad campaigns is it's not necessarily targeted for individual people and, um, again and again, my experience at the FBI and the behavioral analysis unit was, if you don't have a very customized, concentrated campaign on one person or just a handful of people, it's probably not going to be effective. Um, when I'm looking at how you craft something to influence even one person, I'm very interested in their close relationships. I want to know the two or three people closest to them.
Tim Pappa:I want to know the people they love, the people they hate, because if you think about a lot of our major decisions, it usually is reflected or influenced by other people, or thinking about thinking about other people, how it might impact them, is usually like a consequential decision. I don't really see that in these broad campaigns. Um, I think, in terms of weaponizing that for our own purposes, I really don't see corporations, organizations doing that. I I think it's a huge gap. If you look at some of the research going back to 2018 2020 on cyber deception, there's been some great studies by kimberly ferguson. Walter and her studies are really, although it was artificial and it was using, like red teamers, the actors, it was a really interesting study because she had so many of them in the virtualized environment they were attacking was so massive and she wired them up to the study their physiology. So they wanted to see at the moment. They said to them before they tried to penetrate or compromise a part of the network, there's deception on this network or there might be deception on this network. They wanted to see if there's a rise in their heart rate and there was. I don't know if that surprised a lot of people, but their takeaway from that, they said, was very counterintuitive. It's saying you actually shouldn't hide deception. You should tell people you have deception even if you don't. So a lot of organizations were hiding the fact that they use deception on their network. But they're saying go ahead and shout it from the rooftops even if you don't have it, because this will cause attackers to cause right. It's really just more conceptual at this point If ransomware gangs and nation state actors whoever are so keen on damaging or disrupting the reputations of organizations because they know organizations fear that and there can be obviously real loss for corporations because of reputational damage. What if their reputation was targeted right? Their reputations are so important. If you take, like a ransomware gang actor, that kind of live and die based off of how many affiliates trust them and want to work with them and how much they believe in their capabilities, if you undermine that reputation, I think you're going to have fallout pretty quickly and people aren't really trying this. You know like theoretically, what would? What would happen if a pretty well-known threat actor attacks your corporation and it failed and then that corporation said didn't work Right. Basically, we spanked them, they tried this. This is how poor their trade craft is, whether that was true or not right. I mean, people can manipulate artifacts, but that would be pretty embarrassing to a threat actor.
Tim Pappa:Most organizations. They fear that they're worried that this will incite more attacks. I don't know if that's true. I think that's folklore for a lot of people. I think it's a feelings-based fear that they don't want any attention, they don't want to incite attacks. I would argue it is a very strong message, right. It's a very strong signaling that we're confident enough in our network defense that not only did we stop you but we could do it again. And if there's anyone that's not as good as you, they definitely are not going to be successful. I think that would deter a lot of attackers.
Steve McCrone:So when we talk about well, you talk about adaptive strategy, but the headline for the piece of work I've just finished is adaptive cybersecurity, and it really goes to the heart of the idea that humans in particular, are hardwired to avoid uncertainty. That's why you see the heightened emotional state when you tell them there's deception on this network. Their perception of uncertainty lifts and as does their psychological, physiological response to that. But I guess what I'm hearing, and certainly the view that we take, is actually that discomfort that you feel when uncertain should be something you seek, because it's then that you actually uh, in a state where you can engage with and better understand the system so comfort is not a desirable outcome, particularly in cyber security.
Steve McCrone:If you're comfortable with your cyber security, you're probably in a dangerous place. That's the sort of rule that we use.
Tim Pappa:Yeah, I would agree with that.
Tim Pappa:I also, although it's anecdotal, there's been plenty of cases in the past that I worked with in the FBI where actors didn't act the way I thought they would. You know, it's like it's not an unwritten rule, but maybe there's an impression that you have ongoing communication with a threat actor. You don't want that communication to end. Don't anger them, right, maintain the communication. I don't think that's true. I've seen too many cases where other agents have angered them, have done something very offensive to them, and it didn't disrupt the communication at all. And I think that's because they had other motivations that were much more important. They wanted money, they wanted to find out what the FBI knew, they were curious and that could be very damaging to them. Someone's curiosity right to find out what's going on here. Um, those situations, I know they they seem abstract to some people, but I think we can look to our own lives to think about moments where you've you've been in distress, you've been highly emotional, you've been upset about something. When you create those moments, you create an overload, right, it's a cognitive and affective overload. People just they miss things, they don't scrutinize things as much. That's actually what we would want. We want that. We want someone in a highly emotionally aroused state so they're not paying attention to certain things. They're more vulnerable and this is an opportunity to exploit people behaviorally.
Tim Pappa:I've also seen in the past year in the few cases where you look at a ransomware enterprise like Lockbit which has had a lot of trouble reputationally because the FBI, doj and then NCA in the UK have made it a point to troll them, troll some of those people, definitely different, and it's kind of things we had proposed for a long time and people didn't really bite on it. Because you can imagine in a law enforcement organization where you've got a certain order to things like you've got a criminal case, you've got other actual political sensitivities out there internationally as well, and you're suggesting you should troll this guy, you should mock him. It doesn't seem like a connection for people Like what's that going to do? We can point to the research and say this will degrade this guy's decision making, because once you do that he's going to want to respond. He's very reactive to what people say about his reputation. He's very reactive to what people say about affiliates trusting him or not.
Tim Pappa:And once you create it, or he's created that reputational stage, everybody's watching and now you've got them in a very vulnerable spot. Um, you know, as an idea, when you look at these things, where you've got department of justice, um, where they're asking for the like, uh, rewards for justice, they're saying, basically, we'll offer reward money if you have any information on, say, these russian cyber actors, a lot of these cyber actors. Of course they're cyber actors, of course they're paying attention, of course they're watching that, they want to see what people say about them. So even if you're in a situation where someone's on a chat in a forum and they put up that poster, some anonymized individual says 10 million dollars, more like 10 000. That kind of crack at someone. That that's that's really hurtful. Yeah, they care about that because they don't want to be perceived that way, because it does impact them.
Tim Pappa:I just haven't really seen governments, and definitely not corporations, willing to try this. But in the few samples you've seen in, like LockBit, it's actually effective. And the research into just human behavior in bit, it's actually effective. And the research into just human behavior in general, it's very effective. I mean, again, people can look at their own lives and see how they respond in these kinds of situations.
Brian "Ponch" Rivera:It kind of reminds me of like trash talking in the NBA or in basketball or something on the pitch, right? I mean, we do that to get in the mind time space of our opponent. So, and that's natural behavior that I think any I hate to say, any kid around here would do, but you see that often, right? So this exact idea of taking a practice that you see somewhere else and bring it over to this type of defense, it makes sense to me. I mean, we, we, you know, trolling for something is, you know, when we used to troll for Migs in the um, uh, in Vietnam, you know we'd set up traps for them. Right, we're taunting them, we're, we're, we're encouraging them to do something. Right, get in there, they're uh inside their OODA loop and had them act and basically ambush them. Uh, which which is, I think, part of the the connection to your view of looking into guerrilla warfare as an inspiration behind all this. So this all makes sense to somebody like me.
Brian "Ponch" Rivera:I just didn't know this wasn't happening already in this contested space. Question to you are cyber crimes underreported, underreported?
Tim Pappa:I mean, I don't know. I'm more without knowing fully. I don't think they necessarily are. There's definitely increased reporting requirements now with the SEC in terms of if there's been a compromise. But in general, I think another way to phrase it is I think they're under-publicized. That's probably what it is right, because most companies they're going to want to resolve that very quickly, they're going to hire vendors, they're probably going to call the FBI or the FBI already knows about it. So I think it's just a matter.
Tim Pappa:Surprised that I haven't seen as much promotion or projection or highlighting of attacks. Is ransomware gangs attacking hospitals, because you know you, can, you hear stories about that where it's like does it result in anyone losing their life? I think there's been some situations where that may have happened or they knock out a network so people can't get surgeries or there's issues in the ICU and there's something like that. But that's actually something that could be projected much more loudly by those victims Because, again, that actually does matter.
Tim Pappa:It does matter in some circles there might be some, there might be limits that are kind of eroding with ransomware gangs Among many of them. They don't care about that anymore that they will attack hospitals because they know they have money and they know the hospitals have to pay. I just don't see enough of an effort to give it back. I don't know if I don't know if I would correlate all this, but around the time I was leaving the FBI and the intelligence community there is a number of denial and deception shops across the government and DOD that were shutting down, and I mean these shops are more for training and education, but you know anyone who studies that irregular warfare and deception in the military. It's the same old story where there's cycles right Not until there's a conflict and a war it spins up again. There's a lot of attention on it and it spins down.
Steve McCrone:Let's explore that. We'll put a link to your article on Mujahideen and Russian deception tactics and how that is transposed into cyber. We'll put a link in the recording for that. But could you, Tim, just for the listeners who don't want to geek out on that stuff, could you just give us an example of the interplay between the Russians and the Mujahideen in terms of deception and then maybe we can explore that in terms of how it translates to a cyber idea.
Tim Pappa:Yeah, of course, thanks. Like our exploration into looking at Mujahideen ambush and raid tactics against the Soviets. It kind of started as a proof of concept. It was really just to demonstrate for our own team and other people, if you take this concept of like, how teams can be creative, how they can come up with campaigns right and in terms of like design, this idea of like near and far analogies that sometimes that help people. That helps people whether they're technical or not or have different backgrounds, where they're like hey, it's kind of like this and that's really helpful.
Tim Pappa:And I was in a used bookstore and I can't. I was looking, I was just perusing the shelves. I'm like what's this? And it was a book from you know, 10, 20 years ago and it was an interesting book because it was a former Marine and an Afghan soldier who went back to Afghanistan and they said this is a little bit different. We want to collect vignettes from Mujahideen commanders and fighters and try and organize it. Let's organize it into different categories so there's more than just ambushes and raids. But I thought there are actually so many examples of this. I'm sure you could go to any conflict and be like we can find something, because certainly Barton Whaley is a deception scholar who did that right.
Tim Pappa:He studied as many famous battles and wars as he could to try and see empirically is there a difference when someone uses deception or not? And overwhelmingly he found people who use deception. They typically won that battle. He found people who use deception. They typically won that battle. And even if they didn't win the battle, they typically had lower casualty counts if they use deception. This is a significant finding. That was in his book Stratagem that came out many years ago.
Tim Pappa:But same kind of approach where, like can we take any kind of analogy here, at this one in particular, guerrilla warfare and apply it to the way we design cyber deception? And I think most people might think, oh, as a large corporation, we might take the view of a massive power like the Soviet Union. But we thought no, mujahideen makes more sense for us actually, because this network is ours. We can't be everywhere at the same time, but we know it, we know the land right. And we thought same thing with the Mujahideen. Not only are some of them not from Afghanistan, but the ones who were. It's a big country, they couldn't be everywhere. The Soviets had too many people there and then, of course, afghan government under Soviet control, and so they had to look for ways to adapt to that kind of situation. We thought this is similar to our network landscape we just can't be everywhere at the same time. Our attack landscape service is too big, it's global. So the idea was can we look at any of these ambushes and raids and find parallels with Barton Whaley and those deception frameworks right Of simulation and dissimulation? And sure enough we could. It wasn't that difficult at all. I think you look at any of these situations and you can find those parallels.
Tim Pappa:What was also illuminating with Barton Whaley that I don't see in a lot of his writing, but he's got a number of unpublished works in his special collection up at National Intelligence University and so I've been looking through the archives to look at some of these unpublished works, and there is one work in particular where he writes a lot more about his deception framework of simulation and dissimulation and he said a lot of people make the mistake thinking it's just one or the other. He said this is a simultaneous thing. You can't have simulation without dissimulation and vice versa. So when you're planning and designing something, you got to keep that in mind, and his example he wrote about was trying to hide the fact that he's wealthy. As an example he doesn't want people to know he's wealthy, so he lives in a poor house, a poor looking house, right. So he's dissimulating his wealth, but he's simulating that he's poor. And so if he's trying to bury a bag of gold coins in his backyard so people don't know about it, he's's again. He's simulating that he doesn't have any money and he has gold coins, right, and then simulating that he's not wealthy. The example, I think, is useful, because when we look at this we're like how can we do that as well?
Tim Pappa:One of the ambushes, um, as a vignette in that book, was actually focused on the Kabul River, which I haven't been to Afghanistan, so I'm pulling this from those vignettes.
Tim Pappa:But they talked about how people familiar with the Kabul River in certain spots thought this looks crossable, it looks shallow, but they found in certain parts of the river that was not the case and that people tended to get swept away and especially large military vehicles would get stuck in the mud and undercurrent. And so there's one of these ambushes where they drew Soviets to the Kabul river, and the way they did that was they ambushed them with limited people and they wanted to see. They wanted the Soviets to see them retreating back to the Kabul River. But this was just meant to draw them in and there were Mujahideen hiding down by the river with light camouflage and they wanted the Soviets to see Mujahideen crossing the river on foot. But obviously they knew where to cross it safely. So the Soviets continued to pursue them and tried to cross that river with tanks. The tanks got stuck. The other mujahideen opened fire and killed a lot of Soviet soldiers.
Brian "Ponch" Rivera:But this was by design, though right, it wasn't by accident, it was by design Okay. It was by design.
Tim Pappa:So another example and I think this is included in the paper was they had an issue where there was a certain hotel down a major highway that a lot of Soviet convoys would go down and there was constantly ambushes there. The Mujahideen would constantly ambush Soviet convoys and obviously they had an issue too right where they would have people spying for the Soviets that would tell them hey, Mujahideen are setting up and there's always at this particular hotel, like near it, in it, around it, and the thing that what we pull from that is because this was so frequent. The Mujahideen were establishing patterns, soviets were too. And so you can. You can manipulate that right. And this is the opportunity to apply deception. If you want to simulate that you're setting up for an ambush, but then you leave it open. Now you've also created another situation where what do the Soviets make of that? Are they still going to slow their convoy? Yeah, they might. Are they still going to add more people to that convoy and pull forces from another area? They want to be more vulnerable? Yeah, they might. This is where it becomes very dynamic and naturalistic. They would have situations sometimes where the Mujahideen would set up ambush sites and leave their weapons there and part of it is. They wanted to be seen so people would report on it. Yeah, but then they wouldn't occupy it until a later stage because they knew there might be Soviet helicopters flying over looking for them and they thought they're not actually there and that can have a lot of cascading effects. Right, the people they rely on as informants, how reliable their intelligence is, all those kinds of things. But we would take that and say what does that look like in a cyber deception space?
Tim Pappa:But we would take that and say what does that look like in a cyber deception space? Well, that could be a firewall on a particular part of the network that we only partially configure. So an attacker doing the reconnaissance, they might get to a particular part of the network on the perimeter where they're in. They have some level of access. But they're going to be probing that firewall because they want to see what kind of rules are on that firewall. How is that firewall configured If we configure it? So we let a lot of traffic go through and they try to go through. This is similar to us, to an ambush site that they would probably presume. Of course we're going to configure this firewall to keep people like them out and keep certain activities out, but the fact that we let it in, can we, can we plausibly signal why that could be the case? Well, it might be something like um, it's on a part of the network where there's a lot of like devops on software and people don't use it as much.
Tim Pappa:I don't know it could on a part of the network where there's a lot of like dev ops on software and people don't use it as much. I don't know. It could be a credential that we've left somewhere that they use to gain access to that part of the network. It could be a lot of reasons, but the whole idea here is everyone's got to get through that firewall. So what happens if we let them through a few times but later on we reconfigure it to stop them or ensnare them? That would be like an example of, theoretically, how we might apply it.
Tim Pappa:But now can we build a, can we build an actual cyber deception function out of that?
Brian "Ponch" Rivera:Let me ask you this If there's a network of bad actors out there and they see this and they learn from it, they can adjust too right they're they're fighting the same war or same type of mindset. So what if you're working with other corporations on this or other vendors? Is this network approach work? I mean the network approach to sharing information, such as putting a firewall on a DevOps space. Would that be something you look into or have you done?
Tim Pappa:Um, I think like partnership or collaboration with other corporations. Yeah, so far in my experience, in reality I don't. I don't think it's very practical. I I think too many corporations don't want to share information about that kind of stuff, like the design of their network. Um, and then you've got different sectors, right, Like you've got the retail sector, but then a couple of companies like Microsoft and Amazon are very different than Walmart because they're vendors or not.
Brian "Ponch" Rivera:Yeah.
Tim Pappa:So there's going to be a different dynamic there.
Brian "Ponch" Rivera:So how do you share information and grow this type of thought leadership within corporations? That's what I'm kind of curious how does this work? I mean?
Tim Pappa:how do you take your knowledge and share it with other folks? I think a lot of it's internal, which is obviously a problem, right? I think the opportunities there are actually vast still in terms of how the national security enterprise, how government agencies, interact with private sector. What I mean is an example in theory is certainly the government in the United States relies heavily on Amazon and Microsoft for their services right, I mean, that's still a client relationship and certainly they have a number of teams that are involved in, like, countering threats, and so you read about things like Microsoft in Ukraine playing a major role right In rooting out bad actors. So in those situations that's valuable. But what about the companies like my company that doesn't have that kind of client relationship with the government? Do we actually have more freedom to be creative and build different things? I would argue we do. We do because we're not worried about delivery of services to the government and we don't have to fit what that client wants. So potentially it creates flexibility and opportunities.
Tim Pappa:I think, in reality, that with corporations sharing with each other, I think there's a lot of concern about proprietary stuff. So I think what happens is some of that has to happen out publicly non-affiliated right. I've seen some things recently in the past few months from Amazon and Microsoft about some of their deception work which was new. It's newer, they can do that. It's harder for other companies to do that and again, a lot of companies don't actually have dedicated cyber deception groups as much as everybody needs it. So if you look at something like the Mujahideen example, sure we can present on that, we can publish on that. People might take some lessons from it. I actually think the real value at this point would just be in corporations talking about it publicly. That's it. I actually think the real value at this point would just be in corporations talking about it publicly. That's it. That's it, because that can be quite a deterrent.
Tim Pappa:My takeaway with something like deterrence or deterrence theory is there's like a key point that I read about a couple of years ago about deterrence theory where they were saying like that I read about a couple of years ago about deterrence theory where they were saying like you can't look at deterrence theory like it's meant to just prevent activity or stop it. A lot of times it doesn't. What it does do is change the nature of that interaction. Right, and I think that's what we're trying to do? Would it? Would it be? Would it give a lot of attackers pause if a bunch of retail companies like target and Walmart suddenly said oh yeah, we've overhauled our whole cyber deception approach and we've incorporated in different ways? Or what would happen if they came out and said actually, the past five years we've been applying cyber deception in a customized fashion across the network, whether that's true or not. So we.
Tim Pappa:Some of the time we get away with that right.
Steve McCrone:Yeah, I think we discussed, Tim, the analogy of looking at cyber security and cyber attack as a business, and in that business you have entrepreneurs, you have innovators, you have government agencies, you have small organizations, each of which has its own unique kind of agenda and goals and aspirations.
Steve McCrone:Whether that's I mean typically it goes ego, information, money in terms of the gain, and one of the things that we talked about or we tried to do with, particularly our client, is say, well, if it is a business, then I think to what you're saying in terms of keeping stuff out is we want to increase the energy required to get into our system, so, whether that's through deception or whether that's through hard barriers, and then we want to decrease the expectation of return if people do get in, and that could be also the expectation of return if people do get in, and that could be also the expectation that you might actually be made a fool of, you might be mocked, you might be.
Steve McCrone:You know you're running that risk. So you increase the risk across a broader set of parameters, and I'll just go back to the sort of theme that I see emerging in this conversation, and that is, if we go and take cybersecurity as a very serious topic, then we actually rob ourselves of the ability to be kind of playful, experimental and creative, and that's what drives our ability to create deception or create a mismatch or create an asymmetry between the attacker's resources and our own.
Tim Pappa:Yeah, I agree with you, there's all kind of space for that. I mean, okay, even if you look at just why do people share certain content online? A lot of it is when it's funny. Right, it's funny, it's emotional and usually if it's, like you know, in-group, out-group type of studies, that's always the contrast to it. If you see content online that's funny about something like they're making fun of an ISIS terrorist or whatever, we might want to share that and our in-group would be like, yeah, they're being ridiculed, this is funny. Whatever the out-group, the ISISis terrorists, they'll be angry about that.
Tim Pappa:And also, will share that they will also share that widely.
Tim Pappa:That's right yeah I mean that that fits within those, um, that framework of like. Why do people share what they do online? It's it's got to be emotional and usually it's either angering or it's funny, depending on like, what group you're in. If you have both people, that's what often makes things viral on a lot of platforms and, um, in in past examples we've tried to do that and sometimes it's worked and sometimes it hasn't. Um, things can get complicated when there's been times where it hasn't worked, trying to create content that will drive an adversary to want to share it to their detriment. Right, they share within their group and it's probably influencing people's perceptions of people in the group, whether it's true or not.
Steve McCrone:I think maybe stop taking cybersecurity so seriously. It's not such a good sales pitch. I don't know.
Tim Pappa:Yeah, that might work for some people. I mean, it's kind of like, um, there's, there's that, but then there's also, like for a lot of companies, like what's your, what's your method and your rigor for how you craft something to respond to this and it's it's great that I mean I'm fortunate that I'm at a large corporation that has a lot of resources to be able to do that. I know, obviously, like one thing I'm looking at right now for research I'm developing is doing a survey among nonprofits and NGOs about their attitudes toward hiring former cyber criminal felons and why. That's kind of interesting. Potentially is in the U?
Tim Pappa:S, you come here and go to jail. You're extradited here and go to jail. When you get out, you're still a felon. You usually don't have immigration status. The federal government's not going to hire you. Nobody wants to hire you, even though, yes, you're massively talented. That's what got you in jail in the first place because of who you're victimizing.
Tim Pappa:So, beyond like reputational concerns about things like that, which are valid, right, um, local and state governments have started to relax their approach on some of that and a lot of us state and local governments part of the law now is on applications for employment. You. You can't have a question on there about if they're a felon or not. Okay, you can ask in an, but the whole point is this opens the door. I don't know what the response will be.
Tim Pappa:My theory, my hypothesis, is that actually a lot of nonprofits and NGOs that have hardly any staff, they have hardly any money to do network defense, I think they'll actually would have a positive attitude toward the idea of consulting a former cyber criminal, like a Russian cyber criminal who's living in the US now that did their time in prison, to say what can we do to make this stronger? If that's a positive response, I think it could influence, potentially, policy in the US. I think it could influence larger corporations that are completely unwilling to take a chance on them, even though they want them. They want to hire them, but they can't because they've got people who own stock and people on the board and they're concerned about their reputation.
Tim Pappa:But I bring that up to say there's got to be some kind of solution for the small guy out there, and I think part of that is not only talent like that, I think it's influence. So can you imagine a situation for deception, cyber deception if the kind of threat actors that would normally look at a very small local government and want to attack them or do some kind of attack on a critical infrastructure like a water utility, if they knew that there is a very notorious Russian cyber criminal that's taking a look at their network and try to make it better, that might give them pause. I know it's a provocative idea. I know a lot of people are like I don't think it's a good idea, but I think it's something to explore because otherwise, as things are right now, I think a lot of people continue to get victimized.
Brian "Ponch" Rivera:You got a good point here. To me that's like a red team technique Somebody you don't want a red teamer who doesn't understand the system, right, challenging you, so going I was thinking outside the box here going to find somebody who may be motivated to, not by money but by just the challenge, right and the puzzle. And, by the way, is there a profile for a cyber criminal that you can kind of give us an archetype for, or anything like that? Is there a type of person that you know the characteristics that you know makes up a good cyber criminal?
Tim Pappa:I'm going to give you a different kind of answer. Okay, when I see a lot of general profiles of a cyber criminal, it gives me pause when it's often like young men who are loners growing up.
Brian "Ponch" Rivera:Living in the basement eating Doritos Mountain Dew.
Tim Pappa:They love to tinker with things and figure out how they work. Um, yes, there's truth to some of that, but I feel like it's a very tired cliche. I think the approach is just wipe any mental schema you have whatsoever about cyber criminal, because that whole idea has become much more diffuse. Like I did a case study recently on a a um uh, a hacker based in Switzerland who was a trans anarchist hacker leaker, and so her whole thing was I want to find organizations that are not protecting data that's important to the public and I want to expose that. And, sure enough, she got indicted and she is stuck there.
Tim Pappa:But you have someone who has all these labels, they have all these identities. They can be called an activist, they could be called a cyber criminal, they could be called all kinds of things, but they have all these motivations. So how could I possibly be like oh, here we go, it's the guy in the basement. You can't there. It's very different now.
Tim Pappa:And, um, I think that also troubles the way we like estimate threat. So I think, going back to the way we'd approach things in the BAU and I do it now too is you really got to try and start with a clean slate, because I think that'll help you manage things you probably have to anticipate these days, which is you might be looking at something that's generated by AI. Right, you might be looking at a false flag. You might be looking at a nation state actor that's trying to act like a hacktivist, and so if you can put aside those presumptions about what you're probably looking at just based on the activity you're seeing or what they claim are the reasons they're doing this, I think it actually helps clarify some of your analysis. So for that reason, I usually don't like to try and point to a prototype of a cyber criminal, because they're too different. You know, it's all. It's too variable.
Brian "Ponch" Rivera:I like that contextual answer that it depends, right, and you know that's just my me being naive and talking about the kids I knew growing up that are probably hackers now.
Tim Pappa:Well, there are people like that. There are people like that, but there's definitely there's. I think there's been a very, very slow recognition of what the reality of the cyber crime ecosystem is. Yeah, and the stuff that keeps coming out from some some really great longitudinal studies of dark web markets and forms is that it's kind of like think of it like professional sports. There's a very small percentage of people in a population that are actually good enough to become professional athletes. These are the malware developers. These are the people running a ransomware gang. Literally everyone else are doing odd jobs and small functions and they're not being paid much for it at all, and that's the reality of cybercrime. An organization might be talking to customer service for a ransomware organization, but it might be someone who their English is good, but other than that, they're not even that technically capable. So that's why, again, I think you've got to put aside what we think they might be and start with what we're actually observing and experiencing, based on their behavior.
Steve McCrone:So the analytical approach of you know, predict the bad actor and then buy something off the shelf to guard against them is not really a reality in the face of the complexity of the system we're in.
Tim Pappa:Yeah, I think, at a minimum, if you do that, you got to repurpose it, you know, to something that really fits your organization, even though I think you know, with deception tech, they will tell you nowadays, oh, you can customize it, we can adapt it to whatever you want. Okay, that's true, that's fine, but that doesn't work for a lot of people. They can't afford it, um, or they're not sure how it's going to function correctly in their you know, unique network enterprise.
Steve McCrone:So I think it's got to come down to you really got to try and customize it if you want it to be effective, um, so, before we wrap up, most people who would be listening to this probably don't work for a large retailer or government agency that's massively well-resourced and can afford Tim Pepper and the likes to come in. What advice would you give to the average person and the average organisation in respect to the things that we've talked to or about today? Okay, organization in respect of the things that we've talked to or about today? Okay, I mean, how can they be, how can they be involve themselves in in deception or involve themselves in a more complex, friendly approach to cyber security?
Tim Pappa:I think, uh, if, if we're talking about deception, I think the most cost effective thing they can do is look for ways to communicate Whatever works for them. Communicate that they have deception on their network, that they use these kind of functions, whatever it takes to slow down an attacker and give them pause. They should do it. It's going to be very difficult for anyone to verify that right. It's going to be very difficult for anyone to verify that right. And then advice that I've given many nonprofits and small NGOs before is things that you hear in October for Cybersecurity Awareness Month. A lot of it still comes down to. You've just got to talk to other people in your try to try to influence people's security practices. Like I think.
Tim Pappa:Obviously people would be stunned if they knew how, how few employees, even in InfoSec, use multi-factor authentication on everything they do. A lot of them don't. If an organization did that, they'd be in the 90th percentile in terms of deterring and stopping attacks. But a lot of that change happens by people modeling that behavior. People they respect. People they like. They keep finding that in studies of like that truly makes a difference in organizations. I've seen things too in spear phishing, where people have been writing legally about. Like is spearfishing? Is it as effective as people thought in terms of educating organizations so they don't fall for it? That might be true and I haven't been involved in the studies, but I've read about some of that. What I have seen is effective and I know it's a practice of some in industry is they actually? If you get a spearfishing email from your own organization, they want you to tell people on your team.
Steve McCrone:Does this explain spear phishing to the uninitiated?
Tim Pappa:No, they just want you to not only report it, whatever, but share it to people on your team.
Steve McCrone:Tell them about it. I heard you, hey, I got this email.
Tim Pappa:This is what was involved in it. When that's happened, they've seen rates of detection increase dramatically. That's not like cheating, it's um. That's how it should be, um, because people start to normalize talking openly about that stuff and sharing cues that they find about what might be suspicious or not, and it models shared behavior, because Because now someone with you if you find out, someone that you like and respect has been using MFA and you're not, you probably might be more likely to do it as well. So I think that would be the second piece that organizations, the small organizations, should talk openly and share openly about these practices and then model that behavior, because there's simple things they can do to really make a difference.
Steve McCrone:Cool, that might be about where we leave it. Tim. Thanks very much for your time. That was an amazingly interesting conversation. Thank you, guys. Thank you for setting it up, appreciate it.
Tim Pappa:No, thank you for your time. I appreciate the discussion.