No Way Out

Drones, Airspace, AI, and Decision Making: DroneUp CEO on Leading in Complexity

Mark McGrath and Brian "Ponch" Rivera Season 2 Episode 117

Send us a text

When Tom Walker walked into a Best Buy and saw drone technology more advanced than the space shuttle selling for under $1,000, he knew aerial autonomous systems would fundamentally change society. What he couldn't have predicted was reconnecting with a former 13-96 OCS classmate decades later to discuss how military experience shapes innovation in the rapidly evolving drone industry.

The stakes couldn't be higher. With over a million drones now flying in US airspace—outnumbering manned aircraft four to one—and that number projected to double by 2027, we face an unprecedented airspace integration challenge without adequate systems to protect general aviation. Recent near-misses, go-arounds, and actual collisions with manned aircraft highlight the urgency of finding solutions that balance innovation with safety.

Drawing on his submarine background, Walker brings unique perspectives to these challenges. The military's emphasis on teamwork, contingency planning, and mission focus has directly influenced his leadership approach in an industry where pivoting is essential for survival. "I don't want the subject matter experts defining the outcome," he explains, "but I know I need them to help me understand the complexities of each stakeholder."

Perhaps most fascinating is the discussion around autonomy and human oversight. As AI advances, Tom and Ponch agree that human judgment remains irreplaceable in critical decision-making. When a drone approaches a target and discovers a daycare center nearby, or when a delivery drone encounters an unexpected dog, who makes the final call? This tension between technological capability and ethical judgment permeates both commercial and military applications.

The conversation culminates with a powerful insight about resilience: "Sometimes getting punched in the mouth means you're on exactly the right path. You just didn't realize the significance of the change you were trying to make." In an industry transforming how we think about airspace, transportation, and autonomy, this philosophy may prove essential for those brave enough to lead the way.

Want to hear more conversations with innovative leaders applying milit

NWO Intro with Boyd

March 25, 2025

Flow Learning Lab

Find us on X. @NoWayOutcast
Substack: The Whirl of ReOrientation

Want to develop your organization’s capacity for free and independent action (Organic Success)? Learn more and follow us at:
https://www.aglx.com/
https://www.youtube.com/@AGLXConsulting
https://www.linkedin.com/company/aglx-consulting-llc/
https://www.linkedin.com/in/briandrivera
https://www.linkedin.com/in/markjmcgrath1
https://www.linkedin.com/in/stevemccrone

Stay in the Loop. Don't have time to listen to the podcast? Want to make some snowmobiles? Subscribe to our weekly newsletter to receive deeper insights on current and past episodes.
Recent podcasts where you’ll also find Mark and Ponch:

The No Bell Podcast Episode 24
...

Brian "Ponch" Rivera:

Tom Walker, good to see you, man. Good to see you, class 1396. That's right, you and I sat on a flight about five years ago. We were sitting right next to each other talking about all the people we knew. Do you remember this? Oh, yeah, and then it took about 30 minutes.

Tom Walker:

What was interesting is that we didn't even recognize each other when we first started talking. I think it was one of us mentioned OCS and Seaman Admiral.

Brian "Ponch" Rivera:

Yeah, yeah, we were talking about Gunnery, sergeant Anderson, anderson, and we were talking about everybody. And then we're like, wait a minute, we were the same class, that's right. So, yeah, small world, it's been about what? 30 years, is that right? 29 years, yeah, it's been a long time, yeah. So here we are, your company Drone Up up. You founded it in 2016.

Brian "Ponch" Rivera:

All right, and you did something really stupid. You left the Navy at 16 years of active duty, right, yeah, very similar. I punched at 16. Not a lot of folks do that. I discovered that the folks that do have something that they want to do, that's, I'd say, bigger than themselves and take the lessons from the military and go apply it somewhere else. One thing I discovered about being in the military is a lot of folks like to stay in the military because it's safe. There's a process, there's a way to get promoted, there's a way to do things, and you lose a lot of innovation in the military by the system, and that's what I've discovered over the last 10 years or so as a senior officer in the military is just that the system kind of stifles innovation.

Brian "Ponch" Rivera:

So, guys like you jump out, you create a company. Uh, I want to hear a little bit about how you came up with drone up. Uh, you know when you left the military and your path to there, so let me know, how did that work out for you?

Tom Walker:

Well, when I first left the military and first off, thank you, this is great I never imagined that when we were in OCS together we'd be sitting here nearly 30 years later having this conversation. So that's kind of exciting. You know, when I first left the military, my wife owned a technology company. So I went to the safest job that I could find and that was working for her. You know, if you think taking direction and taking orders in the military is tough, get out and go to work for your wife. Nice, but it worked out. We worked together for a very long time, built that company up.

Tom Walker:

But I always had an interest in autonomous systems and drones. You know, a lot of people ask me oh, you must have been exposed to these in the military and I'm like well, in submarines we didn't really use drones a lot. Right, but even in my career later on, when I was working and supporting the special operations community, I never was really exposed to drones. But the truth of the matter is is I walked into Best Buy one day and I was picking up some equipment for web techs and I looked on the shelf and I saw this piece of technology that was more advanced than the space shuttle in many ways and the systems that it had, and you could buy it for less than $1,000.

Tom Walker:

And I knew at that moment that these aerial autonomous systems were going to fundamentally change society, for the good or for the bad, and decided that day that I was going to do something with them and hopefully influence it to be more positive. And I like to tell people you know, starting a business is risky and everybody knows the percentages of success. But in our case we started a business around drones, not really knowing what it was we were going to be selling to customers who didn't know what it was they were going to be buying In an industry the government had yet figured out how to regulate. So you want to take those hyperbolic failure curves and bend them even a little more then that's what we did.

Brian "Ponch" Rivera:

So there's an interesting arc that you and I talked about before we started recording. That is when you departed the DOD, you departed the Navy. You ended up in a similar path that I ended up in, which is software development. I ended up doing a lot of work in the agile space, learning about all these frameworks and arguments about Kanban versus Scrum, and I ended up working with Jeff Sutherland. A lot of industry folks were looking at the military at the time. A lot of folks don't know that Scrum actually came from fighter aviation and the Toyota production system. They don't know that mission command is something that we're all taught. People read David Marquet's book. They get really excited about that and they forget that there's 150 well-trained people 120. How many people are on a submarine?

Tom Walker:

roughly as many as 300. Okay.

Brian "Ponch" Rivera:

Yeah, so you get this Dunbar number in some instances where it's less than 150, well-trained, really educated. I want to know what your background in Nuke Power School did for you, your background as enlisted that's something that I want to point out. You were enlisted and you became an officer, but how did that help you really create this company?

Tom Walker:

officer. But how did that help you really create this company? Well, it's interesting. I don't know how much nuclear power, in terms of the training, influenced me, but if you go back, I graduated high school in Arkansas, in Southern Arkansas, and at the time we were ranked a solid 50th in education. And I like to joke there was a Business Week article one time about the education level in Southern Arkansas and, like I said, we were last in the country and it said that a high school diploma in Arkansas was equivalent to a seventh grade education in the rest of the country. And so I often joke about my seventh grade Arkansas education. My staff hears me joke about it all the time.

Tom Walker:

But I applied and went through the standard MEPS application and then you had the NFQT and I got into this program. And what most people don't know is everybody talks about the loss rate in the SEAL training program and in the pilot training program. But most people don't understand we have the highest loss rate is in the nuclear power program. But most people don't understand we have the highest uh loss rate is in is in the nuclear power program. I mean because you you start out with nuclear field a school and then you go into this nuclear power training school which is just the most intense training that you can imagine. I mean, it's it's just it. Every nuke who's gone through it appreciates that you essentially get a college degree in about 18 months.

Tom Walker:

And I had never had to study and so I think I went in a little cocky with. This will be easy, it's the military, it's an easy punch out from Arkansas and realized that it was the first time that I really had to study. But it was also the first time that I had ever collaborated with other people around me to kind of leverage their strengths and where they had had better high school training and education in certain areas of math, chemistry and physics that I just simply didn't have. So it was kind of my first exposure to teamwork. But I will tell you it wasn't until the submarine.

Tom Walker:

You know actually making it and doing my first deployment on a submarine and realizing you know you're stuck with this. This is your team and you know day one you like some and you don't like others, and some are strong in some areas and some are weak in others, but you're forced to work with that team. You know you can't just replace somebody, you can't just order new parts. I mean, it was really that small collaborative teamwork where you're surrounded by hyper-intelligent people who are here because they want to be. You don't end up on submarines by accident. That's a tough road, you know, a road to hoe. So it was that small team, collaborative, mission-driven, outcome-driven environment of submarines that I think really kind of instilled into me the way that I approach even business today.

Brian "Ponch" Rivera:

You know there's a group of folks that study submarines, flight decks of aircraft, carriers, fighter aviation, healthcare, nuclear power plants, and they look at this and they say it's a high reliability organization, high reliability theory, and out of that we get resilience engineering which can be applied to, and is being applied to, software development, to get into DevOps and things like that. But the origins again are what you just described. It's the starvation of resources. You're almost forcing people to learn how to work together. In our case, in aviation, we're taught how to work together through aviation, through resource management. You learn from the new power school.

Brian "Ponch" Rivera:

You learn how the power of uh I don't think you call it debriefing, but critiques you do the same team life cycle as anybody else would in a, um, like fighter aviation and seals by the way, I don't know if you know this, but seals borrowed a lot of this from both of our communities uh to to really learn how to work as a high performing teams. So, um, one of the things we look at on the podcast are those great lessons from the military, like mission command, and we touched on that with David Marquet briefly. But this high reliability theory really applies to what's going on in your space now right. I mean, I saw you write recently about airspace, the dangers of uncontrolled airspace and the rise of I guess we call them uncontrolled drones. Can you describe a little more about what that is?

Tom Walker:

Yeah, so you know we can go into that a lot deeper, but I mean, the real issue right now that we have is we have so many of these. You know drones or, as we talk, you know, aerial robotic devices that are flying around, and for years and years, and years, the FAA has been trying to figure out how we integrate them into the airspace safely. But, as you know, our air traffic control system is woefully out of date. I mean, we're still using copper wiring as our primary transmission back end. You know radar systems that were built in the 80s. I mean it's just so unbelievably out of date and there hasn't been funding for that.

Tom Walker:

And so I think when our industry came along about 10 years ago and started really talking about flying drones and scaling it as an industry, the FAA's position was we got a big enough problem trying to fix what we got. These are not really and but you fast forward. Now there's a there's over a million drones flying in the air national airspace in the United States to only 250,000 manned aviation. So we're already outnumbered four to one, and the number of these drones is going to double by 2027 and double again by 2030. And, as of right this minute. There is no system in place to protect manned aviation from drones. There's nothing, no technology, no, anything.

Tom Walker:

And so now what we've got to do is this really complex issue of how do we work with the policymakers, how do we work with the regulators, how do we work with the people in the industry to scale what we know is going to scale as an autonomous industry, but how do we do it in a way that protects general aviation? How do we do it together? How do we coexist? And, unlike the other communities and everything we talked about, where we can all come together as a collaborative small team, this is just a population of teams that have their own self-interest in mind. General aviation wants to protect the national airspace, wants to protect passengers and those crewed platforms. The unmanned industry wants to just continue to expand and build their scalability, and nobody's interests are similarly aligned right now, and that's a challenge for people like you and me, who are used to being able to get everyone around the table and say, ok, we're going to agree and we're going to dissent, but at the end of the day, we're going to all go out and collaboratively work together and make this happen.

Brian "Ponch" Rivera:

You know, I spent some time in an air and space operations center. I spent three years there as a master, air attack plan chief and combat plans division deputy. So I spent a lot of time in the Air Force. And there's this piece of airspace we just didn't touch and that was for the helicopters, didn't touch, and that was for the helicopter. So I think it was a thousand feet below. I just kind of left it blank like yeah, that's, that's a mess that somebody else is going to figure out. Right, well, that mess is here, it's, it's for all of us. Now and I think, um, can you kind of walk us through the dangers to, uh, you know, mom and pop sending their kid off to school? For, you know, the first time this summer, I mean is there a?

Brian "Ponch" Rivera:

threat to them with these drones in the airspace. What are your concerns at the moment?

Tom Walker:

Well, this is an interesting conversation to have because and I like to joke this is where my bipolar disorder is an advantage. By and large, we've spent the last eight years. We built the largest drone services operation that had ever been built. We had 55,000 operators operating drones all over the United States and beyond the borders. We moved past that and, in nine months, built the largest drone delivery operation in the world with, at the time, could serve 4 million households and the entire time, when you know, the media and maybe the general public was saying we're a little concerned about drones, our position is they're very safe. We've got a perfect safety record, a perfect track record. Now you fast forward. We are now on the other side of that. Now we're addressing the concerns that we spent eight years saying weren't concerns, but they weren't because the density of operations was so much lower. Now it's not.

Tom Walker:

So now, if you look just in the last year, year and a half, we've had 24 different near misses of drones and aircraft at a single airport. Those are the ones that are reported, though. Right, and that's important. These are the ones that are reported. In one single airport. We've had six airliners have to deviate off and do a go around to avoid a potential collision with a drone. We've had three instances of drones hitting manned aircraft and one just happened out in California with a firefighting aircraft, you know, and the only thing that we have in place to protect general aviation from drones is policy, and that policy has been reported as violated to your point. Reported as violated 277,000 times.

Tom Walker:

So do we have a problem? Is there a threat? There is, and that's really where we are today and that's what we're focusing on. We're talking to Congress, we're talking to Department of Transportation, we're talking to the White House and we are trying to Congress. We're talking to a department of transportation, we're talking to the white house, um, and we are trying to come up with a solution, because unmanned systems are not going to go away. Drones are not going to go away. The future is autonomy. If you saw Maverick, uh, the recent Top Gun movie, I mean that that message came through loud and clear. I mean, but not today, but not today, it wasn't that day, yeah, yeah.

Brian "Ponch" Rivera:

No, this is great. So now we've seen in the last few months, you know, DeepSeat coming out. We have new, different types of AI moving in the space rapidly. So how does artificial intelligence plan to how we manage our airspace or how are you looking into that?

Tom Walker:

Another very interesting question, because our platform is built on leveraging AI, but you know, what it really is doing is it is learning through algorithms that we've already traditionally used for deconfliction and separation and for optimization of the airspace, and so now we're moving forward by, you know, leveraging AI to speed up the processing of those. So, you know, one of the things that we've heard from traditional aviators is I'm not really ready to have AI control my airspace, and I understand that it's algorithms we've already used and a mechanism by which we're speeding up the processing of those algorithms. And then they kind of look at it a little bit differently. Right, Planes land themselves today. Right, Planes can take off themselves and people say, but that's only an emergency and you have to tell them.

Tom Walker:

No, believe it or not, you've probably… there are board pilots out there right now. There are very board pilots. You know. It reminds me I think I told you this when we met is. I meet people all the time and they're like, oh, my son or my daughter would really love to come work for you. They love to fly drones, and I'm like, yeah, they wouldn't have as much fun as you think, because flying drones today is sitting beside looking at a couple of panels waiting for something to turn red.

Brian "Ponch" Rivera:

I mean, that's really where we are. So it's really just an operations center more it's a command center. Command center, Okay, Okay. So you have to have great situational awareness. What's going on? Communication between the platforms. A lot of lessons we learned in the military, right, yeah?

Tom Walker:

And people say I've had people look at, you know, the panels and say there's no way one person can monitor all this.

Tom Walker:

And I have to go back to the days on the submarine when we would sit in front of these reactor control panels and electrical control panels and you know there's 600 different gauges and meters and you, you, you gained an awareness, a situational awareness, where you didn't have to stare at it. You know, you, you just knew when something was out of out of uh, uh, out of whack, and, and it's the same thing today, and it's even a smaller amount of information. But we we have so much, we gather so much data from these devices that are flying around that we're still not even leveraging that. I think AI is going to be a key to unlocking the ability to really exploit some of the data that we're collecting that we just simply don't do.

Brian "Ponch" Rivera:

So I read some of your articles and some of your ideas and maybe some of your values, but what I understand is you're looking at this from a how people work together with technology more so than the process and the technology, if you. I think, if I'm reading what you wrote correctly and understand your background you're putting the people first and the technology second, but the interaction between that is critical.

Tom Walker:

It's. You know it's when you're in the business of autonomy and you're in the business of improving autonomy with AI. It always creates the question where are the people right? But one of the things that we understand it's just like right now, we have jobs that are transitioning within our own company, that are moving from traditional pilot operations into AI engineering. In other words, we still have to train these systems to understand the different flight characteristics of different platforms. So when you are deconflicting them or creating that lateral separation, this platform doesn't operate exactly like this platform. We have to still train the systems to learn that. They don't know that inherently. And it's very similar, you know.

Tom Walker:

If you think about rules of the road and boating, yeah, and everybody says, well, boats should never collide. Well, if I know the rules of the road and you know the rules of the road and we are both following those, then everything works great. But what happens when one particular element in that system isn't following those rules? One particular boater in the water not that this would ever happen doesn't know the different ways that you interact and who has the right of way, doesn't know the different ways that you interact and who has the right of way. And so until we have those systems able to understand those deviations from what is expected normal operation, then we have to have those people driving those decisions. We have to have those people who are there to intervene, and I have not encountered an autonomous system today that's even close to being ready to do it.

Brian "Ponch" Rivera:

So there's always going to be a person in the system. There's always going to be a person, okay. So this is interesting. I learned about this through Toyota.

Brian "Ponch" Rivera:

You know, they started with a loom, so they started automating looms many, many, many years ago and one person was able to monitor many looms, and that's the same thing. Is that automation? So within the Toyota production system, it's called Jidoka, I believe it's automation with the human touch right Human in the system the whole time. We want the human in the system. I fear I don't know what your thoughts are, but as we see the war in Ukraine continue, we see more use of drones, and I think there's a report this morning about drones flying out of Ukraine towards Russia. There still needs to be a person in the loop in decision making, correct, I mean, that is so. I want to hear your thoughts on that a little bit more.

Tom Walker:

So, you know, I'll give you, I'll give you two different examples of how this conversation has evolved within our industry and within our our our kind of close group that really focuses on this is, you know we were talking about the other day. Somebody had a question and they said well, if a drone is given a target and it goes in and as it's getting ready to deploy its you know kinetic platform to this target and it realizes there's a daycare center adjacent to the target, how does the drone make that decision? And if you really want to get into a really deep philosophical AI conversation, it is well. Ai can understand the geopolitical impacts and the value of this target versus the potential implications of a collateral damage incident, and it can do all those calculations in seconds and make the right decision. And somebody in the room said is that the decision I want the drone making Right? I mean, it's similar to, as you can understand, a pilot in that situation, while a pilot sees that in that situation.

Tom Walker:

I would hope that they would make sure that somebody knows about that. And then what do we want to do? And do we have time to maybe reassess the situation before we deploy, deploy this and, and it comes all the way back down to delivery drones. We're getting ready to deliver. There's a dog on the ground. The dog could grab the package. Do we go ahead and deploy it? Well, we've got a guaranteed 30-minute delivery time. There's never, at least not in the foreseeable future, and I kind of believe never that there needs to be a human element in that final decision, and I could give you a dozen of those, but those, I think, are two very kind of ones that we can relate to today of there has to be somebody making that call.

Brian "Ponch" Rivera:

I got a question. So with autonomous vehicles on the rise, I imagine you're looking at the same issues in aviation. But if a Tesla runs into a human, what's the difference between a Tesla running into a human or a drone running into a person and injuring them? Who owns that? I mean the software developer, Is it?

Tom Walker:

software developer. Is it the sensor developer?

Brian "Ponch" Rivera:

Do we have to make a sign? Blame.

Tom Walker:

I mean, yeah, exactly, and when you get into situations like that where you say you know the probability of that happening with a self-driving vehicle, ironically, is infinitely smaller than it is with a human-driven vehicle. So do we have some acceptable risk in the fact that it's going to happen? Is there some number where, as long as it's less than 1.5% as compared to the four percent over here, does that make it safe or unacceptable? Uh, but we, we're a society where we feel like somebody has to to to take blame and assume responsibility for anything that deviates outside the norm. Yeah, I don't know that we'll ever change that, um, but you know, at the end of the day, is that really an autonomy question? Is that a sensor question? Is that a software day? Is that really an autonomy question? Is that a sensor question? Is that a software programming question? Is that a?

Tom Walker:

I mean, at what point, at what point do we move that trust level from a human? I mean, the big question that we're dealing with today in one of the groups that I'm involved in is do you trust AI more than a doctor? And you know we had this same conversation a year ago at a conference and I would say it was 90-10, that I would rather have a doctor. I'd rather have a person who's gone to medical school, who's looking at me, who smells me, who can see me making that decision, and it was overwhelming. It was 90-10. We just had that very same conversation about three weeks ago and it's 90-10.

Tom Walker:

The other direction I mean it's completely changed to the point of this platform has access to so much more information than what any one individual could ever have. But that comes back to the question you asked. I'm not going to deny If I have something wrong, I'm going to AI. I have something wrong, I'm going to ai, I'm asking about it and then, when I'm done, I'm going to my doctor and saying here's what ai has done and I'll tell you. The other change I've seen is doctors by and large despised web md right and those type of online platforms. So you've diagnosed yourself using the google right.

Tom Walker:

that's what I had a doctor tell me one time um, the doctor that I have now is probably one of the best doctors I've had in my life. I'm really fortunate to have him as my, my, my general doctor and and and we discuss ai in terms of how he's using it, yeah, and how he appreciates when people do the due diligence and bring it to him because it can shorten the conversation and actually improve it. So you see that kind of change, but again still want a person. I still would rather have you tell me, yeah, you're going to, you're going to make it or you're not, rather than than than chat GTP.

Brian "Ponch" Rivera:

No, that's fascinating. The uh thinking about this, um in in in the cockpit. Um, I used to work for an airline years ago and I went through the aviation crew resource management. Threatened air management. Errors come at you, I'm sorry, errors come from you, threats come towards you. So the errors leaving the cockpit meaning if I hit the wrong switch or do something like that, I think in a normal flight it's like seven to ten times a flight there's, there's an error that gets trapped within the cockpit that you know that can be, uh, somewhat a threat to the passengers, but that's.

Brian "Ponch" Rivera:

You have a team in there, a crew working together and checking each other and make sure they're doing things correctly. So, um, those lessons are, are are very valuable in healthcare. It's same thing threatened air management, um, uh, operating on the wrong foot, putting the wrong meds in a body. I've seen it. So I've taken these lessons from the Toyota production system, from aviation, seen them work inside of health care. It is fascinating, right, because it's not the one ego in the room, it's not the one God in the room, it's the team working collaboratively with a surgeon and trying get improved patient safety, if you think about it like that. So these lessons are kind of universal. I don't think that's any different than what you guys have to deal with, right?

Tom Walker:

No, you know it's, it's, it's. I think the bigger issue for us and I think it face it's a challenge in other industries as well is, and I think about fighter pilots, for example. I mean there's probably nobody who has more direct physical control of the platform than that, right, I mean there's very few things that you actually trust something else to do when you're in a combat situation other than you. And so I try to imagine and there's two parts to this answer, but I always try to imagine, you know, you have pilots who are sitting in the pilot seat, in the right-left seat, letting the plane land itself, and they're ready to intervene. But if you were getting ready to engage with an enemy aircraft, in a dogfight, would you say.

Tom Walker:

Okay, I'm hands-off and I'll let the system control it. I think I'd rather that be just a fully autonomous platform with no human on board. I don't see a day where there's a human.

Brian "Ponch" Rivera:

Well, you got to think about it Like beyond visual range situations. I don't think I'm fine with the system shooting for me Right, shooting a timeline and all that. It's just math and geometry. At that point, fighting an autonomous, I don't know. I imagine going back to your 90-10 thing a year ago, the way AI is moving right now. I think AI can kick our butts right?

Tom Walker:

Yeah, I think it can. But then the question becomes should there even be a human in that loop, in the cockpit, or in the loop at all? Now, so yeah, let's say in the cockpit, I you know, it's interesting because if you look at most of the major metro stations, they all drive themselves right, they have automated stops, but there's a human on there. Trains can. Why do? Why do we need conductors on trains anymore? I mean, between the sensors that we have and this, you know, they're basically start, stop, they go. Yeah, you know, but we do, and same with pilots. You. We have planes that can fly themselves. No-transcript, I was on the phone on the way here. No-transcript, not the same thing, but autonomy.

Tom Walker:

We've proven the ability to take sensor data and to leverage algorithms to fly devices from point A to point B safely. We've proven that. We have the ability to de-conflict those platforms. We've proven that. And so all we're doing, by adding AI into that, is being able to not just do it with one-to-one controls or one-to-five controls, but how do we have one operator for 100 platforms or 1, 1000 platforms? Ai becomes a tool to help make that job easier for that individual, not do that job for the individual.

Tom Walker:

So, in all of those situations we talk about, are we really trying to eliminate humans or are we trying to reduce the workload on humans to the point that we can be more efficient and fly more platforms or operate more robotic devices? The moment we start transitioning to allowing ai to control that ecosystem in that environment without a human, ultimately having that decision making process, I think that's where we run into real problems. Okay, not just real problems, but real challenges in making, in coming to agreement that that's a good idea and then throw the regulatory yeah on top of that and try to get a decision where everybody's going to agree on it in washington so I think you and I do agree that a human in the system has to be, has to be there, absolutely okay.

Brian "Ponch" Rivera:

All right, um, I'm curious the the cost of? Um. Uh, you, you brought up health care a moment ago. I'm delivering. Was it medical supplies?

Tom Walker:

or yeah, prescription medications and medical supplies.

Brian "Ponch" Rivera:

Okay. So weather you have weather issues in place. Who would you rather send up? A drone or a helicopter, you know, I mean that's a threat. You can't really fly in IFR conditions bad weather but can you do that with a drone?

Tom Walker:

I mean absolutely. And drones are getting more and more resilient and we can build them. I mean, if we can build aircraft that can fly anything, we can build unmanned aircraft that can fly anything. We just you know, there's got to be a demand for it. And I think the argument about just like what happened in Charlottesville when we had that helicopter crash I mean there was no reason that couldn't have been drones doing the very same aerial surveillance. In fact it should have been right for a variety of reasons Cost, yes. Safety, yes. Instead of one aircraft having a singular view of the situation, you could have had five or six or seven unmanned systems, creating a much better cooperative operating picture.

Brian "Ponch" Rivera:

So this is interesting. I spent some time at the safety center. I do a lot of work in safety. If you go back two or three years, I would disagree with this whole conversation.

Tom Walker:

I know right, yeah go back two or three years I would.

Brian "Ponch" Rivera:

I would disagree with this whole conversation. I would go no, right, yeah, but. But what I'm seeing now is humans. We have intent, identity, intelligence that changes throughout the day. We were different people based off what we eat, what we sleep or how much sleep we get, uh, the moods we're in, and all that. So we have crew days for a reason, right, you have crew rest. Uh, for those of you that are not familiar, you want your pilots, who they're flying your commercial airlines, to get some good sleep, stay away from alcohol, to eat some good food. Why? Why is that important? Well, because they're taking care of you, they have your safety and care in their hands, along with an aircraft, right? So that's, that's what they're there for. But they're humans. Humans are inherently flawed. Yeah, yeah, we are. Yeah, are inherently flawed, yeah, we are. So now, by, like I said a few moments ago, if we were having this conversation a few years ago, I wouldn't agree with it. I do now.

Brian "Ponch" Rivera:

We are removing some of those flaws, those biases, those human factors that go into the cockpit and reducing it down to I'm going to use some language here that we talk about quite a bit. It's a complicated thing, right? So the difference between complex and complicated is a 747 or a Airbus 350 is complicated. You put a human in it. It's complex, right? Same thing in fighter aviation. So if we remove the complexity of a system, we might again I'm throwing this out there because I hadn't thought about this years ago, but I'm thinking about now. You're moving from the complex domain to the complicated domain and that is the domain of subject matter, expertise, high process, high repeatability, where the relationship between cause and effect is known beforehand. In complex you don't know that. We don't know how the turning of an aircraft, the mood of it, or go back to the accident up in DC, all those factors that went into the helicopter crash against the type of aircraft.

Tom Walker:

The CRJ, yeah, 700,. I believe it was.

Brian "Ponch" Rivera:

So many things were going on in there. It's not one thing, it can't be reduced down to one thing. It's many, many things. It's complex, but if we can reduce that, we can move into the complicated domain. And I think that's what you're trying to do is find high repeatability, create safety, and you're never going to be perfect, right, right. Yeah, that's how I'm looking at this now, tom, I didn't think I would have this view years ago. I think I do now.

Tom Walker:

I would have this view years ago. I think I do now. I think I think my opinion has changed on it as well and I think I look, I, I, I think, I, I, I kind of view the situation with a slightly different analogy than what you just gave from, or I won't say, but metaphorically is systems work. Algorithms work when they take an input and they produce an output, and they work when they are given a known input that they are able to calculate a known output. That's just how systems work. And one of the things I talk about on the road is they say when will we be able to have self-driving vehicles, predominantly on the vehicle, and it's really a breakover, because the truth is, if every vehicle out there were autonomous right now, it'd be perfect.

Tom Walker:

The problem is the moment you put one human driver in the loop and maybe they're texting, or maybe they're singing, or maybe they had a bad day, or maybe they just got some bad news, or maybe they're a little hungover, or maybe they just sneezed or who knows what. The moment that piece of data is thrown into that ecosystem of autonomy, it creates an input that maybe every vehicle doesn't have time to properly calculate their algorithms and change, the human becomes the problem. You're right. You put 25 autonomous vehicles with little detect and avoid sensors on this table and they can run for 24 hours without ever touching each other. Right, we know that the moment you put one person with a remote control vehicle, something's going to crash.

Tom Walker:

There we go, and that's the complex, right? I just made a complicated system that works complex by the injection of one thing. But my opinion has changed too, and I'll tell you this. So, for years, when I would give talks, people would say when are we going to be riding around in self-driving aerial taxis, autonomous aerial taxis? And I would always tell this story and it was my way of saying here's where I am today, on my view, and and the story was, the first self-driving vehicle was actually displayed in 1939 at the World's Fair in New York.

Brian "Ponch" Rivera:

I didn't know that.

Tom Walker:

And it drove around for months and months and months and it just had sensors on the ground, it had mirrors, it was at the time a relatively advanced tech, but there really wasn't anything. But everybody then thought this is the future, this is the Jetsons. We're going to have self-driving cars any day. Eight years later, after that first one, I was in New York I mean, I was in San Francisco and I was standing outside the Virgin Hotel and the latest self-driving I won't say which brand it was, the latest self-driving vehicle was going to pick us up and drive us to my speaking engagement, me and a colleague. And all it had to do to pick me up was drive 238 feet from a parking lot to the front door of the Virgin Hotel and in that 238 feet it sideswiped a parking meter and took out the back tire of a California Highway Patrol motorcycle.

Tom Walker:

And my answer to everyone was if in 80 years, in a two-dimensional environment, I can't get a self-driving vehicle to drive 238 feet without crashing, how close am I to being in getting into something that's actually unmanned in the air? Yeah, but the truth of the matter is, now that I know the data, now that I've seen the performance, the data, now that I've seen the performance, now that I have the benefit of a history and a safety record of showing how much more incredibly safe these unmanned aerial autonomous driven assets are as compared to everything else to you, if you'd asked me a year ago, I would say there's no future for that that I see me participating In. Just the last year, I'd say my opinion has shifted the other direction. I actually am starting to think I'd feel more safe than that than in a taxi cab.

Brian "Ponch" Rivera:

You know some of the safety features in cars right now keeping you in the lane, the radar capability with the cruise control those things are fantastic and it's aiding my ability to drive. One thing that pops up quite frequently is we lose human capability as things become automated. We lose the ability to know how to drive a car right. I don't think around here many people drive a stick shift anymore, right, I don't know if they manufacture them here in the US. There might be a few. I haven't seen one here in the US. But you lose capability when.

Brian "Ponch" Rivera:

I'll give an example when we, when we got GPS in the cars, we stopped carrying maps, right, and and I teach my kids how to read a map, and all that because one day they may need it. But as we pay attention to I think it's called automation bias, we start paying attention to our Google maps or our ways or whatever we're watching, and we just lose sight of where we are right now we're just following something blindly and we show up to the destination, we're happy. That's a loss of capability and that's a real threat too. I'm curious what kind of loss of capability are we facing with more autonomy and travel?

Tom Walker:

So it's funny that you asked this question, because there's two things that have happened in the last 48 hours that I think have such relevance that I didn't realize had relevance until you just asked this question. But a very intelligent and this person's going to laugh when they hear this later on, because they know who I'm talking about a very intelligent, one of the most intelligent people that I know said to me the other day they were talking about their new car and how it it essentially drives itself. And they said it's really, really helpful to me because now I can get more work done while I'm driving. And I thought that's about the scariest thing that I've ever heard anybody say. In other words, it takes care of everything for me, so I can text and I can, you know, look at my phone and I can check things out. And I thought we've just we. What we've done is we've allowed this automation to give us a false sense of security that this automation is going to work.

Tom Walker:

And what happens now, when you actually have to drive? Are you suddenly going to say, well, now that I'm having to drive, I'm going to discontinue the habit of looking at my phone, I'm going to discontinue texting? I won't do that now that I'm having to drive, the truth is no, you won't. That now has become a component of your driving characteristic that you develop that habit and and what happens when that system fails? That you think isn't going to fail, they do fail, and I had one failed on me and several months ago I was riding down the road and I was going 70 miles an hour and I had my auto speed and auto distance set on and and which I wasn't texting or working. But I was driving and all of a sudden the car that was in front of me went off of the sensor and it started speeding up and if I had to hit the brakes, I would I would have rear-ended them. So you can't rely on these systems, but I think people do that.

Tom Walker:

I think the threat we have is is and I I've I've seen this in the drone community is when we we're so reliant now on autonomy that it's push a button to take off, it's push a button to execute the mission, that when we do lose that autonomy, the quality of the piloting, when they have to take over manual operations, has decreased markedly. And you could offset that by saying well, they should be doing continual training of manual flight operations and manual flight operations manual flight operations. But the truth is in business, when you're having to train them how to use the new autonomy, train them on how to the new policies and regulations, train them on the new maintenance procedures and everything you know, I mean pilots have the opportunity to go to the simulator, but very few people argue that that's as effective as flying in a critical situation I worry about. I know that's happening in my industry. I worry that maybe that's happening in just general driving and maybe it's happening in commercial aviation. I don't know that. I can't say that for sure.

Brian "Ponch" Rivera:

I was doing some work with the Navy. A few years ago we had some drone operators and their air sense wasn't there, right. They all went through some basic training on flying a drone or what do you call them UAVs UAVs, yeah, and I remember they were outside. We were on a platform out at sea somewhere and they're looking at a piece of sky like they could see where this thing was and I tapped them on the shoulder. I'm like it's actually over there and it's just because of situational awareness. They're not used to having that air.

Brian "Ponch" Rivera:

They don't have that air sense, and that's just what I gathered from it is being an aviator and watching these guys operate, I'm like, hey, man, you brought up how people are facedown looking at technology. Um, I see this with kids crossing the road. You know, going back to that, those examples putting a human in the system where kids crossing the road with headphones on, not looking left and right, uh, you know, because they're in a safe space for cars and I to hit them, but I believe that the number one place to get hit by a car is in a crosswalk, right.

Tom Walker:

Yeah.

Brian "Ponch" Rivera:

Because that's where people cross the road. So there's a social aspect to this as well. The idea that you're going to get more time to do more work, which more work doesn't always equate to productive work right, it's just busy work. Productive work right, it's just busy work. So because you're busy doing something in a car that's driving for you doesn't mean you're getting work done. That's right. Yeah, so curious. What are the thoughts you have on your industry? What's your biggest challenge at the moment for the drone industry?

Tom Walker:

I think it's an interesting maturation of our industry because we have fought as an industry for so many years we had almost a Tourette's level expression that the key to scaling our industry is beef loss beyond visual line of sight. As soon as we can fly beyond visual line of sight and as soon as we have a supply chain, then the industry will scale. And the truth is, and always has been, that that's not exactly true. A mechanism or a series of an ecosystem where we can have multiple operators operating different types of platforms in congested or contested airspace, whether it's manned or unmanned. And that's the thing we've always viewed as this is an unmanned issue and this is a manned issue or a crude issue or an uncrewed issue. But the reality is this is an airspace issue issue or an uncrewed issue. But the reality is this is an airspace issue and it's, you know, it's not general aviation's responsibility to figure out, it's not the unmanned systems responsibility to figure out. And so when you've got different competing interests, then it becomes very difficult for us to come together and say, first off, what technologies do we need, and then let's implement the policy to leverage those technologies. We've done it backwards, we have viewed it as a policy issue. We need policy, we need policy, we need policy. But policy doesn't actually prevent anything from happening. Policy maybe mitigates some risk. Policy gives you the opportunity for, you know, for dealing with it after there's an incident, holding people accountable. But we have not done a good job of expecting, you know.

Tom Walker:

I'll tell you, when I was back at working in my wife's company years ago, I was in Chicago and we were working with I won't name the customer, but it's a very large encyclopedia customer and they had a major investment company, one of the biggest in the world and most respected, and that company had bought this encyclopedia company. And this was right when mobile applications were starting to come out. And so we were pitching to build a mobile application to take that encyclopedia and digitize it. And I remember the representative who, from the investment company that bought them, you know, listened to us for about 15 minutes and finally he interrupted the conversation and he said look, this mobile phone and app thing is a fad and as soon as it passes we'll be back to door-to-door sales of encyclopedias. And that oversight really cost that company. It's still around, but it's not anything what it was in the day and I think we kind of took the same approach. I think there was a little bit of. Drones are cool, but they'll go away. They're nifty now.

Brian "Ponch" Rivera:

Yeah.

Tom Walker:

But they're never going to replace. You know the large scale aircraft and the piloted aircraft that we have, and now it's a million to one. You know it's four to one, with a million of them in the air. It's a million to one, it's four to one, with a million of them in the air, and I think we're late to the game in figuring it out. And so now we've got the complexity of okay, what are the policies? What are the regulations? How do we continue to scale this?

Tom Walker:

And then this becomes a bigger deal. How do we maintain a competitive edge in the United States in terms of being a leader in these systems when, if you look at the major manufacturers around the world and even policies around the world, they've evolved much faster? And then, on top of all of that, we have an aging air traffic control. We've had all these recent incidents. So we've got that particular issue, which is totally unrelated to the unmanned system incursion problem. And then, if you really want to kind of smear a little more complexity, now drop AI and the sudden growth and recognition of both the excitement around the opportunity and the threat that it brings, and this is a mess, but it's not a mess that we can ignore any longer and I think there's a really clean, clear path and a way to get there, and that's what we've been really up in DC and on the Hill pitching is let's solve it before it becomes a crisis.

Brian "Ponch" Rivera:

So I also read some, I think some of your articles where you talk about not hiring like subject matter experts. It's like subject matter experts. What I mean by that is, the solution to these challenges is not going to be more than likely coming from an expert in the field. It's going to come from somewhere else. It's going to come from your background. You're coming from Nuke Power into this drone space. Right. That, I think is is going to be the organization that can bring those cognitive, cognitively and diverse people together. Are the is gonna be the company that wins, right. So who's are you guys doing that right now? Are you bringing diverse people together?

Tom Walker:

we're talking about not not service diversity, but like this type of thing right right, diversity of background and technology and education and experience, yes, but you know, your best weapon during a moment of chaos is clarity, competing interests that I just talked about to reach a moment of agreement or at least alignment in a. What are we trying to accomplish? Let's ignore how we get there, right, right, right, but let's. What do we want to accomplish? And so you know, I've had been at tables, just like this one and this one in fact, where you've had 25 year people that were faa and air traffic control and and 25 years of policy and enforcement and and then subject matter expertise on the hill, whether it's a majority, minority, community, and then unmanned systems folks.

Tom Walker:

And then you go around the room and the biggest challenge isn't figuring out how to solve the problem. The biggest challenge is what is? What problem are we trying to solve? I mean, and what you hear from one side of the room is, the problem is is we don't need drones in the damn airspace? Okay, well, there's a position, that's great.

Tom Walker:

And then the other side is the future is unmanned, and then okay, then that's a threat.

Tom Walker:

And then you have air traffic control saying, well, you're simply trying to put us out of a job. And so if you stop and just go back and say, okay, can we define what success is? Don't worry about how we get there, let's just start with what success is. And if you take all of those people who had that experience, or all the background experience, you take them out of the room and you bring in and you put a doctor in here and you put a, a a guy who does, has done construction his whole life those in some ways my favorite people to have in the creative right, because nobody's had to deal with crap that you and I've had to deal with in terms of solving it. And then say how quickly can we get to what success looks like? You get there in five minutes. You get them out of the room, then you bring all the subject matter experts in and go okay, this is where we're going. Now let's leverage your experience to figure out how to get there.

Brian "Ponch" Rivera:

That's where we've had the most success. We talk about systems drive behaviors and the system that an organization is in or the competing organizations. In the military, a lot of folks are trying to get promoted. That's systems in organizations what drives them. In the military, it was a lot of folks are trying to get promoted. That system is trying to behave. You're not really innovative in that space.

Brian "Ponch" Rivera:

That is the motivation in the military right, yeah, but all these organizations are motivated by something you said, that one is looking at and saying, hey, you're going to take away our jobs. Okay, that may be an outcome. You may lose jobs, you may gain jobs. We don't know an outcome. It may be you may lose jobs, you may gain jobs, we don't know. But that's driving their behavior. Is that? You know paycheck that, hey, I need a paycheck, and this is what I know. So it is a human, the human condition that's. That's driving a lot of the. I guess creating the molasses for your industry, is that a good way?

Tom Walker:

It's. It's funny Cause I got a. I had a meeting or a lunch with a former governor about a month ago and we were having a conversation and he said his least favorite word on the planet was stakeholders. And he said let's have a meeting and get all the stakeholders together. And he said essentially I wouldn't go to those meetings because there was no positive outcome, because everything was a specific self-interest in the room.

Tom Walker:

And I think it comes back to the same thing we were just saying a few minutes ago. When you bring people together that from a diverse background that doesn't have you know a a, you know a play in the game, you know, then I think you get to a faster outcome. The problem is is there you and and? But you can't. I think this is.

Tom Walker:

The unique thing I've learned is I don't want the subject matter experts defining the outcome, but I know I need the subject matter experts to help me understand the complexities of each of the stakeholders in order to be able to negotiate, to be able to collaborate, to be able to make sure that you know my, my, my, my, uh, uh. Father-in-law, who passed away a few years ago, had one of my favorite sayings and he said, people are always motivated for their reasons, not yours, right, and the key is is being able to understand what everybody's motivated for and against. But that value of knowing those motivations comes in when you're ready to start executing on a plan, not when you're trying to figure out how to get there.

Brian "Ponch" Rivera:

So I want to build on this. Uh, the conversation I had on on the drive over here I was talking to Moose, who's also does this podcast with me, and we're talking about an allocentric view and an egocentric view of the world. And basically the allocentric view is is kind of what you're looking to create is think a lot of companies do this, and uh, early on in the conversation you said we didn't know what this looked like. We were just kind of probing and trying to figure out what this company would be. Same type of thing.

Brian "Ponch" Rivera:

You looked at the landscape and said, hey, this is out there, I know this is out there, I just don't. Hey, we're going to do this. You start with understanding the context and then you kind of figure out, hey, what should we do to get there? And I think this is something we learned in the military. And there's actually the positive side of it where we learned this and there's a negative side. The positive side is we always start with multiple perspectives, understanding the outcome.

Brian "Ponch" Rivera:

Right, we run rock drills, rehearsal concept drills, we do simulations, we test things out to see how they work, but we always look at the landscape or a map or something first. Right, industry doesn't do this. They look at their mission statements. They bring in McKenzie. Mckenzie tells them what to do and that type of thing and they just do it. So they're losing that ability to think for themselves. What I think you're saying is the same thing we've been saying on the podcast all along is understand the landscape and then work your plan out. That way, don't start with the individual who looks at the world from their advantage, point right.

Tom Walker:

I think another thing that we learned in the military is I have a joke when people will say well you know, do you have a backup plan? And I'll say I'm an old military guy, I have a primary plan, a secondary plan, a tertiary plan and a holy shit.

Tom Walker:

I hope nobody's looking plan right and I don't see that a lot in today's entrepreneurs specifically. And I sat on a Shark Tank thing recently and my favorite question to ask them was okay, if you get this investment into your company and two days from now you realize you are totally wrong in one of the most important assumptions you make what's your backup plan? And not one person had a backup plan. They were like well then we'll go back and we'll reassess and whatever. And I, I think you know, in the military, especially when you think about submarines and and when you deploy you're you're underwater, the team you got is the team you submarines, and when you deploy you're underwater, the team you got is the team you got and the supplies that you have are the supplies that you have and the food that you have is the food that you have.

Tom Walker:

So there was always a backup plan. If this particular you know particular sensor dies, then this is how we fix it. If we don't, okay, then we would do drills Like what happens. If you don't have one of those? Well then we would do drills Like what happens if you don't have one of those, well then I got to determine which system has a reliability and redundancy that I can steal from over here, and you were always working contingency plans. I don't see that today.

Brian "Ponch" Rivera:

They talk about it, they talk about anti-fragile, they talk about optionality, risk mitigation. They just do not build it in.

Tom Walker:

It. It's just not built in as part of a business model or a business plan. Yeah, and you know, I think my favorite term I keep hearing today is I want to know what your de-risking models are, and it's like there's so much risk that maybe I'll spend a little less time de-risking and a little more time on contingency planning. I'm not saying that's right or wrong, but I think that's a little bit more. Comes from that back military background.

Tom Walker:

I'm going to. I'm going to minimize risk, but I I'm I'm used to risk. I have to accept risk, so I'm going to develop alternative strategies in case that risk actually turns into a threat.

Brian "Ponch" Rivera:

Well, I'm going to ask you a question. Uh, tomorrow the drone industry kicks your ass. What can you do with the folks in this room or in this building?

Tom Walker:

Um, you know, believe it or not, I think anybody thinks about that. I think, you know, in the military, we used to think about okay, if, if, if half the team gets knocked out tomorrow, then what are we going to do? How do you? You know, the the reality is is we are in a volatile industry. Um, it is. I just told you about all the complexities that are going on, that there, this isn't a there's no clear path for any of us, even the, the, the leaders in the industry and there are some great ones, um, uh, you know, we've had some companies that were rocking and rolling and three, four, 500% growth and then gone the next day.

Tom Walker:

I think the reality is is there's so much opportunity and there's so many ways to participate in this industry, whether it's manufacturing supply chain, whether it's supporting regulatory, whether it's helping develop policy, whether it's in the software, sensor technology, ai. There's so many different opportunities that you know we've pivoted twice already. We pivoted from drone services, we pivoted to delivery. That we pivoted to well, pivoted back to what I think our core was in the beginning is technology. If we were to get smacked, you know it's like Mike Tyson, the great philosopher, said you know, everybody's got a plan until they get punched in the mouth, right? Um, you know, everybody's got a plan until they get punched in the mouth, right?

Brian "Ponch" Rivera:

You know, if we were to get punched in the mouth tomorrow, I think we, we would shake it off, we would, we, we might take an hour or so to lick our wounds and then we'd turn around and we'd be back Sounds to me like your company has been living the complex adaptive system view of the world, which is you probe, sense, respond, you go in a direction, you see what's out there, right, this is what we tell our clients is you can't define a future end state in a space that's complex, so you have to probe. You did some probes, you get punched in the mouth and you pivot. That's what you do. That's the whole point of probing. Right, find out what's going to work and once the landscape changes, I think the drone landscape changes all the time, right?

Tom Walker:

Well, and an interesting element I would add to that, that is, the missing pieces. Everybody says when you get punched in the mouth, then you, then you rotate or you pivot or you spin or you do whatever, whatever the analogy is, but the missing piece there is. Why did we get punched in the mouth? Because, because sometimes you get punched in the mouth because you are a threat to an industry that's been around longer and bigger than you and perhaps your approach, what you were, what you were trying to accomplish or achieve, wasn't the wrong thing. But maybe you didn't, you weren't quite prepared for how big the wall you were going to have to climb was. That doesn't mean that we stop and we go in a different direction. It may mean we back up, regroup and we go back in the very same direction.

Tom Walker:

And, and so I, I, I don't like it when people say you get punched and pivot, because sometimes punch sometimes can mean you're on exactly the right path. You just didn't realize, uh, the significance of the change you were trying to make. I mean you don't change. Nobody changes the world without resistance, right?

Brian "Ponch" Rivera:

So the big talk right now is resilience in an organization. How do you build that? And we talk a lot about a Alastasius. It's as we find change in the landscape, you have to continue to change who you are as a living system or organization, that's. That's opposed to like robustness, right, robustness is just getting punched and coming back and getting punched, and you know you don't really learn, or being foolish.

Tom Walker:

Yeah, yeah, yeah.

Brian "Ponch" Rivera:

So the the whole idea of resilience again, is an emergent property of the complex adaptive system of the people and what you're bringing together. And um again, man, one of the reasons I wanted to have this conversation is you're, you're. You took these lessons and you're living them, but you don't, you're not talking the language of a theorist. When you do it, you're actually executing what needs to be done, whereas the academics out there are going to tell you what you should do, but you're actually living it.

Tom Walker:

The problem I have with the theorists and the academics are they spend more time telling me why what we're doing won't work and why we're taking on what is too big. Taking on what is too big, too significant of a change in the way that our national airspace system operates. If you listen to them and you listen to the problem, I mean it's like going back to the very beginning and the impact on the hyperbolic failure curve that we did by making the decision to be here. I'm not saying I don't listen to those people, but at the same time I don't think you know I go back to my military days, both when I was in the submarine community and then afterwards when I was in, you know, my subsequent communities. I didn't want a leader that stood up, that gave me a lot of theoretical hyperbole.

Tom Walker:

Tell me what the mission is, convince me that you're the right guy to follow into this mission, and then tell me how we're going to execute it. And you know, save all the rest of the stuff for the war college when you get there, that's great.

Brian "Ponch" Rivera:

I think it's a great place to end this episode. Tom, appreciate your time. Anything else you want to part?

Tom Walker:

No, I appreciate it. Thank you for the opportunity. It's great to connect again.

Brian "Ponch" Rivera:

Small world huh.

Tom Walker:

Absolutely All right, Cool man.

Podcasts we love

Check out these other fine podcasts recommended by us, not an algorithm.

Shawn Ryan Show Artwork

Shawn Ryan Show

Shawn Ryan
Huberman Lab Artwork

Huberman Lab

Scicomm Media
Acta Non Verba Artwork

Acta Non Verba

Marcus Aurelius Anderson
No Bell Artwork

No Bell

Sam Alaimo and Rob Huberty | ZeroEyes
The Art of Manliness Artwork

The Art of Manliness

The Art of Manliness
MAX Afterburner Artwork

MAX Afterburner

Matthew 'Whiz" Buckley