The hosts sit down with security and intelligence expert, Brian Kime, to discuss the implications of surveillance technology and AI on intelligence, journalism, and civil liberties. They explore the dual nature of surveillance as both a tool for security and a potential threat to privacy. The conversation delves into the challenges posed by misinformation, the importance of expertise in navigating these issues, and the ethical considerations surrounding the use of surveillance technology. The episode concludes with reflections on the future of journalism and the role of AI in shaping public perception and trust.
[00:00:00] AJ Nash: Hello and welcome. To another episode of NeedleStack. I'm one of your hosts. I'm AJ Nash, the digital intelligence advocate over here at Authentic8. For those who don't know me, my background, 20 some odd years in the intelligence community, chasing war criminals and fighting terrorism and those kind of things Have been in the private sector about 10 years now.
[00:00:33] AJ Nash: 10 years now, primarily building intelligence programs, advising, intelligence driven security as usual. Joined by co-host today Rob Moey. Rob, introduce yourself.
[00:00:43] Robert Vamosi: Sure. Hi. I am cybersecurity expert. I've been a award-winning journalist, author of two books, and now I'm focusing on, intelligence and cybersecurity.
[00:00:53] AJ Nash: Yep, exactly. Rob's the smart one for anybody who doesn't realize it. He's the smart one of the two of us. Except today we have somebody on who might be smarter than both of us. He'll certainly tell you he is. If you ask. No, I'm just kidding. Brian k is actually our guest today. So Brian and I are old friends.
[00:01:05] AJ Nash: Brian's been a security and intelligence expert for decades. You'll notice for anybody who's watching as opposed to just listening. Brian's got a new gray hair in the front of his head showing just how old he's today. Thanks aj. Like I say, we're old friends, right? That's, well, it just shows experience, Brian, but no, his background's fantastic.
[00:01:20] AJ Nash: This is Guy, this guy's retired army Reserve. He's worked at Forestry. He was, he was an analyst in the community there. He's worked for Carrier. He's got a very interesting educational background as well. He is got a master's degree in Information Security engineering and a Master's in Urban Policy Studies, but that was built on a bachelor's in architecture.
[00:01:36] AJ Nash: So it comes from a really unique background. Bachelor's, by the way, from Georgia Tech. Great University. A really unique background, really smart guy. And so I'm really excited to talk today with, with Rob with both of you guys, Rob and, and Brian, about what are we talking about today, Rob, man, who's, what's your idea and what are we doing to talk about.
[00:01:51] Robert Vamosi: So wanna talk about the rise of surveillance in our society. I know Europe's had CCTV and other things for years, but the United States, we now have flock. We have other technologies that are tracking where we go in our cities, and I'm wondering how that either helps or hinders the intelligence community.
[00:02:11] AJ Nash: What do you think, Brian? Let's start with you, man.
[00:02:13] Robert Vamosi: Yes,
[00:02:14] Brian Kime: yes,
[00:02:15] AJ Nash: yes. My answers too. Go ahead, pat.
[00:02:18] Brian Kime: The, the, the short. The short answer. In, in, in general, there's just so much data. There's so many sensors out there that it's hard to move around without being tracked, and that is, is. Beneficial and harmful for both sides.
[00:02:36] Brian Kime: So if you are the, the counter spy, if you are trying to track a agent in, in your area, if you're trying to track that case officer, you have a ton of data. There's a ton of sensors like you mentioned, Rob flock cameras. I've got one just a couple blocks away. And, and law enforcement, you know, can use that to, you know, track a vehicle that may not necessarily correlate to a person, but it, it certainly helps if you don't have to necessarily always put a physical tail on that agent or on that case officer, you can.
[00:03:12] Brian Kime: Be a little sneakier. You know, you don't always have to have that physical tale if you're using something like a flock camera. And but if you are on that, if you are that intelligence officer, there's so much data out there to collect. Especially when we talk cybersecurity, you have tons of fingerprinting technologies like Jarm and J three, J four, all those over, over the wire fingerprinting technologies.
[00:03:40] Brian Kime: You have everything on your phone that is tracking your location and that data's being sold, and then some law enforcement or intelligence agency can acquire that data and, and track your device everywhere it goes. And of course, by putting all these things together, you can do some pretty interesting tracking.
[00:04:02] Brian Kime: So it really benefits both sides, both the spies and the counters. Spies.
[00:04:07] AJ Nash: Well, yeah, and I think, you know, it's interesting. One of the challenges, or actually let's start with the advantages. I guess one of the nice things about all this content, right? All this, all this technology is retrospectively, right?
[00:04:16] AJ Nash: If something happens, I think there's an expectation, frankly, at this point, that we'll solve that, that mystery, that crime really rapidly. Right? And in America, unfortunately, a lot of times that's gonna be a shooting. I mean, that's just where we are right now. There's a lot of public shootings and, and mass shootings.
[00:04:30] AJ Nash: It's very surprising if somebody gets far right. Because even if you didn't see it coming, if you didn't have any heads up or any warning, whenever somebody commits one of these heinous acts, there's cameras everywhere. Right. And there's, whether it's the, like I said, flock cameras, whether it's, you know, public tracking.
[00:04:44] AJ Nash: There's also the ability to. To, you know, grab all the content from cell towers, legally speaking with, you know, with, with proper warrants, et cetera, to start sifting through and see who was in the area, right? Plus, there's all the people that are there. Wherever you are, there are people and they have their cameras and their phones and, and so all of this content.
[00:04:59] AJ Nash: So it's really remarkable whenever anybody is. When there's a manhunt that lasts more than 24 hours at this point, it's pretty remarkable. Now I think that also opens up a real challenge to, you know, where we're gonna be in the future. And, and, and maybe where we've been in the past, man, the conspiracy theorist opinions of, you know, manufacturing that data or manipulating that data.
[00:05:17] AJ Nash: And then if you throw in things like AI now, right? Well, is a video an accurate video? Did somebody create the video? Did somebody manipulate it? Somebody, you know, change it to incriminate somebody who may not have been guilty? So. I think we have all this content, all this, all this technology available.
[00:05:30] AJ Nash: So forensics are, are really good, you know, on the backend, right, to be able to see what happens. The proactive piece is interesting because there's so much content. First of all, obviously there's all the laws you know, that prevent people from just hoovering all this stuff up and watching it all live domestically.
[00:05:44] AJ Nash: For anybody who doesn't know, the US government doesn't do that. Despite what anybody might tell you. There are a lot of laws in place. It, it very rarely happens with mass collection. If it does, there's a lot of good reasons and a lot of legal hurdles to go through, but. But this would be a challenge for that as well, because now there's so much more coming in, you know, the whole hiding in the noise factor.
[00:06:02] AJ Nash: If your content, if your pinging of, you know, where your cell phone is or you know, whatever's, you know, where your facial recognition picture showed up, whatever, it's mixed in with everybody else's, you know, there's a lot more to sift through. And then of course, technology helps us sift through it faster.
[00:06:17] AJ Nash: So it's, it's this constant effort of we need more and more and more content because that gives us a better chance of, of getting the answers quicker. But then we need more and more technology to process it. 'cause you don't have enough people to get through it fast enough. And then you have to trust all the technologies because you don't have time to double check everything.
[00:06:31] AJ Nash: It, it, it's, it's an endless game. I think. I think overall we are in a much better position. You know, I, I I was watching, there's a show on tv, I won't name it 'cause it doesn't matter much, but it's set in 1850s America. Every time there's a conflict there. I'm just thinking, well, why don't they just go ahead and kill him and drag him in the woods and dump 'em like there's no forensics back then?
[00:06:50] AJ Nash: Just, just make 'em disappear and it won't be a problem anymore. It was so much easier back then. People died and it happened all the time. You were in the wilderness and well, Bob walked away and he never came back. Now, that would not be the thought process. It shouldn't be anyway, by the way, to just kill people.
[00:07:03] AJ Nash: But it would be much more difficult now to just go, oh, I've had enough of this guy. I'm mad at him. I'm gonna kill him and throw him in the woods somewhere. No you're not. No, you're not. You're gonna get caught very, very quickly by all this technology. So I think it's made us better, but it's also very, very complicated.
[00:07:20] AJ Nash: What do you think, Rob?
[00:07:21] Robert Vamosi: So we're recording this episode in the middle of December, 2025. And over the last weekend there were some dramatic. Violent acts committed. And one of them was in Bondy Beach in Australia. And immediately after that, somebody concocted a story that the individual who stopped the shooting from being worse than it was, was a white IT male.
[00:07:48] Robert Vamosi: Australia, even though the video disagreed with that. The point that I'm making is, is that Ben Collins pointed out that it was absorbed by some of the ais. And for a while, this individual that was created, his name was Edward Crabtree propagated through these AI systems as the individual when in fact it was Ahmed El Ahmed, who was a 43-year-old immigrant to Australia.
[00:08:13] Robert Vamosi: So in, where I'm going with this is that we have these AI engines picking up on misinformation and propagating it out. Then comes along, the investigator and the investigator may be under a time crunch. I, I know there are rules and there are ways it's done, but nonetheless, it's like this first bland, you know, at, at what they see is, is what they might include in a report or inform the analysis going forward because that's what they picked up.
[00:08:43] Robert Vamosi: Is that necessarily true or is that, you know, always been the case and we're just more sensitive now because of these AI agents?
[00:08:51] AJ Nash: Brian, you wanna go first? I mean, I, I'll jump, I'll do it, but I just, I just feel like I just spoke for 12 minutes it felt like, so I don't wanna dominate the show. Brian, what are your thoughts?
[00:09:00] Brian Kime: Yeah, I mean, AI and surveillance, I think, go hand in hand. I don't think you can talk about surveillance without talking about AI and or AJ's earlier point is, can you trust the video, the first reports coming out of any event? It's, you know, AI can certainly help us, you know, this. Podcast is called Needle Stack, right?
[00:09:21] Brian Kime: It can help us find that needle in the needle stack. But like any data analysis, if there's garbage in, you're gonna get garbage out. And so, you know, as Rob mentioned, if someone has a erroneous report or, or the wrong information and gets it out first, those LLMs sometimes will pull that in first and then continue spinning out.
[00:09:47] Brian Kime: The wrong information and same thing goes if you're, you know, looking at what happened at Brown University or we're talking about cyber breaches. If the data is, is bad, then it's gonna be hard, you know, for us, us to use AI to get accurate, accurate insights and inaccurate answers. Then use that for whether it's indications and warnings, or use it on, you know, a prosecution side of the house.
[00:10:17] Brian Kime: So it, it's, it's, it's a mess.
[00:10:23] AJ Nash: So I think, you know, I think to answer part of what you, you'd asked Rob, it's always been so, right, so the challenges with, with intelligence, with, you know, law enforcement, same, has the same challenge is a lot of times there is that time crunch, right? And so. What we're all trained on is caveat language, right?
[00:10:41] AJ Nash: Okay, so I don't have a lot of time. I'm gonna do something really quickly. You need an an initial assessment, not a final assessment, an initial assessment. I only have an hour. It's a good chance. Whatever I give you is going to have a lot of caveat language, you know, based on limited reporting, possible, probable, likely, you know, it's a lot of maybes.
[00:10:57] AJ Nash: Really good professional intelligence. Personnel rarely jump in within an hour and go, this is who did it. Absolutely. A hundred percent. That's actually a red flag. You see somebody doing that, there's a good chance you should pull them aside and find them something else to do with their time. So there's gonna be a lot of caveat language early on, and so I think we're still gonna see that.
[00:11:15] AJ Nash: But I do think there's a concern. The challenges we have with AI and ai, it can be a wonderful tool and really, really helpful. But there's a couple of challenges. Evolutionarily, humans are designed to find the easiest solution to problems, easiest and quickest solution. That's just how we're built. It's how we've built from.
[00:11:31] AJ Nash: Forever. Right? You. And that's, that's just how we're designed is so AI taps right into that, right? As opposed to going out and doing a bunch of research and trying to really find the answers and the original sources. Lemme just ask this platform and oh, it'll give me sources. Will I check those? I. I will, will everybody.
[00:11:48] AJ Nash: Probably not. And the longer you get into it, the easier it is to be sucked into that. Even if you check sources, even if you're an expert and a professional. I got 200 sources. All right, I checked the first 12. They all seem good. I'm probably not gonna check the other 188. I should, but there's a chance I'm not gonna, 'cause I'm gonna a time crunch and somebody wants an answer and I've got some caveat language anyway, so it's good.
[00:12:05] AJ Nash: And so we're gonna see more of that happening where people rely more and more on these platforms and we don't know what's going into them. And so you've given a really good example. You know, aminal Ahmed deserves an incredible amount of credit. He is a hero. No, no, no other word for what he did in Bondi Beach.
[00:12:21] AJ Nash: He took a couple of rounds and saved a bunch of people's lives and, and could have easily run away. He was not part of the fray. He was on the, he was behind the shooters. He could have left and he chose not to. And, and to have that turned into a different story that now he was gonna have to be unraveled and has been thankfully publicly very quickly.
[00:12:35] AJ Nash: So, it's been overwhelmed. I'm sure the LMS now would give you the right answer. But had that gotten a little further, had that fake story gotten further out, it gets harder to reel that back in. And what I'm really worried about is the next generation. So there was a, a story not long ago. It just got some, some new hubbub.
[00:12:51] AJ Nash: A report came out saying 80 some odd percent of all ransomware is AI driven. Now that, that's, listen, that, that report's garbage, so there's no other way to, to play it. It was, it was branded by MIT it's actually from a consortium between MIT and private sector. And when you dig into it at all, you can see that it's, it's garbage.
[00:13:08] AJ Nash: There's no other word for it. I hate to be so, so bold about it. There's no word for it. I mean, they included examples that, you know, were WannaCry and not Petya, which was from 2017. That's for anybody who doesn't know the geo the timeline here. That's five years before chat. GBT came out with the first public ai.
[00:13:23] AJ Nash: So it's very unlikely those were AI driven. I don't think the cyber criminals were sitting on multi-billion dollar technology, so they could do a little ransomware attacks. And then from there the data gets, gets worse and worse. The problem is that product went out in April. Nobody paid a lot of attention to it early on.
[00:13:38] AJ Nash: By the time it got some rebuke from experts in October, it had been all over the internet, and while MIT quietly just pulled that paper down, it doesn't exist publicly. Now you gotta go to the archive to find it. If you Google the terms, you'll see dozens and dozens of websites that have this. Data, this, this quote, this inaccurate information vendors and, and, and academic institutions and, and publishers.
[00:14:01] AJ Nash: And that's all content that then goes into ai, AI and trains the next round of AI and the LLMs. So how am I knowing that it's wrong in six months, going to win an argument with somebody who says it's right when they show me all of their sources, which is all circular reporting, leading back to MIT's false report that no longer exists.
[00:14:17] AJ Nash: But how am I gonna convince them? That they don't know the truth, and that's gonna be our problem with a lot of our research and a lot of our investigative work is all this content that's coming in has to be accurate. This extra observ observation observations we're talking about and, and you know, all the surveillance, if it's manipulated and it's inaccurate and it's poured en mass into these technologies that we're gonna become more and more dependent on, we're gonna lose sight of what truth is.
[00:14:41] AJ Nash: And it's gonna be very hard to win an argument or win your freedom in court when the preponderance of the evidence, which is all fake, but is impossible to trace back is against you. And so that's the the far end of where I think we could go with this technology. Unfortunately.
[00:14:58] Brian Kime: Yeah, I, I think the reality is that Gen AI makes everybody a mediocre intelligence analyst, a mediocre investigator, a mediocre lawyer, you know, a mediocre doctor, whatever it is, and expertise is actually may become even more valuable than it was even just a few years ago.
[00:15:22] Brian Kime: So if we're talking, you know, surveillance. The AI may detect, you know, you know, in the case of of Bondi Beach, it, it's gonna look at that individual and maybe it's going to inadvertently classify that person as something they are not. But that expert, you know, will look at that and correctly. Diagnose, pull apart that video and, and accurately tell you what happened and why.
[00:15:50] Brian Kime: Where the AI just helps like a random individual kind of get it, like 50% Correct. You know, and, and so true expertise hopefully will come out of this and, and, and. And be identified, and, and everyone can just be mediocre, but the true experts will rise above and, and, and actually, you know, you know, you know, help decision makers in the case of intelligence actually make better threat formed decisions in the case of, you know, digital forensics or, or crime scene forensics, you know, actually help find killers and so forth quicker, where the other, you know, 99% of people are just.
[00:16:32] Brian Kime: Looking up chat, GPT or copilot or claw or whatever, and you know, throwing a few random pieces of information in there and then hoping that the math tells them what happened in, in the real world, which, you know, it rarely can ever do accurately.
[00:16:45] AJ Nash: Right. Well, and Rob, I mean we've, we're talking about this from an Intel standpoint and from a law enforcement standpoint, but what about journalism?
[00:16:50] AJ Nash: I mean, that's your background, right? Aren't we seeing the same issue where it's creating a bunch of mediocre. Amateur journalists. Is it doing the same with professionals? Do they have, do they get tricked into the laziness that that affects all of us with AI and start pu publishing things that are inaccurate?
[00:17:03] AJ Nash: Like, how's this gonna impact journalism, which we're all gonna depend on as as a source of truth?
[00:17:08] Robert Vamosi: Well, in a previous episode, we talked about citizen osint and how easy it is for everybody to become an investigator today. The same is true in journalism. You have people that. Claim to be journalists and don't really have the background.
[00:17:22] Robert Vamosi: And yeah, they will. They'll take a story and they'll run with it without asking those questions. Like, this is too good to be true. And Can I get a second source to confirm that that takes too long? We don't need a second source. No, you do need a second source. You really need to vet the information. And certainly when, when I write a book, I, I worry about all the, the things that I'm saying in the book.
[00:17:44] Robert Vamosi: It's like, well, somebody's gonna come back and say that's not true because here's this that I didn't find on the end. Ah, you know, but that hasn't happened. Fortunately. There are ways and, and I think Brian and you. Emphasize that with good training, the expertise can, can actually get through all of this stuff.
[00:18:02] Robert Vamosi: I'm thinking of a decade ago when the Boston Marathon Bomber was caught. I mean he, he and his brother. He and his brother were identified because it was such a public event. Everybody had their cameras out. But even that, there were like cafes and shops that also had their cameras. And within Brian tell me a couple days, it was less than a couple days.
[00:18:26] Robert Vamosi: It was very quick. They were a, a able to sift through all of that and identify the individuals of interest.
[00:18:34] Brian Kime: Yeah. EE exactly. That. That that terrible bombing in 2013 was a, a good example of using all these CCT cameras and the power of social media. At the same time, if I recall, I think they act, some of these citizen journalists actually pointed to a couple other people that were not suspects and, and, and that type of kind of crowdsourced investigations.
[00:19:02] Brian Kime: Can harm real, you know, innocent people because everyone wants to be first, and I know that's gotta be a challenge for you, Rob, as a journalist, is there's such an emphasis on being first with a story. And not as much on being correct, you know, and, and, but, but the, you know, expert journalism will rise to the top and it kind of makes me want to pay for more journalism because then I know Yeah.
[00:19:32] Brian Kime: That, I know that. They are following good journalistic practices. They're not just taking random surveillance video and just throwing it out onto social media without context or you know, claiming that it is something that it isn't. Right. You know, I want, I want true expertise and, you know, hopefully, rob and, and your brethren in the journalism industry will continue to do great journalism and, and make it worth paying for. Right?
[00:19:59] Robert Vamosi: Going back to this past weekend in Brown University, they identified somebody had the person's photograph name plastered everywhere. This was from the FBI, and then they retracted it.
[00:20:12] Robert Vamosi: That individual is no longer in custody and they're now looking for somebody else. Meanwhile, on. The flip side of the weekend, you had Rob Reer and his wife murdered. And of all the sources in the world, people Magazine came forward and said it was most likely his son that they're looking at. And nobody touched it for about 24 hours.
[00:20:36] Robert Vamosi: And then all of a sudden, the LAPD confirmed that they were looking at the sun, et cetera, et cetera, et cetera. And so. There are good journalists out there. Sometimes they're in places like People Magazine that you least expect them to be, but they do have good sources and they do vet them before they go public.
[00:20:53] AJ Nash: And I think those are two really good examples in my opinion. I look your, your guys' opinion on this, on why it was handled differently, right? So the university case is a, is a case that was going to affect the public, right? There's, there's a fear of. It's a randomness, so anybody feels like they could be a victim.
[00:21:10] AJ Nash: This was, this was just some students in university, Rob Reiner. Not many people were going to fear that the person who killed Rob Reiner was probably gonna be on the loose looking for them next. Right. Rob Reiner is part of a different class of people in a different community. It was likely gonna be personal.
[00:21:22] AJ Nash: It was a double stabbing of a husband and wife. So I wonder if that had something to do with It Is there is the public. Pressure, right? The public pressure. When there's a, when there's a, a mass shooting, like a school shooting or, or a random attack or a bombing, for instance, something public like that, I feel like there's a lot more pressure to get resolution fast, right?
[00:21:39] AJ Nash: To calm the public, to ease the public's mind. The Rob Ryan tragedy, and it is a tragedy I, I think, affects a lot of people emotionally. But I don't imagine there were a lot of people that, that thought of that and immediately thought, oh God, somebody's on the loose killing people. We could be next. Right?
[00:21:53] AJ Nash: So I wonder if that doesn't allow law enforcement and, and journalists and. The right amount of time right. To process and to work through this. Whereas it seems like when it's a much more public case that it just feels like there's such a rush to get something, to prove an effort, even if it means we brought in the wrong person, Hey, look, we're doing something right, and then unfortunately that person's life is, is turned upside down.
[00:22:14] AJ Nash: What do you guys think is, does it matter if it's something that's feels more isolated? Does, does that give people more gap and time to do proper, you know, procedure and, and quiet as opposed to something that's a public panic? Where it just feels like there's pressure to, to get some answers out faster.
[00:22:28] Robert Vamosi: Well, I'll, I'll take the lead on that. I mean, I think journalism has caused the problem because there's this expectation of 24 7 news and. As I know, news doesn't happen in regular fits and spur. I mean, it, it just, it trickles out and then sometimes there's no news. So what do you do? How do you fill that gap?
[00:22:50] Robert Vamosi: How do you keep people interested? And so it doesn't always work with journalism, and we've built this sort of expectation that 24 hours a day you can get news well. News doesn't always happen, and this past weekend is a great example where we had three major stories break, boom, boom, boom, one right after another.
[00:23:09] Robert Vamosi: That doesn't happen very often, fortunately. And you know, there were different circumstances as as we just said about why people were misidentified and why news. Waited before they actually confirmed and so forth. All different expectations, but I'm gonna say that journalism kind of caused this rush to judgment and the snap decisions that were now being asked to make.
[00:23:31] Robert Vamosi: But that's, that's my take. Brian.
[00:23:34] Brian Kime: I think Sally politics plays a role here in each of these stories and how, and, and it's related to journalism because. I think in, in, you know, I apologize in advance, Rob, this offends you, but I think some journalists you know, are our two aligned politically with certain other groups and, and come at stories, current events, longer term stories, whatever it is from a particular political angle and will sometimes, oftentimes, perhaps.
[00:24:10] Brian Kime: Work that story based on how it's either gonna affect their political tribe or the other. Political tribe. And so Rob Reiner, you know, while it's a, a terrible tragedy, I don't think politically you can spin this too much, but the other two events, brown University and, and Bondi Beach and Australia, you know, both politically charged topics stateside.
[00:24:34] Brian Kime: There's the, you know, the persistent debate over Second Amendment and guns rights, and then overseas in Australia you have antisemitism and, and, and immigration. And, and these are all very, you know, hot, passionate political topics. And so how the journalist and the, the, the professional journalist and the citizen journalists go at that current event, I think is definitely colored a lot by their.
[00:25:03] Brian Kime: Politics, frankly.
[00:25:05] Robert Vamosi: I, I agree with you on that. And we're all liberals just putting it out there. All journalists are liberals. It's just that's a fact and can't shy away from it. But no, that's why you have newsrooms, that's why you have different people editing and, and various processes. But again, it goes back to that.
[00:25:24] Robert Vamosi: Rush to judgment being first having something new in that 24 7 news cycle. I think that is really the, the, the problem there.
[00:25:33] AJ Nash: Yeah. I, I. No, I agree. I've, I've said for years that the, the worst thing that ever happened to journalism was when it became a, a pro, a big for-profit business. You know, when I was a kid, which now seems forever ago, you know, we had the, we had the af you know, we had the what?
[00:25:45] AJ Nash: You had the morning newspaper and the evening newspaper, and you had the nightly news, and that was, you know, basically it. And I mean, I remember when CNN first started, right? So, but morning paper, afternoon paper. Nightly news, and I don't feel like the people who ran the nightly news were really worried about their advertising dollars.
[00:26:00] AJ Nash: I'm sure they existed. There was obviously advertising still, but it wasn't the same. Right? And it wasn't as much competition either. You had like three networks and that was about it. And it felt like the ideal was, Hey, let's get the best story out. Let's get the best news out, the most accurate news out to people, not, not the story that's gonna catch the most attention.
[00:26:16] AJ Nash: You know, there weren't a lot of. I mean, headlines. Hell, now everything's a teaser too. Even, even if something's important, it's, you know, what's gonna kill you in your own house? See us at five. It's like, oh my God, why don't you tell me right now if something's in my house that's gonna kill me. You know?
[00:26:29] AJ Nash: And that's cliche, of course, but back then it was like, Hey, here's the, here's the top story. Here's what's going on. Here's what you need to know. And it was just facts and figures and, and, you know, and eyewitness accounts and estimates and numbers, and, and they had time to do it properly. And this 24 hour cycle that you mentioned.
[00:26:45] AJ Nash: And the competition for it. There's several networks that are 24 hours a day, and then it became more about entertainment and it became more about advertising dollars, and then you stop saying the things the same way. Right? You may still try to get to the truth, but. How do you wanna present the truth?
[00:27:02] AJ Nash: You know, as opposed to just bold and brash and not worry about people's opinions. 'cause truth is truth. It was, well, what are we gonna do that's also not gonna offend the audience? How are we gonna attract more audience and not lose audience? And, and so I think over time we've run into a real issue with that.
[00:27:14] AJ Nash: And every once in a while you see some, some shining examples of people that are just go against that and just say, Hey, this is the facts. Like it or not, you know, I know our particular demographic won't care for this, but this, this is how it is, this is what we've got as evidence. And we see people that, you know, change sides, which is unfortunate that there are sides in journalism now.
[00:27:30] AJ Nash: But I think all of this still comes back to the, to the original discussion, which is where does the surveillance state fit in? Where does the technology fit in? And I think ideally. If we can get to truth and get people back to just saying, Hey, this is what we've seen. This is all the evidence we have, and we validated it we should be able to agree on what facts are.
[00:27:51] AJ Nash: But I, I feel like we're at a point now where we don't anymore. I think the term alternative facts became a popular thing. Seven or eight years ago, which by the way, there's no such thing as an alternative fact for anybody who's paying attention still. I think we started down this path of changing truth to meet whatever we want it to be, and, and we have people now living in alternate worlds as a result.
[00:28:08] AJ Nash: And I think that's super dangerous and it's gonna affect us more with more technology and more observation and more available evidence that people are still gonna try to manipulate it to just further their argument as opposed to furthering the truth, which I think most people are just going to lose track of, unfortunately.
[00:28:24] Robert Vamosi: I wanted to take it back to the intelligence community and, and how surveillance is impacting that in the sense that sometimes you have to be deceptive. You have to go undercover and you have to meet in person with someone. But with all the surveillance technology, with facial recognition, with your cell phone, pinging towers everywhere, et cetera, et cetera how's somebody.
[00:28:49] Robert Vamosi: In law enforcement supposed to do that type of human intelligence.
[00:28:55] Brian Kime: It, it's, it's very tough obviously, when, when, yeah, you are being tracked everywhere. Obviously we have, you know, online forums you can meet up but even those can be surveilled as well. Deception and, and counter deception is a big part of this now, and you could take.
[00:29:16] Brian Kime: A, a, a phone and, and, and and you could tape it to like someone's Uber or something and have them drive around and confuse, you know, the, the, the teams that are tracking these individuals and then slip out the back door. Perhaps if, if the means of tracking a, a agent or a case officer is a, is the phone, you know, if you send the phone out on a trip somewhere, you know, and, and people are following that around.
[00:29:44] Brian Kime: Then you can walk out, you know, the back door. So, you know, like, you know, intelligence and deception, surveillance has all changed. You know, how we, we manage our personas both online and in the physical world and, and how we, you know, conceal movement or how we fake movement. It, it's, it's all changed and it's, you know, it's incredibly complex.
[00:30:06] Brian Kime: And, you know, I, I think for a while you know. People mock the there, there was a case was a, it was a couple years back where this sort of amateurish individual that was traveling in a foreign country and American and he had like masks and wigs and stuff and, you know, was trying to kind of go old school with his, you know, clandestine meetings.
[00:30:30] Brian Kime: But I, I, if you can. If you can fool these AI enabled cameras, you know, with some pretty cheap Hollywood type makeup special effects, you know, like, you know, maybe you can have some freedom of, of maneuver of movement in a a non permissive environment like a Moscow or a Beijing or something like that.
[00:30:54] AJ Nash: Yeah, I mean I think, I think we've seen that there's some capabilities to fool the facial recognition, right? Like you said, you know, it's complicated. I think, I think the era of COVID helped for some folks masks work really well, frankly. And the ability to be in public with a mask on and not be seen as out of the ordinary certainly can be helpful.
[00:31:12] AJ Nash: 'cause you can mask a whole lot of your features that way. And there's a lot of different types of masks that people have used, not just the standard hospital masks. So. There's that and, and, you know, minor prosthetic, you know, work that, that most people can do, you know, at a costume shop is probably enough to help out.
[00:31:23] AJ Nash: But, but I think you gotta make sure you know what you're doing on the other end too. Like you said, Brian, you know, you do all that and then you bring your phone with you. It doesn't really do you a lot of good, or you bring your, you know, Fitbit with you, you know, or any number of things that can be tracked or you, you use.
[00:31:36] AJ Nash: A credit card or a payment system that, that ties back to you. Like it's, it's the full gamut. You've gotta be able to be a different person. So I think for those who are in that line of work, you know, in clandestine work I think it's. Probably more challenging, but certainly within the scope of what they're trained to do, right?
[00:31:54] AJ Nash: You just, you, you have to assume the environments are more hostile than they used to be because the technologies are everywhere. Whereas you might have only limited your highest level of obfuscation for the most hostile and, and concerning environments. Now you have to assume that's everywhere, right?
[00:32:08] AJ Nash: Because again, the technology is everywhere. So, but I, I think those who are familiar with Tradecraft and are skilled are probably still doing well. You're not gonna be able to get away with a fedora and a big, you know, you know, a big trench coat. I don't think that's gonna, they're gonna play out very well.
[00:32:22] AJ Nash: You can't just hide your head from the cameras like you see on TVs, but but there are certainly, the facial recognition software is far from perfect and it doesn't take a lot to modify. You still, you know, a mask and a, a good pair of sunglasses and a decent hat, and you're probably gonna make it very difficult to, to prove that that person is who, who they are, and, and the systems won't pick 'em up.
[00:32:38] AJ Nash: But you also have to make sure your technologies aren't with you. You know? And that's, that becomes more challenging, right? And you gotta do the same thing with whomever you're meeting with you know, if they're under surveillance. So I think, I think all of that makes it more challenging, but I suspect, as we said earlier, it's more challenging on the back end of unraveling something after something's happened, to be able to go back and find evidence.
[00:32:56] AJ Nash: I think then you're in a better position to look back and go, oh, it turns out, yeah, these people were here and this looks suspicious. Proactively. It's very difficult. These systems aren't just out warning people, Hey, I saw this guy's face, you know, here's where he is right now. There aren't people just actively monitoring, you know, millions of cameras all the time looking for, you know, facial matches or anything.
[00:33:11] AJ Nash: So I think it might help people piece things together at best, but, you know, on the front end, it's still gonna be really difficult to stop people from doing clandestine activities. The technology is the easier thing to track than the faces.
[00:33:25] Brian Kime: Unless everyone starts wearing those AI enabled glasses that have connectivity and are, are scanning everyone's face as you're just walking around town and, you know, sending that back to a database and then saying, you know, this is Rob and over there is aj.
[00:33:42] Brian Kime: And, and that's scary as f.
[00:33:44] AJ Nash: Yes. And that's gonna be happening. I, I don't even disagree with you. That's, that's not even sci-fi. You get all those meta glasses out there, you know, it's, again, it's a whole nother stack of data. And yeah, if, if they're tied to facial recognition or they're tied to my connections, right?
[00:33:56] AJ Nash: So, so if I meet up with you for lunch the system doesn't have to guess who you are because you're one of my connections. You're one of my contacts. So my, my contact list is gonna be access, and it's gonna say, oh, hey Brian. K this is who you're meeting. And by the way, here's his resume and here's what he's done.
[00:34:09] AJ Nash: You know, social media, here's what's new in his life. So make sure to ask him about his new. You know, car or whatever, right? All this stuff's gonna just get populated into my eyes, so I'm ready to go. Well, somebody else has all the information too. There's databases, there's something collecting that. So if everybody has these AI enabled glasses on.
[00:34:24] AJ Nash: Everybody's gonna know where everybody is all the time. And there's, there's somewhere there's gonna be content available to be accessed. Making it virtually impossible, because even if you're hiding from somebody, you're probably not hiding from your own people. And so it's gonna be very difficult to, to hide for somebody who, who is sought out as those technologies continue to advance and people are gonna do it 'cause they make their lives easier.
[00:34:44] Brian Kime: And and I see. If law enforcement were to adopt these types of technologies, what does that mean In the context of the Fourth Amendment right. If normally on the street, I don't have to ident identify myself to a police officer unless like they have observed me or there's a report of me committing some crime, right?
[00:35:05] Brian Kime: But if I'm just walking down the street and officer's walking the other direction and identifies me, is that a Fourth Amendment violation? That's, that's tricky. That's tricky. Are we gonna allow law enforcement to use this? Now, typically, if you're, you know, on the, in the public street, right, you know, you, you don't have an expectation of privacy.
[00:35:25] Brian Kime: So I think this is gonna be really interesting if those, if that technology becomes, you know, affordable for law enforcement and, and, and if, if it's adopted at all. Like of course there'll be challenges to it, but I, I really worry. You know about that if that's gonna be kosher un under the fourth Amendment, and, you know, I'm sure it'll take some, you know, some fugitives off the street, you know, but is it at the cost of, you know, our civil liberties?
[00:35:54] AJ Nash: Well, imagine a simple traffic stop and a and an officer walks up and they have AI enabled glasses on, so it's a simple traffic stop. They don't have any probable cause to search your vehicle for anything beyond the traffic stop.
[00:36:05] AJ Nash: You had a broken taillight or you're speeding or whatever. But suddenly, as they walk into the car, the glasses identify this person as somebody who has a long history of drugs. You know, use sailing, whatever it might be, right? Does that now become probable cause? 'cause you didn't have it before and that person hasn't displayed any activity that would've qualified as probable cause in that moment, but now you know a lot more about them.
[00:36:24] AJ Nash: You know, guilty knowledge because your eyes have told you. Does that now create a scenario where you can right search their vehicle? And again, this is assuming they're not on probation where anybody can search somebody's car is on probation automatically.
[00:36:33] Brian Kime: Ano another excellent question, Al alternatively the passenger, right?
[00:36:37] Brian Kime: Because typically at a traffic stop, right? Only the drivers really. In scope. Yeah. And now if the officer bends down, he sees that, you know, AJ's sitting shaka next to me and you know, and he was caught doing bad things, you know, a while back and, you know, now is that, is that probable cause to. Go arrest the passenger or search the vehicle, or all kinds of murky areas here and, and that technology can be pretty scary.
[00:37:05] Brian Kime: And, and of course, what happens when it, it, it goes wrong, right? When, you know, when a, when a, when a a human makes a mistake in, in the wake of Bondi Beach or Brown University or whatever, it sucks, you know, when the AI does it, I mean, one who's responsible for that. And, and two. What does law enforcement do?
[00:37:24] Brian Kime: What does the, the the, the citizen do when they are not the person that the AI tolds told the cop that they are? Right? And then. You just comply and hope it works out. Hopefully some folks are thinking about, so hopefully some, some lawyers are thinking about guardrails around that technology if law enforcement were to ever to use it.
[00:37:46] Brian Kime: And I, you know, I, I would tend to hope they would be very deliberate and slow in adopting that. And, and hopefully those guardrails will be in place and, you know, there there'll be some opinions from some very knowledgeable. You know, judges and prosecutors and so forth about the admissibility of some of that surveillance data.
[00:38:06] AJ Nash: Sure. Well, and then it gets back into how trustworthy is the data? Is it gonna be a mistake? Is it gonna be manipulated intentionally, right? Is there gonna be, you know, again, anytime there's a large database of content that's useful to somebody, that means there's a couple things. It's a target to be stolen, obviously, and it's a target to be manipulated.
[00:38:22] AJ Nash: So, you know what, if somebody wants to go in with the intent of manipulating data. To create specific outcomes. You know, you could have a whole class of people that find themselves being discriminated against unintentionally. The the law, law enforcement, for instance, doesn't have anything to do with it.
[00:38:36] AJ Nash: They don't know, but the data has made this happen and now a class of people are being discriminated against, or a specific individual is being targeted who has, you know, has done nothing wrong because somebody on nefarious third party manipulates that data. That's all everybody's depending on. So again, we get back into more data security and reliance on all these technologies.
[00:38:52] AJ Nash: And we're seeing some changes right now in the laws. And what's, what's allowed when you stop somebody and what's probable cause now and what's, you know, what is profiling? So the laws are currently shifting as we're going. And if this technology gets introduced we're gonna learn the hard way. I think things will happen.
[00:39:06] AJ Nash: And then afterwards we'll get to the course and find out what should or shouldn't happen. 'cause we're gonna have law enforcement ahead of technology which happens, or technology has law, which happens. I mean, this is not an unfortunate, you know, not an uncommon thing. And so we're gonna find out after the fact.
[00:39:18] AJ Nash: And it's all happening very, very rapidly. Technology is getting in the hands of good guys, bad guys, all the above. You know, whichever side you think you're on, on, good or bad. And being used in, in ways that weren't anticipated, weren't intended, weren't understood, and, and so we're gonna see all these consequences.
[00:39:35] Robert Vamosi: So one of the good uses of AI is pattern matching. And so, I think Brian raised this in another conversation, that whereas you'd be looking at satellite photos comparing two locations, the human eye can detect only so much, but an AI is going to excel at determining that like, oh, the garage door was left open in this picture, but it's closed in this other picture.
[00:39:59] Robert Vamosi: If that type of detail is possible with a, with a good use of ai, is it not?
[00:40:06] AJ Nash: Oh, absolutely. I think it also can help debunk mistakes on things like time of day, for instance. I, I think in imagery, analysts are very good at this in general, right? To be able to look at a photo and understand that what time of day it's supposed to be at, where the, you know, the, where the sun is in, in relation to that geography, and be able to look at where the shadows are supposed to be and measure that out.
[00:40:24] AJ Nash: Imagery experts do a really good job with this. But it's still challenging in mass. Whereas AI can do that kind of work quicker and be able to say, no. You're saying this was at a specific location, a specific time that does not add up. That is not how the shadows would look here. That's not where the sun was in the sky, and actually it was in this location.
[00:40:38] AJ Nash: So there'll be a lot of opportunity to use AI to debunk mistakes or intentional manipulations as well. At larger scale, right? Humans do this and do this pretty well, but it takes time and I think the machines are gonna be in a better position to do some of those things. So AI's really good at sifting through large amounts of data and helping us get to better answers on things like that rapidly.
[00:41:00] AJ Nash: It's just when you get into the more complex. Questions and answers and problem solving that it gets more difficult and more easy to manipulate and, and, you know, get something in the mix that's either a mistake or an intentional manipulation. But I, I think you're right that, you know, the change, the change agent, the change engine of ai, you know, being able to, to recognize change quickly is something that's really, really good at you can see a very complicated picture and quickly go, no, this, this, this, and this moved.
[00:41:25] AJ Nash: You know, it's something, it can pick up very quickly. It's good a pattern recognition.
[00:41:30] Brian Kime: Definitely AI will make every geospatial analyst adequate at their job. But yeah, it, it really takes that expert that's been studying a target for a, a long time and understands all those. Intricacies if you're watching a a nuclear weapons facility, right?
[00:41:51] Brian Kime: You know, what does it mean if a Louvre on a ventilation shaft shifts? If, if it changes, you know, if there's five vehicles parked out front on Thursday and there's nothing on Friday, what does that mean? That's where. The cultural knowledge, that's where other knowledge about nuclear engineering and all sorts of other things come into play that that AI may not, may not really, you know.
[00:42:16] Brian Kime: Take into account. Right. But AJ I want to go back real quick to a point you were making about a minute ago about if we do have law enforcement wearing these AI enabled glasses and you're collecting all this data, that data becomes a source for theft. Well, there's a, a data breach that was announced the day before.
[00:42:34] Brian Kime: We are recording this of a very. Popular website that caters to people's vices and, and using AI to sift through that stolen data might be interesting. Oh boy. And then it's a hub. Your other points of is,
[00:42:51] AJ Nash: oh, it's a hub. Is it? Okay. That's down, out to shame the, here
[00:42:56] Brian Kime: Rob.
[00:42:57] AJ Nash: It's a, oh, is it? I gotcha. Okay.
[00:43:03] Brian Kime: And, and, and yeah. Manipulating data like we both sides. The, the group that has said they have stolen this data and the victim have admitted that there's a data theft. And now yeah. Can that data get manipulated and. How, what if that gets into the public domain? That presents all sorts of challenges.
[00:43:28] Brian Kime: And then now there's other, there's plenty of metadata inside that data breach that, you know, could be, could be enriched with, with other surveillance data and so many other things and, and. You know, think about you know, family law cases and, and things like that. It's, yeah, it, it could be very, very damaging for a lot of folks and, and for a lot of folks that weren't actually victims.
[00:43:51] Brian Kime: But if I just, if I decide to throw AJ's data in there and claim that, you know, he was doing whatever right. Then he is, he is gotta prove that he is gotta prove that he is Yeah. Not guilty of.
[00:44:03] AJ Nash: Well, and that, and that's a big part of it, is just creating a scenario that forces people to spend time and energy to defend themselves.
[00:44:08] AJ Nash: Even if it's completely false information at at at best. I gotta waste time and energy proving it. At worst, I. There are people who will never believe me no matter what. Because that's just how life works, right? Or people won't even hear the, the, the, the truth, right? If you'll hear an accusation, they believe the accusation, they go on about their day.
[00:44:24] AJ Nash: 'cause we get a million headlines a day. And it turns out months and months later that that person was exonerated. But you never hear that part of the story. And so that person, there are people who believe all sorts of things that aren't true in this world because the original headlines said so, and they never followed the case to see what the result was.
[00:44:37] AJ Nash: Because we don't have time and energy to do that. They're not the splashy headline. No. That, that splashy headline of what happened is front page, the retractions, page seven in a small corner someplace. Because you know, it's not newsworthy, you know, as part of it. And plus people don't wanna admit it.
[00:44:52] AJ Nash: But yeah, you talk about a breach, like, you know, this, this website there's also been, I mean, the Ashley Madison breach was, I don't know what it was a decade ago or whatever. Yeah. And that, that was really. Embarrassing for a lot of people. For those who don't know, it's a website that caters to people who want to have extramarital affairs and some of the folks who were breached on that one they used government email accounts.
[00:45:10] AJ Nash: I mean, I mean, it was pretty bad all around It's bad behavior, but you really should use your government email account to do that. Makes it just a little bit worse somehow. Now, for what it's worth, it never even occurred to me then. I didn't dig into it too deeply to begin with, but it never occurred to me.
[00:45:24] AJ Nash: Then it would've been very easy to just ram in a few other people, Hey, here's somebody I think's a real jerk at work. You know, lemme, lemme get part of the download of, of the breach and just throw in this guy's data and then send it out, you know, republish at some place. And all of a sudden this GS 14 that I've never liked working with.
[00:45:39] AJ Nash: Well now he's an Ashley Madison data and he's gotta explain himself to people. It would just cause problems and, and there's so many opportunities to do that again, because there's so much data, right? There's so much coming in. We, you know, back to that surveillance discussion that started, whether it's video, whether it's audio, whether it's metadata, whether it's website, you know, content.
[00:45:56] AJ Nash: There's so much out there which offers up so many opportunities. To manipulate it, to, to create, you know, scenarios to create stories that may not exist. It's not hard to get these data dumps off the off the dark web. And actually they're remarkably cheap. So it's not hard to grab this content.
[00:46:09] AJ Nash: So you can actually have the real content if you want, and then just inject what you want. You're getting files, like, it's just, it's spreadsheets basically, eventually, and just throw in what you want and then send it back out to somebody and say, this is the data. It's remarkable. We haven't seen more of that, frankly.
[00:46:26] AJ Nash: Fun stuff, right Rob?
[00:46:34] AJ Nash: Yeah. It's, there's always somebody out there with an idea, man. I don't think I'm the first one with that idea. I, I'm certain of it. I'm certain I'm not the first guy to come up with that one. It's probably even happened and I didn't even realize it, if I'm being honest. I've probably processed data that had some in there and didn't know it.
[00:46:49] AJ Nash: It just seems like it's an obvious thing for somebody to do. So of course it's happened at some point. Now, the flip side is that creates a scenario where you have maybe plausible deniability if you actually are guilty of things. I mean, that is the opposite of all this. With all this data out there and all the ability to manipulate and change, and this gets back to what I've said all along about the loss of truth.
[00:47:06] AJ Nash: At what point is it. Just, I didn't do it. Well, there's video, there's audio, there's witnesses, there's all of this evidence. None of it's real. It's all fake. It's all ai. It's all fake news. It's all a hoax. It's all whatever. People just lose track of what's real and that, and then it gets to the point of just, I'm just gonna trust the person I want to trust, even if I know I shouldn't, but you're not gimme enough evidence to prove I shouldn't.
[00:47:27] AJ Nash: 'cause there's no such thing as evidence anymore at that point. And that's a fear I have is, is that, you know, to begin of the discussion, does all of this surveillance, does all of this content, does it make it better or worse? Does it make us, you know, better at security or worse, better at intelligence, or worse?
[00:47:42] AJ Nash: And as we said, it's both. Ultimately it could be just impossible if, because it's gonna come outta what people are willing to believe. And while I like Brian's opinion, that expertise is gonna have more value in a world with so much of these technologies that can't be trusted, I don't think that's what we've seen as the pattern over the last decade.
[00:47:58] AJ Nash: Expertise is being devalued. Education is now the elitism. You know, e expertise, you know, the intelligence community now can't be trusted because we're gonna trust a foreign adversary instead, because that's who some people in power wanna believe instead. Like expertise has been devalued for a long time.
[00:48:13] AJ Nash: Journalism is now, you know, instead of having a whole set of skills and responsibilities and. Now it's just somebody with a camera says, I'm a journalist, and that's it. You know, medical science is now reduced to conspiracy theories. I haven't seen a pattern that suggests that expertise is getting more valuable in our societies as we just rely on these technologies and people are allowed to believe what they wanna believe instead of what is truth, because they've got enough technologies that'll tell them what they wanna believe is the truth, and they can argue equally with you, even though it's all garbage, because nobody knows what's real anymore.
[00:48:44] AJ Nash: So all your evidence and all my evidence. They all look like evidence now. So I worry that that's gonna be where we end up. And if that's the case, then yeah, this is all really, really bad for intelligence and, and collection and law enforcement and, and just truth in general, journalism, all of it.
[00:49:01] AJ Nash: Har. Hard to disagree there. Happy holidays, everyone.
[00:49:09] AJ Nash: Oh, please tell me, Rob, you got a, you got a bright note to end on. That can't end on what I just said there. This is dystopian future. Just
[00:49:13] Robert Vamosi: gonna say, does Brian have any part thoughts that he would like to share?
[00:49:19] AJ Nash: Come on, Brian. Tell everybody good news after I just said all that.
[00:49:23] Brian Kime: No, I, I, I, I do agree with the folks that think.
[00:49:27] Brian Kime: All this AI is going to place a premium on true expertise, and we need to, as, as consumers, as, as folks in, in business and, and in government, we need to seek out those experts and promote them and encourage a maybe return to expertise or the value of expertise because, you know, infamously former.
[00:49:55] Brian Kime: Director of the NSA and Director of the CIA General Michael Hayden, you know, is famous for saying we kill people based on metadata. So if we are allowing AI to find metadata that warrants certain things, we better have experts. Who are validating, who are contextualizing and doing the things that AI cannot do.
[00:50:20] Brian Kime: I mean, at the end of the day, from on the intelligence side of our conversation, and AI can tell you a lot about what has already happened, but it takes really that expert intelligence analyst with all that knowledge, all those insights in the brain and the proper trade craft to advise a policymaker or a business leader about.
[00:50:40] Brian Kime: What the threat is likely to do next and get them to make a decision that reduces the risk of that event or threat. And so, you know, true expertise will continue to be extremely valuable and maybe more rare.
[00:50:58] Robert Vamosi: All right. Well, thank you, Brian for being our guest today. This has been a great conversation.
[00:51:03] Robert Vamosi: Really appreciate your expertise that you bring to it, and I want to thank our audience for listening. We appreciate all of you for joining us each week here on Needles Stack. You can find transcripts and more about the show at authentic8.com/needlestack. That's authentic with the number eight.com/needlestack all one word and be sure to let us know your thoughts.
[00:51:25] Robert Vamosi: There's a comment button there where you can leave comments or you can go out on social media where we're @needlestackpod, and we're found on almost all the social media platforms. And lastly, subscribe wherever you're listening or watching us today, subscribe. That way you won't miss any episodes in the future.
[00:51:44] Robert Vamosi: All right, aj, I'm out.
[00:51:46] AJ Nash: No, I appreciate it. As you said, thanks to everyone for being listeners and watchers of the show. Thanks Brian for coming on and, and you know, you like all of our amazing guests are what make this, you know, such a good show. 'cause Rob and I don't know what we're doing without you guys.
[00:51:58] AJ Nash: And so if you're. If you're enjoying the show, please let us know. You know, as, as, as followers, if you're not enjoying the show, please let us know. You know, we wanna make it better. Actually let Rob know. I don't really wanna hear about it. But with that in mind, I think we'll close it out for today.
[00:52:10] AJ Nash: Again, thank you everybody. We appreciate your time. This has been another episode of Needlestack.