[music plays]

CHRIS PAUL

A lot of the techniques that Russia uses date back to the Cold War. It's old wine and a new bottle. It's active measures just the information environment is so much more ripe for easy dissemination, for planting a story one place and then using a different source to say, hey, look, somebody said that, and kind of amplifying your own baloney. And we as humans are super vulnerable to being tricked, misled, deceived, and manipulated.

[music plays]

MATT ASHBURN

Welcome to NeedleStack, the podcast for professional online research. I'm your host, Matt Ashburn, and I still remain skeptical even when something seems believable.

JEFF PHILLIPS

And I'm Jeff Phillips, tech industry veteran and curious to a fault. Today we're continuing our series on fact checking and debunking. In several of our prior conversations over the last couple of weeks, they've touched on disinformation. And that brings us to our next guest. So we're joined by Chris Paul. Chris is a senior social scientist at the Rand Corporation and professor at the Party Rand Graduate School. He spent more than two decades in policy and defense research, focusing heavily on inform, influence, and persuade operations. So really excited to talk with Chris today. Welcome to the show.

CHRIS PAUL

Hey, thanks. Good morning, Jeff. Good morning, Matt. Happy to be here. Great to have you.

JEFF PHILLIPS

Yes. Chris, your work has devoted a lot of attention to Russian propaganda and disinformation campaigns. Can we start off ask you to talk a little bit about the nature of the Russian playbook on this front and any ways that it differs from other nations practices?

CHRIS PAUL

Yeah, I had the great fortune in late 2015, early 2016, to get the opportunity to really dig into and look at Russian propaganda. And I have to admit that at the outset, I was kind of surprised that they were enjoying some degree of success with primarily falsehood-based propaganda. I come from an influence and persuasion background that says that credibility is king and the truth will always win. So my colleague Miriam Matthews and I dug into the Russian propaganda model and we labeled it the fire hose of falsehood because it has four salient characteristics. First, it's high volume and multichannel. Second, it's rapid, continuous, and repetitive. Third, and this is where the falsehood comes in, it makes no commitment to objective reality. It's not wholly false. Maybe it's false, but backed up with manufactured evidence. Or maybe it's a little bit false, but is consistent with narrative frames. Or maybe it's true, but it's spun in a way that is completely misleading. Or maybe it's false, but presented from credible seeming sources. And then the final characteristic is that it makes no commitment to consistency. If one lie doesn't stick, they'll quickly try another one. You'll see the same channel broadcasting different accounts of events within a very short period of time.

MATT ASHBURN

I absolutely love that phrase, fire hose of falsehood. I think it really does. It conjures up a bit of an actual image in your head of what this could look like, right? And it really is descriptive of the problem, and I think the scope. I'm curious, how much of this is Russia being really good at what they do in their tradecraft to promote this propaganda, and how much of it is maybe the vulnerability of our populations with the tendency and human nature to digest information in different ways?

CHRIS PAUL

Yeah. So I agree. A fire hose is a really evocative metaphor and I'm happy to have stumbled across it and seen it propagate. And I think it does help in sharing awareness of this challenge. Now, coming back to why it works, I don't think it's actually particularly good trade craft on the part of the Russians, and maybe I can pick that apart. Falsehoods are persuasive, but their actual trade craft has to have been kind of sloppy or I fear we never would have noticed it, honestly, if they were actually more subtle about this, if they were actually good at it, they could have had better linguistic skills, have a softer touch, and it might have taken us a lot longer to notice it. Whereas in early 2016 we could look at things they've been doing since 2008, incursion in Georgia and some of the other things that have happened and kind of pick out what they were doing. Recognize clearly that the balance on RT is more propaganda than it is good investigative journalism that it is. Infotainment catch their election interference with their ham handed Facebook pages. That still got some results, but kind of poor trade crafts. So I think your question, "Hey, is it more about the human vulnerabilities?" My answer is going to be yes. A lot of the techniques that Russia uses date back to the Cold War. It's old wine and a new bottle. It's active, measures just the information. Environment is so much more ripe for easy dissemination, for planting a story one place and then using a different source to say, hey, look, somebody said that, and kind of amplifying your own bologna. And we as humans are super vulnerable to being tricked, misled, deceived and manipulated.

MATT ASHBURN

Yeah, Chris, I think you're right on that. A lot of the stuff that you see is a bit sloppy, it's a bit obvious, I think, certainly in hindsight. And so it just seems that Russia is just taking advantage here and it just seems to be very good at exploiting those human vulnerabilities that people have.

CHRIS PAUL

Yeah. Another observation about Russia's approach is they're very opportunistic. I like to point out that this is a nation of chess players, and in chess you have a long term plan, but your long term plan evolves based on what your opponent gives you. And so you may be thinking to be moving in one direction, but if your opponent does something kind of dumb and creates an opportunity, you're going to pounce on it. So Russia's opportunism looks at specific events, disasters, calamities, or just natural divisions within society and saying, oh, here's a society that is already picking at itself, already has a contentious division. Let's see if we can't amplify that. Let's promote and fire up both sides of the issue and see if we can't escalate the level to which this society is slamming into itself. We've seen that here in the US. They've helped us with our polarization. Thanks so much, Russia. They've done it a bunch in the UK and again, other places. If you look around the world where Russia is heavily engaged, their interference in the separatist movement in Catalonia, in Spain, their interference in the Scottish independence referendum, or in the aftermath of the Scottish independence referendum. Same themes over and over.

MATT ASHBURN

What are some of the human vulnerabilities that can enable this propaganda and these actions? Can you expand a bit on that?

CHRIS PAUL

Going through the list of Russian fire hose of faucet characteristics, starting with high volume and multichannel, quantity has a quality all its own. The social psychology research shows that the more often you've heard something, it's in your brain, it's in recall, the more likely you are to believe it, the more different places you've heard something from. That kind of makes sense, right, to think, oh, I'm cross checking. But Russia's broadcast approach, with lots and lots of different micro channels and lots and lots of different speakers, or apparent speakers, reinforces that another important factor Russia's high volume, not the high volume of multichannel, the rapid, continuous and repetitive. So propaganda is rapid, continuous and repetitive. Not only do you hear it frequently, but you hear it first. We all know that first impressions matter, but we all underestimate how powerful first impressions are. And this has to do with how we as humans store information. We are not computers. We do not keep our memories in card catalogs as separate pieces. We are homo narratives. The story people. We store our life experiences in one gigantic holistic story, one big worldview. So when we're presented with a factoid, and that's the word I'll use for something that's presented as fact, that may or may not be when we're presented with a factoid and we accept it, we don't write it on a mental filing card and put it in our filing cabinet, we bake it into our worldview. And so then if someone else comes along, six minutes, six hours, six days, six weeks later and tells us, "Hey, that fact, that wasn't actually the case," they're not asking us to go to a filing cabinet and take and tear up one card, they're attacking our worldview. And so, the burden of proof for a reputation is much higher than the initial factoid penetrating our consciousness, because that repair effort not only has to dig out the falsehood, but it also has to offer us something to make our world view whole again or it's not going to work. So that's one of the biggest factors, but it relies on some other human vulnerabilities, like the fact that we are just not that good at adjudicating truth from falsehood. A famous social psychologist, Daniel Kahneman has this great book called Thinking Fast and Thinking Slow. And in it he isolates what he calls System One and System Two thinking. System one is your autopilot. System One is the shortcuts and the heuristics that you use to get through a day bombarded by data and attention grabbing things. So it's anything that's easy. You just do the easy path. It's what you do when you're tired. It's what you do when you get behind the wheel and you think, "Oh, I'm going to work." And then you kind of zone out and 20 minutes later, "Oh hey, I'm in the parking lot at work, good for me." System Two is when you're really turned on and your attentive, that's when you're focused on something. You're devoting all your attention, your concentration, all your faculties. You're zoomed in on something. Something's provoked you and stimulated you to that higher level. And when we're in System Two, we're harder to trick. But being in System Two is demanding. It takes energy, and we don't have enough energy. It's not possible to be in System Two all day long. So a lot of propaganda relies on the fact that we're most of the time a little bit tired or a little bit in attentive. We're cruising around in system one. So imagine you're cruising around the Internet. You're watching videos. You see a video and it has up in the corner some kind of logo that says News, and it's clearly on a sound stage, and there's an attractive anchor person with a clean accent, and at the bottom is a footer of information, data, headlines, spooling by. You look at that and your brain says, "News." And provided that it's not a logo that you recognize as something you should be skeptical of, or provided the anchor person doesn't say something that's completely contradictory to your worldview, you're probably going to accept what they say as plausible or true, but it does not have to be. And those heuristics, those shortcuts, leave us very vulnerable to being tricked. And I could go on. There are a bunch of others too, but I'll pause.

JEFF PHILLIPS

That's super interesting. And I sit here as myself and wonder, how do I become more discerning on everything you read? It just seems overwhelming at the end of the day, Chris, to try to keep on top of everything and understand what's true.

CHRIS PAUL

Sorry to interrupt. That is a great point because I'm pleasantly surprised that you are being reflexive about, hey, what can I do to be better at this? Because often at this point in the conversation, the people I'm talking to are kind of snidely, looking to their left and their right and thinking, oh, you poor humans, you are so vulnerable to being tricked and manipulated, with the subtext being not me, I'm special and magical. Well, that's called blind spot bias, and we all haven't. So I'm here to remind you and your listeners, you're vulnerable. You're vulnerable. I'm vulnerable. Everyone else, we're all vulnerable. It's not just the other humans. We all do system one and system two. We all have heuristics. We are all not that hard to trick.

MATT ASHBURN

That's an interesting point. And, you know, as the news of Russian propaganda, that is, the awareness of Russian propaganda, the fact that it exists and they're attempting to exploit these vulnerabilities, it's become more mainstream. As far as people being aware that this is taking place, do you think people have had that introspection? Do you think that they've learned very much over the past few years that this has become popular in the news cycles?

CHRIS PAUL

I hope so. And despite the presence of blind spot bias, even if people are personally inclined to believe that they are more resilient to propaganda than they are, people take an altruistic view, oh, I want my fellow humans, my fellow Americans, to be more resilient to this. So, yeah, I think we should do something about those naughty Russians doing this. We should have more civics education. It's okay to have media literacy interventions. There's goodness in that. And I hope that continued spreading of awareness, continued spreading of understanding of how it works and what we might do about it can really help. I think one of the things that is most helpful is it's called prebanking. Given how important this first mover advantage is, given how important first impressions are, what are the opportunities for the counter propagandists to get in front? And I think a great example of that is what happened at the outset of Russia's invasion in Ukraine. So the Western intelligence services publicly released, hey, it looks like Russia's massing forces along the border. They're claiming it's an exercise, but to us, it looks like they're prostrate for an invasion. Do not be surprised if Russia invades within the next couple of weeks. And then right before the incursion actually started, they said, you know what? We're pretty sure Russia's going to invade, and they're going to create some kind of thin pretext. There's probably going to be some kind of false flag operation where they claim Ukrainians have attacked them and provoke them. So they come pouring across the border. And with that broadly broadcast, when that actually happened, people were like, oh, yeah, I got warned about this. Shenanigans. Shenanigans, Russia. I see you're up to Shenanigans. But mentally, go back in time and think, what would have happened without that? So we have this situation where other some warning russian forces are along the border. Their attentions, we're not sure what's going to happen. Silence. And then Russia's claiming that Ukrainian forces in eastern Ukraine shelled their positions across the river. An immediate response. And it was like, oh, ambiguity, uncertainty. I don't know what happened. Whereas with the forewarning, you know what happened. You believe the true story?

JEFF PHILLIPS

Well, speaking of being better, if I switch this back to what's going on in the Ukraine, we also have a midterm election coming up and we've talked about how good Russia is in information warfare. I guess my question would be, have the Ukraine - and the US, for that matter - have we evolved our information operations over the last, say, since 2016, when people like me became aware that Russia was very good at this during that election? But have the Ukraine and US evolved their information operations over the last couple of years or even since the invasion in Ukraine?

CHRIS PAUL

Yeah, so compare Ukraine 2022 with Ukraine 2014, where in a thoughtful, hybrid informational and physical campaign, little green men that were Russian soldiers but without insignia on their uniforms infiltrated Crimea and displaced mayors and other public figures, but created ambiguity about what was happening. Enough ambiguity to paralyze response in the rest of the west caught Ukraine on the back foot in terms of what to say about what was happening to them in terms of their own understanding. Well, in the meantime, I'm sure Ukraine had a deep soul searching moment after that and said, hey, there are other parts of Ukraine that Russia might like to take a bite of. What do we do the next time this happens? And so they launched a campaign both of improving their military, of improving their noncommissioned officer corps to be a more effective and nimble and decentralized fighting force. But they also did a bunch of thinking about crisis response and how to prepare narratives and how to respond in the face of ambiguous information. So they have become much more sophisticated information warriors. Now, some of that is personally contingent. President Zelensky has amazing charisma and good media sense and has been a real cornerstone of their effective narrative dissemination. But I think there's lots of other things that have happened the way Ukraine has built a global reputation so that more people in the world know who and where they are and respects them as a prospective member of the west and someone who is worth protecting. Whereas in 2014, I think most people in the west, especially in the United States, thought of Ukraine as a former Soviet republic. And SoC, if the story is, oh, Russia is taking back some territory from a former Soviet republic, how much does the average American care about that? Whereas Ukraine is a sovereign nation. Ukraine has a World Cup soccer team, Ukraine has a distinctive culture. Ukraine has this flag. We recognize the two colors of it. Russia is invading this sovereign nation that has Western sympathies and has its own unique cultural history. That's not okay. I want to stand with Ukraine. That movement took place between 2014 and 2022.

MATT ASHBURN

Yeah, that's an important note right there's. Incredible amount of marketing skill and crisis response in 2022 that we've seen from Ukraine. It's just really, really cool to see.

CHRIS PAUL

And then some of the things the use of memes. Authoritarian governments hate being mocked. And so the use of kind of aggressive humor, it does two things. It also solidifies identities because when you see something funny, you are either laughing with or you're laughing at. And so this thing we're just talking about identifying with Ukrainians, when you see one of these great Ukrainian farmer tractor and tank memes, you're like, ha, ha, evil, bad, nasty Russians. Hahaha. Funny, clever, like me, funny, clever, western Ukrainians. Okay.

MATT ASHBURN

Ukraine's efforts have been incredibly powerful, right. You can go from neighborhood to neighborhood in the US. At least, and you see Ukrainian flags all over the place, even at locations or in residences where people may have no formal affiliation with Ukraine. But they're supporting the effort in part because of the very effective communication and information operations. So it's been a great success.

CHRIS PAUL

Yeah, just so. I certainly see Ukrainian flags in my neighborhood as I walk around, and I've had colleagues tell me about their Ukrainian relatives and how their or their Ukrainian ancestors and it's interesting because I bet they had these same relatives and ancestors in 2014. So why is it more salient now? And it's because it's been made salient because these efforts have been put in place too.

MATT ASHBURN

Yeah, I think the use of the term meme is very appropriate there and that it was derived originally from something that is viral in nature. And so that certainly applies in this case. I wanted to go back to something that you mentioned earlier, which is helping other people be better, right, and helping them to become better at this stuff. As we start to wrap up here, is there any advice that you would give for fact checkers or professionals or even consumers to help battle propaganda efforts by adversaries?

CHRIS PAUL

Yeah, I'm hugely supportive of fact checking efforts, citizen fact checking efforts. Bellingcat, the toolkits they make available to help amateur fact checkers become more professional. But I guess based on the research that I've done, the advice that I would offer to fact checkers is the truth doesn't have magical properties. So while it's important to ferret out the truth, you also have to think about the human psychology. How are you going to get your factoid, which happens to be true, to propagate and spread amongst your SOC things like the first mover advantage? And boy, in fact checking, it's really hard to have the first mover advantage because a liar that makes something up is always going to have the scoop. They're always going to be first. If something didn't happen and they tell a story about it, there's no way you can get ahead of that. But recognizing that disadvantage, if you are debunking rather than pre bunking, what can you do to help humans repair their worldview with the new information? So it's not just, hey, here's what actually happened. It has to be more than that. Here's what was done and said and why it isn't true. Here are the techniques that are involved. Here is what actually happened. Here's how you should think about that. And then it's an opportunity to speculate a little bit. Hey, I'm a fact checker. I have consistently seen this Russian source generate falsehoods. I expect that in the future, this Russian source will continue to generate falsehoods. Hey, now all of a sudden, you're the first mover. Take what has happened in the past, projected into the future. Now you have to be careful because you don't want to over prognosticate about what specific things are going to say and then if they don't, now all of a sudden, you're not a fact checker anymore, but reporting that a source has repeatedly engaged in debunked falsehoods, that you have good documentation for and saying, I bet they're going to continue, that seems like it's in bounds. And that can definitely be helpful.

MATT ASHBURN

Those are certainly good insights, and I appreciate you being here today. Chris. So thanks again to our guest, Chris Paul. If you liked what you heard today, you can always subscribe to our show wherever you hear your podcasts. You can also watch episodes on YouTube and view transcripts and other episode info on our website at authentic8.com/needlestack. That's authentic with the number eight dot com slash needlestack. And also be sure to follow us on Twitter. We're @needlestackpod over there. And we'll be back next week with more on fact checking, debunking, propaganda, all things disinformation. We'll see you then.

Fact-checkers’ biggest nemesis is the proliferation of disinformation in the digital age. Chris Paul joins the episode to talk about Russian propaganda and disinformation techniques. Why are they so effective? Is it skill or just an innate vulnerability for humans to want to believe what they see? Chris Paul walks us through his research from the technical to the psychological.

Key takeaways

  • 4 key factors of Russian disinformation tactics
  • How we're all vulnerable to disinformation techniques
  • Strategies to counter the effects of disinformation rather than the symptoms

About Chris

Christopher Paul is a senior social scientist at the RAND Corporation and professor at the Pardee RAND Graduate School. Prior to joining RAND full-time in July 2002, Paul worked as an adjunct at RAND for six years and was on the statistics faculty at the University of California, Los Angeles (UCLA) in 2001–02. During the course of his more than two decades in policy and defense research, Paul has employed a range of methods including comparative historical and case study approaches, quantitative analysis, expert elicitation, analytical wargaming, and evaluation research.  His current and recent research efforts include analyses supporting operations in the information environment, security cooperation, counterinsurgency, irregular/unconventional warfare, and operations in cyberspace. Paul received his Ph.D. in sociology from UCLA.

Publications by Chris

Subscribe
Enter your email address below to receive notifications from needlestack@authentic8.com
Close
Close