Dr. Larissa Doroshenko details how "reflexive control" forms the framework of Russian disinformation.

Producer: Johannah James and Michael Gordon
03/19/2022 • 08:47 AM EST


Dr. Larissa Doroshenko, Political Communications Scholar at Northwestern University, discusses the origins of the Russian Internet Research Agency and its early work in suppressing the 2011 election protests in Russia to its later work of sowing discord, promoting polarization, and interfering in elections all over the world. She also details how the Cold War framework of "reflexive control" utilizes fake news as just one component of a larger, more sophisticated Russian disinformation campaign.

Transcript:
0:00: Johannah James:

With us today, we have Dr. Larissa Doroshenko, emerging scholar of political communications. Most recently, Dr. Doroshenko published a study during which she investigated the manipulation of information by Russia leading up to what is now a military invasion of Ukraine. Today, we're talking to Dr. Doroshenko about this incredibly relevant analysis in light of the current state of military conflict in Ukraine. Dr. Doroshenko, thanks so much for joining us.

0:22: Dr. Larissa Doroshenko:

Thank you so much for inviting me.

0:24: Johannah James:

Can you give a little background on the Russian Internet Research Agency and how they played into this kind of information warfare?

0:33: Dr. Larissa Doroshenko:

Yeah, so the Internet Research Agency, so they're like different kinds of conversations of when exactly was its birth date, but most scholars confer that it started its activity around 2010. And one of the first tests or like possibly first test that we've seen in the research, was it's used on 2011 after presidential elections, when there were a lot of protests in Russian Bolotnaya Square in Moscow and St. Petersburg, again, these protest cities, but also all over the country at that time. And Russian trolls were used to flood and suppress the narrative around the protests around the fraudulent elections with things like kind of distracting with other types of information and pushing other agendas online to distract people from the protests happening and flooded with other types of information.

And we know for sure now that the bots were used in the 2013 operation in Donbas, that the paper was focused on. But there is also a lot of evidence that this Internet Research Agency was also active during military operations in Syria. So as we worked through this dataset that we had, the archival data, a lot of them were using Arab writing, a lot of them had hashtag Syria. We did not have any experts, unfortunately at the time who were able to read those tweets. But there's definitely, and knowing that Russia also had backed up Assad's regime and Russia has a lot of military presence and interest in Syria, it was not surprising that it was also engaged in these disinformation campaigns in that area as well. And we also saw it in 2016 in the US. And also I think during Brexit as well, there were investigations about it and also other interference in other election campaigns afterwards, all over the world.

2:39: Michael Gordon:

You mentioned that initial IRA efforts were directed at the domestic audience for a couple years, and then it gradually started being used internationally.

2:50: Dr. Larissa Doroshenko:

Yeah, that is exactly what we observed. They started like right with the Russian protests aftermath of 2011, when it was getting clear that Russians were getting more and more dissatisfied with their government, right. And this chess game that Putin was playing with a changing constitution, so creating some type of proxy for a couple of years because he could not be elected for too long. So, and then afterwards, they were used at that time, they then were tested in Ukraine and then they also started exploring those pretested and strategies that they saw worked.

Right. Like, for example, if I post this type of tweet, it gets a lot of retweets and we also liked the features that they were using. So looking at the links, but also looking at the type of hashtags they use, the type of ad mention they use, to get traction, to get more retweets and also to get more followers as a result of it. And so the most successful strategies essentially were employed later on in at least in the United States of what we studied for sure, but I did not see how they were not explored later on as well

4:07: Johannah James:

Regarding IRA activity on social media back in 2014, you have said that your analysis revealed that they didn't just spread falsehoods, they engaged in reflexive control. Can you elaborate on what reflexive control is?

4:21: Dr. Larissa Doroshenko:

Yeah, absolutely. So I think it's very important. I will start a little bit with a preview. So I think it's very important for us to realize that disinformation campaigns are not just fake news, right? As you realize that. And I think that's what made a lot of people kind of complacent. And now I also see in a lot of conversations how Russian propaganda doesn't work anymore, just because we see a lot of kinds of retaliation cases in the West and in the Western media. And I think that's also like, can put us in a more complacent state of mind where we're like, oh, you know, like it's not as scary as we thought it is. But it also makes us kind of catch specific instances of these falsehoods and not like kinda looking at the trees, but not at the forest behind it, or like not at the bigger patterns.

And this theoretical framework of reflexive control really enables us to look at the patterns in disinformation rather than specific instances. And then it enables us as scholars or as citizens to see whether certain pieces of information we see on social media or in the news, whether they feed those goals and whether they feed those patterns. Rather than just try to chase those instances one by one.

Reflexive control as a framework was developed back in the 1960s at the height of the Cold War between the US and the USSR. And actually I've seen also some works where experts are saying that it stings specifically for Putin, because it was believed that effective use of this reflexive control essentially helped the US and helped the West to dismantle the Soviet Union eventually. So they were much stronger with that information on propaganda. And that is one of the reasons why they won and, you know, led to the dissolution of the Soviet Union. So it's really interesting that for Putin it could be like almost when you can talk about the psychological condition of that person, but here, it could be like a personal revenge almost to establish a very strong reflexive control operation to kind of take revenge for the defeat of the USSR.

And that strategy essentially feeds like an adversary, like an enemy information, erroneous information, that leads that adversary to make a wrong decision. So you're not telling them what to do. You are not gonna directly say the false things. You are feeding information in such a way that makes them more likely to make a wrong decision. And in this case there is this kind of famous 4D strategy, I'll kinda simplify it. So it's distort, dismiss, which I think we've seen a lot. So kind of distorting like real facts like fake news or dismissing, just denying your involvement or not discussing it. Like as we can say, Russia right now also is like distorting some facts about saying that we have a Nazi or fascist government in Ukraine or dismissing the facts about high casualties and that they're targeting civilians as well, or like they're losing a lot of manpower and a lot of equipment in Ukraine these days.

But another strategy that I argue is especially targeted at foreign audiences and is much more subtle, is distracting and dismaying. So distracting being introduced in certain issues with certain takes on the information that would distract from the main kind of purpose, or that would also create or exacerbate divisions among people or between countries. And this is obviously just playing more with the emotions and fear mongering. And just saying more how, you know, I think, especially with this crisis, we see a lot of nuclear threats and just kind of like playing with that. Like even yesterday we witnessed how Russian forces were like shelling the nuclear power in Zaporizhzhia. So, I mean, again, I think it's just a very deliberate part they're playing in this dismay. So yeah, so it's much more complex than just creating false information. We should be more careful and look for these patterns in the information we see these days.

9:12: Johannah James:

In your most recent study, you mentioned that the IRA is often referred to as Cold War 2.0 in its use of strategies, such as reflexive control. Are there any other propaganda techniques you've noticed Russian involvement in the media using that also reflect the Cold War?

9:34: Dr. Larissa Doroshenko:

I think in general, we see a certain conversation like trying to distract from the issue. I've heard recently a lot during the State of the Union or during this time, like build the wall or just in general because of this narrative, how Biden should care more about our Southern border than he cares about the border between Russia and Ukraine. So that might, the narrative might be organically created by parties. But, or as we also call them, propaganda research "useful idiots," but they can be artificial and they are artificially amplified by Russian propaganda. So some of my friends in Ukraine sent me a list of Telegram channels. And Telegram is a Russian-based messaging app that is getting more popular also, especially among far-right groups in the United States, partially because it helps them to invade more like, you know, I guess hate speech and other types of flags that might have gotten on more traditional social media platforms like Twitter or Facebook, because they're more alert and more aware of it. So it's very popular in Russia.

So now in addition to Twitter, we also see a lot of these Telegram channels created to promote a kind of certain Russian propaganda essentially. And I've seen, for example, certain memes, say like with a Biden's face and some oil refinery, oil drilling equipment in the back, you know, in the horizon saying "make Russia rich again." So a lot of this is circulating on Russia propaganda websites and we might see them as kind of separate instances. But essentially they are similar parts of distracting from the main issue.

And another thing that also like another distracting strategy that I've noticed recently to my dismay or to my disappointment was also a lot of conversation among people supporting the Black Lives Matter movement and like on the far-right, like if we're talking about, you know, on the far-left, we talked about what the right - what the GOPs doing. On the left side, we also see a lot of conversation about how Ukrainians are racist, we shouldn't be supporting Ukrainians. A lot of them are siding also with the Russian propaganda narrative talking about that there is a Fascist government in Ukraine. Talking about that the war has been going on for eight years, where have you been all these eight years? And those are all the strategies used by Russian propaganda, they're just being translated into English and spread among Americans right now.

And I've also been on these Telegram channels used for Russian propaganda. I saw there were even some tweets translated from English to Russian and one of them was from BLM supporters saying, "Let not Putin distract you from important issues. Ukrainians lives don't matter until black lives matter." And it's like a very, like it's a false analogy or like a false choice, a logical fallacy, right? You don't just decide between the two, it's an important crisis. We can talk obviously about any type of instances where students of color are not treated equally at the Ukrainian border right now who are trying to evacuate. That's obviously a separate conversation, but there was also a lot of fake information circulated around it. Not everything was verified. Some of the instances I know surely because I got in touch with some friends in Ukraine who were able to confirm it. A lot of it was not true.

So again, it might have, this narrative might have emerged organically because of those instances where let's say Ukrainian's just prioritizing only women and children and those students of color were left, for example, for later, just because there were males, but that was interpreted as a racial issue. And would get amplified and was also helped by Russians, like trolls or Russian propaganda to amplify even more and say, oh, hey look we only care about white people here and that's very, very upsetting. And what it does, essentially is that it creates and exacerbates divisions among American society around the issue of Ukraine. Like, should we support it or should we not support it? Oh, they're racist there, that's why we shouldn't be supporting it or we're not gonna you know, be as engaged in our kind of protection of this young democracy. So that's again, that's some of the examples, but I argue they just fall very neatly into this distracting strategy of reflexive control.

14:27: Johannah James:

Can you provide us some insights on the increasing sophistication of these IRA campaigns and possibly highlight some ways to detect these disinformation strategies in future campaigns?

14:39: Dr. Larissa Doroshenko:

I think one part is to first of all, be aware of this. So we finally saw the ban of Russia Today from Facebook. We saw how the company would not be able to purchase any ads. I think either on Google as well or Facebook for sure. So there are also a lot of other similar propaganda websites that are known to the U.S. government. So there's been this report from the Global Engagement Center at the U.S. Department of State back in 2020, where they listed a lot of websites associated with Russia Today, associated with Russian propaganda. If we started blocking Russia Today, I think we just, if you say A, you should say B, we should block all.

Similarly it's one way to block their official presence on social media. I, as a user, can still go to the website and can still post the link from it on my account. So for blocking and then again, if we have certain created accounts from Russian trolls, or maybe even regular Russians who want to present alternative information to their friends abroad, they can still do that. I can still go and close those links to Russia Today on my social media platforms and people will share them and it's still spreading. Maybe not as fast, but if you employ trolls, for example, it will spread faster. So monitor those links as well, not just closing the profiles, but also monitor those links being posted, who are they being posted by? Getting information from those users, trying to figure out who they're connected to. Maybe we can detect a troll network this way.

And another part is obviously like maybe a bit more controversial, but we know that Russian propaganda's been using those like not automated with the bots, but also semi-automated with the trolls, means to promote its agenda. If we are trying to win this battle, information battle, why don't we use the same type of strategies? But instead of obviously promoting propaganda, why don't you promote truthful sources of information? BBC, Deutsche Welle, Radio Free Liberty, they're all banned in Russia right now. Because Russia realizes that they are presenting verified information about what's going on in Ukraine.

Those sources are trusted. I'm not even talking about maybe more controversial like CNN and we can, you know, start like these partisan battles again within the country. Let's go with the, I don't know, European sources but amplify those voices on social media. Because right now we are just playing like a fair battle. We are relying only on their accounts and the user engagement. Russia is relying on a bunch of bots and trolls to promote their propaganda. So if we are trying to win this battle, I would argue we should use similar measures to amplify truthful sources of information.

And another like last thing is instead of maybe looking at the specific instances of the fake news, see patterns. Like if I see certain types of information, like let's say about all of this, not all refugees are created equal, Ukraine is discriminating against refugees. My first thing is like, okay, that's really sad. Let's verify this information. That's true. And my second thought is who is benefiting from this information right now? And the obvious answer is Putin, because he tried to create this artificial refugee crisis just a couple of months ago on the border between the European Union and Belarus. By bringing in tons of refugees from Syria and Iraq, and trying to make them cross the border. And Belarusian Border Control were going to help them and aid them in this and not stop them. So it was completely artificially amplified, artificially created to divide Europeans more. Exactly the time when the Biden administration received information about the Russian plan to invade Ukraine and was trying to rally and unite the European partners.

So, at the same time, all of the strategies like creating this kind of type of refugee conversation saying that European countries are hypocrites or like trying to see, like whether you can create certain tensions within the European countries in terms of how they treat refugees, all of that is helping Putin. And so in other words, okay, those issues are important to address, but let us not get distracted by them. And I think if we keep this goal in mind, if we think, okay, what is the pattern might be about? What is that specific instance fall into? Who is this information benefiting? We would be also much better equipped to detect disinformation campaigns that are more subtle than just fake news.

20:03: Johannah James:

Okay. It looks like we're out of time for today. We've been talking to Dr. Larissa Doroshenko, emerging scholar of political communications, about her research on Russian disinformation campaigns. Dr. Doroshenko, thank you so much for joining us today.

20:16: Dr. Larissa Doroshenko:

Thank you so much for inviting me. It's been a pleasure.

References