Read Latest Issue now

Nina Jankowicz: “The US is still playing whack-a-troll”

Posted on Aug 4, 2020 by FEED Staff

Nina Jankowicz is an American expert on the intersection of technology and democracy with a speciality in Eastern Europe, as well as the Disinformation Fellow at the Woodrow Wilson International Center for Scholars. Her debut book is called How to Lose the Information War: Russia, Fake News, and the Future of Conflict

FEED: How did you come to write your book?

Nina Jankowicz: I started my career in the democracy support space and my degrees are all in Russian studies. After I left graduate school, I was working for an organisation called the National Democratic Institute, a non-profit NGO that works to support democratic activists in Russia and Belarus. This was as the United States Agency for International Development was getting kicked out of Russia. Since we were partially USAID funded, we left, too. The Russian government did their propaganda gambit against us and that’s where my interest in all this stuff came about.

While I was still working at NDI, the Ukraine crisis began, with Russia annexing Crimea and invading the Donbass. NDI was a fairly old-school organisation and they were happy to just stay out of it and let things be said about us. But I thought we should be taking a more proactive approach. In 2016-17, I got this Fulbright fellowship in Ukraine where I was advising the Ukrainian Ministry of Foreign Affairs on strategic communications – in the belly of the beast. I worked with the spokesperson at the Ukrainian MFA on messaging and how to keep western attention on Ukraine after ‘Ukraine fatigue’ was setting in.
But more formative was the fact that the US election was going on while I was in Ukraine. The perspective of being able to see tactics that were already in use on the ground in Ukraine being done to the US information ecosystem was extremely alarming. As more came out through the US Russia investigation, it became clear that this was not going to go away.

I was getting really frustrated with the way the US was looking at things, as if we were the first country this had happened to. People were ignoring the fact that this sort of manipulation had been happening in Eastern Europe since 2007. That’s what my book looks at, how five countries in Eastern Europe – Estonia, Georgia, the Czech Republic, Poland and Ukraine – responded to these influence operations.

It’s important to look not just at the disinformation, but at how countries responded and what the best practices for responding are. Right now, the US is still very much playing whack-a-troll. We’re not imposing enough of a cost to make foreign actors stop, as we’ve seen over the past couple of months with not only coronavirus disinformation, but with the George Floyd protests, which bad actors are certainly taking advantage of.

FEED: Are the problems solely caused by foreign actors? What about internal disinformation?

Nina Jankowicz: As I was researching and writing the book, it became an even bigger issue inside the US. It’s not just about foreign disinformation now. It’s also about domestic disinformation. If there’s one thing to understand about online disinformation, it’s that these tools are democratised and that anybody can use them. You don’t have to be a government, or even organised. One person or a small group of individuals can have a big impact.

But we’re now inadvertently – or sometimes knowingly – supporting the goals of malign foreign actors. That’s where we’re headed now – to ‘information laundering’, where, rather than placing ads and creating fake personas on the internet, bad actors can seed narratives in groups and private channels – encrypted messengers, for example – and those then get put out through authentic local voices. That’s a pattern I’ve seen in a couple of the Eastern European countries.

The most important takeaway is that we can’t fight foreign disinformation unless we recognise the domestic disinformation problem as well. That gets us into quite a quagmire in the US. President Trump is a source and amplifier of all of this disinformation and that has stopped a lot of the common-sense, easy solutions we could be implementing by politicising the entire concept.

If there’s one thing to understand about online disinformation, it’s that these tools are democratised and that anybody can use them

FEED: How is this different from what governments have been doing for decades?

Nina Jankowicz: There’s a difference between what the Soviet Union did during the Cold War, and what Russia does today. The USSR created spurious publications and fake experts to seed these narratives. If you look at, for instance, the fake story about the US creating AIDS, it got some traction, but compared to what can be done with social media using fake personas, it’s night and day. The tools that allow them to drill down and target the most vulnerable people is incomparable to what was going on in the 1980s.

They also put out what the RAND Corporation calls a “fire hose of falsehoods”. It doesn’t matter if it’s supporting one ideological goal or another. We’ve seen support of candidates on the left and the right in the United States. The same has been true in other countries, including Germany. It’s not necessarily in support of one ideology – it’s to create chaos so that we’re focused on our own internal problems and not paying as much attention to Russia’s adventurism abroad.

Every nation has influence campaigns. That will never go away. But in this case we’re talking about it being farmed out to a nominally non-governmental agency – in this case, the Internet Research Agency (IRA) – to create this air of plausible deniability. Then they go out and impersonate Americans to intervene in our democratic discourse.

FEED: So, how do these disinformation teams operate?

Nina Jankowicz: The IRA used to be in a five-storey building in St Petersburg – although they’ve since moved, so I’m not sure how many stories it is now. It was uncovered by a bunch of Russian journalists who talked to whistle-blowers from within the organisation.

Originally there was a Ukraine unit, where they tested techniques that were later employed in the US, spamming comment boards and creating fake accounts on VKontakte – the Russian version of Facebook – and using other social media networks and creating fake news sites. That was stage one, in 2014. As all this was going on, they started the US unit, which involved hiring young people for a lot of money – something like $900 a month, which in Russian terms, particularly after sanctions, was a lot. Usually they were folks with a journalism background with a good knowledge of English and American culture.

There were a bunch of units within the America team. Some were focused on fake Twitter accounts, some made memes. They employed two black men who made YouTube videos, which was very successful. This was back when the Black Lives Matter movement was first starting. They had a lot of money to throw at things. It’s not a very strategic communications plan, they’re just throwing spaghetti at the wall, seeing what sticks and redirecting resources.

People know that Russia bought some ads through the Internet Research Agency – $100,000 worth. But it was clear that there was a lot of organic engagement, too. They were creating communities that started out by using positive messaging. There was a Facebook page called ‘Being Patriotic’ that was a right-wing, jingoistic page, very pro American. My favourite example from that page was a post of a golden retriever wearing an American flag bandana, and the text said: “Like if you think it’s going to be a great week!” That’s not really disinformation – I would probably like that picture, especially if it were posted around the 4th of July. It’s meant to engender community and positive feeling and trust between people who liked the page and the page moderators.

Then over time they would have greater and greater asks of their followers. For the groups targeting black Americans, for example, as the Black Lives Matter movement gained steam, it was about signing petitions, changing your profile picture in support of people who had been killed or arrested. And eventually, with both groups on the right and the left, it turned into IRL protests. People were actually showing up to protests that had been organised by the Internet Research Agency. I talked to a guy that had a couple hundred people show up to his protest after the IRA bought ads for him. He didn’t know it was the IRA.

Political advertising needs to change

FEED: How would you rate the UK’s and EU’s handling of the problem?

Nina Jankowicz: I think the EU unfortunately is a bit hampered by its necessity for consensus on big decision making. Every time sanctions on Russia come up for renewal with regard to Ukraine, there’s a big debate. I don’t give them a great grade. But it’s better than doing nothing at all, which is what the United States has been doing for a long time.

I do think the EU is doing a decent job in terms of using its bargaining power to try to influence the social media companies. I think GDPR is a good thing. I think that’s a win for consumers and users of the internet. It’s just a matter of implementation and making sure that it has teeth now.

The UK has been interesting. I think their approach to Russia has been really clear-eyed for the most part. I wish that it was not sometimes marred by politics. They also have a Counter Disinformation and Media Development fund that was established long before any of this was cool.

Also, in response to the social media companies, the UK parliament and DCMS have been more aggressive than any US regulator. I don’t know if that’s because these aren’t UK companies that they’re regulating. In the US, we have a kind of hubris that we can’t regulate them because we brought social media to the world! And if we regulate them, then we’re going to stifle innovation.

FEED: Is there incentive to regulate how disinformation and influencing work if those techniques are also being used within
a country?

Nina Jankowicz: Again, you can’t begin to address the foreign threat if you don’t recognise the domestic disinformation. In the UK, there are a lot of civil servants, and a couple of high-level politicians as well, who recognise there’s a domestic disinformation problem. We don’t have that in the US. If that behaviour is going to support your cause in the United States, most people aren’t going to call it out. There was a pledge about the use of disinformation among the Democratic candidates for president and not all of them signed it.

My sense is it’s because it puts the Democrats on equal footing with the Republicans, because the Republicans are clearly using these tactics. And Trump has publicly embraced the fact that he’s doing so. I hope that in future presidential contests we see kind of a return to decorum. There’s always been some degree of lying and posturing during politics

This Genius Interview first featured in the August 2020 issue of FEED magazine.

Dalton Combs, Boundless Mind: "The war o...

April 10th, 2018

Boundless Mind co-founders Dr. T. Dalton Combs and Ramsay Brown know how your brain...

Geert Lovink, Institute of Network Cultu...

October 9th, 2019

Geert Lovink is a Dutch activist and writer on digital technologies. His Sad by...

Sabina Hemmi, Elo Entertainment: "When I...

October 8th, 2019

Sabina Hemmi is co-founder and CEO of Elo Entertainment, the company behind gaming data...

Greg Gilderman & Kevin Hayes, The Weathe...

July 29th, 2019

The Weather Company’s editor in chief and global head of video, Greg Gilderman, and...