Putin’s Propaganda Isn’t What Russians Are Used To

Doing terrible things in the open is propaganda, too.

(Getty/Chris McGrath)

Welcome to Galaxy Brain—a newsletter from Charlie Warzel about technology, media, culture, and big ideas. You can read what this is all about here. If you like what you see, consider forwarding it to a friend or two. We’re still figuring things out in our new home so let me know what you think: galaxybrain@theatlantic.com.

This is a subscriber-only edition of the newsletter. You can subscribe to The Atlantic to get access to all posts. Past editions I’m proud of include: Guardians of the Internet, Don’t Alienate the Willing, and How to Spend 432,870 Minutes on Spotify in a Year.

In the Atlantic article I published on Tuesday (and sent to Galaxy Brain readers), I spoke to a lot of people about the need for nuance in discussing “the information war.” One person I quoted but didn’t end up speaking with was Peter Pomerantsev, a senior fellow at John Hopkins University’s SNF Agora Institute and a scholar of Russian propaganda. Peter had one of the more interesting observations about the conflict around the limits of the term “information war”:

I reached out to him this week to talk a bit more about what he’s seen so far in this conflict.

Russian Info Warriors Caught Flatfooted

Pomerantsev offered this reason for the Russian propaganda apparatus seeming so feeble in the early days of the conflict.

Clearly nobody told the Russian propaganda machine that the war was coming. It’s obvious that they had no time to prepare and they can’t do a comms campaign on the fly. Everyone is saying “Russian information operations failed,” but to me it is clear a lot of domestic propagandists just weren’t told. Putin moved forward with this invasion without the information and cyber brigades…and by the sound of things, without good parts of the army. But that doesn’t mean they won’t get back together.

"Propaganda of the deed"

Part of my job is to write about the ways that information travels. Watching the tragic footage coming out of Ukraine the last few weeks has been a reminder of the limitations of this work. The digital side of this conflict and the narratives that form around it are obviously important, but it’s all secondary to what’s happening on the ground: the killing, leveling of cities, displacement of hundreds of thousands of refugees. When I said this to Pomerantsev, he responded with a term I found useful, and chilling: “propaganda of the deed.”

It’s not helping the Russians that they don’t have the information side activated right now. But Putin’s doing something else. He’s doing propaganda of the deed. He’s doing just terrible things right out in the open. Saying all the quiet parts out loud. Publicly entertaining the taboo topics of nuclear use and chemical weapons. He’s saying, “I’m here doing this and you’ll just stand there not doing anything and I’ll show the world you’re all a joke.” It’s propaganda of the deed instead of spin and narrative-setting.

He suggested that the Russian troll farms like the Internet Research Agency that got so much attention in the wake of the 2016 election “was just a little cartoon — the ads and the shorts before the feature presentation.” Ultimately, Pomerantsev told me, the destruction we’re all bearing witness to is the real—and most powerful—narrative.

Although he did offer this caveat: “But Putin might be fucking up. Obviously, the war is not the elegant operation he anticipated, and the propaganda of the deed here could certainly and totally backfire, depending on how the war goes.”

The Open-Source Investigation Template

Last week, I spoke with Eliot Higgins, one of the founders of the open-source investigations project Bellingcat. The Centre for Information Resilience and the Conflict Intelligence Team have worked with Higgins’ team to build a sprawling set of resources to monitor, verify, debunk, and archive the battlefield footage that’s popping up on social media. Perhaps the best resource is the Russia-Ukraine Monitor Map, which allows any interested party to search these social posts (by region or type). (Warning: There is quite a bit of graphic footage in there, though it is labeled as such.)

This network, Higgins argued, is vital not only for penetrating the fog of war, but also to track and archive potential war crimes for accountability purposes later on. In our discussion, Higgins said that open-source investigators have been working in Ukraine since as far back as 2014, after the downing of Malaysia Airlines Flight 17 in Donetsk Oblast. These investigators had a good understanding of the conflict but also of the region, and knew what feeds to check and how to evaluate the footage. Their years of work made documentation and verification that much easier when Putin launched his invasion.

Higgins urges the creation of similarly robust investigation networks in other regions of the world.

Higgins: This preexisting network in Ukraine is crucial to the efforts you’re seeing right now. And it could be the same in other places around the world, but we have to put in the work there. I’d rather not be in a situation where it takes a high-profile, terrible thing to happen in some country for the West to care enough about it to start monitoring. If we want the truth and accountability in these conflicts, we have to document them early.
Warzel: That sounds like a tall order. Do you think you can set up these networks?
Higgins: What we’ve seen with Bellingcat volunteers is that we have people who come in because they are interested in the specific region or a topic. There are also people who simply just like investigating and peeling back the information and following it. We need to identify more of these people, and we’re trying to build out our own Bellingcat volunteer community. We’ve been training people in this work and engaging with broader online communities—that’s what we’re doing in Latin America, where we’ve applied open-source investigation to work journalists there have been doing. We’ve taught some of them to do this work themselves. But we have to keep building this out.

Bellingcat’s website is something to check out—it has some educational sessions on the books, too.

Video Formats In Wartime

On Monday night, Ukrainian President Volodymyr Zelensky posted a video of himself in his office in Kyiv. A lot has been written lately about his effective style of communication with Ukrainian citizens and the world, but this video in particular stood out to me. It begins with Zelensky on his phone, his camera facing out, through his office window. Then, he flips to the selfie view as he walks down the corridor toward his desk, and talks to the camera. Once he sits down, the POV changes once again, to a fixed position inside the room. It shows Zelensky holding the phone that he was just filming with. He then begins a more formal style of address—similar to what we’ve come to expect from political speeches.

It’s a very simple edit, but a powerful one. The first part communicates Zelensky’s position—a kind of proof of whereabouts—but the self-shot footage also provides an intimacy and authenticity. That is then spliced with the more authoritative, institutional form of communication (the desk shot). I found the blend rather remarkable in its attention to detail. The self-shot angle is humanizing, a reminder of his—and the Ukrainian peoples’—vulnerability, but the speech is mostly given in an official capacity. When I tweeted about the clip, hundreds of people responded to remind me Zelensky and his team have show-business experience. While that’s undoubtedly true, it’s still rather remarkable to think to utilize it during the stress of wartime.

For more on this, I really enjoyed James Poniewozik’s column.

Rules on the Fly

This recent news—that Facebook is now "allowing war posts urging violence against Russian invaders"—reminded me of part of my conversation with Brendan Silverman last week.

Silverman, who founded Crowdtangle and spent a lot of time inside Facebook after it bought the company, warned about platform takedown processes and transparency in the digital side of this war. He’s worried that companies will respond quickly to events on the ground and remove accounts and channels without thinking through the decisions and—more importantly—without archiving the content. One bit from our conversation that didn’t make it into the piece:

What people generally underestimate about social media is the degree to which a lot of the problems we see are execution errors. There are decisions being made, but the process is not executed properly. For example, I’d bet there’s some state-controlled Russian media that’s viewable in the EU on these social platforms, because in their rush to take things down the first time around, somebody, somewhere on a platform forgot to include groups or widen the search parameters.

It’s an important reminder that so much of what we see from big social media companies isn’t the result of rigid rules and processes and value systems. It is, instead, reactive. “A lot of what happens in these nuanced situations is way more poetry than science,” he told me. “I think we tend to  overestimate the philosophical coherence and think we have these set precedents that guide us for all future decisions. It turns out, it’s really hard to do it, and these platforms change precedents all the time because creating principled, universal guidelines for an increasingly complex world is hard.”

This is why Silverman is arguing for more transparency, both in how the platforms surface information but also in how companies are making important moderation decisions. Not only is it helpful for outside evaluation from researchers, civil society, and even politicians—it’s helpful for the platforms to build a framework to learn from their mistakes.

Charlie Warzel is a staff writer at The Atlantic and the author of its newsletter Galaxy Brain, about technology, media, and big ideas. He can be reached via email.