Wednesday, The Washington Post’s Taylor Lorenz reported that the Department of Homeland Security was “pausing” its Disinformation Governance Board just weeks after announcing the initiative. The board has received intense and broad criticism since the moment it was announced, and its head, the author and information researcher Nina Jankowicz, has been subjected to a ceaseless harassment campaign across the internet and scathing coverage across right-wing media and cable news (as Lorenz reports—in the week after the announcement, “approximately 70 percent of Fox News’s one-hour segments mentioned either Jankowicz or the board”). Shortly after the story came out, Jankowicz resigned from the board.
Lorenz details how the Department of Homeland Security and the Biden administration left Jankowicz out to dry during the attacks from the right, telling her not to engage. Employees inside DHS who spoke anonymously detailed the lack of support for Jankowicz and the board and what seems like a panic of sorts in the face of a lot of bad publicity.
The reaction to the news of the pause has been almost uniformly awful. On Wednesday, a Twitter search of the URL to the Post’s story was arguably one of the more depressing places on the internet. It will not surprise you that those who’ve obsessively tweeted about Jankowicz (her qualifications, both real and those that were misrepresented in bad faith; her TikTok presence) gleefully tweeted the story while also expressing their outrage that the Post’s framing of the firing focused heavily on Jankowicz’s harassment. (This is not a bad example of my recent newsletter on the right’s “sore winner” complex!)
There’s also an avalanche of people quite upset that the Biden administration caved to the deluge from right-wing media—yet another example of an establishment organization (in this case, DHS; in other cases, mainstream media organizations) that doesn’t seem to understand how bad-faith campaigns work on the internet. Everyone is mad, a woman gets death threats and is forced into silence, and even the winners barely think they’ve won. Honestly, just another day in Politics Online.
The Disinformation Governance Board saga exemplifies the Cursed News Story, engaging and enraging many people while simultaneously destroying any complexity surrounding the actual events it is about. A good Cursed News Story is typically about an extremely polarizing, often poorly defined subject (in this case, disinformation). It involves divisive characters or institutions that most people have preconceived notions about (here, the Biden administration). But the essential ingredient is some underlying complexity that’s inevitably glossed over in the broader discourse.
The Disinformation Governance Board is a fraught concept that was executed in an extremely poor manner. It has a dystopian-sounding, bureaucratic name that conjures images of truth-judgment panels. It was rolled out casually, with few concrete details or an explanation of what the board would do. Early reports mentioned tackling Russian disinformation and issues around migrants but also included wildly vague statements that suggested the board would help “protect privacy, civil rights, and civil liberties.” Okay! But anytime you’re rolling out a government organization dedicated to focusing on the veracity of information, the details matter. As Techdirt’s Mike Masnick and Cathy Gellis wrote in April, the mishandling of this entire operation only served to cast doubt on the people in charge of the operation:
Everything about the way this Disinformation Governance Board has been rolled out has been a disaster. The lack of clear information about what it is, what it does. The naming of it. The fact that the White House simply left this giant open void to be filled by the misinformation peddlers themselves, suggests that the White House itself is not at all comprehending how any of this works. And that, alone, does not bode well for this terribly named board.
Not good! Now, here’s how Lorenz described the governance board in her piece Wednesday:
The board was created to study best practices in combating the harmful effects of disinformation and to help DHS counter viral lies and propaganda that could threaten domestic security.
This, of course, seems more reasonable. A few people I’ve spoken with who have some knowledge of the board suggest that the group likely would have acted as a rapid-response group, identifying online campaigns spreading verifiably false information, then providing fact sheets to any organizations looking for guidance on the issue in question. Here’s a hypothetical example: Remember Sharpiegate—the idea that voters were having their ballots invalidated because of the pen they used? This fake scandal has gone viral a number of times on social media, and each time news outlets and fact-checkers reach out to government officials to ask about the Sharpies. Each time they get something like this:
Jenna Dresner, a spokeswoman for the Office of Election Cybersecurity, told Reuters via email that using a Sharpie did not invalidate ballots. She said there was no explicit law defining what type of writing instrument should be used.
Perhaps an organization like the Disinformation Board would have resources available to counter well-worn viral lies like this one in an expedited fashion. At least that’s my most generous interpretation. And yet there are also endless reasons such a board could go awry. It could make an error in one of its hypothetical fact sheets that looks partisan in some way. It could make a hasty judgment on something that seemed false, but ended up having a kernel of truth to it. One of its members could say something foolish or ill-considered in public—or in private, which is then leaked—and it could undermine the board’s credibility.
Perhaps most worrying is what might happen when the board changes hands under a new administration. Joe Biden’s Disinformation Governance Board may want to make fact sheets and help organizations with “best practices.” Donald Trump’s Disinformation Governance Board may expand the scope of its powers in ways that worry speech advocates across the political spectrum (a new Texas law is raising exactly those sorts of concerns right now).
Here is what the self-described “nonpartisan, nonprofit” organization Protect Democracy wrote in an open letter to DHS Secretary Alejandro Mayorkas after the board was (vaguely) announced:
Rather than building trust, demonstrating a commitment to civil rights, and explaining itself fulsomely, the Department’s public announcement of the Board has fallen short on those fronts. The lack of clarity is particularly concerning, given the Department’s poor track record when it comes to “monitoring” individual’s First Amendment–protected activities. From illegally surveilling Reverend Kaji Douša after she spoke out against the Trump Administration’s immigration policies, to gathering intelligence on reporters that cover the Department, to monitoring the social media accounts of visa applicants, the Department has demonstrated a readiness to cross the legal bounds of privacy and speech rights. Coupled with the Department’s checkered record on civil liberties, the Department’s muddled announcement of the Board has squandered the trust that would be required for the Board to fulfill its mission.
There’s clearly a lot to be concerned about when it comes to the Disinformation Governance Board. But none of that begins to excuse what Lorenz details about the threats to Jankowicz:
In response to one post on Gab featuring a video of Tucker Carlson discussing Jankowicz, users commented: “Time to kill them all.” Another post featuring Carlson’s coverage of Jankowicz was shared to a right-wing forum with the caption “This is the point where we have to draw the line.” Comments said Jankowicz should be “greeted with Mr. 12 Gauge Slugs.” An April 30 post on Gab featuring a tweet by Rep. Lauren Boebert (R-Colo.) telling her followers “this is the hill to die on” sparked replies that were flooded with threats to Jankowicz’s life. “It’d be easier if we had a large group of trained assassins to take a lot of the [government] bastards out first,” one user wrote.
These egregious examples are jarring, and yet they still don’t reflect the full scope of the harassment. To understand that, people need to see the way that networked harassment works online. (Lorenz, who was criticized by the right for her framing of the piece, knows this firsthand.) As I’ve written before—drawing on UNC associate professor Alice E. Marwick’s “Morally Motivated Networked Harassment”—a lot of online harassment is hard to see and harder to enforce. Many of the worst offenders (big accounts, cable-news hosts like Tucker Carlson) don’t themselves break rules but instead amplify people who do:
Put another way: serial amplifiers—especially the savvy ones who don’t themselves harass but signal their disapproval to their large audiences knowing their followers will do the dirty work—tend to get away with launching these campaigns, while smaller accounts get in trouble. A good amplifier knows how to create plausible deniability around their behavior. Often they say they are ‘just asking questions’ or ‘leveling a fair critique at a person or public figure.’ Occasionally, this is true, and individuals inadvertently kick off networked harassment events with good faith criticisms. Sometimes, people with big accounts forget the size of their audience, whose behaviors they can’t control. These social media dynamics are messy because they vary on a case by case basis.
Jankowicz was undoubtedly experiencing this kind of harassment on multiple fronts. I’ve seen a number of people online debating whether it is actually coordinated (and there’s evidence to suggest a fair amount is automated or inauthentic). But it’s not all shadowy cabal stuff. Part of what makes networked harassment so awful is that bad actors don’t have to “coordinate” the mobs via some shady back channel—this stuff now happens at scale organically. As Marwick notes in her study, “Networked harassment is a tactic used across political and ideological groups and, as we have seen, by groups that do not map easily to political positions, such as conflicts within fandom or arguments over business.” The right-wing-media ecosystem happens to work better and more efficiently and in a uniquely cruel way, but the pile-on dynamic is also how big parts of the social internet work.
And so, as Lorenz reports, DHS’s response to this kind of morally motivated networked harassment is extremely disappointing. I have no way to know exactly what the administration brass was responding to when it chose to silence Jankowicz and pause the board, but my suspicion is that it wasn’t the carefully considered blog posts or open letters from policy shops. As has happened in newsrooms time and again, leaders who are less familiar with the dynamics of social-media campaigns and how Fox News primetime segments are formed—but know enough to not want the bad press—freak out. As I wrote last June:
They see outrage building, realize it is pointed at their institution, and they panic. They panic, in part, because they don’t understand where it’s coming from or why it’s happening or what the broader context for the attack is. This is when leaders make decisions that play into the hands of their worst faith critics.
The irony of the disinformation expert being sidelined and unable to respond to a networked smear campaign against her is obvious. It is what makes the story enraging (to one group) but also broadly newsworthy. It is a picture of a government organization stepping in it in every direction. The entire saga seems to illustrate that those at the helm of the federal government are very concerned and interested in information warfare but appear to have very little understanding of the battlefield.
But much of the nuance in this story will continue to be lost. Jankowicz will likely become a sort of icon to the far right. The shitty nicknames will follow her, and her appointment will become grist for the reactionary whataboutism mill anytime there is an overstep from the right on a speech issue (see again: Texas).
There will also be people glossing over the concerning parts of the board’s formation because the attacks on it have been so brutal and in such bad faith. I also predict that this story will keep popping up, like Hunter Biden’s laptop. Here’s how my colleague Kaitlyn Tiffany described the laptop’s legacy back in April:
[It] is an icon of our information ecosystem’s dysfunction. Some journalists relied prematurely and too much on popular frameworks when covering it. The story really was suppressed by tech giants. But it also really was complicated, and required time and resources to investigate.
As Tiffany wrote, the tech companies have either elaborated on their decisions concerning the laptop or, in Twitter’s case, apologized. She also notes that “if federal prosecutors indict Hunter Biden for possible financial crimes, it will not be solely on the basis of the man’s laptop, so the case could be made that the thing doesn’t matter much anymore. Yet it isn’t going anywhere. Why would it? It’s perfect!”
The laptop is a Cursed News Story in part because, as Tiffany writes, it has become “shorthand, and it makes an easy point.” It is, in other words, a meme. I see the same happening with the Disinformation Governance Board. Many seemingly conflicting things are true about it at once, and yet the memeification demands a flatness and simplicity.
There are lessons in this saga—about communication strategy, morally motivated networked harassment campaigns, government intervention in speech issues (or at least the perception of potential government intervention in speech issues), and the whole political/cultural information war. But I don’t for a minute believe that the right people—DHS or the legion of awful, bad-faith goons, to name a few—will learn much from this. We will, however, continue to fight about it.