Allow me, for a moment, to talk about NyQuil Chicken. In September the FDA issued a warning against viral TikTok social-media challenges involving medicines—specifically, a trend of boiling chicken in NyQuil as a sleep aid. The FDA cautioned that cooking chicken in NyQuil is “unsafe.” This warning prompted a lot of people in the media to assume that teens on TikTok were poisoning themselves in droves for internet clout.

Of course, they weren’t. TikTok told BuzzFeed News that there never was a NyQuil Chicken trend on the platform; the “sleepytime chicken” recipe originated long ago as an inside joke/shitpost on 4chan, and some screenshots went viral earlier this year on Reddit. But the FDA’s warning triggered a slew of breathless, incorrect news articles about TikTok and NyQuil Chicken, which—you guessed it—spurred a whole lot of TikTok searches for NyQuil Chicken. (If you want to read more about this, Ryan Broderick has a great explainer.)

News outlets and government organizations love a good social-media moral panic. But the big reason why something like NyQuil Chicken gets so much pickup is a general lack of understanding about what is actually happening on TikTok. The platform is, by social-media standards, newish. And because its user base is reasonably young, you have a lot of people writing or commenting about the platform who don’t really use it, let alone understand the customs, trends, and subcultures that bubble out of it.

Also: TikTok functions differently than other social networks in that its algorithmic recommendation engine and “For You” page supplant the traditional follower model. That big distinction, paired with the fact that the inner workings of TikTok’s (very good) algorithm are an especially black box, leads a lot of people to (wrongly) ascribe almost magical powers to the algorithm (like the notion that it is pied-pipering kids to make poisonous poultry).

There’s not tons of great qualitative research about what exactly is going on across the platform or how TikTok’s unique architecture influences the behavior of its users. Which is why I got pretty excited last week when Kevin Munger, Benjamin Guinaudeau, and Fabio Votta released a new paper looking at “TikTok and the Supply Side of Social Video.” Munger, a political science professor at Penn State, told me he wanted to look at the factors “that determine who makes (and how they make) the content that flows throughout the platform” with an eye toward political content.

Drawing off Munger’s prior work analyzing political channels on YouTube, the team was able to look at some of the differences between the ways that politically active users post and comment on the two platforms. I called Munger up this weekend to chat about the research. What comes next is a conversation that starts at TikTok and ends with Munger’s provocative theories on the not-so-slow death of literary culture, in general. It’s long, but well worth your time.

Warzel: Let’s start out broadly: Why the focus on TikTok?

Munger: I started getting interested in TikTok during the pandemic and started learning how to scrape it. I am very interested in developing a theoretical framework that looks at the supply side for different social media and why users create the kind of content they do for the platform. With TikTok I think it’s important to focus on a nascent platform because if we ask the right questions about a social-media platform early on, it will have a big downstream impact on the research that comes later and the way we conceive of these platforms.

Warzel: What do you mean by that?

Munger: Take YouTube. If the only question people care about is Does YouTube’s algorithm radicalize people?, it sucks up all the oxygen with research and commentary. And it’s not the only interesting question about YouTube. What we wanted to do is to take the social context of the platform more seriously and think about how the site’s design interacts with the users rather than suggesting that the platform does shadowy stuff to the user. I think a focus on user agency is really important.

Warzel: What’s different about TikTok?

Munger: There are two big value propositions for TikTok users. First, there’s great, intuitive, and powerful editing software, which makes it very compelling for casual users who want to make something with minimal effort. Second is the recommendation-forward approach, which makes it much less of a social network than other platforms.

What’s important about this, from our perspective, is that if you start posting from scratch you’re guaranteed an audience, even if it’s just a small one. The allure of TikTok is that virality-from-nowhere idea—which means that, if you hit, you don’t have to go through the tedious process of generating a network, like you do on other platforms. And we hypothesized that this would mean we would see an increase in the percentage of the user base that’s posting, compared with YouTube.

Warzel: Did you see that?

Munger: Oh yeah. We looked at users who left comments on TikTok videos and searched to see if they’d also posted their own content—and the difference between TikTok and YouTube is stark. On YouTube, only 18 percent of commenters we looked at created five or more videos. On TikTok, the number was 78 percent.

Warzel: That’s fascinating. I wonder how that would impact the way people use the platform. My knee-jerk assumption would be that people might be less inclined to be assholes—or at least, more aware of their commenting behaviors—if they also know what it’s like to post content. I notice that, on other platforms, some of the awfulness comes from some commenters sort of thinking of the platform as some kind of shitposting game. But on the other side, you have creators who are trying to make a living. Maybe this changes up that dynamic?

Munger: I hadn’t thought of that, but it’s an interesting theory to test.

Warzel: So, if TikTok’s algorithm-forward approach is inspiring higher percentages of users to post, instead of lurk, what else is the algorithm incentivizing?

Munger: There’s this interesting strategic logic going on with TikTok. On other platforms, you have influencers who accumulate their own audiences over time and platforms get to the point where they have users with huge numbers of subscribers. And, essentially, these people start to have real influence over the platform.

Think of somebody like Joe Rogan. Once you have that level of fame, you can exist outside of the platform’s ability to match supply and demand. If you’re big enough, you create your own demand. But TikTok has the ability to subvert this and keep creators in a more precarious state with the recommendation algorithm–forward approach. We looked at variance on the number of views that users’ videos had, and it’s unsurprisingly much higher on TikTok than YouTube. On YouTube, creators have a more stable audience than most TikTok users do.

Thinking of Uber as a metaphor is helpful. Uber wants to maximize the consumers they have, and so they want to reduce the bargaining power of drivers in order to drive more consumer surplus and value to riders. The same thing is true on TikTok. They want creators to be as precarious as possible to keep pushing and creating—both because the audience numbers are fickle but also because that promise of seemingly random virality will incentivize more newer users to post.

Warzel: You’re saying from the perspective of the platforms, this boosts user growth and prevents creators from having power? The Uber example is an interesting framework—and that idea of trapping users in that state of precarity is pretty grim. It makes me think of the different trade-offs between platforms. Obviously there are massive TikTok stars, so there’s definitely sustained levels of fame that people can achieve on TikTok, though maybe it is a smaller percentage of the user base compared to other platforms. But it’s interesting if TikTok is keeping a larger percentage of users from becoming too big for the platform and, as you put it, creating their own demand. I think when this happens on other platforms, you can run into a problem where the influencers start to get more and more extreme as a result of their audience capture. So maybe the precarity is better in some ways for the health of the platform. But then, of course, trapping people in this anxious state about the size and loyalty of their audience is an awful way to treat people. It doesn’t seem like either is a great option.

Munger: Yeah, I’m not sure we’re in the position to have that kind of reckoning about the trade-offs between platforms. But I take the point that there’s downsides to both.

Warzel: What else did you take away from your research?

Munger: I think we’re probably underestimating how important an innovation the recommendation-forward algorithm is in media technology. The combination of the short video and algorithm is something people really like, but, more importantly, it seems that people are really adept at using it. What I mean is that it seems like more people naturally are able to put out a decent TikTok video than, say, can write a very good tweet. And that sounds flippant, but I actually think it matters. It’s an era of social video taking off.

The memes are also different because they’re so much more embodied. Instead of an abstract set of symbols, the memes are connected to people’s physical bodies. Like, sometimes meme formats influence how people construct their background or environment or how they set up their camera or, in the case of dances, how they move their bodies. [The body is] so central on TikTok.

TikTok memes are both more embodied and more connected to identity. I think it really moves us further away from the Californian ideology of the internet of anonymity to an emphasis on identities and the bodies creating the content—even more so than Instagram. And I think all of this is something that academics and writers and people who use words as their job don’t like, or don’t fully understand. I think it requires quite a bit more qualitative engagement to understand what’s going on. Quantitative analysis is good—it’s something I do. But, more than anything, you have to live in this world to understand it.

Warzel: What strikes me about your theory of embodied memes is that TikTok then kind of forces you to reckon with the fact that there’s a human being behind the content you’re interacting with. Sometimes on other platforms—I’m thinking Twitter especially—with text or stationary-image memes, the content can feel like an abstraction, or like it just kind of appeared out of nowhere. But with duets on TikTok, you’re usually responding to an actual human being. I have no idea whether that changes posting behavior for better or worse, but it feels like a meaningful change.

Munger: I think parts of it are good in terms of the social norms involved. People seem much more focused on attribution and credit on TikTok. There’s obviously a lot of meme-stealing, but I think there’s more awareness, generally. We didn’t look at this formally in the paper, but the duets are fascinating. It seemed like, as an initial finding, that the percentage of cross-partisan content is much higher than on other platforms. The duet feature (granted, a lot of what we looked at was content in 2020) showed us a lot of teens who were either quite Republican or Democrat arguing with each other in a way that doesn’t really happen on YouTube. On YouTube and other platforms you have a lot less direct cross-partisan arguing—yes, you have debate videos and such, but the partisan content is more siloed.

Also, I should note that, when I looked at a lot of these cross-partisan duets, personally, I’d say that the quality of these discussions was generally pretty terrible and uninformed. But, also, that’s not anything new. Teens have been having bad, uninformed arguments about politics forever!

But I think there’s a lesson there. It gets back to the broader hand-wringing of what political content is consumed. We get very obsessed, generally, about fake news or disinformation and the idea of really insidious stuff getting promoted into the feed. And, honestly, it’s probably more likely that what people are getting fed is actually just poorly argued content.

Warzel: Yeah, I think it’s always hard to overestimate the amount of garbage out there, and I think it’s important to care about it. Because it’s often as or more influential than, say, the truly nefarious, polarizing stuff.

Munger: There’s this sense that a lot of media commenters have that is still grounded in an idea of elite control of messaging. But I think the reality is that the era of any elite group—whether they’re trying to inform or deceive—has control over messaging is really going away.

Warzel: This is actually something that I’ve found really interesting in your work. You’ve been writing for a bit now about what you view as a massive cultural shift relating to the written word. I’m going to quote an article of yours where you argue that “the implicit belief in the efficacy of writing, deliberation and democracy held by words-style people has become increasingly incorrect.” In that piece you talk about the way that it relates to written journalism, which you argue is “collapsing” and the way that new media technologies fundamentally rearrange society and accelerate the death of literary culture. I think it’s safe to say your TikTok work is a way to look at that. Am I right?

Munger: Absolutely. We have a paragraph in the paper that speaks to this dynamic. We say that “We’re going to write a bunch of words about what TikTok is, but if you don’t understand it and haven’t used it…you won’t have a good intuition of what it is from reading this.” I think that’s part of the change.

To read the rest, subscribe to The Atlantic.

Already a subscriber? Sign in