Welcome to Galaxy Brain -- a newsletter from Charlie Warzel about technology and culture and big ideas. You can read what this is all about here. If you like what you see, consider forwarding it to a friend or two. We're still figuring things out in our new home so let me know what you think: galaxybrain@theatlantic.com

I’ve been thinking a lot lately about programming dead zones on TV. I distinctly remember being sick and home from school in the early aughts and lying on the couch aimlessly trying to find something, anything, to watch on TV. All those channels and nothing on.

There was plenty of popular stuff on in the middle of the day (soap operas are an obvious example, and also everything from Ellen to Dr. Phil to Jerry Springer), but I am not sure you’d classify it as high-quality content. Between the morning shows and prime time and the local news, you had a lot of self-aware melodrama, vapid celebrity news, feel-good interview programs, and manufactured conflict (see again: Jerry Springer). Daytime TV is engineered, in part, to kill time and accompany the mundane tasks of home life. I’d classify the bulk of the content as emotionally stimulating, somewhat easy to produce in high volume, and easy on the brain.

The other dead zone that comes to mind is that time after about 1 o’clock in the morning, when the late shows are over and live sports have wrapped up. On the basic TV channels, you’re getting straight infomercials or very strange, local-TV, semi-sponsored travel programming. In the early 2000s, Girls Gone Wild somehow managed to buy blocks of time in the middle of the night to peddle their gross, exploitative, topless spring-break tapes. On cable channels, it wasn't much better. Networks like MTV and VH1 would just go on cruise control with whatever garden-variety reality show they had going at the time in the hopes of hooking somebody into an ill-advised insomnia binge. It’s 2 o’clock in the morning—time to run nine straight episodes of Real World/Road Rules Challenge.

I was reminded of this bit of TV nostalgia a few months ago when Facebook (or Meta, if you want to acknowledge the company’s rebranding effort) released its first transparency report detailing the most popular content on its platform. The report, as I’ve written, was weird. The most-viewed link on Facebook in the report’s three-month documentation period was for the website of a Wisconsin company that connects Green Bay Packers fans to former players of the team; a CBD company had the second-most-popular link. A closer investigation revealed that a couple of popular pages (one from a former Green Bay Packers kicker and one from ’90s TV star Jaleel White) were likely spamming the links out to their sizable followings via auto-scheduled posts. Facebook caught a good bit of flack for the report after The New York Times reported that the company shelved an earlier edition that included a conspiracy site among the top 20 links. It sure looked like the company was obscuring the bad popular stuff on the platform by highlighting the extremely peculiar popular stuff on the platform.

Facebook can, indeed, show its data selectively in order to make itself look good—but I couldn’t stop thinking about all the weird links. What if being surrounded by weird, garbage-y, spammy stuff is a more universal experience of using Facebook than being surrounded by awful, democracy-destabilizing content?

A few weeks ago, over at the newsletter Garbage Day, Ryan Broderick and Luke Bailey embarked on a little investigation. Using aggregated CrowdTangle data from Facebook posts with the highest number of interactions, the pair noticed the following:

There are a lot of right-wing pages up there—Ben Shaprio, Dan Bongino, Donald Trump For President, Fox News, etc.—but if you look closer, you may notice what Luke has been noticing. There are a lot of weird animals pages in there all of a sudden.

What are weird animal pages, exactly? The biggest one is called Love Meow, and it has 4 million followers, but, as Broderick and Bailey point out, there are also websites like Top13, which has the catchphrase “Pawsome Animal Stories!” Top13 made it into Facebook's top-performing posts when a piece published in 2015, “Shelter Pit Bull Made His Bed Every Day Until a Family Adopted Him,” inexplicably went viral this year. I’d urge you to read the Garbage Day post in full, as it’s a nice peek into the bizarre nature of Facebook’s dead-zone programming.

On Tuesday, Facebook released its second transparency report, which is full of similar and equally strange stuff. The top two links, which garnered 92.7 million and 64.9 million views, respectively, were the same Packers and CBD pages. Some of the top links make sense to me (a recipes website, vaccines.gov, one link with 35.8 million views that Facebook won’t show, because “This link was removed by Facebook for violating Community Standards”). But most of the links just lead to spammy, clickbait-y content.

Many of the pages seem to simply repost screen-grabbed photos of recycled memes (a tactic that’s very popular among local-radio-station Facebook pages). The most popular pages include celebrity-gossip sites (People), various cooking blogs, mom-focused content, the Australian branch of the popular viral dude-content site LADbible, and, of course, the Falun Gong–backed newspaper The Epoch Times, which doubled down on publishing right-wing misinformation during the Trump era. The most popular individual posts are almost all text cards with prompt questions like “Who can honestly say they never had a DUI? I’ll wait.” (94.3 million views) and “Name something that a lot of people like, but you can’t stand?” (82.4 million views).

Clicking through these pages can feel like flipping through the channels during a programming dead zone. Some posts are truly vapid, recycled, or low budget, like the 2 a.m. channel scroll. Other posts approximate the feel of listless daytime channel surfing: lots of time killers and “on in the background” content sandwiched between melodrama.

Importantly, lots of this content is not offensive in any way. There’s some worrying misinformation and propaganda in Facebook’s list; there are also some legitimately helpful resource pages, too. But the bulk seems to be this quickly published, clickbait-y grist for the viral Facebook mills. It’s not quite spam, because people engage with it, but it is created and published much like spam by content merchants who throw as much shit at the wall as possible to see what sticks.

Does all this viral grist clogging up our news feeds matter? It depends. The existence of popular, vapid meme pages is probably not a national crisis any more than, say, the proliferation of sudoku puzzles or Ole and Lena joke books. It’s not that interesting as moral panics go. It won’t attract nearly as much attention as the Facebook Papers’ revelations about the platform’s negative effects on teen body image or its ability to cause social unrest in developing regions of the world—nor should it. But it’s also worth thinking about the cumulative effect of a platform on its users when a meaningful amount of the content that flows through that system is dead-zone programming.

Last week, The Wall Street Journal reported that, according to an internal Facebook survey, “1 in 8 of its users report engaging in compulsive use of social media that impacts their sleep, work, parenting or relationships.” The researchers behind the study suggest that these issues “affect about 12.5% of the flagship app’s more than 2.9 billion users, or more than 360 million people.” The piece profiles a few individuals who have legitimately disordered relationships to the platform and may be or feel like they are genuinely addicted to Facebook.

What I find more interesting are the edge cases—the people whose lives aren’t being ruined, but who feel like they’re spending more time on the platform than they’d like. From the study:

The researchers also wrote that they had a more detailed understanding of the aspects of Facebook that triggered the issues, which they said include getting too many notifications, videos that play automatically, uncertainty over whether they will see posts from the people they want to follow and ephemeral content that users felt compelled to watch before it disappeared, among others.

Many of these triggers are based on the platform’s algorithmic architecture, but some of them are also likely due to the content people are viewing. Banal viral grist is likely fine in small doses, but the cumulative effect generally feels pretty crummy. I’ve experienced this on numerous platforms (recently, it happens to me on TikTok), where some mindless scrolling morphs into an hour-long binge and I put down the phone feeling almost hungover. I feel overstimulated and a bit bummed out about how I spent my time. You can see how this gets bleak when it becomes habitual. It could be, as Facebook’s own research teams suggest, bad for you after a while.

At a time when lawmakers, journalists, whistleblowing tech workers, and regulators are examining Facebook’s role in our lives and culture, the conversation is narrowly focused almost completely on disinformation, conspiracy theories, and deceptive information.

I get that these areas are quite serious and overlap nicely with politics, which makes them a popular and consequential subject of conversation. We shouldn’t turn our focus away from networked propaganda or the use of Facebook to help perpetuate authoritarian regimes across the globe. But if you’re actually interested in addressing problems with Facebook’s architecture, there’s an opening in the dead-zone programming arena. The empty-calorie viral posting may not be a problem of the same magnitude as networked propaganda, but it is a problem that’s born of the same algorithmic incentives. Both types of posts are the result of content farmers endlessly spamming and A/B testing low-quality posts designed specifically to trigger emotions and keep people engaged, no matter the cost.

Focusing on the glut of garbage content that rides Facebook’s algorithmic rails is inherently less political than targeting specific bits of political content. I’ve seen enough congressional hearings with Big Tech executives to know that these conversations generally go nowhere. Republicans complain of censorship while their content dominates these platforms, and Democrats look for tighter, more consistent enforcement policies. Both sides dislike the power wielded by Silicon Valley, but they can’t agree on the specific manifestations of this problem. Focusing on clickbait might not be the key to “fixing” the platforms, but it could be a place to start and build consensus about the ways that platforms incentivize low-quality content.

The historical comparison that feels most apt is Newton N. Minow’s famous “Vast Wasteland” speech, delivered in 1961. Minow, then President Kennedy’s newly appointed chairman of the Federal Communications Commission, delivered a scathing criticism of the broadcast-TV landscape. Here’s a little snippet:

When television is bad, nothing is worse. I invite each of you to sit down in front of your television set when your station goes on the air and stay there, for a day, without a book, without a magazine, without a newspaper, without a profit-and-loss sheet or a rating book to distract you. Keep your eyes glued to that set until the station signs off. I can assure you that what you will observe is a vast wasteland.
You will see a procession of game shows, formula comedies about totally unbelievable families, blood and thunder, mayhem, violence, sadism, murder, Western bad men, Western good men, private eyes, gangsters, more violence, and cartoons. And endlessly, commercials—many screaming, cajoling, and offending. And most of all, boredom. True, you’ll see a few things you will enjoy. But they will be very, very few. And if you think I exaggerate, I only ask you to try it.

The speech didn’t purge violence from television and it didn’t turn every channel into PBS, but it did spark a regulatory conversation that not only advanced public broadcasting, but also created the conditions in which independent production companies could flourish, most notably Norman Lear (All in the Family, Maude, The Jeffersons, and many more) and MTM Enterprises (Mary Tyler Moore, The Bob Newhart Show, Rhoda). Notice, these weren’t snobby, elitist shows; they were well-made shows that people enjoyed watching.

Minnow’s tactic was not to focus on political content; the “fairness doctrine,” which ensured equitable and balanced presentation of topics on all public-broadcast forms, had been in place since 1947. And he took pains to ensure that the FCC wouldn’t “muzzle or censor broadcasting.” But his aims were somewhat political. Minnow believed that better media could help strengthen America’s democratic aims in the face of a Communist threat. “The old complacent, unbalanced fare of action-adventure and situation comedies is simply not good enough,” he said in the speech. He also understood that part of the reason the content was so, well, wasteland-ish was because the film studios, frantic to make up for their lost market to television, had figured out how to use their existing infrastructure to roll out miles of low-quality genre fare, which the networks were, in turn, eager to buy and broadcast. Without some sort of regulatory check, that would have been the future of television.

The television landscape of Minnow’s time is not totally comparable to the scale and complexity of the internet. Facebook has billions of users; every person’s experience of the platform is different. But Facebook suffers from some of what Minnow laid out 60 years ago—and his point about how the medium is a product of and a cause of boredom feels particularly apt.

What would a comparable regulatory approach for Facebook look like? Maybe it’s one that offers more stability for creators in the form of trade unions or guilds for influencers. A collective-bargaining organization might pave a path toward what investor Li Jin has coined “the creator middle class.” Instead of having to chase algorithmic incentives that tend to push content toward the lowest common denominator, creators would have the flexibility to produce higher-quality stuff. Perhaps the simple act of more algorithmic transparency might allow creators more autonomy and might help regulatory agencies codify standards for platforms amplifying content beyond a certain reach. What seems obvious, though, is that very little of the platform conversation right now focuses on what Minnow identified: putting pressure on the distribution networks to incentivize creative types to make better and more enriching content.

I’m not personally offended by online garbage when I come across it—just like I’m not offended, per se, by nine hours of reality-television reruns. Any effort aimed at “cleaning up” content will inevitably be hijacked for political ends and likely used in problematic ways. Still, there’s something helpful in looking at Facebook through the vast-wasteland lens. One side effect of the constant media coverage and the parade of leaks, whistleblowers, and punditry about the platform is that Facebook takes on an exaggerated role as some kind of all-powerful, democracy-destroying machine. There’s some truth to that—but it’s still only part of the picture.

Facebook is not just a radicalization engine for Boomers and people who like to join multilevel marketing schemes. Facebook is also a platform that is struggling to attract younger users. The overwhelming amount of content on the platform likely isn’t rabid QAnon believers speaking in tongues; it is likely mundane postings and forgettable, plagiarized memes and mindless autoplay videos of pranks and heartwarming videos from animal shelters. It is Facebook Watch posts that proclaim, “EXPOSING MY CHEATING WIFE” and have 18 million views. It’s unlicensed clips of a seven-year-old scene from The Big Bang Theory posted with the comment “It’s International Bath Day and there’s literally nothing more appropriate to post than this” (29 million views). Yes, it’s your uncle posting a news article with an all-caps comment about critical race theory, but it’s also an image macro from a content mill that asks, “Marriages last for 8 years. How long have you been with your spouse?” (1.1 million likes, 7.5 million comments, 138,000 shares).

Facebook’s transparency reports are a form of corporate deflection—a means of diverting focus from the hyperpartisan pages and posts about vaccine disinformation. But in these reports, the company is also telling us something important about itself: that it is filled with mindless, recycled spam. What might the Facebook conversation look like if it more readily acknowledged Facebook for what it is: a vast algorithmic wasteland? Infinite channels, but nothing on.