The Bad Ideas Our Brains Can’t Shake

Why it's so hard to process new COVID information

Getty

Welcome to Galaxy Brain—a newsletter from Charlie Warzel about technology, media, culture, and big ideas. You can read what this is all about here. If you like what you see, consider forwarding it to a friend or two. We’re still figuring things out in our new home so let me know what you think: galaxybrain@theatlantic.com.

This is a free edition of the newsletter, but you can subscribe to The Atlantic to get access to all posts. Past editions I’m proud of include: Guardians of the Internet, Don’t Alienate the Willing, and How to Spend 432,870 Minutes on Spotify in a Year.

Before we begin: You may have noticed some crypto ads during the Super Bowl last night. One — featuring Larry David — stood out to me because it is almost a perfect example of the crypto FOMO dynamic I wrote about last week in this magazine. Here's the clip. And here's my article.


There is a tweet from Mike Caulfield, a researcher at the University of Washington’s Center for an Informed Public, that I’ve been thinking about. “The biggest info lesson for me re: COVID is that first information is just *ridiculously* sticky,” he wrote. “And I say this as someone who didn't realize you can put modern alkaline batteries in the trash until three weeks ago.”

What Caulfield is saying is that, for many, the information presented when we’re first introduced to a new subject or fact is hard to shake, even if we later find out it is wrong or in need of a revision. Anecdotally, I feel like I see this all the time in my life with regard to COVID responses and procedures. We know COVID is an airborne virus with low risk of surface or object transmission and yet stores, restaurants, and public places still engage in the hygiene theater of sanitizing tables and pens while ignoring more proven, substantial virus-mitigation efforts like, say, upgrading air-filtration systems. Two years into a pandemic, there are still a nontrivial amount of people under the mistaken impression that if they wash their hands and stand six feet apart indoors they are mostly protected from the virus. Others still doubt the efficacy of masks or—I’ve encountered this one a lot—believe that masks don’t need to cover the nose.

Granted, plenty of COVID ignorance and misinformation is ideologically motivated or borne from a genuine misunderstanding of how viruses work. But some of it might also be the result of this sticky-information problem, which is known in psychology circles as the “continued influence effect.” A recent paper in Nature described it this way:

When information is encoded into memory and then new information that discredits it is learned, the original information is not simply erased or replaced. Instead, misinformation and corrective information coexist and compete for activation.

I called up Maddy Jalbert, a postdoctoral scholar and Caulfield’s colleague at the University of Washington, to ask her about this. Jalbert studies how context and our daily experiences can shape our memory and also our decision-making abilities. “When you give humans a piece of information, we are very good at connecting it to things we already know,” she told me. “But if you retract that piece of information and people have already made these connections, you can’t go back and magically take that information out of a person’s head because then that whole understanding of the information they’ve connected it to is different. So people will then rely on their original understanding of things they’ve incorporated.”

What Jalbert is saying is that once we’ve yoked a new piece of information to something we already know and still believe to be true, the new piece of information becomes structurally important to our understanding of the world around us. It is load-bearing and thus not easily removed. It’s one reason why, in trial settings, even if a piece of evidence is ruled inadmissible, it may consciously or subconsciously sway a jury.

A hypothetical example of this effect might be the early and loud guidance on COVID that said, time and again, to wash your hands and avoid touching your face to prevent the spread of the virus. This information is easy to ingest, in part, because it connects to other things about germs and disease that most of us know—washing your hands is sanitary and a good way not to get sick. As we know now, most COVID spread isn’t caused by fomites. Hand-washing is a good thing but it’s not by any means the most important COVID-mitigation tactic. Still, it might be difficult for some to shake this guidance, since the information is so readily tied to commonsense wisdom about hygiene.

Using similar logic, one can understand why early guidance on masks from public-health organizations and communicators like Dr. Fauci might have been especially disastrous. In mid-February 2020, Fauci told USA Today, “If you look at the masks that you buy in a drug store, the leakage around that doesn’t really do much to protect you. Now, in the United States, there is absolutely no reason whatsoever to wear a mask.” That type of guidance persisted for months and resulted in bold headlines like this:

Aside: I want to be clear that I’m not highlighting this information to say that we ought to give a pass to people who’ve behaved selfishly or recklessly over the last 18 months. This isn’t some plea for more compassion for antivaxxers or a plea for anything, really. I am, however, interested in the ways we process and retain new information and what that means for the way we ought to communicate going forward.

It’s not only first-heard information that is sticky. Details or facts that make you feel safe or in control might be naturally sticky. Information that is repeated frequently is more likely to be internalized as true, even if, deep down, you know it isn’t, Jalbert said. And one’s own personal experiences and environment will also shape how persistent a morsel of new knowledge might be. Especially when a subject is polarized or politicized (like masking), an important determiner of sticky information is social norms. “When people hear new information and think, What should I do? most look around and copy people similar to them or those in their social circle. And when everyone around you is doing something one way, you develop a false sense of consensus around an idea.”

Some of what Jalbert describes is intuitive, and yet there’s still so much politicians and public-health communicators could learn from thinking about the continued influence effect and the way to communicate risk in future crises. In the case of COVID, there’s been a particular challenge, which is that science and politics collided quickly on a global stage. Sometimes, like in the case of discouraging masking or the decision to delay booster shots, policy decisions designed to shape public behavior (preventing shortages of masks for health-care workers) were framed as purely scientific decisions. People were told to “trust the science” but what officials were really saying was to “trust specific scientific and political institutions, who are working off of science that is changing daily” (my newslettering colleague Yair Rosenberg had a fascinating interview about this last week).

Most of us are not used to seeing the sometimes messy, iterative form of science, where hypotheses are tested, refuted, retested, and eventually confirmed. We’re used to that process happening outside of our view and then having more definitive, fully formed conclusions presented to us. But when a novel virus spreads swiftly around the world, we’re forced to take in new information in real time. A lot of us aren’t used to this as news consumers but more importantly, our brains don’t exactly love it, either.

Jalbert told me that the way our brains work is quite utilitarian. In any situation—looking at a landscape, conversing with friends, reading the news—there’s too much information to take in at once. So we use practical tricks to process. “We employ all these mental heuristics and shortcuts because otherwise we wouldn’t be able to do anything in our lives,” she said. “The idea behind peoples’ beliefs is that they help you perform tasks. But to do that doesn’t require you to deeply understand every single thing you learn. You’re drawing on shortcuts.” These shortcuts, Jalbert said, are incredibly useful, but they’re also a vulnerability, because if one of them is based on a piece of outdated information, it could steer you in the wrong direction. These mental heuristics, she told me, are the reason why everyone is susceptible to believing wrong information.

It is possible to correct even sticky information that’s wrong. But it requires being deliberate. Jalbert told me that when you retract or debunk a piece of information, what you’re doing is leaving a gap that needs to be filled in a person’s cognition—otherwise, the false information will just pop back in to take its place. “We need a coherent understanding of the gap,” Jalbert said, “which means if you’re going to correct some wrong information, you have to explain in very clear and simple language why you’re updating.”

Jalbert used the example of changing guidance around wearing N95s or hospital-grade masks instead of cloth ones during the Omicron wave. “The messaging was, ‘Okay, now we’re requiring better masks. Get on that.’ But a lot of that messaging was missing the reason why,” she said. “Why was it the case that cloth masks were okay before? And now what’s changed? Those things are really critical to get people to understand and receive new explanations.”

This may sound like common sense. But too often, institutions and leaders are too arrogant to explain themselves. Those doing the messaging rely on shared understandings (like, say, how airborne viruses behave) to frame their communications without realizing that not everyone in their audience will have the same background as them. Their messages are monolithic, not tailored to different audiences. People are told to trust the science, which is a fine message, but doesn’t necessarily help repair the cognitive space and make way for new information to stick. And there can be disastrous consequences when organizations update guidance without proper communication.

“If you create a feeling of deep uncertainty, it can give people the sense that something is truly unknowable,” Jalbert said. “When it feels like you can’t discern whether anything is true, you disengage from the information. This is likely happening to people when rules change in ways that don’t make sense to them. They say, ‘It’s not worth my time.’”

It’s plausible that many Americans have reached this stage. Some are frustrated or exhausted by the pandemic and lack the bandwidth to keep up with ever-changing information. But others have lost faith or trust in officials and institutions in part because new information has been communicated in a way that leads people to feel that keeping up with guidance is a fruitless endeavor.

Talking to Jalbert, I realized just how bad many large institutions have become at messaging effectively to wide swaths of human beings. I don’t just mean public-health organizations in the federal government, either. Jalbert and I discussed misinformation in the form of propaganda, and she argued that our human desire for shared understanding—especially among people who share our social norms—can often outweigh our desire for the truth, and that much of what we think of as misinformation serves a functional purpose, providing a shared connection in a given community.

If you think of misinformation this way, it’s easy to see how many debunking tactics employed by institutions and media organizations can fall flat. A simple “FACT CHECK: THING WRONG” is ultimately a shallow, almost half-assed attempt at a solution to a difficult problem. “It’s not good enough just to replace the information—you need to also account for the way that that piece of information serves as a shared connection in a given community,” Jalbert said. “To expect people to change their beliefs without an idea of how to replace it or foster other connections is missing the key component.”

I do not mean to rag on overworked and well-meaning public-health officials who are trying their best in the face of changing science and endless political and cultural roadblocks and saboteurs. Revision, and clear communication in general, is hard! But there’s a great deal that almost any public communicator can learn from experts like Jalbert. In any information crisis we understandably tend to pay a great deal of attention to what people think or why they think it in an attempt to set the record straight. There’s something to that, of course. But we need to focus just as much on how people think.


Charlie Warzel is a staff writer at The Atlantic and the author of its newsletter Galaxy Brain, about technology, media, and big ideas. He can be reached via email.