How and Why Anti-Science Propaganda Works

I. Introduction

psychological-manipulation-techniques-5-600x480Imagine that you can only know what you discover by yourself through trial and error. There are no websites, no books, no teachers, not even other people to talk to. How much could you know about the world? Do you think you could figure out which chemical molecules compose your food? What was happening in another country? What the climate was like 30 years ago? How to hunt or farm? As Philosopher John Hardwig says, “we are irredeemably epistemically dependent on each other.” That is to say, just about everything we know about the world we learn from other people. We depend on other people for knowledge.

This is particularly true as the world gets more complex. There’s simply too much for one person to know on their own and so people start to specialize in areas of knowledge. Hence, we have chemists, physicists, doctors, lawyers, geologists, mechanics, accountants, to name but a few—and, of course, philosophers. Just about everything you know about the world you learned from someone else. This is a good strategy too. There isn’t time or energy for you to get a PhD in every single domain of human knowledge.

Sometimes a problem emerges: There are cases where it’s not clear who the experts are on a particular issue or where two people who appear to be experts disagree. How does the non-expert identify who the genuine experts are? Who should they defer to?

In this post I do two things. First, drawing on a model from cultural anthropology, I explain the strategies that people use to identify experts. Second, I explain how anti-science propagandists manipulate these strategies to confuse the public.

[For more information and educational resources on science denialism and anti-science propaganda visit my separate growing website dedicated to the topic]

II. Heuristics for Identifying Experts
Boyd and Richerson’s (1985) model of cultural learning explains, among other things, how and when we defer to others. Their costly information hypothesis holds that

when accurate information is unavailable or too costly [for individuals to learn something on their own], individuals may exploit information stored in the behavior and experience of other members of their social group. (my italics for emphasis. (Henrich & McElreath 2003).

To place this in our contemporary informational environment, rather than earn a PhD in every domain, it’s much easier to investigate what experts in that domain believe.

Notice that the scope of ‘social group’ is rather nebulous. However, under conditions of high social sorting and polarization, we can expect ‘social group’ to be defined rather narrowly. The greater the distance between social groups, the less likely members from one group are to defer to members or institutions from another. For example, in the US where the population is more socially polarized than at any other point in the nation’s history (Mason 2018), it’s unlikely that members of one political identity will defer (on a politicized issue) to a member or institution perceived as belonging to the other group.

For example, Rightwing partisans are unlikely to defer to climate scientists since they don’t see them as being members of their own group. This distrust shows up in attitudes toward universities and university professors: About 60% of Republicans think universities have a negative effect on the country (compared to 67% of Democrats think they have a positive effect). And 19% of Republicans have no confidence at all in university professors to act in the public interest and 33% have not too much confidence (while 26% of Democrats say they have a great amount of confidence and 57% have a fair amount) (Pew 2018).

In our information environment, scientific knowledge is costly (in that it requires a lot of effort to obtain through individual trial and error). As I’ve said, a citizen cannot pursue a PhD in every field. It follows from the costly information hypothesis that on scientific matters citizens will engage in social learning and defer. Once they’ve decided to learn from others, various contextual cues bias them toward learning from one subgroup or individual rather than another. Adaptive information is embodied in both who holds ideas and how common those ideas are. These in turn underpin the prestige and frequency bias, respectively (Boyd & Richerson 1985).

The prestige bias is actually a proxy for the success bias (i.e., defer to the most successful person at a task). But when the ability to rank individuals by outcome in a particular skill or activity is too difficult, individuals “use aggregate indirect measures of success, such as wealth, health, or local standards” (Henrich & McElreath 2003; my italics for emphasis). The fact that prestige is only an indirect measure of skill implies that it will often be unclear which of a revered individual’s many traits led to (perceived) success.

To situate this in our current world, the fact that an individual has a media presence (prestige) may mislead many to believe that individual is an expert in a domain when in fact they aren’t. The prestige bias explains why so many (erroneously) defer to celebrities for health issues. Similarly, because prestige is defined by local standards, in a socially sorted and polarized society, people will likely not defer to experts outside their own group. It follows that, under such conditions, many will likely defer to the wrong experts on complex politicized empirical issues.

The prevalence of the success and prestige biases creates pressure for success-biased learners to pay deference to those they perceive as highly skilled (Henrich & McElreath 2003). The spread of deference-type behaviors means that naive entrants “may take advantage of the existing patterns of deference by using the amounts and kinds of deference different models receive as cues for underlying skill “(Ibid). So, local (i.e., ingroup) standards of prestige combine with patterns of deference to give (fallible) signals to non-experts about who the experts are.

Once again, we can see how these patterns instantiate themselves and fail in our current environment. On politicized issues, non-experts in a socially sorted and polarized society will defer to different experts and institutions based on ingroup prestige standards and patterns of deference. Few partisans, if any, will defer to individuals or institutions that their outgroup perceive as experts. On partisan issues where there is a consensus of genuine experts, one group will likely defer to the wrong individuals and institutions despite their perceptions to the contrary.

The success and prestige bias do not solve every costly information problem. In our current environment this problem emerges when two purported experts on either side of an issue both work at prestigious universities or institutions and/or both have media presence. Which to believe?

The successful heuristic in these situations is to copy the behaviors, beliefs, and strategies of the majority (Ibid). In information-poor environments (with respect to who has relative prestige/success) the conformist/frequency bias is a successful strategy. The conformist bias is so pervasive that it is an even more common form of learning than vertical transmission (i.e., parent to child) (Ibid.).

Again, like all heuristics they can be maladaptive, depending on the environment. In a highly socially sorted and polarized society, the conformity bias will likely apply only to the behaviors and beliefs of one’s ingroup rather than those of the outgroup. If a majority in one group has false or improbable beliefs, the frequency bias predicts that other members will also defer to the majority.

III. Belief Polarization and Trust

Using a mathematical model developed by Venkatesh Bala and Sanjeev Goyal, Cailin O’Connor and James Weatherall (2019) investigated a similar issue. They modeled how scientific communities either converge or polarize on beliefs in order to study how belief polarization occurs and misinformation spreads.

One motivation behind the project is that the scientific community adheres to rigorous epistemic norms (relative to lay people) and so if some variables can cause belief polarization and misinformation in these communities then it is bound to occur in the general population. The Bala-Goyal model is based on the Bayesian idea that we update our credence levels when others share new information with us.

An important finding aligns with what I suggested might occur in a sorted and polarized society. Their models found that when subgroups within a community distrust each other they appraise evidence differently depending on its source. That is, evidence from a trusted source (i.e., in-group) can move credence levels one way while the same evidence but from a distrusted source can move credence levels in the other direction!

This makes sense. If you believe that a lab or scientist is corrupt then it is reasonable to assume they’ve fabricated or manipulated their results and to revise your credence levels in the other direction. The end result is stable belief polarization within the community: One subgroup converges on the correct view while the other converges on the false one. The greater the mistrust, the larger the faction that settles on the false belief. This occurs because “those who are skeptical of the better theory are precisely those who do not trust those who test it” (O’Connor and Weatherall 2019). The group converging on false beliefs becomes insensitive to countervailing evidence.

Several important conclusions follow from their Bayesian models that incorporate social trust. First, “models of polarization […] strongly suggest that psychological biases [such as confirmation bias] are not necessary for polarization to result (Ibid p. 76; my italics for emphasis). Second, while distrust can cause us to dismiss relevant evidence, too much trust can also lead you astray “especially when agents in a community have strong incentives to convince you of a particular view” (Ibid p. 77).

IV. How and Why Anti-Science Propaganda Works

If you’ve followed so far you should be starting to see how anti-science propaganda works. One important way is to manipulate trust through the prestige bias. Anti-science propagandists will present their experts as prestigious. But most importantly, for anti-science propaganda to work they must diminish the prestige and, therefore, trust in genuine experts.

This is why every single anti-science propaganda campaign targets academic institutions, high profile public scientists, and public regulatory institutions like the FDA, EPA, and CDC. Go through the list of prominent denialisms like anthropogenic global warming deniers, anti-GMO advocates, anti-vaxxers, and evolution deniers and you will quickly identify this pattern. People will only trust the denialist experts if they can be convinced to distrust legitimate experts.

[Note: Some of people will invariably and correctly point to instances where the above institutions or a consensus of experts got it wrong. Yes, it’s true. Institutions and experts make mistakes. However, what’s important from the point of view of the nonexpert is the relative error rates of who to defer to. Compared to other individuals or institutions, which is more likely to get it right? This is a long and complicated topic that I can’t do justice to here but for now, thinking about relative error rates is one way to conceive of erroneous deferrals.]

Propagandists also manipulate people by manipulating the frequency bias. When the information environment is unclear with respect to what to believe (usually deliberately muddied by denialist propaganda) people will defer to the most common belief in their information environment. As an historical illustration, the tobacco industry for years would take out full page ads in major newspapers to proclaim that “the science isn’t settled” (sound familiar?) on the link between smoking and cancer. The intent is to increase the frequency of the idea in the information environment causing people to defer.

Propagandists also manipulate the frequency bias compiling lists of “experts” that contest the consensus view. For example, you’ll often see lists of scientists that question anthropogenic global warming, vaccine safety, GMOs, and evolution. The interest groups that compile these lists know that most people will see the word “scientist” and stop there.

However, if you look at the actual people on these list they are people from a hodge podge different disciplines and with varying degrees of credibility. Although these lists occasionally do contain a very small handful of relevant experts, most are not experts in the domain at issue. We wouldn’t ask a mechanic for advice on our taxes so why do we care what an engineer says about vaccines? The point of these lists is to create the illusion of expert disagreement by manipulating the frequency bias.

In our current information environment the situation is even more pernicious that it was at the time of the tobacco industry’s propaganda. Propagandists can mobilize bots to tweet and post links or comments thereby increasing the frequency of their ideas. Our guard is down because they resemble real people (i.e., manipulating trust). At least with the tobacco newspaper ads people could be suspicious of the obvious vested interests. Even more pernicious is that people are now renting out their FB accounts. So now, propagandists could be manipulating the frequency and prestige bias through someone you trust or, at least, have no reason to distrust.

V. Conclusion

My assessment of science denialists has changed a lot since I started writing my dissertation on anti-science propaganda 5 years ago. I used to think they were stupid and culpable. My position has changed 180 degrees. I now believe these people are victims of sophisticated and well-funded manipulation campaigns that prey on social trust and our necessary reliance on others for knowledge.

There also may be degrees of culpability–it seems like some people really should know better. Maybe I’ll post more on the ethics of belief another time. That said, I’ve found that reframing (lay) science denialists as victims of manipulation allows me to be much more patient and sympathetic with them in conversation. My research projects going forward all revolve around figuring out how to talk to and help people who have been targeted by propaganda.

Sources

  1. Cailin O’Connor and James Weatherall. The Misinformation Age.
  2. Robert Boyd and Peter Richerson. Culture and the Evolutionary Process.
  3. Joseph Henrich and Richard McElreath. The Evolution of Cultural Evolution in Evolutionary Anthropology 12: 123-135.
  4. Mason, L. (2018). Uncivil agreement: How politics became our identity. Chicago, IL: The University of Chicago Press.

 

17 thoughts on “How and Why Anti-Science Propaganda Works

    1. My pleasure. Thank you for taking the time to comment. This article is a subsection of a much longer paper I’m working on. I’ll post pieces of it each week.

      Like

  1. “There isn’t time or energy for you to get a PhD in every single domain of human knowledge.”

    What a ridiculous article. It equates having a PhD as being an expert. What folly! Some of the most obliviously incompetent people I’ve ever worked had PhDs. That means nothing in the real world. The other ridiculous assumption is that you need a PhD to determine any in reality. How horribly misguided. Lol..

    Like

    1. I agree that having a PhD doesn’t make you an expert in every domain nor is it necessary. Some domains–like the trades–use other systems to identify experts. And, like you, I have worked with incompetent PhD; however, they weren’t incompetent in their areas of specialization.

      Like

    2. I should add that this article is specifically about expertise in the sciences. And for that you do need a PhD. I don’t see how anyone could otherwise get the specialized training, mentorship, community, and access to facilities to conduct original research. I’m willing to be corrected.

      Like

  2. This is an interesting and important article.

    Sadly I find the language used is too academic – ie. written for others in the same academic ‘fold’ :-).
    Just too challenging for it to be shared among many, like myself, who might also benefit by understanding why all of us believe what we believe, but tend not to read if the structure is not simple.

    It could be far more comprehensible and extremely useful if it could be rewritten in much shorter sentences, more gaps, and much of it in simple diagrams and drawings instead of words.

    Also phrases like ‘costly information hypothesis’ just CAN’T be thrown out there, if you want ordinary folks to comprehend what you mean, without a note. (eg. Was the hypothesis expensive to buy? Wouldn’t ‘Information/cost hypothesis’ be less ambiguous?)

    Liked by 1 person

    1. Thank you for your valuable feedback. I actually agree with you. In my other posts I make an effort to avoid overly academic writing. This post was actually a small section of a much larger paper I’m writing–which explains the academic tone. Hopefully, you’ll forgive me this time around and are able to find prior posts that are deliberately written for a broad audience. Once again, that you for your valuable feedback. I’ll definitely take it into account for future posts.

      Like

  3. Philosophami, here’s a copy of a post I recently made on Facebook regarding your article. I’m glad you’re still working on a longer version of it. You might want to take my comments into account in any reply, especially my reading recommendation:

    This is an interesting and informative essay that brings up some excellent points about propaganda and social bias. However, as someone who has been studying the philosophy, psychology and history of science for several decades (I’m a non-fiction science book reviewer for the American Library Association in my spare time), It’s clear to me he doesn’t really understand science that well–at least not as well as I think he should. There’s a not so subtle implication here, for instance, that anyone who questions the safety and effectiveness of GMOs, vaccines or other “established” scientific paradigms has fallen victim to anti-science propaganda. Granted, I don’t think many or most so-called climate change “denialists” really understand or have even studied the complex evidence supporting the anthropomorphic origins of climate change. However, there are some skeptics of these mainstream viewpoints who are very well informed about the science and still have serious questions. With respect to vaccines, there are many reputable scientists and health professionals (I count myself in the latter category) who question how safe and effective they really are, pointing to many flaws in the evidence such as the lack of safety studies for many vaccine ingredients. However, the knee-jerk response in the media, and by the author of the article you posted, is to call these people “anti-vaxxers.” But debate and skepticism are actually healthy features of good science, otherwise our scientific understanding of the world would never move forward and we would all still think the Earth is the center of the universe as asserted in Ptolemaic cosmology. Palmer makes light of the notion that experts can sometimes be wrong and doesn’t really understand the power of confirmation bias, cognitive dissonance and groupthink. He would do well to read a good book about science studies (a discipline that researches the sociology, psychology and history of science), such as “Dogmatism in Science in Medicine” by Henry H. Bauer who is an “expert” on this topic.

    Like

  4. Thank you for sharing this. This information really needs to be read and understood by many, many people. I strive to understand those who I know think differently from me and it’s a challenge I often do not meet, despite my efforts. I’d like to try to simplify this and offer it in an upcoming podcast episode. I have a podcast for English learners and others who are interested in the English language and world cultures. Of course, I will credit you and if you would like, I could send you my script before I make the episode to insure that I’ve got my facts straight.

    Like

    1. Hi Alex,
      Thank you for the positive feedback. You’re more than welcome to share the info on you podcast. Also, if you do interviews I’d be happy to come on and discuss this as well as other aspects of the problem. Whatever works best for you!

      Like

      1. Thank you so much for that! I’m running this around in my head and trying to come up with a good context to use this in a way that would be as useful as possible. I’ll get back with you when I have a more solid idea. Right now I’m fighting a nasty cold, so it might take a little while. I appreciate your generosity! And thank you so much for the offer to be interviewed!

        Best,
        Alex

        Alex’s ESL World
        alexseslworld.com

        From: Wrestling with Philosophy
        Reply-To: Wrestling with Philosophy
        Date: Monday, January 13, 2020 at 11:18 AM
        To: Alexandra Olinger
        Subject: [New comment] How and Why Anti-Science Propaganda Works

        philosophami commented: “Hi Alex, Thank you for the positive feedback. You’re more than welcome to share the info on you podcast. Also, if you do interviews I’d be happy to come on and discuss this as well as other aspects of the problem. Whatever works best for you! “

        Like

      2. Hi,

        I’m all better and I wanted to let you know that I uploaded my podcast episode this past Thursday. I really wanted to interview you, but when it came down to it, I couldn’t figure out what to ask you. You see, my audience is mainly English learners. So, I have to carefully word things so that most people will understand. I also provide them with transcripts of my episodes with vocabulary lists for each one. Your article was written in academic English, far above what I would expect even an advanced learner to be able to read. I spent a lot of time trying to simplify what you wrote. I completely cut all of the references to the various biases, even though I understand that these are critical to getting the full message. I hope I haven’t overstepped my bounds. I also provided them with a link to your original article, so for those who can read that level of English, they have it available to them.

        I’m including a link here to my podcast episode, in case you want to hear me in my very simplified version of what you said. Thank you for allowing me to do this and I hope I haven’t made any egregious mistakes.

        https://www.buzzsprout.com/695233/2863009

        Alex

        Alex’s ESL World
        alexseslworld.com

        From: Wrestling with Philosophy
        Reply-To: Wrestling with Philosophy
        Date: Monday, January 13, 2020 at 11:46 PM
        To: Alexandra Olinger
        Subject: [New comment] How and Why Anti-Science Propaganda Works

        philosophami commented: “Sorry to hear about your cold. I hope you get well soon. “

        Liked by 1 person

      3. Hi Alex, I’m glad you’re feeling better and you were able to find a formate to successfully convey the information to your podcast audience. I’ll check out the podcast soon once I get through editing an article I’m working on. Fun fact: I taught ESL in Venezuela, Argentina, and Japan. I have many great memories. Sometimes I miss it.

        Like

  5. This is a good article and even though it written for an academic audience based on the language it kind of reminds me of a book I have yet to finish called Thinking fast and slow. As humans we are more likely to associate to a certain group like it was mentioned on your post to find guidance on what to believe since we do not have enough time to find out for ourselves. The other main issue is the fact that we live in a more polarized society like it was stated and just like the book I referenced “what you see is all there is” from your minds stand point. People are only seeking sources that are the same as their groups believes and automatically discrediting any other source. It’s a good start and it will be interesting to see what you discover as you do more research. This also took me back to my academic years even writing this post lol.

    Liked by 1 person

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s