How and Why Anti-Science Propaganda Works

I. Introduction

psychological-manipulation-techniques-5-600x480

Imagine that you can only know what you discover by yourself through trial and error. There are no websites, no books, no teachers, not even other people to talk to. How much could you know about the world? Do you think you could figure out which chemical molecules compose your food? What was happening in another country? What the climate was like 30 years ago? How to hunt or farm? As Philosopher John Hardwig says, “we are irredeemably epistemically dependent on each other.” That is to say, just about everything we know about the world we learn from other people. We depend on other people for knowledge.

This is particularly true as the world gets more complex. There’s simply too much for one person to know on their own and so people start to specialize in areas of knowledge. Hence, we have chemists, physicists, doctors, lawyers, geologists, mechanics, accountants, to name but a few—and, of course, philosophers. Just about everything you know about the world you learned from someone else. This is a good strategy too. There isn’t time or energy for you to get a PhD in every single domain of human knowledge.

Sometimes a problem emerges: There are cases where it’s not clear who the experts are on a particular issue or where two people who appear to be experts disagree. How does the non-expert identify who the genuine experts are? Who should they defer to?

In this post I do two things. First, drawing on a model from cultural anthropology, I explain the strategies that people use to identify experts. Second, I explain how anti-science propagandists manipulate these strategies to confuse the public.

[For more information and educational resources on science denialism and anti-science propaganda visit my separate growing website dedicated to the topic]

II. Heuristics for Identifying Experts
Boyd and Richerson’s (1985) model of cultural learning explains, among other things, how and when we defer to others. Their costly information hypothesis holds that

when accurate information is unavailable or too costly [for individuals to learn something on their own], individuals may exploit information stored in the behavior and experience of other members of their social group. (my italics for emphasis. (Henrich & McElreath 2003).

To place this in our contemporary informational environment, rather than earn a PhD in every domain, it’s much easier to investigate what experts in that domain believe.

Notice that the scope of ‘social group’ is rather nebulous. However, under conditions of high social sorting and polarization, we can expect ‘social group’ to be defined rather narrowly. The greater the distance between social groups, the less likely members from one group are to defer to members or institutions from another. For example, in the US where the population is more socially polarized than at any other point in the nation’s history (Mason 2018), it’s unlikely that members of one political identity will defer (on a politicized issue) to a member or institution perceived as belonging to the other group.

For example, Rightwing partisans are unlikely to defer to climate scientists since they don’t see them as being members of their own group. This distrust shows up in attitudes toward universities and university professors: About 60% of Republicans think universities have a negative effect on the country (compared to 67% of Democrats think they have a positive effect). And 19% of Republicans have no confidence at all in university professors to act in the public interest and 33% have not too much confidence (while 26% of Democrats say they have a great amount of confidence and 57% have a fair amount) (Pew 2018).

In our information environment, scientific knowledge is costly (in that it requires a lot of effort to obtain through individual trial and error). As I’ve said, a citizen cannot pursue a PhD in every field. It follows from the costly information hypothesis that on scientific matters citizens will engage in social learning and defer. Once they’ve decided to learn from others, various contextual cues bias them toward learning from one subgroup or individual rather than another. Adaptive information is embodied in both who holds ideas and how common those ideas are. These in turn underpin the prestige and frequency bias, respectively (Boyd & Richerson 1985).

The prestige bias is actually a proxy for the success bias (i.e., defer to the most successful person at a task). But when the ability to rank individuals by outcome in a particular skill or activity is too difficult, individuals “use aggregate indirect measures of success, such as wealth, health, or local standards” (Henrich & McElreath 2003; my italics for emphasis). The fact that prestige is only an indirect measure of skill implies that it will often be unclear which of a revered individual’s many traits led to (perceived) success.

To situate this in our current world, the fact that an individual has a media presence (prestige) may mislead many to believe that individual is an expert in a domain when in fact they aren’t. The prestige bias explains why so many (erroneously) defer to celebrities for health issues. Similarly, because prestige is defined by local standards, in a socially sorted and polarized society, people will likely not defer to experts outside their own group. It follows that, under such conditions, many will likely defer to the wrong experts on complex politicized empirical issues.

The prevalence of the success and prestige biases creates pressure for success-biased learners to pay deference to those they perceive as highly skilled (Henrich & McElreath 2003). The spread of deference-type behaviors means that naive entrants “may take advantage of the existing patterns of deference by using the amounts and kinds of deference different models receive as cues for underlying skill “(Ibid). So, local (i.e., ingroup) standards of prestige combine with patterns of deference to give (fallible) signals to non-experts about who the experts are.

Once again, we can see how these patterns instantiate themselves and fail in our current environment. On politicized issues, non-experts in a socially sorted and polarized society will defer to different experts and institutions based on ingroup prestige standards and patterns of deference. Few partisans, if any, will defer to individuals or institutions that their outgroup perceive as experts. On partisan issues where there is a consensus of genuine experts, one group will likely defer to the wrong individuals and institutions despite their perceptions to the contrary.

The success and prestige bias do not solve every costly information problem. In our current environment this problem emerges when two purported experts on either side of an issue both work at prestigious universities or institutions and/or both have media presence. Which to believe?

The successful heuristic in these situations is to copy the behaviors, beliefs, and strategies of the majority (Ibid). In information-poor environments (with respect to who has relative prestige/success) the conformist/frequency bias is a successful strategy. The conformist bias is so pervasive that it is an even more common form of learning than vertical transmission (i.e., parent to child) (Ibid.).

Again, like all heuristics they can be maladaptive, depending on the environment. In a highly socially sorted and polarized society, the conformity bias will likely apply only to the behaviors and beliefs of one’s ingroup rather than those of the outgroup. If a majority in one group has false or improbable beliefs, the frequency bias predicts that other members will also defer to the majority.

III. Belief Polarization and Trust

Using a mathematical model developed by Venkatesh Bala and Sanjeev Goyal, Cailin O’Connor and James Weatherall (2019) investigated a similar issue. They modeled how scientific communities either converge or polarize on beliefs in order to study how belief polarization occurs and misinformation spreads.

One motivation behind the project is that the scientific community adheres to rigorous epistemic norms (relative to lay people) and so if some variables can cause belief polarization and misinformation in these communities then it is bound to occur in the general population. The Bala-Goyal model is based on the Bayesian idea that we update our credence levels when others share new information with us.

An important finding aligns with what I suggested might occur in a sorted and polarized society. Their models found that when subgroups within a community distrust each other they appraise evidence differently depending on its source. That is, evidence from a trusted source (i.e., in-group) can move credence levels one way while the same evidence but from a distrusted source can move credence levels in the other direction!

This makes sense. If you believe that a lab or scientist is corrupt then it is reasonable to assume they’ve fabricated or manipulated their results and to revise your credence levels in the other direction. The end result is stable belief polarization within the community: One subgroup converges on the correct view while the other converges on the false one. The greater the mistrust, the larger the faction that settles on the false belief. This occurs because “those who are skeptical of the better theory are precisely those who do not trust those who test it” (O’Connor and Weatherall 2019). The group converging on false beliefs becomes insensitive to countervailing evidence.

Several important conclusions follow from their Bayesian models that incorporate social trust. First, “models of polarization […] strongly suggest that psychological biases [such as confirmation bias] are not necessary for polarization to result (Ibid p. 76; my italics for emphasis). Second, while distrust can cause us to dismiss relevant evidence, too much trust can also lead you astray “especially when agents in a community have strong incentives to convince you of a particular view” (Ibid p. 77).

IV. How and Why Anti-Science Propaganda Works

If you’ve followed so far you should be starting to see how anti-science propaganda works. One important way is to manipulate trust through the prestige bias. Anti-science propagandists will present their experts as prestigious. This gets people to defer to the wrong experts. But most importantly, for anti-science propaganda to work they must diminish the prestige and, therefore, trust in genuine experts.

This is why every single anti-science propaganda campaign targets academic institutions, high profile public scientists, and public regulatory institutions like the FDA, EPA, and CDC. Go through the list of prominent denialisms like anthropogenic global warming deniers, anti-GMO advocates, anti-vaxxers, and evolution deniers and you will quickly identify this pattern. People will only trust the denialist experts if they can be convinced to distrust legitimate experts.

[Note: Some of people will invariably and correctly point to instances where the above institutions or a consensus of experts got it wrong. Yes, it’s true. Institutions and experts make mistakes. However, what’s important from the point of view of the nonexpert is the relative error rates of who to defer to. Compared to other individuals or institutions, which is more likely to get it right? This is a long and complicated topic that I can’t do justice to here but for now, thinking about relative error rates is one way to conceive of erroneous deferrals.]

Propagandists also manipulate people by manipulating the frequency bias. When the information environment is unclear with respect to what to believe (usually deliberately muddied by denialist propaganda) people will defer to the most common belief in their information environment. As an historical illustration, the tobacco industry for years would take out full page ads in major newspapers to proclaim that “the science isn’t settled” (sound familiar?) on the link between smoking and cancer. The intent is to increase the frequency of the idea in the information environment causing people to defer.

Propagandists also manipulate the frequency bias compiling lists of “experts” that contest the consensus view. For example, you’ll often see lists of scientists that question anthropogenic global warming, vaccine safety, GMOs, and evolution. The interest groups that compile these lists know that most people will see the word “scientist” and stop there.

However, if you look at the actual people on these list they are people from a hodge podge of different disciplines and with varying degrees of credibility. Although these lists occasionally do contain a very small handful of relevant experts, most are not experts in the domain at issue. We wouldn’t ask a mechanic for advice on our taxes so why do we care what an engineer says about vaccines? The point of these lists is to create the illusion of expert disagreement by manipulating the frequency bias.

In our current information environment the situation is even more pernicious than it was at the time of the tobacco industry’s propaganda. Propagandists can mobilize bots to tweet and post links or comments thereby increasing the frequency of their ideas. Our guard is down because they resemble real people (i.e., manipulating trust). At least with the tobacco newspaper ads people could be suspicious of the obvious vested interests. Even more pernicious is that people are now renting out their FB accounts. So now, propagandists could be manipulating the frequency and prestige bias through someone you trust or, at least, have no reason to distrust.

V. Conclusion

My assessment of science denialists has changed a lot since I started writing my dissertation on anti-science propaganda 5 years ago. I used to think they were stupid and culpable. My position has changed 180 degrees. I now believe these people are victims of sophisticated and well-funded manipulation campaigns that prey on social trust and our necessary reliance on others for knowledge.

There also may be degrees of culpability–it seems like some people really should know better. Maybe I’ll post more on the ethics of belief another time. That said, I’ve found that reframing (lay) science denialists as victims of manipulation allows me to be much more patient and sympathetic with them in conversation. My research projects going forward all revolve around figuring out how to talk to and help people who have been targeted by propaganda.

For my free online critical thinking course go here: https://reasoningforthedigitalage.com/

Sources

  1. Cailin O’Connor and James Weatherall. The Misinformation Age.
  2. Robert Boyd and Peter Richerson. Culture and the Evolutionary Process.
  3. Joseph Henrich and Richard McElreath. The Evolution of Cultural Evolution in Evolutionary Anthropology 12: 123-135.
  4. Mason, L. (2018). Uncivil agreement: How politics became our identity. Chicago, IL: The University of Chicago Press.

41 thoughts on “How and Why Anti-Science Propaganda Works

    1. My pleasure. Thank you for taking the time to comment. This article is a subsection of a much longer paper I’m working on. I’ll post pieces of it each week.

      Like

  1. “There isn’t time or energy for you to get a PhD in every single domain of human knowledge.”

    What a ridiculous article. It equates having a PhD as being an expert. What folly! Some of the most obliviously incompetent people I’ve ever worked had PhDs. That means nothing in the real world. The other ridiculous assumption is that you need a PhD to determine any in reality. How horribly misguided. Lol..

    Like

    1. I agree that having a PhD doesn’t make you an expert in every domain nor is it necessary. Some domains–like the trades–use other systems to identify experts. And, like you, I have worked with incompetent PhD; however, they weren’t incompetent in their areas of specialization.

      Like

    2. I should add that this article is specifically about expertise in the sciences. And for that you do need a PhD. I don’t see how anyone could otherwise get the specialized training, mentorship, community, and access to facilities to conduct original research. I’m willing to be corrected.

      Like

      1. I have a PhD in chemistry, and I find it very difficult (if not impossible) to discuss chemistry with a layman. People with whom I socialize comment on how much I know. I am pretty sure that they would believe me when I talked about any subject. I can talk about glaciers lengthening or retreating in response to climate change. I do not do it because very few people understand what I am saying. Very few people have ever seen the end of a glacier in a valley and understand that has retreated up the valley or advanced down the valley in historical times. I think that I could use my credentials to promote any cause that I liked, but that is not my personal style.

        Like

    3. Attacking a literary technique that frames a point not meant as fact is Strawman fallacy.

      It is granted to critical readers that PhD wasn’t meant literally, but as a stand-in for recognized accumulation of knowledge and skill (however they are gained and manifested).

      Like

    4. I earned a PhD at one of the most respected Universities in America. In my 16 years of college level classes, the most valuable thing that I learned was skepticism. Never accept anything unless you can independently confirm it. The best confirmation is to observe it myself, but that is not always possible. In that case, I must rely on somebody else’s confirmation. How do I decide who to trust? One way is personal knowledge of that person. For example, I would never accept confirmation from President Trump! I use the person’s academic credentials. Do you have a better idea?

      Like

  2. This is an interesting and important article.

    Sadly I find the language used is too academic – ie. written for others in the same academic ‘fold’ :-).
    Just too challenging for it to be shared among many, like myself, who might also benefit by understanding why all of us believe what we believe, but tend not to read if the structure is not simple.

    It could be far more comprehensible and extremely useful if it could be rewritten in much shorter sentences, more gaps, and much of it in simple diagrams and drawings instead of words.

    Also phrases like ‘costly information hypothesis’ just CAN’T be thrown out there, if you want ordinary folks to comprehend what you mean, without a note. (eg. Was the hypothesis expensive to buy? Wouldn’t ‘Information/cost hypothesis’ be less ambiguous?)

    Liked by 1 person

    1. Thank you for your valuable feedback. I actually agree with you. In my other posts I make an effort to avoid overly academic writing. This post was actually a small section of a much larger paper I’m writing–which explains the academic tone. Hopefully, you’ll forgive me this time around and are able to find prior posts that are deliberately written for a broad audience. Once again, that you for your valuable feedback. I’ll definitely take it into account for future posts.

      Like

    2. On the other hand, google is a couple of clicks away. If there’s something you don’t understand, you can look it up. That’s how we learn and grow.

      Like

  3. Philosophami, here’s a copy of a post I recently made on Facebook regarding your article. I’m glad you’re still working on a longer version of it. You might want to take my comments into account in any reply, especially my reading recommendation:

    This is an interesting and informative essay that brings up some excellent points about propaganda and social bias. However, as someone who has been studying the philosophy, psychology and history of science for several decades (I’m a non-fiction science book reviewer for the American Library Association in my spare time), It’s clear to me he doesn’t really understand science that well–at least not as well as I think he should. There’s a not so subtle implication here, for instance, that anyone who questions the safety and effectiveness of GMOs, vaccines or other “established” scientific paradigms has fallen victim to anti-science propaganda. Granted, I don’t think many or most so-called climate change “denialists” really understand or have even studied the complex evidence supporting the anthropomorphic origins of climate change. However, there are some skeptics of these mainstream viewpoints who are very well informed about the science and still have serious questions. With respect to vaccines, there are many reputable scientists and health professionals (I count myself in the latter category) who question how safe and effective they really are, pointing to many flaws in the evidence such as the lack of safety studies for many vaccine ingredients. However, the knee-jerk response in the media, and by the author of the article you posted, is to call these people “anti-vaxxers.” But debate and skepticism are actually healthy features of good science, otherwise our scientific understanding of the world would never move forward and we would all still think the Earth is the center of the universe as asserted in Ptolemaic cosmology. Palmer makes light of the notion that experts can sometimes be wrong and doesn’t really understand the power of confirmation bias, cognitive dissonance and groupthink. He would do well to read a good book about science studies (a discipline that researches the sociology, psychology and history of science), such as “Dogmatism in Science in Medicine” by Henry H. Bauer who is an “expert” on this topic.

    Like

    1. So was this an attempt by you to cast doubt on the author’s credibility and therefore diminish another’s ability to trust that the information is relevant and correct? By not so subtly dismissing these ideas as okay but ill informed; you seem to be doing exactly what the author suggested people would do to subvert their knowledge.

      Like

  4. Thank you for sharing this. This information really needs to be read and understood by many, many people. I strive to understand those who I know think differently from me and it’s a challenge I often do not meet, despite my efforts. I’d like to try to simplify this and offer it in an upcoming podcast episode. I have a podcast for English learners and others who are interested in the English language and world cultures. Of course, I will credit you and if you would like, I could send you my script before I make the episode to insure that I’ve got my facts straight.

    Like

    1. Hi Alex,
      Thank you for the positive feedback. You’re more than welcome to share the info on you podcast. Also, if you do interviews I’d be happy to come on and discuss this as well as other aspects of the problem. Whatever works best for you!

      Like

      1. Thank you so much for that! I’m running this around in my head and trying to come up with a good context to use this in a way that would be as useful as possible. I’ll get back with you when I have a more solid idea. Right now I’m fighting a nasty cold, so it might take a little while. I appreciate your generosity! And thank you so much for the offer to be interviewed!

        Best,
        Alex

        Alex’s ESL World
        alexseslworld.com

        From: Wrestling with Philosophy
        Reply-To: Wrestling with Philosophy
        Date: Monday, January 13, 2020 at 11:18 AM
        To: Alexandra Olinger
        Subject: [New comment] How and Why Anti-Science Propaganda Works

        philosophami commented: “Hi Alex, Thank you for the positive feedback. You’re more than welcome to share the info on you podcast. Also, if you do interviews I’d be happy to come on and discuss this as well as other aspects of the problem. Whatever works best for you! “

        Like

      2. Hi,

        I’m all better and I wanted to let you know that I uploaded my podcast episode this past Thursday. I really wanted to interview you, but when it came down to it, I couldn’t figure out what to ask you. You see, my audience is mainly English learners. So, I have to carefully word things so that most people will understand. I also provide them with transcripts of my episodes with vocabulary lists for each one. Your article was written in academic English, far above what I would expect even an advanced learner to be able to read. I spent a lot of time trying to simplify what you wrote. I completely cut all of the references to the various biases, even though I understand that these are critical to getting the full message. I hope I haven’t overstepped my bounds. I also provided them with a link to your original article, so for those who can read that level of English, they have it available to them.

        I’m including a link here to my podcast episode, in case you want to hear me in my very simplified version of what you said. Thank you for allowing me to do this and I hope I haven’t made any egregious mistakes.

        https://www.buzzsprout.com/695233/2863009

        Alex

        Alex’s ESL World
        alexseslworld.com

        From: Wrestling with Philosophy
        Reply-To: Wrestling with Philosophy
        Date: Monday, January 13, 2020 at 11:46 PM
        To: Alexandra Olinger
        Subject: [New comment] How and Why Anti-Science Propaganda Works

        philosophami commented: “Sorry to hear about your cold. I hope you get well soon. “

        Liked by 1 person

      3. Hi Alex, I’m glad you’re feeling better and you were able to find a formate to successfully convey the information to your podcast audience. I’ll check out the podcast soon once I get through editing an article I’m working on. Fun fact: I taught ESL in Venezuela, Argentina, and Japan. I have many great memories. Sometimes I miss it.

        Like

  5. This is a good article and even though it written for an academic audience based on the language it kind of reminds me of a book I have yet to finish called Thinking fast and slow. As humans we are more likely to associate to a certain group like it was mentioned on your post to find guidance on what to believe since we do not have enough time to find out for ourselves. The other main issue is the fact that we live in a more polarized society like it was stated and just like the book I referenced “what you see is all there is” from your minds stand point. People are only seeking sources that are the same as their groups believes and automatically discrediting any other source. It’s a good start and it will be interesting to see what you discover as you do more research. This also took me back to my academic years even writing this post lol.

    Liked by 1 person

  6. Great article, and relatively free of the scholarly journalese that made me run screaming in terror from teaching at the college level. I particularly latched onto your final comment that “it seems like some people really should know better.” One thing that continues to fascinate and frustrate me is how little the average guy’s thought process has changed over the centuries. The techniques of manipulation get more sophisticated every day, but people are just as tribal, just as stubbornly “set in their ways” as they were ten thousand years ago.

    And frankly the attitudes of those who do “know better” haven’t improved either. We still cling to this idea that if we rationally explain to people how they are wrong with charts, memes, or scholarly articles that they will have an epiphany. They will wake up and say, “my God! What a fool I’ve been!” I gave up on that naïve view a long time ago.

    To me the best work ever written on this subject was Eric Hoffer’s 1951 opus “The True Believer.” It is as relevant today as it was then.

    Like

    1. “One thing that continues to fascinate and frustrate me is how little the average guy’s thought process has changed over the centuries. The techniques of manipulation get more sophisticated every day, but people are just as tribal, just as stubbornly “set in their ways” as they were ten thousand years ago.”
      You hit the nail on the head!

      I like Hoffer’s book and he gets a lot right but there’s been a lot of empirical research on populism and the psychology of populism since then. I think he’s a good starting point but, just like in any field, a lot of new research and insight can be produced in 70 years!

      Like

  7. Fantastic read! It was heavy going and I am used to reading academic writing. It took me a few goes to really get my head around it but it was well worth it. Really important and interesting stuff. I’ll definitely be book marking and reading more of your work.

    Like

    1. Thank you! It was a modified excerpt from an academic article I have under review right now–that explains the tone. I try to make my other posts more playful and reader friendly. Academic writing is a slog to read for anyone!

      Like

  8. Great article! Thank you for writing and posting. I do have a question though, that you can probably answer. I’m wondering what would be the motivation for the anti-science propagandists to convince so many people of their non-expert views. For example, they claim that pharmaceutical companies just want money so they create and promote vaccines that are unsafe. But what would be their motivation for manipulating so many peoples’ opinions if that makes sense? Do they have financial gain from that? Or something else entirely?

    Like

    1. Hi,
      Thank you for taking the time to read it and comment. This is a great question. The first thing I’d say is that not all anti-science campaigns serve the same purposes. In some cases they serve corporate interests. So, anti-climate change propaganda serves the interests of (and is funded by) the fossil fuel industry. The anti-GMO movement is mostly funded by the organic lobby. Other anti-science movements are motivated by a distrust of science, technology, and anything new. The anti-vax movement is an interesting one since it’s made up of a hodgepodge of demographics. Some are wealthy liberals but increasingly the anti-vax movement comes from right wing Christian evangelicals, right wing populism, and some strains of libertarianism. A lot of it is motivated by distrust of government, experts, and institutions like the FDA. Most anti-science now is political propaganda. It’s used to discredit and weaken any agency or organization that stands in the way of another group’s ideological goals. It largely (but not exclusively) comes from the right and is directed (and has been for a while) at the FDA, EPA, WHO, UN, and universities.
      Here are a couple of links that might interest you:
      https://mcgill.ca/oss/article/covid-19-pseudoscience/anti-vaccine-movement-2020?fbclid=IwAR2lhYAf8tZXpsV5370lwV4_VI-KhQSo3sh-enpqnO4cO54lPKDvufqbJCg
      https://journals.sagepub.com/doi/pdf/10.1177/0146167217741314
      https://www.forbes.com/sites/seanlawson/2020/04/21/right-wing-responsible-for-pushing-coronavirus-disinformation-on-twitter-worldwide-new-report-says/?fbclid=IwAR1MSoo0n7P7U2JkSHJhGBozlWSy3oZrNP5ZZ09bY2BpbEG2WFdZUuqcFLU#7d638132597f
      https://www.researchgate.net/publication/339508362_Anti-Intellectualism_Populism_and_Motivated_Resistance_to_Expert_Consensus

      Like

  9. Hello!
    Very interesting, I really enjoyed reading this article, thank you for writing and sharing it.

    At the very end of the article you mentioned that your “research projects going forward all revolve around figuring out how to talk to and help people who have been targeted by propaganda.”
    How to talk to people who believe in different things as myself is one of the topics I’m most interested in as a skeptic, and one we can all agree is pretty important in tense and divisive times.

    That being said, in case you don’t know about it yet, can I humbly point you in the direction of Street Epistemology?

    Street Epistemology is a friendly and empathy-promoting conversational technique (as well as a movement to promote it) related to critical thinking, that consists of simply asking socratic questions to help the other person evaluate by themselves how reliable their method to coming to a belief actually is.

    It is pretty exciting and is based off of Peter Boghossian’s books “A Manual For Creating Atheists” and, more recently, “How To Have Impossible Conversations ‒ A Very Pratical Guide” (neither of which, I should confess, I have read… yet! 😛 ), but I’d start by recommending any of Anthony Magnabosco’s hundreds of street interview videos on YouTube.
    He is the absolute friendliest guy you can randomly find on the street questioning you on why you believe this or that, 😀 and he has really developed and promoted the technique.

    https://streetepistemology.com/

    Anthony Magnabosco’s Top20 videos YouTube playlist:
    http://tinyurl.com/PL-AM-SE-TOP20

    Thank you again and I’ll be sure to read up on your many other articles soon!

    Like

    1. Hi Henrique, thanks for taking the time to comment. Yes, I’m familiar with street epistemology. I’m a big fan of the method and use parts of it whenever I can. I’m getting the sense that it’s gaining popularity which is a good thing for the skeptical movement.

      Like

  10. I hadn’t got very far into your article before I ground to a halt with this sentence:

    “And 19% of Republicans have no confidence at all in university professors to act in the public interest and 33% have not too much confidence (while 26% of Democrats say they have a great amount of confidence and 57% have a fair amount).”

    Comparing a positive with a negative greatly confused me and, in order to make sense or it, I had to do the arithmetic:

    R: 100-(19+33) => 48% have fair amount of or high confidence
    D: 26+57 => 83% have fair amount of or high confidence

    R: 19+33 => 52% have no or low confidence
    D: 100-(26+57) => 17% have no or low confidence I hadn’t got very far into your article before I ground to a halt with this sentence:

    Conclusion: the Democrats have higher confidence in university professors. Clearly that was implied in the original sentence, but I do think it could have been expressed a little more clearly.

    Like

  11. What, if any, culpability in the current anti-intellectual/anti-science ambient environment around us do you think “experts” themselves should bear?

    For most of my youth “experts” were deferred to with awe that made the Bill Murray “Back off, man, I’m a scientist!” outburst in Ghostbusters seem representative. Then, in no particular order, we had radium poisoning, experiments of dubious ethical and scientific merit devastating lives (Tuskegee being almost the Platonic Form of these!), embarrassing failures in the realm of nutrition (is low-fat good or bad for you?), embarrassing failures in the realm of medicine (thalidomide, and the Therac-25 debacle being good exemplars here), an abiding disaster in economics that we’re still reaping the whirlwind from, etc. etc. etc. Not to mention the utter, nigh-farcical disaster that is my field (software).

    I think a lot of the current suspicion of experts is well-grounded in risk assessment. Certainly the incidence of such experts being GROSSLY wrong is rare, but because of their outsized influence on the world, when they actually are grossly wrong, the disaster is usually of a rather sizable magnitude. Low incidence, but high price of failure leads to wariness.

    Like

    1. This is a great question. I think it’s important to recognize how experts and expert institutions have failed in the past and to look at what they’ve done to ensure those errors/violations don’t occur again.
      As you point out, in most cases these instances are rare relative to the times experts act ethically. Also, in most cases, experts and expert institutions have responded to these cases with new and better practices and institutional safeguards.
      I think it’s unreasonable to expect perfection from experts and expert institutions because these are human beings and human institutions. We should instead look at how these humans and human institutions respond to failings RELATIVE TO other humans and human institutions. By that metric, experts and expert institutions perform quite well.

      Like

      1. I’m more interested in, however, the practical side of things, as in, “How do we get ourselves out of this mess we’ve found ourselves in?”

        The problem, as I see it, is that over the course of my life I’ve seen us swing from “trust the experts: they know what’s best” to “fuck the experts: look at the mess they’ve left us!”. And honestly, if I put myself in the shoes of the average person, I sympathize.

        Going with my field, experts have, frankly, made an unmanageable mess of things. Between the intrinsic laziness of computer programmers, the utter contempt in which their users (the people they’re making the software for!) are held, and the … dubious … practices which are held up as “how to do this” (Facebook’s infamous “move fast and break things” for example) the “experts” in that domain have made a field that is enormously wasteful, frustrating, a security nightmare, and now a social wasteland. We are paying costs for seeds that were planted in the 1980s (and probably much earlier: the ’80s is just when I began), and those costs are going to be measured in human lives (already have been: c.f. “Therac-25” above and throw in the 737 Max). Yet, despite this, the reckless abandon in how software is made and funded and delivered continues at an accelerating pace.

        At some point should there maybe not be a bit of humility coming from the experts’ side? An acknowledgement that while, yes, they are experts, they’re experts in a very narrowly focused field (by necessity!), and that maybe, just maybe, what they do has an impact they’re not able to foresee outside of that field? That maybe people outside of their area of immediate expertise may just have reasons for disagreeing with experts and, *GASP!*, that they may actually even be right?

        I think a whole lot of this anti-science trend could have been avoided had experts stopped being arrogant jackasses for so many decades and sat down and listened (instead of impatiently waiting for their turn to speak), or, perhaps, even more importantly, sat down and communicated instead of lecturing.

        Is this anti-science trend troubling? Indubitably. Is it the fault of bad actors? Certainly. IN PART. It also seems, however, to my perspective of watching it unfold over five decades, to be very much also the fault of arrogant dismissal on the part of the experts.

        Like

  12. Thanks for the ideas you have contributed here. Something important I would like to talk about is that personal computer memory demands generally rise along with other advances in the engineering. For instance, any time new generations of processors are introduced to the market, there is usually a related increase in the size and style preferences of both the personal computer memory plus hard drive room. This is because software program operated simply by these processor chips will inevitably rise in power to make new know-how.

    Like

  13. Firstly, I’m a chemical engineer and thoroughly enjoyed this essay. I have some thoughts about it and will work to present them concisely.

    Have you considered the sheer amount of nonsense that is being fed to the public as science by Universities and Professors that simply is not science?

    The entire field of “soft science” has to be sold as “soft science” because so much of it is not science whatsoever and is rejected as science by the Scientists in University faculty. So they’ve gone the propaganda route and invented new terms to ride the respect that the “hard sciences” enjoyed with the public. Entire fields of “soft science” are nothing more than pseudoscience and a good number of it’s practitioners will happily admit that. Many will not, and many of those that will not appear to have significant bias and agenda in their work. The result appears to have been a significant erosion of trust in real science precisely because it has become more difficult (read: costly) for the layperson to even determine what is real science and what is not.

    When people see a man in a wig and a dress in full makeup standing before Congress and are told that he is a woman, they know better. They know that is not true, yet it is presented as fact and various institutions are demanding they acknowledge it as so. From that moment on their trust in experts and science and Universities and education and everything associated with it becomes tainted. This kind of situation is happening more and more and has been building to this point for a long time now.

    I’ll give an anecdotal example more to the point of the layperson vs the expert: I had a conversation with someone with no post-secondary education whatsoever who absolutely believes that the whole of Sociology is science and scientific. Probably better to describe it as Science at this point due to the cult-like religion that seems to be forming. When I assert the opposite, that Sociology is predominantly non-scientific and mostly the statement of the opinion of an author I was assured I was incorrect. I informed my compatriot that my assertion was actually what I had been taught in graduate level sociology by my Professor in University and what I know to be absolute fact due to my own understanding of what science truly is as an chemical engineer.

    Like

  14. Hi there,

    Thank you for the read. From one angle, I found it very interesting and seemed well-structured in conveying the how distrust towards certain stakeholders can have on a person’s standards of evidence and how that may be manipulated by bad faith actors or others who might be well-intentioned but didn’t know any better. Furthermore, I have a great appreciation towards any source of information which alludes to what Friedrich Hayek called the “Knowledge Problem”, that no one person can know everything and one of reasons we, as humans, are reliant on each other.

    From another angle, I would like to challenge you and say that academic papers such as yours, have a role to play in how come there is distrust in the first place. That’s not to say we shouldn’t understand how social mistrust can be increased by manipulation but any implication to suggest that the primary reason why there is such a distrust in the first place is not only insulting to those people but also strengthens, in my opinion, the actual reason for why this social distrust has worsened.

    My hypothesis being, the primary reason that social distrust is at its current state, is due to one group projecting themselves as “enlightened” and expecting the other group(s) to relinquish their own views and embrace the “enlightened” group’s views. Which is further intensified when the “enlightened” group gives off not so subtle signals that the other group is “uneducated”, “victims of misinformation” etc.

    As one philosopher put it.. “You can right and I can be wrong but together, we will get to the truth”.

    Thanks for reading 🙂

    Dave

    Like

Leave a comment