I was going to write about my changing ideas on libertarianism but a friend of mine requested I explain herd immunity. Of course, there is no shortage of literature on the topic, but I thought I’d discuss the topic in terms of critical thinking concepts. (Gotta have a unique perspective otherwise I might as well just post a bunch of links to other people’s articles).
Sigdwick’s Insight (Call Me Mr. Busdriver Cuz I’m Gonna Take You to School)
Before we get it stahted in ha, we’d do well to establish a baseline of common beliefs. This is what I, in my classes, have come to call “Sidgwick’s Insight“. I won’t bore you with why I call it that but I will give you a brief explanation of the concept and why it is absolutely vital to critical thinking:
Imagine you’re a bus driver (fun, I know) and you want to get some people to a particular destination. Here comes the really dumb question: If the passengers never get on the bus, can you get them to the destination?
An argument with someone that has an opposing view point is very similar to the above scenario. The destination is the your conclusion. Just as you can’t get your passengers to their destination if they never get on your bus, you can never lead an opponent to your conclusion if they never accept your premises. Conclusions follow from premises. Sidgwick’s insight is that you must always begin your argument with premises both you and your audience share.
Once your passengers are on the bus, all sorts of things can go wrong. You can run out of gas, you can disagree about whether your particular route will get you to the destination, or after a while the passengers can simply refuse to continue on the trip and get off the bus. I’m stretching the analogy, but you get the idea.
The main point is simply that your chances of leading an opposing audience to your conclusion go up dramatically if you begin with shared premises. A good arguer shows a hostile audience that his–not their–conclusion follows from the evidence that they already accept.
Germ Theory Denial, Straw Men, Inconsistency and Falsifiability
In the spirit of Sidwick’s insight, I need to find some common ground with my anti-vaccine audience. Because the anti-vaccine community runs from the absolutely nutty to the intelligent-but-misinformed and I don’t know exactly where my audience sits on this spectrum, I’m going to start by showing why the nuttiest view fails so I can discount it and begin with a premise that everyone will share with me. I also want to address the nuttiest position because I want to avoid committing a straw man.
A straw man argument is committed when you distort your opponent’s position such that it is a caricature of his actual position. It’s important not to commit this fallacy because by defeating a weaker version of an argument, you leave the door open for counter-replies (E.g., “that’s not what I meant”…) whereas if you can defeat the strongest and most charitable version of his position, there is little chance of a rebuttal.
The premise I hope to begin with is that germ theory is correct, so lets start there: In super-simplified form, germ theory is the idea that microorganisms (bacteria, viruses, fungi, protist, or prion) cause infectious diseases. To be clear, germ theory does not say that all diseases are caused by microorganisms, only the infectious ones are. To suggest that germ theory says otherwise would be to commit the straw man fallacy (learning’s fun!).
Now, there are some loons out there that deny germ theory (that was an ad hominem, for anyone keeping track!). I’m not going to spend too much time on people who hold this view but I’ll discuss their beliefs to illustrate another critical thinking principle: logical inconsistency.
One issue you’ll come up against while evaluating arguments is determining when you should or should not accept a premise. This can be particularly difficult when it is about a topic you’re not too familiar with. One simple rule is that you should reject any argument that has mutually exclusive premises; that is, two or more logically inconsistent premises. With this rule, you don’t even need to know anything about the topic. If the premises are logically incompatible, you can reject the argument as a whole.
The loony end of the anti-vax movement provides a good example of logical inconsistency: Many in the loony camp deny germ theory. So far no inconsistency, just a blatant denial of almost 200 years of science. However, these same people will often say that the massive drop off and virtual elimination of vaccine-preventible (i.e. infectious) diseases wasn’t caused by vaccines–it was caused by better diets, hygiene and sanitation. Did you spot the inconsistency?
If germs don’t cause infectious diseases, then why would sanitation and hygiene have any effect on their transmission and rates of prevalence? This is what we call a logical inconsistency. Now, to be fair, simply because we’ve shown an argument to be inconsistent, it doesn’t follow that the conclusion is false, it only means that that particular line of argument won’t work to support the conclusion. Nevertheless, eliminating a line of support for a conclusion diminishes the likelihood of its truth.
Another good heuristic for evaluating a position is its falsifiability. Falsifiability means that there is some way to set up an experiment or test to show that a position is false. For example, the hypothesis that vaccines do significantly diminish rates of infection and transmission is falsifiable.
I could conduct an experiment or look at historical data to test the hypothesis: I could look at rates of infection and transmission for a particular disease in a population before a vaccine was developed and then I could look at rates of infection and transmission after the vaccine had been administered to the population. I could also look at what happens to rates of infection and transmission when immunization rates fall. If there is a significant difference, I can infer a causal relation. If there is no significant difference, I can affirm that it is probably false that vaccines prevent infection and transmission of a particular disease; that is to say, the hypothesis has been falsified. Anyhow, if a hypothesis isn’t falsifiable (i.e., there’s no possible way to prove it false) then it’s weak.
[Note: I’m going to gloss over the philosophical issue involving the distinction between “in principle” and “in practice” falsifiability as well as the philosophical problems surrounding the falsifiability criterion. My claim is only that it is a good heuristic.]
In light of the notion of falsifiability, let’s evaluate some “alternative” theories to germ theory. There are people that believe that disease isn’t caused by germs but by poor alignment of your spine, chi, chakras, too much yin/yang and/or too much stress. As with most positions, there are varieties: Some say that the germ theory is completely wrong, others hold a hybrid view that, yes germs can cause diseases, but only in people that don’t adhere to a particular magic diet, lifestyle, philosophy, attitude, world-view, etc…
In other words, if people would simply change their lifestyle, worldview, eat organic bugabuga berries, pay for quantum healing sessions, etc…they’d never get an infectious disease because their immune system would be so strong. It’s only because [insert name of your favorite “toxin” or psychological ailment attributed to modern society] that people’s immune systems are compromised. You might think this is a straw man, but alas, it is not. A little time on any “natural healing” website will disabuse you of your naiveté.
So, where does falsifiability come into all of this alt-germ theory? The purveyors of these schools of thought generally present their hypotheses in non-falsifiable forms. Here’s how the conversation typically goes: They make their claim that “the one secret THEY (i.e. the establishment) don’t want you to know” [choose your favorite alt-med treatment and/or new-age “philosophy”] will prevent you from ever being infected by an infectious disease (and especially not cancer). You point to an example of someone who gets the alt-med treatment and/or adheres to the new-age “philosophy” yet caught (or died from) an infectious disease. They respond by saying, “ah, they weren’t doing it quite right” (maybe it was the gluten?) but if they had, done it right, they never would have gotten the disease.
No matter what counter-example (attempt to falsify their hypothesis) you point to, they will say that the person wasn’t truly doing it right (e.g., they ate GMO corn by accident one day). They never allow any counter-examples. The hypothesis is unfalsifiable–in practice–and also commits (bonus!) The No True Scotsman Fallacy.
So, how do we deal with this? As you might have guessed, I have a solution. It’s called the “put your life where your mouth is” test. Before presenting it I’d like to say that I don’t believe that, when push comes to shove, people really believe half the nonsense they say they do. Here’s the solution: Ask the proponent of alt-med treatment X/new-age “philosophy” Y to undergo whatever treatment/practice/therapy/”philosophy” they are recommending. They can do whatever they think makes them perfectly healthy and immune to infectious diseases. Eat organic acai berries, do yoga, mediate with Tibetan monks, do acupuncture, get adjusted at the chiropractor’s, uncover their repressed emotions…whatever. Then ask them if you can inject them with the HIV virus.
If they hold either of the views that (a) micro-organisms don’t cause disease or (b) micro-organisms-only-cause-disease-if you-don’t-buy-what-I’m-selling then they should be happy to oblige. Of course, only the looniest of the loons will oblige..and if they do, ethical considerations dictate that winning the argument should come second to causing someone’s death through their own gullibility.
Ok, so maybe the HIV virus is a bit much. Maybe ask them to rub an HPV-covered swab on their genitalia. I’m sure they’ll be happy to show you how well their treatment works. Probably they’ll just get reiki or will simply will themselves back to health through positive thoughts. Please put it on video.
One last point regarding consequences of non-falsifiability: When the anti-vaxer/proponent of alt-germ uses the ad hoc strategy of “ah ha! but they didn’t do it right” we should consider that public health policy should take into account how actual people, living in this world, will behave, not how they might behave if they were perfectly rational and living in a perfect world. Regardless of its efficacy, if a practice being preached is so unattainable, it is not practical in a world of creatures who regularly act against their own self-interest–especially when it comes to their own health.
The False Dilemma
The false dilemma fallacy is committed when an arguer presents two options that aren’t mutually exclusive but presents them as though they are mutually exclusive. A (very) moderate anti-vaxer might accuse me of committing this fallacy. But I would not consider such a position to be that of an anti-vaxer: Most anti-vaxers either believe that vaccines cause more health problems than they prevent or that vaccines have negligible efficacy compared to whatever treatment/lifestyle they’re recommending (correct me if I’m wrong).
It is the anti-vaxer that commits the false dilemma: Either you vaccinate and get sick OR do the treatment/live the lifestyle they’re selling and you won’t get sick:
But this is to present a false dilemma: Of course a healthy diet, low-stress and active lifestyle is going to make you less susceptible and more resistant to disease than if you have a poor diet, high-stress, sedentary lifestyle. Nobody is disputing this (to suggest they are would be to commit a straw man). Aaaaaaaaaaaand, if you vaccinate as well, you will decrease even more significantly your susceptibility to infectious disease (up to 22x vs unvaccinated depending on the disease: Glanz J, et al “Parental refusal of pertussis vaccination is associated with an increased risk of pertussis infection in children” Pediatrics 2009; DOI: 10.1542/peds.2008-2150.).
Back to Sidgwick and End of Part 1
I have learned from my formal and informal study of the psychology of reasoning and belief that deeply-held views are most often recalcitrant to evidence and reason, no matter how compelling. I don’t really expect to change anyone’s mind at this point. But, if we’re going to discuss the question of herd immunity as it pertains to vaccines, we need some premises that are held in common. The purpose of the above section was to try to establish at least one of those premises: that germs (micoorganisms) cause infectious diseases.
If you find fault with how I have shown competing views to be improbable, please leave a comment in the comments section and I will do my best to address it.
Section 2, which I hope to write next week, will begin with the premise that the germ theory is correct. From that, I will attempt to show why it follows that we should prefer the well-established scientific notion of herd immunity instead of its denial.