In the true crime space, we have noticed the creep of misinformation and conspiratorial thinking. It's not always easy to spot, though. Critical thinking can help us all engage better with stories about law and crime. But how can we start engaging in critical thinking around our favorite genre?
Today's guest, Dr. Steven Novella, can help. Dr. Novella is a clinical neurologist at Yale University School of Medicine. He's also one of the hosts of the podcast The Skeptics' Guide to the Universe. We'll talk to him about how to spot a con job, our adversarial criminal justice system, Occam's razor, and the appeal of conspiracy theories.
Learn more about Dr. Novella and his podcast The Skeptics' Guide to the Universe here: https://www.theskepticsguide.org/
Check out the Science-Based Medicine website here: https://sciencebasedmedicine.org/
The Skeptics' Guide to the Future: What Yesterday's Science and Science Fiction Tell Us about the World of Tomorrow by Dr. Steven Novella, Jay Novella, and Bob Novella: https://bookshop.org/p/books/the-skeptics-guide-to-the-future-what-yesterday-s-science-and-science-fiction-tell-us-about-the-world-of-tomorrow-steven-novella/18588557?ean=9781538709542
Read the other book from the Novella brothers: The Skeptics' Guide to the Future: What Yesterday's Science and Science Fiction Tell Us about the World of Tomorrow: https://bookshop.org/p/books/the-skeptics-guide-to-the-future-what-yesterday-s-science-and-science-fiction-tell-us-about-the-world-of-tomorrow-steven-novella/18588557?ean=9781538709542
Read the other books that Dr. Novella mentioned!
The Demon-Haunted World: Science as a Candle in the Dark by Carl Sagan: https://bookshop.org/p/books/the-demon-haunted-world-science-as-a-candle-in-the-dark-ann-druyan/6315853?ean=9780345409461
Why People Believe Weird Things: Pseudoscience, Superstition, and Other Confusions of Our Time by Michael Shermer: https://bookshop.org/p/books/why-people-believe-weird-things-pseudoscience-superstition-and-other-confusions-of-our-time-michael-shermer/11054136?ean=9780805070897
Nonsense on Stilts by Massimo Pigliucci: https://bookshop.org/p/books/nonsense-on-stilts-massimo-pigliucci/8283291?ean=9780226495996
Support The Murder Sheet by buying a t-shirt here: https://www.murdersheetshop.com/
Send tips to murdersheet@gmail.com.
The Murder Sheet is a production of Mystery Sheet LLC .
See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.
[00:00:00] Thank you so much to our sponsor, VyaHemp. These folks make delectable gummies for you to enjoy of the THC and THC-free CBD and CBN varieties. We love the CBD and CBN options. There's something for everyone, whether you want to relax, get high, become more productive,
[00:00:18] or boost your sleep routine. Whatever mood you're trying to create, Vya's got you covered. These gummies are delicious and legal to ship to all 50 states. If you're like us, you might need some help settling down at night to go to sleep. We can be
[00:00:32] downright jittery after a day spent running around looking into disturbing stories about crime. I've found Xen to be a nice THC-free option. CBN and CBD help me unwind as I try to catch some Zs. Vya's new Dreams formulations also can help with sleep.
[00:00:49] They allow for a fully customizable sleep journey, featuring 2, 5, and 10 milligram options, depending on the strength you're seeking. Head to VyaHemp.com and use the code MCHEAT to receive 15% off, plus one free sample of their award-winning gummies for people age 21 and older.
[00:01:06] That's V-I-I-A hemp.com and use MCHEAT at checkout. Please support our show and tell them we sent you. Get the rest you deserve with Dreams from Vya. This is an interesting moment in true crime. More and more people seem to be trying to spread
[00:01:23] elaborate conspiracy theories in an attempt to explain the world or try to diminish a favored defendant's guilt. We've begun to see this again and again, in case after case. And let's be clear, this wild talk is not just about true crime,
[00:01:41] and it is not just people in the audience who believe it. You may recall something we recently reported concerning Kara Wienecki, an attorney closely linked to the defense in the Richard Allen case. Ms. Wienecki has stated that she doesn't believe we landed on the moon.
[00:01:55] We laughed about that a bit on the program, but let's be honest, it is not funny when seemingly educated people believe and spread ridiculous nonsense. Dr. Keith Abrams Let me make just one more point about that.
[00:02:07] When someone says that we didn't land on the moon, it is easy for most of us to immediately recognize that that person doesn't know what they are talking about. But the problem is that
[00:02:19] most nonsense and lies are not as easy to spot as that. So how can we, as consumers of true crime, identify what is true and what is not? Kat Kerlin For that to happen, we all need to brush up
[00:02:34] on our critical thinking skills. And we definitely include ourselves in that because as you'll hear in this episode, there have certainly been times when the two of us have failed as critical thinkers. Dr. Keith Abrams To get some help with all of this,
[00:02:45] we decided to turn to someone whose work has taught me a great deal over the years. Dr. Steven Novella is an academic clinical neurologist at the Yale University School of Medicine, but I've come to know him best through his work in movements associated with skepticism
[00:03:01] and critical thinking. He hosts and produces the podcast The Skeptic's Guide to the Universe, and he's also written a book of that same name. We highly recommend both. He is also, as you will soon hear, an articulate and witty speaker. Kat Kerlin Critical thinking is a lifelong process,
[00:03:17] and frankly it isn't always easy. But it is better than the alternative. When we don't do it, we make it easy for people to fool us and take advantage of us. Looking at it that way,
[00:03:28] you can almost imagine critical thinking as a shield which can guard us against nonsense and lies. It is time we started brandishing that shield as we consume content around true crime. My name is Anya Kane. I'm a journalist. And I'm Kevin Greenlee. I'm an attorney.
[00:03:45] And this is The Murder Sheet. We're a true crime podcast focused on original reporting, interviews, and deep dives into murder cases. We're The Murder Sheet. And this is The Skeptic's Guide to True Crime with Dr. Steven Novella.
[00:04:45] Well, let's start by asking an obvious first question. Can you tell us a little bit about yourself? Yeah. So I'm Steven Novella. I am the host and producer of a podcast called The Skeptic's Guide
[00:04:57] to the Universe. We're in our 20th year, actually. So we've been doing that for quite a bit of time. We talk about everything to do with science, with critical thinking, with pseudoscience, skepticism, weird fringe beliefs, conspiracy theories, all that stuff. You mentioned skeptics. What is a skeptic?
[00:05:17] So we use the term skeptic to refer to scientific skeptics, which is a term that was popularized by Carl Sagan and others of his ilk. So the idea is that we want to believe what's true, what's actually verifiably true. We follow the epistemology of science.
[00:05:40] We think that science is the best way to figure things out. And we promote not only scientific literacy and a scientific worldview, but also critical thinking. You have to understand something about how our brains work and how we deceive ourselves,
[00:05:56] how others can deceive us. So understanding critical thinking is critical as well. And also media savvy. We live in a multimedia universe now. So you have to understand how to access information, how to evaluate and assess information in order to get to some belief that's
[00:06:14] more likely to be true than not to be true. That's the ultimate goal of things is to believe things as probably true only when they actually are. Yeah, we obviously are a true crime podcast. And right now in the true crime world,
[00:06:29] there's a lot of talk about elaborate conspiracies. In one of the cases we're covering, some defense attorneys were saying that the murder was actually committed not by their client, but like a religious cult of odiness. We see other sorts of conspiracies raised in the Karen Reid case.
[00:06:47] What is it about people that draws us to these conspiracy theories? Yeah, that's a big question. It's a lot of things. And there's so many sub questions in there, particularly when I talk about conspiracy theories.
[00:07:02] So I do think that there's a little conspiracy theory in all of us. There are some basic human psychological elements that do attract us to conspiracy theories. And a big one is something that psychologists call apophenia, which is basically a technical
[00:07:20] term for we see patterns. Our brains function basically by noticing and picking out patterns. Our brains also have another function though. I mean, literally this is parts of your brain that do this. So we are constantly sifting through all of the data out there in the universe.
[00:07:40] And this could be visual, it could be auditory, it could just be ideas, abstract, whatever, events connecting the dots. Our brain's really good at that. But we also have a reality testing filter. So it seems as best as we can, neuroscientists think that our brains are
[00:08:01] evolved to really oversee patterns. We see patterns everywhere. So we have a very sensitive pattern recognition. It's meant to see patterns that aren't even there. And then we filter out the ones that are probably not true with the reality testing circuitry in our brain.
[00:08:22] So conspiracy thinking, part of it is, again, and this is actually a big research question. Do people who tend to believe in conspiracy theories, are they overseeing patterns or are they under filtering them? At what end is the problem occurring? I don't know that there's
[00:08:41] a definitive answer to that. It kind of depends on your research paradigm and how you look at it. But it does seem that some preliminary evidence suggests it's probably an under filtering pattern. We all see lots of patterns, but they don't filter out the not real ones. So
[00:08:57] you might notice some bizarro coincidence. I've seen the number seven so many times today. And most people will recognize that as just a coincidence. But some people will be like, the universe is speaking to me. There's some magical thing going on because it can't be a
[00:09:14] coincidence. I see this pattern and the pattern speaks to me at a primal level. And so I think that's kind of where conspiracy theories come from. The truth will set you free.
[00:09:25] We live by that on the murder sheet. We're always looking to get at the truth when we cover criminal cases, when we're parsing through legal documents and stories from survivors and detectives and attorneys just trying to get the full picture.
[00:09:38] So you can imagine why we love to listen to Brittany Ard's Quest for the Truth on the new podcast, You Probably Think This Story's About You. Britt is all about getting an answer to a deeply personal question.
[00:09:51] What if the person you thought was your soulmate never really existed? After a chance meeting on the Hinge dating app, a man named Kanan stole Britt's heart. She fell hard for him, but he ended up dragging her into a web of lies.
[00:10:06] The Kanan she came to love was an invention, a ghost. Britt's journey to piece together this disturbing mystery isn't just compelling. It's a raw look at self-discovery and the power of coming together to form a community through shared grief and trauma.
[00:10:22] Listen and follow You Probably Think This Story's About You wherever you listen to podcasts. But there's a number of other psychological sort of self-reinforcing aspects to it as well. You know, conspiracy theories perpetuate themselves because by design they are insulated
[00:10:40] from disproof, right? So once you're inside a conspiracy, any bit of evidence that contradicts the conspiracy, well, that was planted, right? That's a false flag operation. And any bit of evidence that's missing, well, that's a cover-up. So there's literally no way you can get out of the
[00:11:02] conspiracy theory once you're fully in it. It's an insulated belief system. You know, that's why they tend to perpetuate, like why do people still believe we never landed on the moon when the
[00:11:13] evidence is overwhelming? I mean, it's overwhelming that we landed on the moon. And then, you know, the, you ask, well, you know, to just come up with a coherent explanation for how it all happened.
[00:11:24] And they really can't. It's just, they do a couple of things. They do what we call anomaly hunting, right? So they look for things that are weird and then they just weave some sinister story around
[00:11:36] these apparent anomalies. Why isn't the flag waving? Whatever, just, you know, they, or whatever they come up with. And again, that's where the apophenia then comes in, where they connect those dots. Then you have your conspiracy theory. And then once they're inside the conspiracy theory,
[00:11:52] they're done, right? Then there's no way out. Yeah. We've actually seen instances of cases where people say the evidence here is too good. It has to be a frame job. It definitely seems like a self-fulfilling prophecy for some of these folks. I'm curious, you know, broadly speaking,
[00:12:07] you've been part of the sort of skeptical world for years. In your kind of general view, do you feel that people are becoming more skeptical or less skeptical? Yes. I do think it's both at the
[00:12:20] same time. There's objective evidence to back that up. I mean, if you look at, you know, at any component of skepticism, scientific literacy, people are actually getting more scientifically literate. It's still bad. I'm not telling you that, I'm not saying it's good,
[00:12:35] but it's like, oh, we've gone from 18% to 24% of people in this specific test of scientific literacy that was done over 20 years. So it's incrementally getting a little bit better. People's IQs are actually been increasing by about three points per decade for the last
[00:12:53] 60 years, as long as we've been measuring them. Not even sure why that is, but, you know, but that's happening as well. I think people have a lot more information, right? We process a lot more information. And so people, you know, I think are generally more savvy,
[00:13:09] but at the same time, the forces that are trying to deceive us are also getting better, getting really good at creating a narrative and then fine tuning that narrative so it pushes all of our emotional buttons and then creating an information ecosystem that locks people into
[00:13:32] those narratives, right? So you get to the point where you don't even realize that the information that you consume is being curated for you by, you know, massive organizations and people who sounds like a conspiracy theory, right? But this is actually happening because, you know,
[00:13:51] the market forces and media forces are real, you know? I don't think it's one guy in a room doing all this. Some of it is organic. Some of it is just, this is what market forces do. But sometimes,
[00:14:04] yeah, you do have like the head of Fox News who has a very specific editorial policy that has a very specific purpose behind it and literally curating the news for an audience and for a
[00:14:18] narrative. And if you're not aware of that, you think it's just the news, you don't realize that your reality is being created for you, right? This happens in totalitarian countries obviously, but even in an open country like the United States or, you know, most Western countries,
[00:14:35] we still have people feeding us information. Unless you really make an effort to get a wide variety of sources of information and double, triple check everything, right? Unless you're like a really actively skeptical of everything you come across, chances are you are being herded
[00:14:53] into a certain narrative by people who want to sell you something, people who want you to vote a certain way, people who want you to feel or believe a certain way. And again, I think a lot
[00:15:04] of it is organic. It just emerges from the culture, from society and the institutions and people doing their own little thing, but it still can be a very powerful force. And so that's why
[00:15:17] you get to the point like in the United States today, we think things are really polarized. And this has been again, well-established. It's not just my opinion. This has been researched. It's well-established that people of certain political end of the spectrum literally consume
[00:15:32] different information with almost no overlap, like no overlap compared to people of the opposite political persuasion. Like we're literally consuming two completely different subsets of information. So of course we're polarized and neither side could understand the other.
[00:15:49] Both sides think the other side's crazy, right? Because how could they possibly believe that when there's all this information that is going in the other direction? It's like, yeah, but... And both think that the other side's being lied to. I think this existed hundreds of years before
[00:16:08] social media, but social media ramped it up by an order of magnitude. And we'll see what happens. I hope things will sort themselves out and reach some sort of equilibrium point, but nobody knows at this point in time. We're just sort of taking it as it comes.
[00:16:24] What you say about people subscribing to and getting their information from completely different sources with different versions of reality, we see a lot of that in the true crime world as well. And I guess I'm curious, with all these sources of information people have,
[00:16:40] how can they use critical thinking or the principles of skepticism to figure out where the truth is? Yeah. So I wrote an entire book about that. So the Skeptic's Guide to the Universe book
[00:16:53] is a primer on how to do that. How do we know what's real in the world today with so many sources of information? One of my favorite examples is Steve Jobs. One of the richest men
[00:17:04] in the world, top of an information company, had every resource to everything in the world, diagnosed with a very treatable, very survivable form of cancer, and yet made personal health decisions based on misinformation and ultimately died. Now we don't know for sure what would have
[00:17:23] happened. It wouldn't have made a difference. I can't say that that affected the outcome, but it absolutely could have. And at the very least, we know that he delayed proper treatment and science-based treatment, and that may have been critical in his outcome. How did that happen?
[00:17:40] How did Steve Jobs not have access to the right information? And it's because, again, we're living in this world of different narratives. And once you follow one narrative, it seems like the truth
[00:17:53] to you. So there's a few good rules of thumb if you want to be more skeptical. Again, I can't tell you how to do that. It's a lifelong project. But there's a few good tips on how to be skeptical,
[00:18:04] I guess. So one is question everything, obviously. Don't take anything at face value. If it's important, the more important it is, the more you should question it. And how do you question it? You should especially question things that fit your narrative. That's the thing that people don't
[00:18:21] do. We instinctively believe and accept things that fit our narrative, our beliefs, and question things that don't fit our beliefs. That's confirmation bias. Confirmation bias is powerful, massively powerful. It makes us think that there's so much evidence to support my view.
[00:18:38] It's like, yeah, that's because you're only looking for and accepting the evidence that does, and you're rejecting or overlooking or dismissing the evidence that doesn't. So do the opposite. Be especially skeptical of your own beliefs. And when information confirms what you believe,
[00:18:55] that's the information you need to be most skeptical of. And then in terms of sources, again, the big rule of thumb is always go back to the primary original source. Don't accept somebody else's summary of what a piece of information is or whatever. Go back to the
[00:19:12] primary source always as much as you can. It should be a reflex. I hear something in the news. I'm like, oh, I wonder where that information came from. You trace it back to whatever the original
[00:19:21] source of the information is. You often find it's something very different than this narrative that was created for you or the way it's being used. It's never the same, right? It's always different. It's always more complicated. It's never as clean or as simple as is being presented.
[00:19:39] But you'll find that, okay, now I have a much clearer idea of what actually is going on. I'm seeing where the consensus of opinion is, try to get to objective sources as much as possible.
[00:19:48] Find out what the other side is saying and why and see who has the better evidence, the better argument. It's a process. There's no one trick to it. It's a process. Again,
[00:19:58] the more important it is to you, the more effort you should put into it. If you're making a life or death health decision, I would put a lot of effort into trying to figure out how reliable
[00:20:08] the information is that you're seeing. So if you at least have a process, don't just go with the flow of this is my tribe, this is what we believe. This is the information that supports
[00:20:19] what we believe. Everyone else is crazy and that's good enough for me. Chances are you're probably not getting to reliable conclusions if that's your process. Just believe whatever the tribe says, whatever the narrative is, probably didn't just happen to fall into the one true narrative in
[00:20:37] the world, right? Yeah, that would be pretty lucky. One thing you mentioned I want to go back to, I thought was really interesting with Steve Jobs. Obviously, we kind of all regard him
[00:20:50] as a very smart man, a very visionary man with technology business. So is being a good critical thinker the same thing as being intelligent and smart or can you be an intelligent and smart person and terrible at critical thinking? Yeah, that's a great question. So intelligence
[00:21:04] has many different things and again, neuroscientists, psychologists, whatever, don't even like to use the term very much because it's a loaded term, IQ testing, what are you actually measuring? What is it? And we recognize that it's multifaceted
[00:21:20] and you absolutely could be intelligent in one way and not in another because there's different skill sets that we would all think of as, quote unquote, intelligence. Some psychologists will use terms like emotional intelligence to represent your ability to pick up on social
[00:21:39] cues and think about interpersonal relationships and things like that. And certainly there are people who may have very high engineering, pragmatic, mathematical, whatever intelligence who have very low emotional intelligence. And absolutely critical thinking is its own skill set.
[00:21:55] It is a very involved skill set. It's not easy. It's very hard to do. It takes vigilance. It takes the ability to look at your own beliefs and go, I could be wrong. I am wrong. I know I'm wrong.
[00:22:07] The question is how wrong am I? And in what way am I wrong? And how can I become a little bit less wrong by trying to be open to new perspectives, new information, whatever. Does that answer your
[00:22:19] question? It did. Yeah. It's important. Correct me if I'm wrong, but it almost sounds like critical thinking is a bit like a muscle. If you're not using it and applying it again and again,
[00:22:30] it can atrophy and not be super helpful. It's something you should be applying across your life, which may be hard for some people. Yeah. It's a pattern of behavior too. It's what we call metacognition. So you're thinking about your own thought process. You're thinking about it
[00:22:45] critically. You're always trying to second guess yourself and prove yourself wrong in a way. And those ideas which survive your attempts to prove it wrong, all right, that has some value. I can't immediately disprove it. If you really want to go on this skeptical journey,
[00:23:03] it's not just saying, all right, I'm going to be skeptical. I'm going to question things. It's a massive skill set. Psychologists have been studying this for 100 years. Neuroscientists now we know a lot more about how the brain works. There's a lot of literature. You can go
[00:23:18] get deep into the weeds just on conspiracy thinking, for example, or just on how do you tell real science from pseudoscience or what's the actual difference about the philosophy of science, neuropsychological humility, like how your brain deceives itself and instructs reality
[00:23:35] and how your memory is flawed. There's so many different sub areas of information and you could fall victim to any one of them. One of the things as skeptics, we remind each other and ourselves,
[00:23:46] the ultimate bias, the ultimate way that you deceive yourself is thinking that you're a critical thinker and therefore you can't be deceived. Once you think, oh, I've arrived, I'm a critical thinker, I'm a skeptic. No, you can't fool me. That's the easiest person to
[00:24:03] fool is the one who thinks they can't be fooled. Ask any magician that, they will tell you that. The person who thinks they're too smart to be fooled, that's the easiest person to fool.
[00:24:13] They love scientists as an audience. We see this all the time where you have a scientist who's brilliant, brilliant scientist who falls for complete chicanery because it's not their skill set. It's a different skill set. They're not used to nature actively trying to deceive them.
[00:24:35] So they don't know how to deal with that. It's 100 skill sets and any one of them could trip you up if you're not aware of it. It's a journey you will never get to the destination of. You have
[00:24:49] to constantly be questioning and trying to improve how you process information and questioning your own biases. Even for somebody who's been doing this my entire adult life, there are just biases in the way that we look at reality and the value judgments that we make.
[00:25:07] It's really hard to weed that completely out of your thought process. We're biased in ways we're not even aware of. So you always have to be first and foremost open to the possibility that
[00:25:20] you're wrong and just always try to get a slightly more sophisticated version or nuanced version of what you think about things. Obviously, from what you're saying, it's clear that it's not easy to be
[00:25:33] a critical thinker. It's a process. It takes a lot of hard work. So I'm sure some people in our audience might be wondering, what makes it worth it? What's the harm if I want to believe
[00:25:45] we never went to the moon? Yeah. So believing in magic is a really dangerous thing. One aspect of organized skepticism, a massive aspect of it is consumer protection. There's many layers to that as well. Ultimately, again, I mentioned Steve Jobs. Steve Jobs arguably died prematurely because he
[00:26:06] fell for this natural is better narrative that sounds good superficially. But it's total BS when you really dig deep into it from a scientific and critical thinking perspective. But it sounds good
[00:26:18] and it's very pervasive, especially in California. So it fit with the culture that he was embedded in. So we make decisions individually as a family, as a group, as a society all the time, informed by science and critical thinking. Should we be investing in green energy or is
[00:26:41] global warming a hoax and a conspiracy? Can I trust institutions? Can I trust the FDA to evaluate medicine? Should I take medications or are they all poisons that are going to kill me because the pharmaceutical industry is trying to keep me sick? Which one of these things do
[00:27:00] you believe? So we make these life or death decisions all the time or decisions that have massive effects on our life, on our prosperity, on our society where critical thinking and scientific literacy is absolutely critical. So if you are not... People ask that question, well, what's the
[00:27:20] harm? I say, well, what's the alternative to being a critical thinker? It's the opposite of being a skeptic, being a critical thinker is being gullible. But nobody will say, yeah, I'm gullible. I'm okay with being gullible. But that's really what you're saying, right? If you're not a critical
[00:27:35] thinker, you are gullible. That's what the word means. And that means you're vulnerable to any con artists out there. And there's lots of con artists out there. There's 8 billion people in this world. 1% of them are psychopaths or sociopaths. That's a lot of people. There's
[00:27:53] a lot of people out there that have no compunction about stealing all of your money. We're swimming in scams right now, right? I mean, I get scam emails and texts every day. It's like everybody does phone calls. It's constant, constantly being assailed by people attempting to con me
[00:28:11] in one way or another. You have to be skeptical in today's world. You have to be, otherwise you're vulnerable as hell, right? For my next question, I'm going to out us our embarrassing critical thinking failures. I know when I was... We all have them.
[00:28:27] I'm going to ask you yours, but I want to do ours first so you don't feel like we got to. When I was just a consumer of true crime, I would disagree with whatever podcast or documentary I was
[00:28:38] watching unless it was really egregiously bad. And over time, I'm just taking in this information uncritically. I'm a journalist. I'm a history major. I would never assess anything like this at work, but it's like, I like this podcaster, so they must be right. So that's my embarrassment.
[00:28:54] You want to tell them yours? Years ago, I used to be... I fell prey to Kennedy assassination conspiracies. Huge waste of time, but analyzing them did teach me a lot about
[00:29:04] critical thinking. So I want to ask, is there a time in your life where you've kind of failed in the critical thinking space? Oh yeah. I mean, when I was younger, like before I was a teenager and into my 20s, I believed in everything. I believed in UFOs,
[00:29:20] Bigfoot, the whole thing. Basically, if it was on In Search Of with Leonard Nimoy, I like that. Spock is telling me that this is correct. And the thing is, I was a total science documentary junkie.
[00:29:36] I would watch any science show on TV. And the pseudoscientific ones were just as slick, just as compelling. It was adults with some authority figure saying these ancient astronauts made these lines in the desert, which I believed all of it until I started to learn more science
[00:29:54] and more critical thinking. And the first one to fall was the whole UFO thing, like aliens were visiting the earth. And then once the first domino goes and you realize, all right, this is bullshit.
[00:30:06] And yet there's this whole infrastructure, a whole ecosystem of belief in it. So if that could be a lie, that could all be just self-deception and pseudoscience. How do I know that any of these
[00:30:21] other things that I've been believing are correct? Then you start to examine things one by one and all of the pseudoscience just collapses one after the other. But then you get to more and more
[00:30:32] nuanced things. So we start off as what we in the community call Bigfoot skeptics, which is meant to be a derogatory term, but it's really perfectly legitimate. It's just that's kind of where you start off. The low-hanging fruit is, yeah, there's no breeding population of giant
[00:30:50] primates in North America that we have not been able to find for the last 60 years. Maybe you had a point when it was brought up in the 1960s, but I mean, come on, it's like age here, whatever.
[00:31:03] How many years later, there's this seven-foot primate living in Oregon and no one's been able to find a single piece of convincing evidence? It boggles their mind, right? There are still shows about finding Bigfoot, though. It's amazing. So anyway, you start to think about things like that
[00:31:24] and you realize how deception gets built into the culture in so many ways. But those were mine. But it was a good experience. I think I'm a better skeptic because I was on the other side of it at
[00:31:37] some point. I kind of know how people think and how they get into that and how you sort of cocoon off your beliefs and dismiss skepticism until you can't anymore. But also just the ability,
[00:31:51] it's very liberating. It's very freeing once you realize I don't have to believe anything. I can decide for myself what to believe based upon logic and evidence. And in a way, it is extremely freeing. You use the word pseudoscience in case some members of our audience aren't
[00:32:11] familiar with that. What is pseudoscience and how is it different from science? Yeah, so yeah, it's another good question. So first of all, there's what we call, what philosophers call the demarcation problem, which is a fancy way of saying there's no sharp line
[00:32:25] between science and pseudoscience. It's a continuum, right? And so it's more like what are the features of good science versus pseudoscience? And the more features of pseudoscience you have, the more you are towards that end of the spectrum. But sometimes even legitimate science
[00:32:41] will use some kind of squirrely techniques. And you know what I mean? So it's not these two clean, sharp categories. But the features of pseudoscience are basically they're going through the motions or like pretending to do science, but they're not doing the real spirit of science.
[00:32:58] So the big thing is that pseudosciences generally start with the conclusion and they work back from there, right? Unfortunately, like in the legal system, like the true crime area, that's a lawyer's job, right? The lawyer's job is to start with the conclusion and to work backwards.
[00:33:13] So in a way, like it doesn't surprise me that a lawyer is defending a conspiracy theory. They don't care. They don't even have to believe it. That's not their job. Their job is to make
[00:33:22] whatever defense they can for their client, right? But if you're doing science, you can't do that, right? You can't start with the conclusion and work backwards. They do things like they only look for evidence that supports their hypothesis rather than trying to disprove their hypothesis,
[00:33:36] which is what a legitimate scientist would do. They do things like dismiss evidence that doesn't support their hypothesis or they find reasons to ignore it or dismiss it. And their methods are terrible, right? They just don't use good double-blind controlled methods. They will
[00:33:53] use terms differently in different contexts. They won't be measuring things properly. They make so many mistakes that basically their data is meaningless or uninterpretable or they can twist it to say whatever they want it to say, right? So they're not really asking
[00:34:12] questions. They're just twisting the whole process and the data to fit their conclusion that they want to have. I was curious. This is something that sort of I think when I look at history sometimes you have instances where experts, people who are trusted either in politics or
[00:34:32] science or medicine, any field really, do betray trust or betray the public's trust or just make a bad call. I think of pushing opioids in the early 2000s, right? They were kind of a miracle
[00:34:45] drug and then all of a sudden we have an epidemic. So people I think right now especially are very skeptical or maybe skeptical is the wrong word but very much dismissive of experts. How when we're being skeptical, when we're applying critical thinking, can we hear out
[00:35:03] experts without necessarily dismissing them but also leaving room for I guess not believing everything wholeheartedly? I mean what's the balance? It's kind of a big question I suppose but what are your thoughts on that? Yeah. That's the trick isn't it, right? Is knowing how to
[00:35:17] respect expertise but not idolize an individual expert, right? So one way is to not invest authority in one person, right? Or one group or one institution. You want to as much as possible rely upon a consensus of opinion among many different individuals as much as possible,
[00:35:41] right? So any person could be wrong and even any scientist could make a bad call, could have a bias, you know, could have a conflict of interest, could get desperate and cheat because he thinks he knows
[00:35:53] what the answer is but he's having a hard time getting the data to do what he wants it to do, whatever happens. So there's never any authority in one individual. But if you have a broad consensus
[00:36:05] of many different individuals bringing to bear independent lines of evidence, right? So like at the other end of the spectrum, right, you have hundreds of experts with hundreds of studies, thousands of studies, multiple independent lines of evidence all pointing towards the same
[00:36:22] conclusion. That's pretty rock solid, right? That's pretty reliable. And then of course there's everything in between. It's a spectrum and you have to make a decision like what are people saying about this? How controversial is the claim? How politically charged is it, right? If it's a
[00:36:37] pretty mundane claim that's not a political football at the moment and you have somebody who is clearly a recognized expert who's making pretty ordinary claims about something, they're probably accurately reflecting the evidence. Although even then you're getting their view
[00:36:52] of the science, right? You'd always want to know do other experts agree with you or, you know what I mean? And that could be on anything. Even there's lots of just pure scientific controversies that don't deal with pseudoscience or anything else. Just did an asteroid wipe out the
[00:37:07] dinosaurs or was it something else? There's experts who have the minority opinion that no, it was the volcanic eruptions in the Deccan Traps and the asteroid was incidental or whatever. But 95% of scientists are saying that it was the
[00:37:24] asteroids. So maybe that's probably true. It's always good to know what the minority opinion is and to recognize that there is one and there's debate about it. So I say it's tricky but again, big rule of thumb is trust in consensus more than any individual because individual opinions
[00:37:40] could be quirky, they could be biased, they could be flawed. Anybody could be wrong on any given day and you just got to make a judgment call based upon that but also be open to change as the data
[00:37:51] changes, that's expert opinion changes. But also avoid the temptation to reject an expert because you don't like what they have to say because it conflicts with your narrative, your tribe, your worldview. Don't cherry pick your experts. It's very easy because there's so many
[00:38:07] people out there with varying degrees and levels of expertise. You can find a quote-unquote expert to say anything. So you shouldn't just pick the ones, don't start with the conclusion and then pick your expert to support that conclusion. That's not going to get you to the right answer.
[00:38:26] You want to just say, start with the experts. What are they saying? Is there a consensus? How solid is it? Who disagrees? Why do they disagree? Maybe we just don't know the answer at this point
[00:38:38] in time. But again, there's a process. If you follow a scientific objective process and you're more likely to get to an answer than this is what I want to believe, this guy over here agrees with
[00:38:50] me, well, there you go. I have an expert who agrees with me so I'm right. Yeah, it's scary because that can lead to wrongful convictions. We actually recently covered a case of a man who was accused
[00:39:00] of a crime. Prosecutor thought he did it. They find an expert to say blood splatter. Turns out it's just total pseudoscience. He was eventually acquitted after a series of appeals but it's actively dangerous when it gets applied to the legal system. I also wanted to ask you
[00:39:17] about something else which is logical fallacies. I find it useful when evaluating people's argument to look and see if they're using any logical fallacies. What are some of the logical fallacies that people use? Yeah, there's a chapter in our book just on logical fallacies. There's a lot of
[00:39:34] them. We have an article on our site, the top 20 logical fallacies. But there's a lot of them and there's more or fewer depending on whether you're a lumper or a splitter. But there are some basic ones. The mother of all logical fallacies is the non-sequitur which just means
[00:39:51] that the logic doesn't follow. Anytime you make a conclusion that does not follow from the premises, it's a non-sequitur. To clarify, these are informal logical fallacies. Informal logical fallacies mean they don't say anything absolutely about the conclusion. It's just a good rule of
[00:40:09] thumb as opposed to the formal logical fallacies where they're 100% always incorrect. If you say one equals two and two equals three, therefore one equals three. That's a formal logical fallacy. It's math. It's always wrong. But informal logical fallacies more like
[00:40:28] it's just a clean way to think versus a sloppy way to think. If you say, for example, this is true because this one expert over here says it's true, we call that an argument from authority.
[00:40:41] That one expert could be wrong. You can't say it has to be true because this one expert says that it's true. Or you could say, well, this may be wrong but that person's also doing it
[00:40:51] wrong so it's okay. It's the two-quoque logical fallacy. You can't say it doesn't. You could both be wrong. The fact that some other guy is doing it wrong doesn't mean that it's magically right for
[00:41:01] you to do it. You could make what we call an argument from final consequences. This is wrong because if it were true, that would be bad. It's like, well, it would be bad. It doesn't mean it's
[00:41:10] not true. Anyway, there's a huge list of them. The one thing I always like to remind people about that is they use the logical fallacies as a tool, not a weapon. So when people learn the logical fallacies, the first thing they do is use them as a
[00:41:29] weapon against other people. But that's not really what they're for. They're for you to police your own thinking, to make sure that you're thinking in a clear and logical manner, that you're not falling for these mental shortcuts that may superficially make sense but are not
[00:41:45] logically valid. Then we actually call it, there's a fallacy for that called the fallacy fallacy. It's like, oh look, I can frame what you said as if it was a fallacy, therefore you're wrong. It's like, no, you could twist anything. These are informal logical fallacies. I could say,
[00:42:04] well, 97%, 98% of scientists think that the planet's warming because of man-made releases of CO2 and someone else say, that's an argument from authority. Well, not really. The scientists are basing their opinion on evidence and analysis. I'm just telling you what all the experts think,
[00:42:23] but their thinking is based upon evidence. Anyway, that's a legitimate reference to authority as opposed to an argument from authority fallacy, which is this one guy that I found agrees with this opinion. Or an ad hominem is another logical fallacy where you're wrong because
[00:42:43] your breath smells, whatever. But sometimes saying this guy's a convicted con artist is a legitimate thing to point out. I wouldn't put any trust in this guy in what he's saying. He was
[00:42:55] convicted of lying to con people out of things. But you could say that's an ad hominem attack. Again, you can frame anything as a fallacy if you try hard enough. That's why these are
[00:43:08] informal logical fallacies. It is okay to put history into context as long as you're not saying he's wrong because he's a con artist. You should say he's wrong and he's a con artist, which is
[00:43:22] probably why he's saying this. But you want to then say, here's the reason, the factual reasons why I think he's wrong. Logical fallacies are tricky to use. They're easy to, again, deceive yourself into thinking, I'm a critical thinker because I could name logical fallacies.
[00:43:38] Try the best way to approach this and use them as tools to help you think more clearly. Don't just use it as a weapon because it's so easy to abuse. That's your only point is a gotcha
[00:43:50] in a debate. One more point on this is that when you are having a discussion with somebody about something, if you take the debate approach, I'm going to prove you wrong, that's not really going
[00:44:04] to get you very far. Because again, if that's your goal, if you're going to lawyer the topic, if that's your goal, you can frame everything as a fallacy in a sinister way, whatever.
[00:44:18] But if your goal is let's both figure out what our common ground is, try to build what we actually know together, examine both of our positions to see where the facts align and where maybe
[00:44:32] we disagree and then figure out what the right answer is, not who's right, but what's right. It's a much more useful approach. I'm curious, you mentioned some of the areas where trying to
[00:44:45] adopt more critical thinking could even be a bit of a pitfall for some people because they're almost adopting the trappings rather than the real core ethos. And I'm curious, do you have any tips for
[00:44:56] somebody who wants to get started trying to apply more critical thinking in their lives? Where to begin without falling into those traps? Yeah, first you buy my book, The Seven Shades of the Universe and read it twice. No, I mean, it's a primer. We wrote the book because
[00:45:11] people ask us that question, how do I start thinking critically? It's like, well, here's your primer. This will lay it all out for you. And there's other ones. Demon Haunted World was
[00:45:19] a good one to start with. We kind of wrote our book as an updated version of the Demon Haunted World. Why People Believe Weird Things by Michael Shermer is still a great sort of primer book.
[00:45:30] That's more from the, I think the 90s. So every now and then somebody writes a book about this. Nonsense on Stilts by Massimo Pagliucci. There's a lot of great books out there that go over science versus pseudoscience, critical thinking skills, basic skills packaged in slightly different ways.
[00:45:46] They're all good. There's many good books out there. So that's always a good place to start. There's like a book level, here's everything. There's a lot of activists, science communicators and skeptics out there who are breaking down the news, science, critical thinking, pseudoscience
[00:46:04] from many different perspectives. So that's good. I find it very useful just to get into discussions with people. And again, but with the approach of let's figure out what's right.
[00:46:17] How do we know what's right? And let's go through a process and try to figure out if we can figure that out together. I especially love talking to people with whom I disagree. It's kind of boring to talk with people that agree with everything that I think.
[00:46:32] But if I talk with somebody who has a completely different viewpoint, I want to know why do you think that? What thought process led you to that? Why do you think I'm wrong? Can we make any progress
[00:46:45] figuring, resolving our differences? You learn a lot. Sometimes even if you know something is correct, it doesn't mean you could defend it against a dedicated attack or a dedicated attempt at proving it incorrect. Framed in another way, knowing the science of something doesn't mean
[00:47:04] you know the pseudoscience automatically. So historically, a great example of this is that creationists made ... There were several creationists, multiple creationists who tried to make their career debating evolutionary biologists about creationism and evolution.
[00:47:22] Dwayne Gish is the most infamous of them. As a debate, he tended to kick their ass. The scientists lost because they would go into it thinking, well, I know way more about evolutionary
[00:47:35] biology than this guy does, so I could handle anything. But what they didn't know was the pseudoscience of creationism. They didn't know what arguments were going to be levied against them and
[00:47:46] how facts were going to be twisted and how logic was going to be subverted. So they weren't prepared for that, and they just got overwhelmed by that. But if you actually get into a conversation with
[00:47:59] them about it, you realize, oh, this ... It's almost like an investigation onto itself. It's a forensic examination. Where is their thought process going wrong? Or where is my thought process going wrong? If that's your approach, you learn a lot of critical thinking from doing that.
[00:48:18] Yeah, I love that kind of collaboration. I'll tell you, I mean, I've kind of distrusted debate ever since. I was in a college course where we had a debate on whether or not, basically from the
[00:48:29] concept of the Byzantine Empire, whether or not the art should be destroyed. We were on the anti-art side. It was like, how are we going to win this? This is a class where we all love
[00:48:37] Byzantine art. I just got in their faces and just started accusing them of writing biblical fan fiction, and we're all going to be punished by God. We won somehow just because I was yelling louder. It just kind of underscored, obviously, basically whoever's louder, savvier, slicker,
[00:48:55] is not necessarily the person who's ... Yeah, debate is its own skill set. You could be a really good debater even if you don't have facts and logic on your side. It's a performance more
[00:49:07] than anything else. The courtroom is very much a performance as well. I've had many interactions with the legal system myself as an expert witness. I was sued at one point for an article that I
[00:49:19] wrote. We won. We got a judgment against. I think I had to actually pay most of my legal fees. But you learn a lot about the legal system through those various interactions. The way the legal
[00:49:31] system is set up, as you guys know, it's not like let's all figure out together what the truth is. It's an adversarial system. You do everything you can to prove this guy guilty. You do everything
[00:49:42] you can to prove him innocent. There's strengths and weaknesses to that system. The weakness, I think, is that it sort of encourages that approach of this adversarial approach. And I've sat across from lawyers in depositions or whatever where they made arguments like,
[00:49:58] I know you know that that's bullshit. They don't believe that for a second, but it's the argument to make for their side. So they do it. The way they rationalize that, it's like, well,
[00:50:11] the jury or the judgment, they'll sort it out. We're just doing our part. And it's true. The system is set up that way. I'm not blaming them. That's the system. We need to decide if that's the
[00:50:22] system we want. I don't know that that's the optimal system, but it's the system we have. So within that system, they're playing their role. The good thing about the legal system, though, the thing that's really a strong point and the reason why it works is because there are
[00:50:35] rules of evidence. So all the logical fallacies that I'm talking about and all the evidentiary stuff, there are very strict rules of evidence in a courtroom. Again, they may not be perfect.
[00:50:48] They may not be complete. I think they have a lot of issues with how science is introduced into the courtroom. But at least there's rules of evidence. You can't just bullshit your way
[00:50:59] through a case. You have to have sources for your claims, whatever. You can't introduce ideas that have not already been established, whatever. There's a lot of shenanigans that you cannot do
[00:51:13] that a competent judge would not allow you to get away with or a competent attorney on the other side will know where to object. They're breaking the rules of evidence. You can't do that.
[00:51:23] So that, I think, is the strength of the system. The adversarial part is kind of a plus minus. And the relationship with science is, I think, weak. It needs to be strengthened. I want to underscore what you said there about attorneys sometimes knowingly making
[00:51:40] arguments they know are false just because that's their job. And I think people need to remember that and keep that in mind when they hear arguments from attorneys. My attorney told me that. It's like, I don't believe this, but this is the point that we could—
[00:51:58] And he said, I don't have to believe it. He said, from a legal ethical point of view, I don't have to believe it personally in order to say it in court. It just has to
[00:52:08] be reasonable. Somebody might believe this or this. It's a reasonable approach to take. I wouldn't personally endorse it. I don't have to. That's not my job. That's why they could say, even though I think you're guilty, it doesn't matter. I'm presenting a case, and it's for
[00:52:23] other people to decide if you're guilty or not. Another area where people often fall prey to things that aren't true, probably because of wishful thinking, is in areas of health, because we all like to believe in miracle cures or what have you. And this is a leading question,
[00:52:43] but what can people do if they want to look and find accurate information about scientific-based medicine? Yeah, I run a website called Science-Based Medicine. A very leading question. It's taught there's a very complicated relationship between science and the practice
[00:53:00] of medicine. And that's exactly what we could explore, how to optimize that relationship, how to make decisions based upon the best science and evidence available. It's complicated, is the short answer. But as a consumer, again, there's a process you can go through. And
[00:53:15] unfortunately, you have to make health decisions unless you are a physician. And in fact, unless you are an expert in whatever the specific field is that's relevant to your condition, you have to rely on other people who know more than you. That's just like these days,
[00:53:34] no one is an expert on everything. You drive over bridges. Did you investigate the engineering of that bridge to make sure that the ratio of the width and the width of all? Of course not. You
[00:53:44] trust that some civic engineer knew what they were doing, that the regulatory agency made sure that they knew what they were doing before they licensed them, and that whatever commissioned the bridge made sure that they found experts, whatever. You trust, you have faith in the process,
[00:54:00] in the transparency and the whatever, in the expertise of the people involved. The same isn't true in medicine. There's a process. We go to medical school, you get licensed, you get board certified, you get privileges at a hospital. These are all multiple different layers
[00:54:15] of trying to say that, yeah, this person is competent, knowledgeable, and ethical. Those are the three big things. If you violate that, you can get sued, you can get your license taken away
[00:54:27] by the state. There's remedies for people who fall below the standard. As a consumer, you have to have a certain amount of faith in that system. If you don't have any faith in that system,
[00:54:37] you're living in a very dark world and I don't know how you get through your day. Again, this doesn't mean it's perfect. There are clunkers out there, absolutely, but at least there's a process. Again, how important is the ... Do you have a cold or
[00:54:53] you have terminal cancer? How serious is the illness? For really big decisions, get a second and a third opinion. People should know how to evaluate at least the background of a physician. Are you board certified in this specialty? That's like a first layer of do you
[00:55:13] have sufficient expertise? Then if you think that there's ... You don't feel comfortable with the decision or whatever, you want to make sure that you're making the right decision. If someone's recommending surgery or whatever, get a second opinion or get a third opinion.
[00:55:30] I also tell people if the doctor starts saying crazy stuff, you might want to not ... Sometimes they're selling homeopathy out of their office, leave. I'm not somebody that I would trust. Go through that same kind of process of evaluating experts. You can get to the point where you're
[00:55:47] like, yeah, this is pretty much ... Everyone's telling me the same thing, even very credentialed experts, so it's probably correct. Sometimes patients fall into this trap of doctor shopping where it's like, again, pick your expert, keep going until you find somebody who gives you
[00:56:03] the answer you want. If that's your process, that answer is probably not reliable. It's probably just what you want to hear. Then you end up like Steve Jobs. Then you end up doing the thing that they're
[00:56:14] telling you to do because it sounds good, may not give you the best outcome. We know this scientifically because you could study it. You could say, when people do this process, what outcomes do they have? The more you go outside the lines, the worse your outcome.
[00:56:31] It actually does affect the medical outcome. I want to ask you something just because this is a term that gets thrown around a lot in true crime especially, that the concept of Occam's razor, the simplest solution is often the best. I guess as a skeptic, as someone who
[00:56:47] practices critical thinking, what do you think about Occam's razor? What are the flaws or is it a pretty good paradigm? It's a good paradigm, but you misstated it because everybody misstates it.
[00:57:00] It's not the simplest answer is the most likely to be true because sometimes the real answer is very complicated. You could invent a simple answer that's complete horseshit. It's lost in translation kind of thing because he wasn't writing in English, it was in Latin. The real translation
[00:57:16] is I'm just going to paraphrase it, but the answer that introduces the fewest new assumptions is more likely to be true. That's a critical difference because you could say, well, aliens did everything. That's my simple one answer for everything. You're coming up with this complicated
[00:57:34] explanation for every different thing. It's like, yeah, but you're introducing this massive new assumption that there are aliens on earth and I'm not introducing any new assumptions. I'm just going by things that we know exist. That's the real way to approach it. Are you introducing a
[00:57:51] new assumption, assuming the existence of a new element? That's what Occam's razor tells you to avoid or to minimize. If you could explain something using stuff we already know, it's more likely to be true than if you were saying, well, maybe there was this unknown
[00:58:11] thing that is happening. It's okay to hypothesize that, but then you've got to test it. Maybe there is an unknown element and that's now a hypothesis, but that doesn't become your conclusion. You can't skip over the whole testing part of it. But just because you can
[00:58:29] weave a narrative that's complicated or that introduces random elements ad hoc is another good concept. Ad hoc means you're introducing an element as needed, or special pleading is another term that we use. You're making up an explanation ad hoc as needed to explain anything
[00:58:48] that you need to explain. We're really good at that. People are really creative. We're very good at that. Again, if that's your process, you could defend anything. But Occam's razor is part of a process saying, nope, we're going to stick with the evidence that's been established, facts that
[00:59:03] are established, see if we can explain it without introducing anything too complicated or anything new, any new assumptions. Those explanations are more likely to be true because you're not introducing a bunch of new stuff. In medicine, it's the same thing. Can I explain your symptoms
[00:59:19] with your known diseases, or do I have to introduce a new disease? Now, maybe they do have a new disease. But what you don't want to do is introduce three new rare diseases. You have
[00:59:30] three rare diseases. What are the odds that versus, well, there's one disease that could explain everything. It's not simple. It's just the introducing new elements. But sometimes, patients do have three diseases. But we know that they have them, or they have one disease
[00:59:49] that leads to all the other ones. Well, you have diabetes, which causes heart disease and neuropathy. So I'm not really giving you three things. I'm giving you one thing, which I know you have,
[00:59:58] and all of the complications of that disease. That's fine. Occam's razor is okay with that, even though I'm giving you multiple explanations for your symptoms. It all flows from what we know is happening already without willy-nilly just throwing in some completely new random disease
[01:00:16] that we have no evidence for. That's what that means. That makes a lot of sense. Thank you for correcting me, because I'd always heard it the simplest. But I think that makes even more sense in a true crime setting. Although, as you said, the evidence
[01:00:30] has to be ultimately the end all, be all. I can think of one case we did where we interviewed a couple. They had a crazy story. The girlfriend was abducted, and the man was told, you're being
[01:00:42] monitored by this camera. It just sounded like something that was completely made up. It was true, though. When police actually investigated it, they found, no, this is exactly what happened. So it's important to remember that, obviously, in our legal system, the evidence has
[01:00:57] to carry things. Yeah. Sometimes people do have rare diseases. Not often. By definition, they're rare, but not never. Sometimes really weird shit happens. You have to be able to pick up those cases
[01:01:10] as well, as long as you have a process, as long as it's flowing from the evidence and it's not just ad hoc. Absolutely. I wanted to ask you one thing. I think I know the answer, but I'd be curious what
[01:01:25] your take is. Is being a critical thinker the same thing as being a cynic? Are there pitfalls that you could fall into if you take the cynicism approach to everything? Yeah. Being a cynic is actually being anti-critical thinking. Because you're basically rejecting things just to reject
[01:01:43] them. That's your process. I don't believe in that because I don't believe in anything or whatever. Sometimes we use the term contrarian. It's like, well, everybody thinks this, so it must not be
[01:01:53] true. The mainstream media thinks this, so it's got to be wrong. It's like, well, that's the opposite of the argument from authority or the ad hominem. It's just that I reject anything mainstream or I reject whatever, anything that's institutional. If the government says it,
[01:02:13] the government lies, therefore everything they say is a lie. Those are also logical fallacies. Again, that's not a skeptical critical thinking process. It's just a negative process. Skeptics are not cynics. We are open to anything, whatever the evidence and logic leads. Wherever it
[01:02:31] leads, that's where we will follow. Sometimes the mainstream media is correct. Sometimes the government's not lying to you. By definition, a cynic, that's a bias. That's a filter. It's not following the evidence. It's assuming something bad about people or the contrarian
[01:02:51] version is just assuming that whatever is mainstream is wrong. Same thing with what we would call denialism. Denialism is when it's pseudo-skepticism, just like pseudoscience is to science, denialism is to skepticism. You're taking something that you don't want to believe and
[01:03:08] you're trying to frame it as a pseudoscience, but you're denying evidence or changing definition. You're doing all the stuff that pseudoscientists do, but rather than to say that ESP is real, it's to say that global warming is not real. They're both pseudosciences, just in different
[01:03:23] directions. Sounds like cynicism and denialism are just gullibility dressed up in a black leather jacket with smoking a cigarette. It looks cooler, but it's basically the same thing. Sometimes there's gullibility, people following the narrative of their tribe, but sometimes
[01:03:40] you're the fossil fuel industry, you have a pretty strong motivation. They're not gullible. I don't think they're gullible. I think they know exactly what they're doing. If you're selling something, it's not necessarily gullibility. You have powerful motivation.
[01:03:55] Most people, I think just like we are victims and perpetrators at the same time, I think anti-vaxxers are sincere, but they were convinced by a pretty package and they are passing it forward. They're now deceiving the next person down the line in the same way that they were
[01:04:15] deceived. I don't think there's any cynical reason for it. I think they're sincere. They're just suffering from misinformation and a critical lack of critical thinking. Most con artists are themselves deceived. Then in the mix are the real sharks who are taking
[01:04:35] advantage of the whole thing to prey upon people, but most of us are just paying it forward. Just whatever deceptions we've been victimized by, we pass on to other people. This has been a great conversation. I really want to thank you for taking the time.
[01:04:52] Before we go, I just want to emphasize how great science-based medicine is on a personal level. There was a time in my life when I had a relative with some pretty serious health problems.
[01:05:02] That was a place to go to get clear explanations of different treatments and stuff. I believe there were writers there like David Gorski, Harriet Hall. It was really very helpful. I would encourage people to check that out. Where else can people find you and your work?
[01:05:21] Yeah. To make it easy, if you go to theskepticsguide.org, that's the portal into everything that we do. The last question we always ask is, is there something we didn't ask that we should have asked that you wanted to mention? You guys asked a lot of great questions.
[01:05:35] I think we really covered a lot of territory. Awesome. Thank you so much, Dr. Novella. It was really great talking to you. Yeah, it's been a lot of fun, guys. Thanks for having me.
[01:05:44] We would like to close by once again thanking Dr. Stephen Novella for taking the time to speak with us again. Again, we highly recommend his podcast and his book, and we will link to both in our
[01:05:55] show notes. Thanks so much for listening to The Murder Sheet. If you have a tip concerning one of the cases we cover, please email us at murdersheetatgmail.com. If you have actionable information about an unsolved crime, please report it to the appropriate authorities.
[01:06:17] If you're interested in joining our Patreon, that's available at www.patreon.com slash murdersheet. If you want to tip us a bit of money for records requests, you can do so at www.buymeacoffee.com slash murdersheet. We very much appreciate any support.
[01:06:40] Special thanks to Kevin Tyler Greenlee, who composed the music for The Murder Sheet, and who you can find on the web at kevintg.com. If you're looking to talk with other listeners about a case we've covered,
[01:06:54] you can join the Murder Sheet discussion group on Facebook. We mostly focus our time on research and reporting, so we're not on social media much. We do try to check our email account,
[01:07:06] but we ask for patience as we often receive a lot of messages. Thanks again for listening.
