Home » Posts tagged 'Thomas Kida'

Tag Archives: Thomas Kida

6 Ways to Completely FAIL at Scientific Thinking, Part 1

dontbelieveWhen I was a kid, I was always taught the scientific method is a matter of developing hypotheses, testing them, and using the observations from the tests to revise the hypotheses. Very straightforward, but overly simplistic. My teachers rarely, if ever, talked about the crucial strategy of multiple working hypotheses, coming up with every imaginable way that could explain our observations before we started trying to test them. But the most important thing that was never taught was how to think explicitly and clearly. Logical and clear thinking is the heart and soul of science. In fact, there is no decision that cannot be improved by clearly thinking about the question and the available data. We just celebrated Independence Day in the United States. It is time we celebrate our independence from fuzzy, ill-defined, and confused thinking. In the last post, I discussed the critical importance of clearly defining a problem in terms of actionable questions. The first step is to understand a problem well enough that it can be clearly articulated and defined. Then all factors that contribute to the problem can be clarified. But you can’t stop there. Once you have a list of known factors, you have to decide which ones you can actually do something about and not waste time arguing about those you can’t change. Focusing on the definable and workable factors produces results. Wasting time on things you can do nothing about is counterproductive. In this post, I am going to briefly discuss the first three of six general mistakes that EVERYONE makes from time to time. You can never be completely rid of them, but you can be aware of them and try to reduce their influence in your life. If you do this, I promise you will make better decisions. You will improve your life and the lives of those you touch. These six mistakes are outlined and fully discussed in the book Don’t Believe Everything You Think: The 6 Basic Mistakes We Make in Thinking, by Thomas Kida. I highly recommend you get this book and read it.

1. We prefer stories to statistics. People are terrible at statistics, even people who really should know better, so bad in fact, that they make them up to sound smart. You can easily find numerous variations of the statement, “80% of all statistics are made up on the spot, including this one.” Or, as often (likely incorrectly) attributed to Mark Twain, “there are three kinds of lies: lies, damn lies, and statistics.” So it’s no wonder that people suck at them and prefer stories. There are abundant studies illustrating how our brains are wired to listen to stories, how personal stories influence our behavior more than statistics, such as this one, or this one. Statistics happen to abstract groups, stories happen to identifiable people. We even prefer to dress up our information to make it more personal, more interesting, but the very act of storifying information makes that information less likely to be true. It is much more likely that Bill robbed Peter than it is that Bill robbed Peter AND paid Paul. The more complicated things get, the less likely. But this mental shortcut can cause serious problems. This is well illustrated by the anti-vaccination scaremongering going about. The whole anti-vax movement can really be traced to one report by Dr. Andrew Wakefield in 1998 that found a correlation between vaccines and autism, research that has been completely discredited and proven fraudulent. Since then numerous studies have linked into the alleged link and found nothing, such as this one. But no matter how many studies find no link, many people hear Jenny McCarthy talk about her autistic son, they hear others talk about their autistic children, and come to the conclusion that all the studies must be wrong, because the stories carry more weight with them. Disregarding the millions of children who get vaccines that never develop autism, people focus only on the stories of people that claim otherwise. Thus, thousands of children are getting sick and dying because of a belief in stories over statistics.

2. We seek to confirm, not to question. Have you ever read something that you disagreed with and instantly dismissed it or conversely, have you ever accepted evidence simply because it agreed with what you thought? If so (and you have, everyone does), you are guilty of confirmation bias. Confirmation bias causes people to seek out and weigh information that already agrees with their point of view and disregard evidence that disagrees with them without ever really analyzing the data. If you get all your news from either FOXNews or MSNBC, you are likely to rarely, if ever, hear contrary points of view and are thereby limiting input to only that which you already agree. Thus, people who do so will weigh that evidence in favor of their preconceptions and will assume that their view is more prevalent than it really is. If one gets all their information about evolution from the Institute for Creation Research, they will never get accurate information about the theory as the ICR is based on the belief that evolution is false, so they seek only information that discounts it. The only way to avoid this is to seek out diverse news outlets. While you read them, remind yourself that you will suffer from confirmation bias, so you may (hopefully) be able to give evidence from all sides a thorough critique.

Lest you think that only untrained laymen fall into this trap, confirmatory bias is rampant in science as well and it is a serious problem. Even the ivied halls of Harvard do not protect one from poor thinking and confirmatory bias as this article by Neuroskeptic clearly illustrates wherein he takes a fellow neuroscientist to task for not recognizing the fallacy of only looking for confirmatory results. There is a publication bias in the scientific literature towards positive results, the negative results get mentioned much less often. While this is true in all fields, it is particularly important in medical research, and psychology research has been hit particularly hard lately.


Next post, I will cover the next two common mistakes. Stay tuned.