All of the mistakes discussed so far are universal among humans to a greater or lesser degree. These last two are also universal and extremely common, leading to a world filled with pain and suffering, bigotry, and misunderstandings on a grand scale. I should warn you that this discussion will make many people uncomfortable because it cuts into the core of how people view themselves. People define themselves through the memories of their experiences and we tend to remember sound bites better than the complexities of reality, which makes for a dangerous combination.
5. We tend to oversimplify our thinking.
Of all of the mistakes, this one has most likely caused the most problems. When we were still living as hunter-gatherers in small bands, this was a benefit and can still be in some areas. When you live in an environment filled with potentially life-ending threats, you need to be able to recognize and react to them quickly. When that rustle in the bush may be a Smilodon about to attack, you can’t afford to think about all the different options because if you do, you are dead. But most of us no longer live in that sort of environment. We can take the time to think. We just have to fight our natural instincts that are hardwired into our brains. It’s tough, I realize that. It’s impossible to do all the time. But I hope you will see why it is so important that we try.
It is at the core of stereotypes and the “us vs. them” mentality that drives everyone to some extent. Any time you hear someone say, “Blacks are…,” or “Muslims are…,” or insert any group you want, that person is oversimplifying their thinking. It does not matter what you say after that first phrase, it will not accurately describe all members of that group. All “Blacks” are not actually black, nor do they share the same heritage, culture, language, or anything else. All those people that are thought of as Muslims by those using that stereotype are not in fact Muslim. I say this because almost invariably when non-Muslims refer to Muslims in a stereotypic fashion, they are confusing Arab (or anyone from the Middle East) and Muslim. Muslims and Arabs, like any large group, do not all share the same beliefs and culture.
In the first post in this series, I mentioned the anti-vaccine movement. It all started from ONE paper (since thoroughly discredited and debunked) that only referred to ONE specific vaccine. The whole point of the paper was to discredit that specific vaccine so the author could sell his own version. But no one in the anti-vaccine seems to remember that and they have simplified the topic to ALL vaccines.
In science, this sort of thinking causes people to read a single set of experiments (or even one experiment) on a specific target and then try to apply the result to everyone. This mistake is rampant in the medical field. A study will be published saying that a series of rats showed a result and instantly the media says that all humans will have the same result. Fortunately, scientists are well aware of the differences between rodents and humans. A result in rats and mice often does not carry over into humans. This is why all drugs have to go through human trials after they pass animal trials.
Even if a drug works in the small sample of humans, that sample is not truly representative of all humans. You may have heard that science has proven that vitamins are pointless and may even be harmful? The studies that indicated vitamins had no benefit were all done on healthy volunteers that mostly had good diets. So yes, if you are healthy and are getting everything you need from your diet, you don’t need vitamins and the excess can actually hurt you. Unfortunately, most people do not fall into this category, so for them, taking vitamins can indeed help. (This is just another example that eating right and having a healthy lifestyle will avoid many of the health problems most people have and will save you money in the long run. Exercise is almost always preferable to pills and is free.) Even healthy humans are incredibly variable and have different metabolisms. The same drug will not work the same on everyone.
All those internet memes you get with a picture of someone with a saying on it? Fabulous examples of oversimplification. The internet is full of examples of overly quick and thoughtless thinking. Here is a tip, if anyone can boil down the essence of a social problem with one pithy statement, it is almost guaranteed to be WRONG. I have heard more than one person say that because Muslims flew planes into the World Trade Centers, all Muslims were evil and should therefore all be killed, because “they all want to kill us anyway.” To any rational person, this statement is clearly, insanely, wrong. You may wonder why I have mentioned Muslims a few times. That is because right now, it is the most prevalent and dangerous stereotype I know and one which is very familiar to everyone. They either hold that view or know many who do.
I could go on and on about how people oversimplify for the rest of my life, but it gets seriously depressing rather quickly, so I will stop here. But I hope you get the point: Oversimplification, overgeneralizing, has led to the wrongful deaths of hundreds of millions of people and is the source of much of the hatred in the world. Be aware of just how common this mistake is and STOP DOING IT.
You how do you avoid this problem? Never take one study or one source as truth. It is ok to keep an open mind about something, but don’t put your faith into it unless you can verify it through other reliable sources. Wait for other studies that confirm the results because it may be that the first study was wrong. Avoid overgeneralizing. Just because something worked once, do not think it will work every time. Always, always, always keep the parameters of a study in mind, respect the limitations of any study. A result on one mouse in one situation has little to do with results from many people in all sorts of variable conditions. Do not extrapolate beyond the data without clearly understanding that the extrapolation is purely speculative guesswork and may not hold up in reality.
6. We have faulty memories.
One way that our memories are faulty is in that confirmatory bias discussed in the previous post. You can see this problem in everyone who gambles, be it at a casino or the stock market. Most people remember their successes far more commonly than their losses. People can lose fortunes this way. Casinos are masters at exploiting this mistake. If a gambler wins early, they tend to continue playing long after they have lost their winnings and more. Every time they win, they remember that one win and forget all the loses before that. Some people do the opposite, focusing on their failures and minimizing their successes, which leads to problems therapists deal with every day.
Science, particularly medical science, has a form of institutionalized faulty memory. It is much easier to publish positive results than negative ones. Therefore, experiments that didn’t work tend to be glossed over and forgotten, focusing on the ones that succeed. Of course, if those successes are due to chance or faulty experimental design, ignoring the negative results leads the whole field astray. How serious is the problem? A paper in 2012 found that only 6 out of 53 “landmark” papers in haemotology (study of blood) and oncology (cancer research) could be replicated. This sort of publication bias on the positive can have profound problems. It may sound like this means that science can’t be trusted, but what it really means is that it is critically important to never jump on the bandwagon and follow the advice of a new study. Wait until it can be confirmed by other research. Science is all about throwing hypotheses out there and testing them to see if they really work. One test doesn’t do it. Multiple tests are needed and you cannot forget the failures.
Where faulty memory really comes into play is in just how easy it is to change our memories. Simply hearing another person’s experiences can change our own. My favorite study showing this interviewed people about their experiences at Disney World. The participants watched an ad showing people interacting with Bugs Bunny at Disney World. The fact that this event is impossible (Bugs Bunny was not owned by Disney and so could not make an appearance at Disney World) did not keep many of the people from saying that they had fond memories of seeing Bugs Bunny at Disney World.
Kida discusses the results from some researchers in which they asked students where they were when they first heard of the space shuttle Challenger exploding. They asked shortly after the event and then again two and a half years later. Despite claiming that their memories were accurate, none of them were entirely accurate. Some of them were wildly off. Yet the students insisted that they were correct and disavowed the record of their earlier remembrance. There are several studies like this that say the same thing: our brains do not faithfully record our experiences and those memories change both over time and through suggestion by others.
So, what does this mean for us? It means that we have a bad habit of misattributing things, combining memories or making them up whole cloth. Criminal psychologists are deeply aware that eyewitness testimony is the least reliable evidence that can be brought into court, despite the fact that it is considered the most reliable by most people. People commonly say “I’ll believe it when I see it,” and “I saw it happen with my own two eyes!” We put a lot of stock into our perceptions and our memories. But, as is quite clear from decades of research, neither our perceptions or our memories are at all reliable.
So what are we to do if we can’t rely on our own experiences? Make records, take pictures, write it down. Compare experiences with other people. There is some truth to the now common statement; “Pics or it didn’t happen.”
And so we conclude the introduction to the six basic errors in thinking we all make to a greater or lesser extent. These mistakes are universal, they happen repeatedly daily basis. Yet they have grave consequences. In science, we have ways to try to avoid them. We record data. We share it with others and let them try to poke holes in it. We do not trust only one example and demand verification. Scientists make these mistakes all the time. But by being aware of the mistakes and having procedures in place to deal with them, we can minimize the problems.
Last post, I covered two of the six most common mistakes people make in their thinking. Today I will cover the next two: appreciating the role of chance and misperceiving the world around us. Both of these are huge topics, so as Inigo Montoya said, “Let me ‘splain. No, there is too much, let me sum up.” 3. We rarely appreciate the role of chance and coincidence in shaping events. Last time we discussed just how much people hate and misuse statistics. Another way our inbred antipathy for statistics comes into play is not understanding the role of chance. People seem to need to have a cause for everything. If something goes wrong, something must be to blame. We hate to admit that anything is left to chance (which, considering that quantum physics makes everything in the universe a probability, make explain why people don’t understand it). As even Einstein said, “God doesn’t play dice with the world.” However, given enough time or occurrences, even rare events occur. I once had a geology professor who told me that given enough time, events that are almost impossible become likely and rare events become commonplace. This is quite true, over enough time and with enough attempts, even the rarest events will happen. Sometime in your life, you are almost certainly going to see something that is incredibly, inconceivably rare. People talk about 1 in a million chances being so rare as to be inconceivable and not worth thinking about, but that chance happens to over 7000 people worldwide, it will happen to eight in New York alone. Occasionally, you will be among that 7000.
Have you ever called someone only to find out they were trying to call you at the same time? I have done that with my wife. Many people invoke something mystical or a psychic connection that made us call at exactly the right time because surely, what are the chances of two people just happening to call each other at the exact same time? However, I talk to my wife on the phone far more often than I talk to anyone else. Considering the number of times that my wife and I try to contact each other, it is almost inevitable that sooner or later, we would try at the same time. There are those that seem to be consistently lucky or unlucky. If it were really just random chance, then it should all even out and everyone should be equally lucky (or not), right? Here again, with enough people, some people will just randomly be consistently luckier than others, no supernatural force required. There will always be outliers that don’t follow the typical pattern just through random chance.
You can see this same type of mistake when people incorrectly tie independent chances together. When flipping a coin, a string of 20 heads will do nothing to change the chance that the next flip will be heads or tails. This sort of mistake is often seen in gamblers and people playing sports in their belief of winning and losing streaks, in which the results of a series of chance occurrences is thought to affect the odds of future events. So how do we avoid this mistake? Never put much stock in one occurrence. Look at the accumulated data and weigh occurrences accordingly.
Probably one of the biggest fallacies people make in this regard is mistaking correlation for causation. Just because two events occur at the same time does not necessarily mean they are connected. Wearing a specific shirt when you win a game does not make it lucky. It will not influence the outcome of any other game except in how it affects your thinking. I can think of no better example of this than the Hemline Theory, which states that women’s skirt lengths are tied to the stock market. Sadly, despite such an absurd premise, it is still commonly believed and one can still find articles debating the merits of the hypothesis. Needless to say, even if they do tend to cycle together, it would be foolish to say that the stock market is controlled by what skirts women are wearing. What might be plausible is that both are influenced by some common factor. Thus, any study which claims to have found a correlation between two events or patterns has only taken the first step. Once a correlation has been found, it is then necessary to demonstrate how one affects the other. Often, it is found that there is no direct connection, but they may both be influenced by an altogether different factor. Check out the site Spurious Correlations to see almost 30,000 graphs showing correlations between totally random occurrences, such as the graph showing that increased Iphone sales are correlated with a drop in rainfall in Mexico, or that US STEM spending is associated with the suicide rate. How to avoid this problem? Look for multiple lines of evidence and a causal mechanism that explains how one could affect the other. Without that mechanism, you can only say that two things have something in common, you should avoid saying one thing caused the other until you can point to a direct connection.
4. We sometimes misperceive the world around us. Many people make the assumption that that their eyes work like cameras, recording faithfully everything in their field of view and the brain accurately records everything that goes into it. Unfortunately, this is not true. Our senses are imperfect. They neither record all the information, nor does the brain provide a complete image of what is around you. Simply put, you cannot trust your senses. Magicians count on this. One of the best I have seen is Derren Brown, who uses a mixture of psychology and good old-fashioned stage magic to perform his tricks. Visual and aural illusions abound. Take our eyes for example. Unlike a video camera that records the whole scene within the confines of its lens at the same time. We put together images from fragments. We rapidly move our eyes all around our field of view in what are called saccades, focusing on one small bit, then another. The light enters the eye and is picked up by the retina, with rods detecting intensity of light and cones detecting color. Signals from these receptors do not enter the brain as a picture. They are filtered through specialized cells, some of which detect boundaries to sharpen focus, some detect movement, etc. All of these separate signals gets sent to the brain which puts together a patchwork image, an image with a lot of gaps. We don’t usually see these gaps because our brains fill them in with what past experience tells it to expect. This is a really important point. Past experience affects what we see. Our hearing works this way as well.
The fact that past experience affects what we see plays out in various ways. We overlook things that change between eye movements. We fill in the gaps with what we expect to see. Thus, how we view the world is in part dependent on what we expect to see and our expectations are based on our experiences. People with different experiences may view the same thing very differently. This happens so much that when we see something that does not fit our expectations, our brains can even go the point of overwriting the visual input with our prior expectations. And it gets worse. If we focus on something, this tendency to be blinded to other things increases. Most people have heard of ignoring the elephant in the room. A couple of researchers at Harvard did what they call the invisible gorilla experiment. People were asked to observe a group of people wearing shirts that were two different colors. They were told to count the number of times a ball was passed between members of the group wearing the same color shirt. Most people could successfully do this. However, many people missed the man in a gorilla suit who walked into the middle of the group, paused to look at them, and then walked off.
How could someone miss such an obvious thing? They were focused on the ball and missed the bigger picture. This problem is called selective attention or “inattentional blindness.”. This experiment has been done with hearing, in which the participants were to listen to only one of two conversations going on at the same time. This time, part way through, someone started saying, “I’m a gorilla,” multiple times. If just told to listen to the recording, everyone could hear it easily. But if told to listen carefully to only one conversation, most people never heard the gorilla. There are many, many examples like this of selective attention. This is exactly why eyewitness accounts in trials are not worth very much. You might hope it stopped at this level, but it doesn’t. Even if we accurately see what is there, our prejudices will affect our interpretation. Different colors affect our moods and perceptions. Religious or political beliefs affect our perceptions to the point we will literally see things differently, even our views of sports games. Space allows only a cursory mention here, but it is easy to find many, many studies, books, and shows that demonstrate just how unreliable our personal observations are. So how do we avoid this? To begin with, we recognize that our perceptions are fallible. Thus, the more independent observations we can make and the more people that observe it, the more likely it is to be valid. Take recordings that can be viewed and listened to at different times. Try this out for yourself. Watch a movie with other people. Have someone prepare a list of questions in advance about a particular scene. After everyone has viewed it, have everyone answer the questions on their own and then compare them. Most likely, you will find some things people answered differently, other things some people did not see at all. Or just listen to the responses from political leaders after any speech by any President. The best way to get around this problem is through multiple, independent observations. Never trust just one observation and always question the biases of the observer.
Next post we will wrap up this series with the last two common mistakes. Stay tuned.
When I was a kid, I was always taught the scientific method is a matter of developing hypotheses, testing them, and using the observations from the tests to revise the hypotheses. Very straightforward, but overly simplistic. My teachers rarely, if ever, talked about the crucial strategy of multiple working hypotheses, coming up with every imaginable way that could explain our observations before we started trying to test them. But the most important thing that was never taught was how to think explicitly and clearly. Logical and clear thinking is the heart and soul of science. In fact, there is no decision that cannot be improved by clearly thinking about the question and the available data. We just celebrated Independence Day in the United States. It is time we celebrate our independence from fuzzy, ill-defined, and confused thinking. In the last post, I discussed the critical importance of clearly defining a problem in terms of actionable questions. The first step is to understand a problem well enough that it can be clearly articulated and defined. Then all factors that contribute to the problem can be clarified. But you can’t stop there. Once you have a list of known factors, you have to decide which ones you can actually do something about and not waste time arguing about those you can’t change. Focusing on the definable and workable factors produces results. Wasting time on things you can do nothing about is counterproductive. In this post, I am going to briefly discuss the first three of six general mistakes that EVERYONE makes from time to time. You can never be completely rid of them, but you can be aware of them and try to reduce their influence in your life. If you do this, I promise you will make better decisions. You will improve your life and the lives of those you touch. These six mistakes are outlined and fully discussed in the book Don’t Believe Everything You Think: The 6 Basic Mistakes We Make in Thinking, by Thomas Kida. I highly recommend you get this book and read it.
1. We prefer stories to statistics. People are terrible at statistics, even people who really should know better, so bad in fact, that they make them up to sound smart. You can easily find numerous variations of the statement, “80% of all statistics are made up on the spot, including this one.” Or, as often (likely incorrectly) attributed to Mark Twain, “there are three kinds of lies: lies, damn lies, and statistics.” So it’s no wonder that people suck at them and prefer stories. There are abundant studies illustrating how our brains are wired to listen to stories, how personal stories influence our behavior more than statistics, such as this one, or this one. Statistics happen to abstract groups, stories happen to identifiable people. We even prefer to dress up our information to make it more personal, more interesting, but the very act of storifying information makes that information less likely to be true. It is much more likely that Bill robbed Peter than it is that Bill robbed Peter AND paid Paul. The more complicated things get, the less likely. But this mental shortcut can cause serious problems. This is well illustrated by the anti-vaccination scaremongering going about. The whole anti-vax movement can really be traced to one report by Dr. Andrew Wakefield in 1998 that found a correlation between vaccines and autism, research that has been completely discredited and proven fraudulent. Since then numerous studies have linked into the alleged link and found nothing, such as this one. But no matter how many studies find no link, many people hear Jenny McCarthy talk about her autistic son, they hear others talk about their autistic children, and come to the conclusion that all the studies must be wrong, because the stories carry more weight with them. Disregarding the millions of children who get vaccines that never develop autism, people focus only on the stories of people that claim otherwise. Thus, thousands of children are getting sick and dying because of a belief in stories over statistics.
2. We seek to confirm, not to question. Have you ever read something that you disagreed with and instantly dismissed it or conversely, have you ever accepted evidence simply because it agreed with what you thought? If so (and you have, everyone does), you are guilty of confirmation bias. Confirmation bias causes people to seek out and weigh information that already agrees with their point of view and disregard evidence that disagrees with them without ever really analyzing the data. If you get all your news from either FOXNews or MSNBC, you are likely to rarely, if ever, hear contrary points of view and are thereby limiting input to only that which you already agree. Thus, people who do so will weigh that evidence in favor of their preconceptions and will assume that their view is more prevalent than it really is. If one gets all their information about evolution from the Institute for Creation Research, they will never get accurate information about the theory as the ICR is based on the belief that evolution is false, so they seek only information that discounts it. The only way to avoid this is to seek out diverse news outlets. While you read them, remind yourself that you will suffer from confirmation bias, so you may (hopefully) be able to give evidence from all sides a thorough critique.
Lest you think that only untrained laymen fall into this trap, confirmatory bias is rampant in science as well and it is a serious problem. Even the ivied halls of Harvard do not protect one from poor thinking and confirmatory bias as this article by Neuroskeptic clearly illustrates wherein he takes a fellow neuroscientist to task for not recognizing the fallacy of only looking for confirmatory results. There is a publication bias in the scientific literature towards positive results, the negative results get mentioned much less often. While this is true in all fields, it is particularly important in medical research, and psychology research has been hit particularly hard lately.
Next post, I will cover the next two common mistakes. Stay tuned.