Common Sense: A Manifesto

Recently, I filed a post about food allergies in the schoolyard, which was driven by catching myself in several common logical fallacies. You see, before writing the article, I had a general sense that school policies designed to protect allergic kids from peanut exposure were getting out of hand. After all, back when I was a kid, we didn’t have them and I can’t remember a single anaphylactic reaction. So when I overheard the mother of an allergic child in my son’s school complaining that a new school policy was overdoing it, it seemed pretty clear I was right all along.

Of course I wasn’t. My belief was based on anecdotal data and an appeal to antiquity, all bathed in the “warm bath of confirmation bias”, as Dan Gardner so eloquently put it. Once I’d actually looked at the data and thought about the math, policy responses like Sabrina’s Law seemed much more reasonable. Until that point though, I was pretty certain they weren’t, based on what seemed like common sense.

Now common sense is a popular notion. Here on Ontario, we had a whole revolution based on it. It also underpinned the (slightly more significant) American Revolution, and deep thinkers like Glenn Beck continue to espouse it. It’s what every parent needs and what Lehman Brothers lacked. It’s clearly an awfully good thing.

Yet it’s also somewhat curious, for a couple of reasons. First, while everyone thinks they have it, not everyone agrees with one another — a fact which causes shockingly little cognitive dissonance. And second, though this is certainly a quibble: we know for a fact that sense — meaning the ability to consistently reason soundly — is not at all common. It’s an unnatural skill for humans, and as a species we’ve only even been trying to do it for a few thousand years, with many of the most significant advances coming in the last few hundred. Indeed, to use “sense” as we know it today requires deliberate training of a sort that is most uncommon.

Some Thoughts on Thinking

To illustrate this, let’s take a fantastically oversimplified look at the history of human thought. In the beginning, as any first year philosophy student can tell you, there were logic and rhetoric. Rhetoric dates back to the beginnings of written history (~3000 BCE), and is already recognized as a mixed blessing in Plato’s time, because it can manipulate as well as it can elucidate. (The Sophists do the former, Socrates the latter.) Logic, first codified by Aristotle, was the purer tool of thought — or at least one branch of it was. Deductive reasoning, the process of carefully determining new knowledge from a set of known premises, became the underpinning of human thought for the next 2000 years.
http://filipspagnoli.files.wordpress.com/2008/12/aristotle.jpg

Now Aristotle identified a second branch of logic called inductive reasoning, which goes in the opposite direction, forming a general conclusion based on the observation of specific instances, and he recognized it as useful in science.  But he didn’t do much more than that, and while it was taken up periodically over the ensuing millennia, including by such heavyweights as the Persian philosopher Avicenna, there were no real advances in that field until around the 17th century. In fact, much of the time it was brought up was to argue whether it should truly be part of logic at all. After all, the conclusions of induction could be false even when the premises were true and no logical flaw existed, so how could one derive reliable knowledge from such a process?

The problem was that there was no way of knowing just how closely a sample matched the population — at least until the development of statistics, which itself had to wait for the development of probability. And probability really only started to be codified with Fermat and Pascal in the mid-17th century, with antecedents no more than a century older than that. Statistics and probability gave induction its own language, and with Bacon and others also giving induction a second look as the Age of Reason began, the scientific method was born.

There is no way to overstate the importance of the scientific method. After 2000 years relying solely on deductive reasoning, we realized that there was another way to obtain knowledge. Instead of all knowledge being a product or recombination of what we already know, we now had a mechanism to learn genuinely knew things in a new way — by direct measurement and a determination of the precise likelihood that our measurements were reflective of the natural world. This was an immense leap forward in human thought, expanding the realm of things we could “know” exponentially.  Our world today would be unimaginable without it — it set the stage for the Enlightenment and the natural rights we now enjoy, and underpinned the largest creation and widest distribution of wealth in human history by speeding the pace of, and access to, technology innovation.

The Numeracy Problem

But while deductive reasoning at least seemed innate to humans (more on that below), it would be hard to call this new kind of sense common. In fact, it’s completely counter-intuitive, which is why it took thousands of years of human thought to develop. We had to create two branches of mathematics, and learn how to use them — and numeracy is a big hurdle for any mode of thought aspiring to common use. To illustrate just how uncommon this sort of thinking is, consider a few of the many, many studies that illustrate our essential innumeracy:

  • On the Lipkus Numeracy Scale, an 11 question test of numeracy, respondents routinely do dramatically worse on questions that involve simple non-integers represented as decimals — even when those respondents are highly educated.
  • Data from the U.S. National Assessment of Educational Progress found that mathematical literacy among 17 year old students has barely budged in the last 25 years, and remains in the lower half of the basic range. (Here are some sample questions in case you think the test might be too hard).
  • A recent study by the Federal Reserve Bank of Atlanta found that financial literacy is highly correlated with basic numeracy, and that “limited numerical ability played a non-trivial role in the sub-prime mortgage crisis”.  The short quiz they gave the survey respondents tests little more than basic arithmetic.

So nearly 400 years after the Age of Reason began, humans still can’t speak the language of science, at least without significant training. And that training requirement has only grown in the ensuing centuries, for two reasons. First, science itself has continued to become more complex (and thus rarefied) as the field of statistics has developed over the last century or so. Today, hardly a study is conducted without the aid of a computer stats package. And second, a “third wave” of human thought has since begun.

Wave Goodbye to Reason?

By the late 19th century, Charles Sanders Peirce — one of the main codifiers of the scientific method and an early statistics pioneer — was already calling attention to the impact of biases in scientific analysis, and recommending randomization and control as important tools in their elimination. Peirce was concerned mostly with statistical biases (e.g. sampling bias), but a century later we’d be starting to realize that the biases that formed the biggest obstacles to science were not mathematical, but cognitive.

What I’ll call the third wave of human thought — after deduction and induction — is the understanding of human cognitive processes pioneered in the 1970′s and codified by Kahneman, Tversky and Slovic (1982) in Judgment under uncertainty: heuristics and biases. This work has had a major impact on a wide range of fields: Kahneman, a psychologist, won the Nobel prize in economics, and cognitive biases are now routinely taught in science courses.

The main reason it’s so important is that it categorically refutes the notion, upon which the entire prior history of human thought is based, that the bulk of human thought is rational. Surely earlier thinkers recognized that humans were subject to irrational thought, even deliberate manipulation — witness Socrates against the Sophists — but until the recent advances in cognitive psychology, there was a belief that rational thought (particularly deductive reasoning) was our native mode. We now know this to be patently untrue.http://www.masters3d.com/wp-content/uploads/2009/05/cognitive-hazard.jpg

Thirty-plus-years of research has shown that humans have two cognitive processes — one rational and one heuristic-based — that essentially debate decisions before a final judgment is made. The problem is, the rational part takes a lot of coffee breaks, and lets the heuristic part run unchecked unless the decision is deemed particularly important and time allows for careful analysis. Even then, heuristic judgments are viewed as important inputs to the rational process, meaning that not only are most of our day to day decisions intuitive, but even the rational ones are subject to influence by the heuristic process as well.

The heuristic process uses circuitry in the oldest part of our brain that we inherited from earlier species, and it has certainly served us well. In most cases, and under most conditions, it allows us to speed up our decision making process dramatically by substituting easier questions for harder ones. Take the availability heuristic I’ve mentioned before. If I want to know whether the water in a well is safe to drink, I’ll try to recall instances where people drank from it and got sick. I’m answering a different question, but it’s a reasonable way to get to quick, approximate answer if I’m thirsty.

The problem is not that this method is an approximation and thus sometimes wrong. It’s that it’s subject to systematic biases which can make us consistently wrong in certain circumstances. Even when the rational brain does step in, these biases can be very hard to overcome. Some of the early research on conjunction fallacies and the representativeness heuristic was done with social science graduate students — and even stats and psych professors — who still made elementary errors because of the pull of the heuristic.  That particular heuristic has also been associated with the base rate fallacy, regression fallacy, and gambler’s fallacy.

We think of ourselves as rational creatures, yet study after study shows us relying on heuristics for the bulk of our decision making, exposing us to the myriad cognitive biases they engender. This has been hard-wired by evolutionary forces that have disproportionately rewarded quick decision making (“Rhino Charging: Run!”) over careful thought. If we’re honestly assessing what’s most common in human thought, it’s not using our sense at all.

Toward an Uncommon Sense

So for those of us aspiring to think and reason soundly, what’s really required? Here’s my list:

  1. The strength of will not to be mentally lazy — to overcome our default heuristic-based thought process and use our rational processes more of the time.
  2. Enough of a grounding in deductive reasoning to engender careful thought and the avoidance of logical fallacies.
  3. Solid numeracy skills, with a focus on probability and statistics.
  4. An understanding of the scientific method — the way in which knowledge is proposed, tested and affirmed — and its limitations.
  5. An awareness of the myriad common cognitive biases identified in the research, the ability to spot them in one’s own thinking, and practice debiasing.

Have I missed anything? I’d be appreciative of any additions in the comments.

12 Responses to “Common Sense: A Manifesto”

  1. Blondin says:

    Great article, Eric. It ties in very nicely with the book I am currently re-reading, Thomas Kida’s “Don’t Believe Everything You Think”.

    I imagine how much better off the entire world would be if more attention were given to understanding how our minds work and the little traps we set for ourselves in our everyday thinking. I think we all have the tools to analyze data critically and apply logic but we seem to be blind to our own biases and assumptions. There really should be more emphasis on skeptical reasoning in early education. I think Richard Feynman said it best: “The first principle is that you must not fool yourself – and you are the easiest person to fool”.

  2. badrescher says:

    Great piece. It’s a nice summary of the major cognitive biases and you started with a terrific example of why it is important that we understand and overcome them.

    I have one puzzle piece I think puts things into perspective in regard to this frustrating fact:

    So nearly 400 years after the Age of Reason began, humans still can’t speak the language of science, at least without significant training.

    The reason that we need science in the first place is because we are not objective reasoners. If we (people, on average) were capable of understanding the language of science without significant training, we would not need the scientific method laid out for us to be able to determine what is true.

  3. Kenneth says:

    Hi, came across this link on CFI. Great article. However, your recommendation to be more rational and avoid cognitive biases I don’t think is universally applicable. This paper finds that some cognitive biases have evolved to serve a social purpose: http://willem.frankenhuis.org/papers/Haselton2009.pdf . In other words, becoming more rational may hurt your social life!

    • Erik Davis says:

      Kenneth — thanks for the link. I’m encouraged that the sociologists finally seem to be talking to the psychologists in this field.

      One of the things that Gilovitch et.al. lament in Heuristics and Biases (the 2002 update to Judgment under Uncertainty) is that much of the coverage of the H&B research in both the academic and mainstream press has been negative, focusing mostly on the biases. The view they attempt to bring back to the fore is that the dual process (“two brains”) model serves humans particularly well in most cases, and that most of the research should be focused here. They even point out that we employ the heuristics consciously as well as unconsciously, using them where the rational brain would normally step in, but sufficient data is unavailable for it to do its job effectively.

      The biases are a byproduct of the generally useful heuristics, and the areas to worry about are the ones where systematic errors occur, especially when those errors are high stakes. Gardner also points out in Risk that in today’s world of 24/7 media, those biases are kept well-fed by marketers, politicians, activists, and the media — sometimes inadvertently but often with manipulative intent. There’s a great example of this in Playboy this month – “Rogues of K-Street: Confessions of a Tea Party Consultant“:

      A good piece of mail gets its message across in 10 seconds. Television gives you 30 seconds, maybe. We’re playing to the reptilian brain rather than the logic centers, so we look for key words and images to leverage the intense rage and anxiety of white working-class conservatives. In other words, I talk to the same part of your brain that causes road rage. Ross Perot’s big mistake was his failure to connect his pie charts with the primordial brain. Two years after Perot’s first White House run the GOP figured this out, and thus was born the “angry white man” and with him a 54-seat swing in the House of Representatives.

      • Kenneth says:

        Indeed, you’re right it is a great idea to practice debiasing in most cases (and especially, like you said, in high-stakes situations). I just wanted to make the case that cognitive biases can also be adaptive sometimes (in my experience, in social or romantic situations).

      • Erik Davis says:

        What one might call affairs of the gut…

  4. I am mostly interested in politics now, and followed the link to the Conjunction Fallacy and it appears to me to be incorrect, which is doubly concerning since the bottom of the page lists a formal proof.

    The probability that a bill with provision A is less than 50%, but by adding a second provision, B, the bill can be passed because some of the opponents of A have always wanted B. This happens all the time in legislatures, where bills are amended until they can be passed.

    I had most of an undergraduate degree in math, lucky me, and, generally, really like the school of informal logic of fallacies.

    • Erik Davis says:

      Politicians are illogical? Owww, the confirmation bias hurts so good.

    • Kenneth says:

      I don’t think they are analogous. The conjunction fallacy is about probabilities while the example you give is about politics and legislation and choice (with no probabilities).

      • I must agree with Kenneth. I’m not a glamourous highly-paid professional logician or anything, but it seems to me that Joshua’s example differs because the second proposition modifies the first — the two are no longer an apples:apples comparison. Rather than comparing X versus X+Y, we’d be comparing X versus Z, where Z is the product of Y modifying X.

        Apologies if my thinking is broken here.

        Yours,
        Chester Burton Brown

      • Erik Davis says:

        I think Joshua was being wry in his analogy.

Trackbacks/Pingbacks

  1. [...] Just how common is common sense? Erik Davis has a manifesto. [...]


  • Erik Davis

    Erik is a technology professional based in Toronto, focused on the intersection of the internet and the traditional media and telecommunications sectors. A reluctant blogger, he was inspired by the great work Skeptic North has done to combat misinformation and shoddy science reporting in the Canadian media, and in the public at large. Erik has a particular interest in critical reasoning, and in understanding why there’s so little of it in the public discourse. You can follow Erik's occasional 140 character musings @erikjdavis