Today’s guest blog comes courtesy of computer scientist, Gurmukh Mongia. Anyone interested in writing guest blog should submit a proposal to the editor, skepticnorth [at] gmail [dot] com
How we define a social problem is vitally important for any effort we make to understand it. Advocacy groups naturally feel that their issues should blip prominently on our radar screens. Thus are we bombarded with calculations and claims from all directions aiming to convince us that certain problems are dire enough to warrant our attention.
And, of course, for the most part there is valid reason for concern. But sometimes, even with the best of intentions, advocacy groups can find themselves focusing more on their message than the facts. They make choices in how they define a problem that do not correspond well with the way it’s typified. Usually there’s no malice involved, just sloppy handling of the facts.
I would like to share with you two statistics I came across where the numbers just didn’t add up. In both cases, the advocacy groups were working towards admirable goals, and their efforts should not be belittled. However, the claims themselves seemed to be off the mark. An exploration of exactly how they went wrong can help us to identify other times when definitions can mislead us.
Are Our High School Students Chronically Under Nourished?
A while back I found a small plastic card displayed in a supermarket which advanced an interesting claim:
The number stopped me in my tracks. What could they possibly mean by that? The group making the claim was supporting a school breakfast program, in which underprivileged students are provided with a free breakfast. Surely this number can’t mean what it seems to at first glance: that 62% of Canadian students are from families in such dire straits that they’re unable to provide breakfast for their children. The poverty rate is nowhere near that high, so what’s going on here?
Perhaps they mean that 62% of students skip breakfast on occasion, such as when they’re running late. That still seems high, but it makes a little more sense. The problem is that it’s a substantially different kind of claim. If that’s the case, then rather than supporting students in financial need, you’re instead being asked to support students who occasionally sleep in too late. Of course, a school supplied breakfast would still benefit those students, but would people still feel that the cause was as urgent if the problem was portrayed to them in that way?
I decided that I had to investigate this claim, and figure out where it came from. The problem was that while numerous websites repeated the claim, none actually gave it’s source. This was starting to look suspiciously like a figure that had been repeated by word of mouth, and just casually accepted without question. Once an advocacy group accepts a statistic this way, they become the authoritative source for the statistic and nobody bothers to look further and find out where the statistic actually came from.
I finally managed to find a reference that listed the source as the Massachusetts General Hospital. This allowed me to track down the original study, and it turned out to be completely different from what I’d imagined. Massachusetts General Hospital, in association with Harvard Medical School, performed a study to determine how participation in school breakfast programs relates to academic performance. One of the figures mentioned in this study was that 62% of students “ate a school-supplied breakfast rarely or never”. This study never said that these students didn’t eat breakfast at all, just that they didn’t get their breakfast from the school. Presumably, the vast majority of those students had a full breakfast at home.
This is what sociologist Joel Best, author of the “Damned Lies and Statistics” books, calls a “Mutant Statistic”. Obviously, this study would have been of interest to advocates of school breakfast programs. It seems that somebody saw the number, and misunderstood or misremembered it’s intent. That person passed the information on to colleagues, and eventually it became common, unquestioned knowledge. Even though on the face of it, the statistic doesn’t make much sense, when people are presented with an authoritative sounding number, they tend to accept it uncritically.
Is Canada Experiencing A Literacy Crisis?
According to a claim that I found online, it is. It states that almost half of all Canadians lack the basic literacy skills necessary in order to get along in the world.
This claim is stated in much clearer terms than the last one. The problem is that the number seriously stretches credulity. The big question here is: how exactly are they defining and measuring “basic literacy skills”?
This statistic comes from a report based on the International Adult Literacy and Life Skills Survey(IALLS). It turns out that what they’re talking about is “Prose Literacy”, which is an attempt to measure not specifically whether people have the ability to read and write, but instead attempting to measure how well they’re able to do so. There are 5 categories of literacy in their definition, category 1 is the lowest. A test was given to a sample of Canadians, and the 42% result is a based on adding up the percent of those who scored in categories 1 and 2.
It seems, though, that the common convention is to only apply the label of “below basic” prose literacy skills to those whose test results place them in category 1. If you limit this statistic to only those who fit category 1, the number drops to 14.6%. People whose tests place them in category 2, according to the description, are perfectly able to read and write at a basic level, but may have trouble with challenges such as learning new job skills. To me, it doesn’t seem like this definition warrants the description “lacks basic literacy skills”.
Obviously the people who tabulated this statistic feel differently. They believe that anything below category 3 is inadequate for functioning in today’s world. Perhaps there’s some room for argument, but what’s clear is that they’re using a non standard definition of “basic literacy skills” which makes the statistic seem much more alarming than it would otherwise be.
And there has been further analysis done on these numbers. The University of British Columbia released a brief on the subject which notes that most of the people who were found to have major difficulties reading in this test were, in fact, non native English speakers. The biggest problem in prose literacy may be in how well we’re teaching immigrants English as a second language. While this is a concern, I don’t believe most people would classify it as a “literacy crisis”.
What’s The Moral Of The Story?
Obviously, I’m not trying to discredit any organization here. I think that school breakfast programs are a wonderful thing. And as for the literacy claim, it was being used to bolster support for a program providing needy schools with new books. As an avid reader, this program has my full endorsement.
But I don’t think we should be advocating any cause, no matter how worthy, with statistics that mislead the public. When your definitions don’t correspond well with how you’re audience typifies the problem, then I believe you’ve done your audience a disservice.
Gurmukh Mongia is a computer scientist working in the field of web development. He is a graduate of Niagara College of Applied Arts and Technology, where his interest in critical thinking lead him to take additional courses in statistics and public disinformation. He currently runs a blog on the topic of critical thinking entitled The Dumbasses Guide To Knowledge.