Statistics are Rad

Let’s talk about normal.

I know there’s a big push from a range of people to stop calling things or people “normal”. Their argument (and it’s a good one) is that there is so much diversity among people, how can anyone be “normal”? You know what? I completely agree with that. People who say that are absolutely correct, and we’d all be a lot better off if we took that fact to heart.

The problem I have is that we can’t simply remove the concept of normal from our dialogs. It’s a useful concept, and those dialogs are important. Yes, no individual is “normal” per se, but if we follow that road to its inevitable conclusion, we just end up in relativistic hell. I personally insist on using the word normal, and when I do, I’m referring to the statistical normal.

Standard bell curve

Hey baby, I like your curves.

It’s true that among individuals, each person is different. That is to say, they fall somewhere different on a normal distribution bell curve. There are actually a mathematically infinite number of positions you could be on the bell curve. An individual could fall on any one of them, and possibly be the only person ever in the history of the universe to fall there! So yeah, basically everyone is different, and even math is cool with that. But still, the idea of “normal” is useful. I’m of the opinion that when you’re dealing with a large set (like people) and discussing complicated attributes (like with people) chances are, you’re going to be describing a bell curve like the one pictured above at some point.

There’s a mean (μ) and the concept of a standard deviation (σ) from that mean. If you take everybody who falls within one standard deviation of the statistical average, you’d be taking 68.2% of the entire population. That’s already the majority, but I wouldn’t quite call it normal. 31.8% is still a huge chunk of people. So instead we take two standard deviations. Now we’re talking about 95.4% of the population. That’s more than the majority, that’s normal.

This is such a useful concept. You can objectively say whether or not someone is normal. Before you start grabbing pitchforks and torches, let me explain the crucial factor here. “Normal” is not (never, ever ever) a judgment of validity, acceptability, or a reason to discriminate or anything else. It’s an objective, unbiased mathematical concept. Someone being normal or not normal means about as much as whether or not they the like The Simpsons. Some people do, some people don’t. It doesn’t really say anything about them as people. If 95.4% of the population liked The Simpsons, then I’d say liking The Simpsons was normal and not liking The Simpsons was not normal. It’s useful for me to be able to say that. It’s a useful statistical fact about the population, and I can use that language to have a dialog with someone else about the population. Do I care that 4.6% of people aren’t normal and don’t like The Simpsons? Not really. It’s just a fact of the population.

Playing Devil’s Advocate, I will say that many people have a hard time separating the objective, unbiased normal vs. not normal from the subjective, biased good vs. bad. If you think that may describe you, then you might consider not using the word normal. Try to get that notion of good vs. bad out of your head. I personally don’t have a problem with the idea of statistical normal. Math is my lover. We’ve been BFFs since Sesame Street. Yes we had a rough patch with discrete mathematics and theory of computation, but it’s never lied to me, and I expect it never will. It has no agenda, no bias and is as close to absolute truth as humans may ever get. That being the case, I think I’ll continue to stick to using words and concepts I understand that are backed up by mathematics and objective logic.