Unless you are allergic to talking about politics and religion with people, or you consider “arguing” a dirty word, then you have probably at some point faced the accusation, “You think you’re always right.” Obviously this is a truism. As a friend of mine once deduced:
“Yes, I think that what I think is right. And if I learn that something isn’t, then I change my mind…and I’m still right.”
In the book How Not to Be Wrong by Jordan Ellenberg, I came across the probabilistically paradoxical case of Nate Silver. After living off online poker and then writing on baseball metrics, Silver took up political analysis. In 2012, he set out to call the presidential election by accurately predicting the results of all fifty states. Silver consistently and unconventionally included honest margins of error in his predictions that critics decried as “hedging.” In the end, he correctly called every single state.
Ellenberg points out that if Silver’s critics had really wanted to trap him, they could have posed him the question: How many states do you expect to get wrong?
No Dice
Let’s think about probability. If you flip a (fair) coin, about 50% of the time it will come up heads, and about the other 50% of the time it will come up tails. Likewise, with one (fair) cube die, you’ll roll a 1 about 1/6 of the time, a 2 about 1/6 of the time, and so forth.
(Fun party question the next time you’re out with friends: What is the only word in the English language that you pluralize by adding a ‘c’ to the middle of it?)
Let’s say that you have one such die and wager with someone that it won’t come up a 6. You both risk exactly $1. Your friend collects the $2 pot if it rolls a 6, and you collect if it’s anything else. Moreover, like at a casino, your friend is willing to play this “game” with you over and over so long as you are. At any time, you can temporarily swap the winning criterion with him: If you want to go one round betting your dollar on the 6, you may.
Clearly, you’ve got a good thing going. The die is five times more likely to come up in your favor than in his, meaning that you win every time that the die doesn’t roll a 6. So, for which round(s) should you switch the winning conditions?
The answer is ‘None of them.’ Every time you are about to roll the die, the next roll has an 83% chance of coming up in your favor. Even if you know that in the long run about 1/6 of the rolls will come up against you, each and every particular roll will overwhelmingly (5:1) come up in your favor.
In practice, not everyone might intuit this logic. Another mathematical author, John Allen Paulos, tells the story of an experiment where, as in roulette, people pick ‘red’ or ‘black’ over a one hundred-long series of trials. Unbeknownst to the participants, the experiment weighted the red/black division about 70/30. People would pick up on the slanted balance and distribute their own guesses along approximately the same 70/30 divide, even though resolving to hit ‘red’ every time would virtually guarantee them higher accuracy.
Aggregate Error
Because of Nate Silver’s margin of error, his final predictions built-in expectations of how probable and improbable each prediction of his was. That means there was a straightforward and objective mathematical way of determining exactly how many states he “expected” to get wrong. Taking into account all fifty states, Silver’s margins of error implied that he expected to miss about three.
Which three? Well, none of them in particular – that’s the point. If you questioned him on each and every state, he would stand fast with his prediction. There was not a single one that he would have changed. Nonetheless, overall, he would have expected about three of them to let him down. It’s an apparent paradox, where you hold fast to each particular but admit likely failure somewhere in the mix.
Most of us actually have experience with this paradox. Think back to predicting your grade on an exam. You studiously triple-check every single question. You are unwilling to change any answer, but you still feel enough aggregate uncertainty to expect scoring about 98/100.
As quoted by Ellenberg, the philosopher W.O.V. Quine concluded, “A reasonable person believes, in short, that each of his beliefs is true and that some of them are false.” Or, as Ellenberg himself put it, “You find that you always think you’re right, but you don’t think that you’re always right.”
How to Be Wrong
I realized that this phenomenon explains the complaints against “always thinking you’re right.” My knee-jerk reaction in the past had been like my friend’s: Naturally I believe X, since I think that it’s right. If I didn’t think that, then I wouldn’t believe it. This truism recalls the line from Aristotle concerning some unlikely but real event being possible, namely, “If it were impossible, it would not have happened.”
It is, however, unreasonable to believe oneself above all mistake. You may firmly and with good reason defend your beliefs, but the humble man will accept that in all probability, he has some portion of falsehoods mixed in there. Since he recognizes himself as fallible and therefore given to err at least some of the time, he should not oppose, categorically, seriously questioning his ideas. After all, even if you would rate your accuracy at 99%, then you’re mistaken about one thing for every hundred you believe, and your beliefs, opinions, and known facts might number in the millions.
It’s unlikely that anyone would ever ask you, “How many of your beliefs do you think are wrong?” Nevertheless, your hypothetical answer says something important about your character. If you make the illegal move from “Each of my beliefs is probably correct” to “All of my beliefs are probably correct,” then your very reasoning itself is liable to err. You habitually look at your beliefs as correct by virtue of their being yours, rather than for any underlying reasons. There are other paths to this miserable self-deception, but this is a likely one.
If you hear someone defend a contention with “because I believe it” or “because it is,” then beware: Intellectual peril is afoot. More commonly, people say, “It’s the standard/normal/catechetical answer to the question,” or just, “Of course it is.” Such spiraling theories, often found in groups closed to the critiques or questions of the outside world, give rise to lunacy.
Each individual belief of yours should be subject to scrutiny. For that to be possible, you first need to dismantle the impregnable membrane that protects your beliefs en masse for the single very good reason that they are, after all, yours. You must expect to be wrong sometime, and recognize that the mistake might be lurking anywhere.
We should grow accustomed, in the long run, to being wrong and being persuaded to change our minds. This subtle understanding and noble attitude surfaces, with so much other wisdom, in George Washington’s Farewell Address:
Though in reviewing the incidents of my Administration, I am unconscious of intentional error – I am nevertheless too sensible of my defects not to think it probable that I may have committed many errors. Whatever they may be, I fervently beseech the Almighty to avert or mitigate the evils to which they may tend.