Saturday, March 17, 2012

I, Idiot

In a new paper called "Diverging Opinions," James Andreoni and Tymofiy Mylovanov perform an experiment to examine why disagreements persist even when people are able to share their information and choices. This is one of those questions where, when you first hear about it you're prone to think you automatically know a bunch of answers. It's easy to call to mind answers like, "well, people are just inherently subjective and they make decisions based on internal opinions", or "people are 'irrational' and you can't hope to understand how they make decisions." In a sense, these attitudes are correct -- behavior psychology shows us that a descriptive picture of human choice-making is very different from the normative picture that the rules of logical / statistical inference would paint.

But on the other hand, there are well-known mathematical results (e.g. Aumann's Agreement Theorem) that give us reason to think that if humans behave at all like normative theory would predict, then we should exhibit certain tendencies whenever information sharing is available. Simply noting that humans don't always display these tendencies, while perhaps an accurate observation, isn't very helpful for understanding why, how much, or in what direction humans deviate from normative expectation.

In the "Diverging Opinions" paper, the authors want to shed light of the why of the matter. The experimental set up is a game where people do not compete against others; they merely try to have the highest personal success they can. At certain points, they are allowed to see the choices that peer game-players have made, either in an entirely public, anonymous manner, or specifically linking the choices with successes or failures of peers. Thus, we should expect that when information is shared, people will use it to come to a common belief about what the right game strategy is. After all, this isn't a matter of opinion or arbitrary belief; it is a perfectly quantifiable game involving drawing differently colored chips from a container. Any unwillingness to update beliefs according to the public is a matter of irrationality or stubbornness here.

Quoting from the paper:
"How do we address the question of the persistence of differing views in the presence of common awareness of disagreement? In the final round in each session of our experiment, we provide subjects with information about the actions of others in all of the previous rounds. If the subjects believe the reasoning of others is similar to their own, that is, if there is common knowledge of rationality, this information is sufficient to infer their private information and should eliminate disagreement. Surprisingly, we find that, despite giving subjects the common information they need to reach full agreement, a sizable minority of our subjects maintain their opposing views.

Why do people remain in disagreement? Agreement requires two ingredients. One is sufficient rationality, that is, individuals must be reacting to their own information so that their choices reveal what they know. Second, agreement requires a common (or at least sufficient) knowledge of this rationality. We find evidence that both of these may be missing. First, we look at choices in round 1, when individuals should still maintain common priors, being indifferent about the true state. Nonetheless, we see that about 20% of the sample erroneously disagrees and favors one point of view. Moreover, while other errors tend to diminish as the experiment progresses, the fraction making this type of error is nearly constant. One may interpret disagreement in this case as evidence of erroneous or nonrational choices. Next, we look at the final round where information about disagreement is made public and, under common knowledge of rationality, should be sufficient to eliminate disagreement. Here we find that individuals weigh their own information more than twice that of the five others in their group. When we look separately at those who err by disagreeing in round 1, we find that these people weigh their own information more than 10 times that of others, putting virtually no stock in public information. This indicates a different type of error, that is, a failure of some individuals to learn from each other. This error is quite large and for a nontrivial minority of the population. Setting aside the subjects who make systematic errors, we find that individuals still put 50% more weight on their own information than they do on the information revealed through the actions of others, although this difference is not statistically significant."


The reason that I became interested in this result is a quote from Robin Hanson's post about this experiment. Quoting from his Overcoming Bias entry on it:
"... So in this experiment there is a bottom quintile of idiots, and everyone else seems roughly accurate in discounting the opinions of a pool of others containing such idiots. So in this experiment it seems the main reason people think they are better than others is that everyone, even idiots, don’t think they are idiots. I wonder how behavior would change if everyone was shown clearly that the idiots were no longer participating."


This may seem to be a very harsh way of putting it. Our society does not tend to use words like 'ignorant' or 'idiot' in their true senses and instead substitutes various negative or demeaning connotations. These things are almost always considered insults rather than actual, useful assessments of states of knowledge. But this is a more or less accurate way to summarize the experimental results. It appears that we humans are very sensitive to the possibility that information about what everyone else is doing is chock full of idiot choices and, crucially, that we are never the idiot.

When I think about this experimental result, my personal take away is that I should be more aware of the times when I am the idiot, which are plenty. Even though overconfidence is going to make me believe that I am immune to the decision making error that this research article points out, I should try hard to account for the fact that I will be in the bottom quintile of idiots in many scenarios in my life. There are many issues I encounter where I am not the expert, and if I am ever required to support one position or another on such issues, I will be prone to discredit publicly available decisions of others and I will be biased to think that I am not the idiot.

The life lesson here is not to be down on myself because of the title "idiot" or the fact that there are knowledge domains within which I live in the lowest quintile. But instead to notice that I can bootstrap myself out of ignorance by being more willing to acknowledge that the decisions that lots of other people make can give me good information about what the right decision might be. It's also very useful to heavily consider the proper use of humility, because the takeaway here is also not the same thing as giving credence to bad ideas. In the end, this reminds me of an apt Bertrand Russell quote:
"The trouble with the world is that the stupid are cocksure and the intelligent are full of doubt."