The wisdom of the crowd?

I’m sure I often sound hypocritical when I talk about my perspective on scientific inquiry and my skepticism of religion. On the one hand, I believe that something is more likely to be true when the consensus among people points to that fact — for example, when many different people have performed an experiment and observed similar results. On the other hand, I don’t think that the prevalence of supernatural beliefs should be considered evidence for the supernatural. There is a meaningful difference between these two types of consensus … even though I know it sometimes doesn’t sound like it when I’m explaining my stance to religious friends.

John Timmer at Ars Technica, in a summary of a recent article in the Proceedings of the National Academy of Sciences, pinpoints this difference:

The “wisdom of the crowd” has become a bit of a pop cliché, but it’s backed up by real-world evidence. When groups of people are asked to provide estimates of obscure information, the median value of their answers will often be remarkably close to the right one, even though many of their answers are laughably wrong. But crowds rarely act in the absence of social influences, and some researchers in Zurich have now shown that providing individuals information about what their fellow crowd-members are thinking is enough to wipe out the crowd’s wisdom.

It’s an amazing experiment. When people were asked to estimate an obscure value based on their (limited) prior knowledge, the average of their answers was close to correct. But when they were given information about what others had estimated, their range of answers shrunk, often zeroing in on an incorrect value. And worse, everyone’s opinion of the accuracy of their answer increased:

The authors, however, also detected a purely social influence, which they termed the “confidence effect.” As the range narrows and more of the participants are close to the typical answer of their fellow panelists, their confidence that the answer they’re giving is roughly correct goes up. In other words, when someone sees that the rest of the crowd is giving an answer close to their own, it gives them greater confidence that their answer is likely to be right.

In some ways, this is reminiscent of the Asch conformity experiments, in which participants were willing to give obviously incorrect answers to questions about the length of a line when they were surrounded by other people who had already done the same.

The takeaway message is clear: it’s useful to consider others’ opinions, but only under limited circumstances. It works best when each person answers a question first for themselves, based on their own knowledge and reasoning, and only then compares answers with others. This is why consensus is a powerful tool of science; each scientist performs her own controlled experiments and aggregates those results with what others found before. If many distinct experiments agree, it is a good indication that their results are correct. Something else entirely is happening in the case of religion. The vast majority of the time, no separate investigation is being performed. Each individual hears from many others that they believe (and, often, that they heard people who don’t believe get punished worse than our minds can even imagine). It’s no surprise that such an individual would then confess belief themselves. That isn’t an indicator of truth. It’s more like peer pressure.

Leave a comment

4 Comments

  1. Seth R.

     /  May 26, 2011 at 10:16 am

    The problem is – creating the control conditions of isolation seems to be a practical impossibility in most instances.

  2. Ubi Dubium

     /  May 26, 2011 at 11:25 am

    There’s a good related article over at the “You are not so smart” website on “Anchoring Bias”, the tendency to be influenced, or “anchored” by a number you have just heard, even if that number is entirely random and has no relation to the question being asked. For instance, in a study a group was asked to write down the last two digits of their SSN with a dollar sign, then participate in a simulated auction. They were asked, on a series of items, if their bid would be higher or lower than the dollar amount they had written. Then they were asked for their actual bid. The people with the lower starting numbers tended to bid lower, likewise with the higher numbers bidding higher, even though their anchor number was randomly chosen.

    So the skewed “crowd wisdom” results may partially be simple anchoring bias, the fact that they were initially given any answer at all shifts their final answers, regardless of whether anybody actually considered that anchor the right answer.

    I wonder if there has been a study that found a way to account for this effect.

  3. @Seth: That’s a very good point. The best we can usually do is to try to be conscious of the influence of others’ opinions and try not to give them more weight than they’re worth. Most cognitive biases can’t be totally avoided, but the more we know about them, the more we can try to correct for their effects.

    @Ubi: Thanks for the tip! Crazy stuff. (Here is a link to that article, in case anyone else is interested.) It does sound like anchoring bias is probably involved here, but it wouldn’t go so far as to explain the increased confidence in their answers — I think that means they must be thinking that aggregating other people’s answers = more correct. The really weird thing to me is, they’re not wrong! But it still leads to wrongness….

  4. Good stuff!

    As you say, there is great value in “Triangulations” — ouch, shameless self-plug, sorry. :-)

    The human mental habit that generates conformity has survival benefits. But like all evolutionary heuristics, “listening to the crowd” is only good enough — evolution does not build the best. This heuristic works good enough over the long run even if it generates lots of bad outcomes — or at least it did at one time in the evolutionary picture. But this heuristic is loaded with problems, as you suggest.

    With science we have built many methods to see past the blind spots of the supposed wisdom of the crowd. But to do this, we not only need the method, but we need free speech, freedom from persecution and free property right. That way, ideas that differ from the crowd can be heard (no matter how apparently crazy), tested, dragged out of old records and retested (in case the old testing was flawed), announced and freely embraced and examined again.

    The value of Freedom is often underestimated: She is the necessary sister of Science.

Leave a Reply