I’m sure I often sound hypocritical when I talk about my perspective on scientific inquiry and my skepticism of religion. On the one hand, I believe that something is more likely to be true when the consensus among people points to that fact — for example, when many different people have performed an experiment and observed similar results. On the other hand, I don’t think that the prevalence of supernatural beliefs should be considered evidence for the supernatural. There is a meaningful difference between these two types of consensus … even though I know it sometimes doesn’t sound like it when I’m explaining my stance to religious friends.
The “wisdom of the crowd” has become a bit of a pop cliché, but it’s backed up by real-world evidence. When groups of people are asked to provide estimates of obscure information, the median value of their answers will often be remarkably close to the right one, even though many of their answers are laughably wrong. But crowds rarely act in the absence of social influences, and some researchers in Zurich have now shown that providing individuals information about what their fellow crowd-members are thinking is enough to wipe out the crowd’s wisdom.
It’s an amazing experiment. When people were asked to estimate an obscure value based on their (limited) prior knowledge, the average of their answers was close to correct. But when they were given information about what others had estimated, their range of answers shrunk, often zeroing in on an incorrect value. And worse, everyone’s opinion of the accuracy of their answer increased:
The authors, however, also detected a purely social influence, which they termed the “confidence effect.” As the range narrows and more of the participants are close to the typical answer of their fellow panelists, their confidence that the answer they’re giving is roughly correct goes up. In other words, when someone sees that the rest of the crowd is giving an answer close to their own, it gives them greater confidence that their answer is likely to be right.
In some ways, this is reminiscent of the Asch conformity experiments, in which participants were willing to give obviously incorrect answers to questions about the length of a line when they were surrounded by other people who had already done the same.
The takeaway message is clear: it’s useful to consider others’ opinions, but only under limited circumstances. It works best when each person answers a question first for themselves, based on their own knowledge and reasoning, and only then compares answers with others. This is why consensus is a powerful tool of science; each scientist performs her own controlled experiments and aggregates those results with what others found before. If many distinct experiments agree, it is a good indication that their results are correct. Something else entirely is happening in the case of religion. The vast majority of the time, no separate investigation is being performed. Each individual hears from many others that they believe (and, often, that they heard people who don’t believe get punished worse than our minds can even imagine). It’s no surprise that such an individual would then confess belief themselves. That isn’t an indicator of truth. It’s more like peer pressure.