Here's a thought in the early stages of baking in my mind...
In his 2009 book, The Big Switch, Nicholas Carr describes a experiment in which researchers brought together one group of politically liberal-minded folks and an off-setting group of conservatives. They surveyed each participant beforehand to understand his pre-existing views on topics such as same-sex marriage, affirmative action, and global warming. This was the baseline. Then they put the liberals together in one room and the conservatives in another and basically said talk amongst yourselves.
When discussion time expired, the researchers surveyed the individuals again, asking the same questions as before. How did the discussion with like-minded participants influence the initial views as expressed in the baseline questionnaires?
In short, people's views became more extreme and more entrenched on all three issues. The liberals came out more liberal, and the conservatives came out more conservative.
Deliberation thus increased extremism...every group showed increased consensus, and decreased diverstiy, in the attitudes of its members.
The researchers came to call the effect ideological amplification. It's one of those funny wiring glitches of the human brain, and its effects go far beyond matters of politics.
Throughout the weekend I found myself thinking about this tendency and its potential to create trouble. If you're in a profession that requires complex and nuanced logical thought - think scientific discovery, philosophical truth-seeking, medical diagnosing, investing - it's imperative that you find ways to root out the bias that inevitably creeps into your process of thinking. You must design thinking mechanisms for identifying them and then formulate the discipline to root them out.
Yet this is hard to do. Very, very hard. It's so difficult to challenge your own ideas. Our natural tendency is to find ways to support what we're thinking, not to disconfirm it. And this becomes more true the more we develop the idea, especially if we begin promoting it to the world.
And since we're social creatures, too, we often take our ideas to the world in search of support. It's a rare person that takes his ideas to groups of people that are likely to shoot them down. More often we take our ideas to Confirmation Marketplaces...supporting family members and like-minded colleagues. Or industry conferences, email listservs and online forums whose tendencies we already recognize to be aligned with the bent of our existing thoughts.
What happens here, and I don't think we sufficiently account for this in our thinking process, is ideological amplification. These groups and forums become places for us to feel better about what we're thinking. To confirm our existing thoughts - and often to promote them - rather than challenge them.
Such confirmation is probably fine if you're idea is already well-threshed out and true. But more often than not, the ideas require a healthy dose of intellectual pummeling to verify and/or deny the reasoning behind them. Disconfirming challenges to off-set our natural confirmation biases. If we seek out the echo chamber of confirmation marketplaces, we're not likely to get that.