dinsdag 2 augustus 2011

Critical mass ... or not

Imagine a population, 90% of whom are truth-seekers who generally believe B to be true but have weak priors and 10% of whom are committed to that A is true. The 90% cannot distinguish other truth-seekers from advocates. Equilibrium then has to be that everyone converts to believing A is true. If you're a truth-seeker and you meet someone claiming better knowledge that A is true, and you believe his knowledge claims, you upweight A.

Pretty trivial. But a few folks who I'd thought otherwise sensible have read a bit too much into this kind of result.

Here's the original paper by a couple of physicists showing that in a world similar (but not identical*) to the one characterised above, the transition to everyone believing A is really fast if 10% are committed A-believers. Fair enough. But it has nothing to say about anything interesting in the world, like how beliefs might be updated if there are also a similar proportion of committed B-believers. Or if the truth-seekers can identify the committed.

Folks seem to be taking the result as saying something like "If only me and the few folks like me keep advocating really hard, eventually everyone will agree with us!" Give your heads a shake.

*It's not quite a Bayesian framework. Agents randomly meet and express an opinion from a list. If you hold opinion B and meet an A agent, you then hold AB. If you meet another A agent who says A, you then hold A; if you instead meet another B agent who says B, you then hold B. If you hold AB, you're randomly likely to voice A or B at your next meeting. But if you are committed, you only ever voice A. Repeat interactions until everyone believes A. This is the nonsense that happens when physicists try social science.

Geen opmerkingen:

Een reactie posten