## 5/24/12

### bayesian truth serum

I took 100 yes/no questions and asked 20 turkers to answer each question. I also asked each turker to predict how other people would answer each question.

You can see these questions to the right. If you answer one, and predict how other people will answer it, then you can see how the turkers and other blog readers answered the question. You also get points: 100 points if your prediction is exactly right, and it goes down to 0 when you are 50 points away from the right percentage. New questions will come when you answer these questions.

Sometimes you'll see that the average answer and average prediction are pretty different, suggesting cases where people have a distorted view of reality. I'd like to make the data public and say something interesting about it, but first I want to let people make guesses, and try to get a good score, without already knowing the answers.

For me, the idea of asking people to predict what other people will do comes from 'bayesian truth serum', though that paper proposes some fancy math which I am not using here. In particular, bayesian truth serum gives you a score based not only on your prediction, but also on your answer, which would seem like it incentivizes people to answer dishonestly so as to maximize their score, but for some math reason, they claim it doesn't. Here, I'm just giving a score based on your prediction, which is simpler to understand.

This idea also came up again for me when listening to a talk by Justin Wolfers at the Collective Intelligence conference about forecasting elections. Wolfers has some interesting data showing that asking 'who do you think will win' is more predictive than 'who will you vote for', which sortof makes sense since you're sortof asking people to take the average vote of all the people they know, which may effecitvely increase your sample size; however, the effect is stronger than I thought, since he also has data showing that asking a biased sample of people, like all republicans, 'who do you think will win?' is better than asking an unbiased sample of people 'who will you vote for?'.

Anyway, long store short, I wanted to play around with the idea of asking 'how will other people answer this question?'. Hopefully a lot of people answer the questions in this prototype, and I can write a subsequent blog post about that data.