honesty

20May09

So when people play a game like the Prisoner’s Dilemma, what do they actually do?

According to Peter Lunn in Basic Instincts, about half  “co-operate”

He mentions how honesty boxes work on a similar principle (see BBC article).

Is this a kind of “superationality” at work? – that we know that we’ll get more out of it if we co-operate, and know that others know that too?

Or is it that either in evolution or in our lifetime we have repeated the prisoner’s dilemma in various ways again and again, and that with the iterated version it pays to co-operate?

As wikipedia puts it, successful strategies are:

Nice
The most important condition is that the strategy must be “nice”, that is, it will not defect before its opponent does (this is sometimes referred to as an “optimistic” algorithm). Almost all of the top-scoring strategies were nice; therefore a purely selfish strategy will not “cheat” on its opponent, for purely utilitarian reasons first.
Retaliating
However, Axelrod contended, the successful strategy must not be a blind optimist. It must sometimes retaliate. An example of a non-retaliating strategy is Always Cooperate. This is a very bad choice, as “nasty” strategies will ruthlessly exploit such players.
Forgiving
Successful strategies must also be forgiving. Though players will retaliate, they will once again fall back to cooperating if the opponent does not continue to defect. This stops long runs of revenge and counter-revenge, maximizing points.
Non-envious
The last quality is being non-envious, that is not striving to score more than the opponent (impossible for a ‘nice’ strategy, i.e., a ‘nice’ strategy can never score more than the opponent).


No Responses Yet to “honesty”

  1. Leave a Comment

Leave a comment