The Evolution of Cooperation

The Evolution of Cooperation by Robert Axelrod

After I finished my sermon yesterday, before I had even stepped down from the chancel, a church member rushed up to me excitedly and pushed this book into my hand. She just happened to be reading it and saw that it fit remarkably well with the issue from my sermon. She suggested I read the first chapter and then give it back to her tomorrow. Instead, I read the whole book today.

I had been preaching about the Tragedy of the Common, the idea that when shared spaces are unpoliced by an outside authority that self-interest will inevitably direct the individual actors to consume more than their share leading to the destruction of the resources to the detriment of all. Obviously, that notion has grave consequences for climate change (my sermon was for Earth Day). But, I had observed hopefully that spiritually mature folks act from an expanded sense of self that includes relationships with others and therefore a form of enlightened self-interest might lead to cooperation. This book, working up from mathematical models grounded in game theory, shows how cooperation actually is the most successful strategy and would be selected for evolutionarily even for organisms that have no higher consciousness.

The model begins with the game called the Prisoners Dilemma. I was familiar with the Prisoners Dilemma and the optimistic conclusion that under certain, but rather broad, real-world circumstances, cooperation is the best strategy. The game is simply this: two suspects involved in the same crime are arrested and will be interviewed separately. Four outcomes are possible. If they both keep their mouths shut, they “cooperate” and each receives a reduced sentence, say one year each. If one squeals, or “defects” while the other stays silent, then the ratted on criminal gets five years and the rat gets off free. If both criminals talk they each get three years. What’s the best strategy?

If the first criminal chooses to stay silent, he will either receive a sentence of one year, or five years, depending on the choice of his confederate. If he decides to talk, he will receive either three years, or no years. Together they will receive either two years (both are silent, one plus one); five years (one talks, one doesn’t, zero plus five); or six years (both talk, three plus three). The best strategy for both of them together is to stay silent. But the best strategy for either one of them individually is to talk. That’s the dillemma. If I could only trust my partner to stay silent I would, too. But how can I trust him? And if he thinks I’m going to be silent then his best choice is to talk. Each criminal has to make their decision independent of the other, so rationally they both talk and they end up with the severest combined punishment: six years.

But that’s only the outcome if the game is played just once. In a situation where the game is played multiple times by the same two players, and we reverse the outcome to offer rewards rather than avoid punishments, a different strategy maximizes benefits. The possibility for trust and cooperation emerges. It actually becomes wiser to treat the other player as a partner, not a competitor, and to use your behavior to encourage cooperative behavior in your partner, while also signally that you won’t be a sucker. The optimal strategy is to begin by showing willingness to cooperate. Axelrod calls this a “nice” strategy. (A mean strategy would defect in the first meeting.) And then, in every succeeding round to do what the other player did in the round before. If the other player cooperates, you do, too. If the other player defects, you immediately follow with your own defection, proving you won’t tolerate being taken advantage of. If the other player returns to cooperation, you immediately forgive and go back to cooperating also.

Axelrod calls this Tit for Tat, but he also notes you could call this An Eye for An Eye and he’s quick to point out circumstances where equal force responses can lead to endless feuds. He’s careful not to call Tit for Tat an ideal strategy because in some circumstances it may need to be modified by, say, giving back only half of the punishment one received. But the general principle of encouraging cooperation but refusing to be taken advantage of is always correct.

After proving his theory based on the results of computer game tournaments in the first several chapters and providing some real world examples from the behavior of opposing armies in First World War trench warfare and biology (written in conjunction with William D. Hamilton), Axelrod then gives some general advice for folks participating in any situation akin to a multiple round Prisoner’s Dilemma:

“Don’t be envious.” That is, measure your success not against the other player but against what a person playing a different strategy would have earned in your circumstances. “Don’t be the first to defect. Reciprocate both cooperation and defection. Don’t be too clever” (p. 110).

He sums these up in an earlier chapter (in a passage highlighted by my parishioner) as:

“The analysis of the data from these tournaments reveals four properties which tend to make a decision rule successful: avoidance of unnecessary conflict by cooperating as long as the other player does, provocability in the face of an uncalled for defection by the other, forgiveness after responding to a provocation, and clarity of behavior so that the other player can adapt to your pattern of action” (p. 20).

In other words: be nice; unfailingly enforce consequences for bad behavior; forgive quickly; and be transparent in your decision-making. Good rules for living.

The final chapter is some advice about how to introduce cooperative behavior into situations where competition rules. Axelrod’s suggestions are:

“Enlarge the shadow of the future” (p. 126). That is, help folks to see that present interactions will continue to have ramification in the future and give those future realities increased weight.

“Change the payoffs” (p. 133). That is, adjust the penalties and rewards so that the incentives clearly make good behavior the optimum choice. He uses examples of EPA regulations that could become burdensome if too heavy or worthwhile ignoring if too light.

“Teach people to care about each other” (p. 134). This is where I was going with my sermon.

“Teach reciprocity” (p. 136). Neither a bully nor a sucker be. Give as good as you get, both positive and negative, but not for revenge but to encourage cooperation.

“Improve recognition abilities” (p. 139). For cooperation to emerge as the best strategy the players need to know that they are in long-term relationships with others. If you know you’re never going to have to deal with the same person again the best, one-time, strategy is to take what you can. Understanding that we are not ever really strangers to each other leads to cooperation as a form of self-interest, which was another message from my sermon.

The book is highly repetitive. The chapters tend to hammer home rather than expand the conclusion which is apparent in the introduction. The book was originally published in 1984, drawing on articles previously published in 1980 and 1981. I read a revised edition from 2006.

Leave a Reply

Your email address will not be published. Required fields are marked *