Peace Calendar home


The Peace Calendar Vol.1 No.0
The Peace Calendar Vol.1 No.1
The Peace Calendar Vol.1 No.2
The Peace Calendar Vol.1 No.3
The Peace Calendar Vol.1 No.4
The Peace Calendar Vol.1 No.5
The Peace Calendar Vol.1 No.6
The Peace Calendar Vol.1 No.7
The Peace Calendar Vol.1 No.8
The Peace Calendar Vol.1 No.9
The Peace Calendar Vol.1 No.10
The Peace Calendar Vol.1 No.11
The Peace Calendar Vol.2 No.1
The Peace Calendar Vol.2 No.2
The Peace Calendar Vol.2 No.3
The Peace Calendar Vol.2 No.4
The Peace Calendar Vol.2 No.5
The Peace Calendar Vol.2 No.6
The Peace Calendar Vol.2 No.7
The Peace Calendar Vol.2 No.8
The Peace Calendar Vol.2 No.9
The Peace Calendar Vol.2 No.10
The Peace Calendar Vol.2 No.11

Peace Magazine is the successor to the Peace Calendar. Go to the Peace Magazine homepage

Valid HTML 4.01 Transitional

REVIEW: The Evolution of Cooperation

Metta Spencer (reviewer) — November 1984

Robert Axelrod. New York: Basic Books, 1984. $24.95 in Canada.

Imagine that you’re a thief. You and your partner have been caught and put in two separate jail cells. The Crown Attorney doesn’t have enough evidence to convict, though, so she offers both of you the same deal to elicit a confession. If you both confess, you’ll each get 8 years in prison. If neither of you confesses, you’ll each get one year (since she’ll book you on some minor charge for which she can get sufficient evidence to convict). But if only one of you confesses, the one who does will get a six-month jail sentence and the one who doesn’t will get twenty years. Now then, what are you going to do?

Twenty years! Wow. If your partner double-crosses you, you stand to lose a heck of a lot. Fortunately, he stands to gain only a little by doing so: He’ll get a six month sentence instead of the year you would both get by keeping quiet.

But on the other hand, he stands to gain a lot by squealing if he can’t trust you to keep quiet too. In fact, whether you can be trusted will determine whether he can afford to stay silent; if he clams up and you don’t, he loses big. Since both of you realize that fact, you may decide that neither of you can afford the risk. One or both of you may squeal, with an outcome that neither of you would have preferred.

This little drama is an example of a “mixed motive game.” In a way, your interests and those of your partner are harmonious: both of you can win if you trust each other. In another way, however, your interests are in conflict: one of you can benefit from the misfortune of the other. Your dilemma is that you have reasonable grounds both for cooperation and for conflict.

The scientific study of “decision theory” is based largely on studies involving this kind of situation, which is called “Prisoners’ Dilemma.” It is important in peace research because it illustrates some of the problems involved in coordinating activities among actors who have mixed motives — which means all of us. Almost all human relationships involve a mixture of competing and compatible interests.

Thus two (non-nuclear) enemy nations are in the same dilemma as the prisoners mentioned above. If both can trust each other, they need not arm at all, and thereby save lots of money. But if one arms and the other does not, the well-armed nation can afford to attack the defenseless and totally destroy it. For this reason, both nations may arm to “deter” the other, even though this is not the outcome that either of them would prefer. This explains arms races which, as we know, generally result in warfare.

Game theorists experiment with these situations by having teams of players choose between cooperative and competitive’ strategies in a series of games where they win or lose points on the basis of the joint decision. In this way, the experimenters can observe the outcomes when controlling the costs and benefits of cooperating or competing. They can vary other factors as well (for example, the sex of the players). This has turned out to be a very fruitful scientific field, since the situations that it models have so many counterparts that result in conflict in real life. We can develop mathematically exact answers to such questions as “Does honesty pay?” and “How can we best elicit the cooperation of others?”

Better yet, we do not need real players ,to experiment with the game. We can get computers to play according to specified strategies and, by keeping score over a long series of games, determine which strategies produce the best effects. That is precisely what Robert Axelrod has done: He invited players to submit specific strategies of play, then pitted every contestant’s strategy against every other one in a computerized tournament. ‘

To win at Prisoners’ Dilemma, as also to win in a potential arms race with an enemy nation, what is necessary is not to defeat the other player, but to improve his or her score at the same time one improves one’s own. Competitive strategies lose: The trick of winning is to play cooperatively and prompt the other team to do likewise consistently.

The winning strategy in this tournament was the simplest one: It was entered by Toronto’s Anatol Rapoport (See the interview with him in this issue). Rapoport calls his system tit for tat. It amounts to this rule: Always begin by playing cooperatively. After that, play exactly the way the other team played on the last round. If they double-crossed you, then on this round, double-cross them. If they cooperated, then you should cooperate with them. Your reciprocity will reward or punish them appropriately and prompt their cooperation better than anything else.

What is involved’ is the building up of trust over a series of games. When the interaction lasts only for one encounter, this is not possible. In games involving hundreds of rounds played by teams of several people, it is usual for players to begin’ playing competitively, then improve their scores gradually by establishing their trustworthiness and mutual cooperation.

Axelrod’s book amounts to an examination of the moral and political implications of Rapoport’s successful strategy. It applies insights from the strategy to explanations of cases when peaceful, cooperative interactions were developed in the midst of conflict. For example, during World War I, the Allied and German troops lived in adjacent trenches and were supposed to be fighting all the time. Actually, however, they built up certain understandings amounting to “live and let live.” Only because their interactions were repeated over a period of weeks could they demonstrate their trustworthiness to each other and elicit cooperation.

Books of this kind are usually awfully dull, but this one is such a good read I’d recommend it to anyone.