Prisoner's dilemma is a two player game, in which each player separately and simultaneously decides whether to cooperate with the other, or to defect. It is analogous to the situation a criminal might find himself in: either testify against his accomplice in return for a lighter sentence, or keep quiet and hope that the other guy does as well. The highest point payoff results if both parties cooperate, but if one has no idea what the other party will do it is best to defect. Ironically, if each player follows his best strategy (assuming no knowledge of what the other will do) both will defect rather than cooperate, ending up worse off.
Clearly the prisoners would be better off if they could trust each other. But how does trust arise? An academic at Michigan has been running prisoner's dilemma tournaments between software agents (programs) for some time. Each competitor programs his agent and they play against each other in a round-robin tournament, whose key feature is that the agents will meet repeatedly and can remember what happened in previous rounds. The dominant strategy - that of the winning agents - seems to be a kind of benevolent "tit for tat," in which an agent assumes the other will cooperate when they first meet, but punishes a defection by behaving similarly at the next meeting. This allows altruistic cooperation to develop, and deters bad behavior. I am oversimplifying here because, obviously, the success of a strategy depends quite a bit on the overall distribution of algorithms in the pool of agents. However, it seems to be a fairly robust fact that tit for tat styles of play tend to do well. This observation has been taken as evidence that there might be evolutionary selection for altruism. Humans who don't have at least some instinct for cooperation will have lower survival and reproductive probabilities, and the nice (but stern) genes will eventually predominate.
It's a nice story, and perhaps even has a kernel of truth. Of course, as anyone who has worked in a large organization can attest, there are other, nastier, niche strategies that work well too. (Like, pretend to be a nice tit for tat person but defect as much as you can get away with it.) Interestingly, I find that in the silicon valley world of entrepreneurs, VCs, engineers and salespeople, there is a very strong tendency towards benevolent tit for tat behavior. To first approximation, I can email anyone without ever having met them, introduce myself, and, by making a sufficiently logical case for it, elicit some amount of cooperation. This routinely happens along the lines of "I am researching an idea in this space, and wonder if you know anyone who would talk to me about the (market, competitors, technology, etc.)" It is understood that other agents in this virtual world would do the same under similar circumstances. Sometimes you end up wasting your time, or having your time wasted, but other times you develop very useful additions to your trusted network. A business ecosystem lacking this culture of cooperation would have a hard time competing with silicon valley in terms of innovation or development of new companies.
No comments:
Post a Comment