Friday, October 2, 2009

Prisoners of Our Own Device

One of the themes that tends to recur in my blog-essays is "trust." In a social world, we are constantly making decisions about how we accept or reject what other people say, which is largely a function of how much we trust them. (I suppose we can trust someone to always say something unacceptable: that's pretty much how I feel about Glenn Beck, or Glenn beck feels about Barack Obama, but that's another issue.)

Game theory has had an amazing impact on how we approach life. A relatively recent variation on the kind of puzzles with which mathematicians have been testing each for as long as there have been mathematicians, game theory actually gets close to questions of how people act in real life. In fact, game theory underlies much of the academic and political approaches to economics, political science, business, even biology.

(This would be the perfect place to get into the whole Chicago School of economics and the fundamentally flawed assumptions that helped the tribe of Friedman (and Greenspan) to drive our economy into the ditch. But no.)

Suffice it to say that game theory is often based on the judgment that there is something called "norms of rationality" -- people can be counted on to make choices that optimize their success, especially if success is measured as money. Not everyone agrees that a rational norm is so easily described, but that has been the dominant assumption of the last several decades.

One of the competing approaches to norms of rationality can be found in the posing of the "Prisoner's Dilemma." This math game goes back to 1950, and a pair of Rand researchers, Merril Flood and Melvin Dresher. I first became aware of it in Douglas Hofstadter's Metamagical Themas column in the Scientific American in the early '80s. He laid out the parameters of the game in terms of two prisoners, kept apart and incommunicado, who each had something the other wanted. Because each passed the same spot in prison at different times each day, they worked out a way to communicate a way to exchange items. The challenge was that the first prisoner had to trust the other -- no point in leaving the goods unless he trusted the other to leave his -- and the second could just take the goods if they were there and leave nothing. So the set-up had to be that it would take several exchanges to transfer everything each prisoner wanted -- an iterative game.

Hofstadter then challenged computer programmers to write programs that maximized a prisoner's gains, or profits. In the classical approach to the game, where you can either "cooperate" or "defect," the strategy that best reflects normative self-interest is to defect, give as little as possible and take whatever is offered. There were already those who did not agree that that strategy reflected the way real people make decisions.

For one thing, there is evidence that people -- some people at least some of the time -- act in way they want other people to act. You could look at this in terms of the Golden Rule or Kant's categorical imperative, but it seems to be a part of human life.

Hofstadter developed rules that recognize that at least some (but not all) prisoners would act  that way, and then challenged computer programmers to devise the most successful program, the one strategy that a prisoner would use to maximize his gains. Over several iterations the winning strategy was almost always the simplest: "tit for tat:" start by cooperating on the first move and every subsequent act reflects the partner's previous action: defect after defection, cooperate after cooperation.

Okay, great. That's an argument for trust. But didn't I just say in a previous blog, "Trust no one?"

I did say that, and I still do.The lesson from the Prisoner's Dilemma is that you do best by starting with cooperation, but it's only provisional. Be prepared to defect. Two or ten or 100 cooperative acts do not "prove" trustworthiness in the future.

I want to trust other people. Or rather, I want people to earn my trust. So I have to demonstrate trust. When it's appropriate. I also have to demonstrate that I'm paying attention to what they actually do and will answer a defection with a defection. And cooperation with cooperation. I think, in reality, we have a bigger social problem from trusting people who have not earned it, who have, in fact, defected.

That approach kind of takes the idea of normative rationality and refigures it as provisional, or contextual, rationality (my term, as far as I know). It's based on a moral decision (what I do matters, and what I want is a community based on trust) that gets applied to a world of social interactions that are largely unpredictable and sometimes dangerous. The rational basis of my decisions reflect a constantly changing reality -- the world as it really is right now. And it is informed by a desire to see things get better. We may not be prisoners (even if it is the Hotel California), but we're all looking for a jail break.

No comments:

Post a Comment

Leave a comment. Tweet me @WillBurd.