Other

Morality Pays Off

Morality Pays Off

Honesty, fairness, and equity in interpersonal, professional, and academic relationships define ethical behavior. Ethical behavior respects the diversity, dignity, and rights of individuals and groups of people. It can be defined as the standards you hold for yourself in all aspects of life, such as responsibility, honesty, and how you treat others in any situation.

Combining two approaches to game theory can shed light on how moral norms evolve. Selflessness and cooperation cannot be taken for granted. Mohammad Salahshour of the Max Planck Institute for Mathematics in the Sciences (now the Max Planck Institute of Animal Behavior) used a game theory-based approach to demonstrate why it can be beneficial for individuals to put self-interest aside.

One of humanity’s most fundamental questions is, “Why do we behave morally?” Because it is far from obvious that, under certain conditions, we put our self-interest aside and devote ourselves to the service of a group, sometimes to the point of self-sacrifice. Many theories have been developed in an attempt to solve this moral quandary. There are two well-known proposed solutions: individuals helping their relatives so that the common genes survive (kin selection), and the “you scratch my back, I’ll scratch yours” principle applies. When people help one another, everyone benefits in the long run (principle of reciprocity).

Prisoner’s dilemma combined with a coordination game

Because game theory studies how people make rational decisions in conflict situations, Mohammad Salahshour of the Max Planck Institute for Mathematics in the Sciences in Leipzig, Germany, has used game theory tools to explain the emergence of moral norms. Salahshour’s first question was, “Why do moral norms exist in the first place?” And why do we have different, if not opposing moral standards? For example, while some norms, such as “help others,” appear to encourage self-sacrifice, others, such as dress codes, appear to have little to do with reducing selfishness.

To answer these questions, Salahshour coupled two games: first, the classic prisoner’s dilemma, in which two players must decide whether to cooperate for a small reward or betray themselves for a much larger reward (social dilemma). This game can be a typical example of a social dilemma, where success of a group as a whole requires individuals to behave selflessly. In this game everybody loses out if too many members of a group behave selfishly, compared to a scenario in which everybody acts altruistically.

However, if only a few individuals act selfishly, they can outperform their altruistic team members. Second, a game that focuses on typical group decisions, such as a coordination task, resource distribution, leader selection, or conflict resolution. Many of these issues can ultimately be classified as coordination or anticoordination issues.

Without combining the two games, it is clear that cooperation does not pay off in the Prisoner’s Dilemma, and self-interested behavior is the best choice from the individual’s perspective if there are enough people who act selflessly. Individuals who act selfishly, on the other hand, are unable to solve coordination problems efficiently and waste a significant amount of resources as a result of failing to coordinate their activity.

The situation can be completely different when the results of the two games are considered as a whole and there are moral norms at work which favour cooperation: now cooperation in the prisoner’s dilemma can suddenly pay off because the gain in the second game more than compensates for the loss in the first game.

Out of self-interest to coordination and cooperation

As a result of this process, not only cooperative behavior but also a social order emerges. All individuals benefit from it, and as a result, moral behavior pays off for them. “In my evolutionary model, there were no selfless behaviors at first, but as the two games were coupled, more and more moral norms emerged,” Salahshour reports. “Then I noticed a sudden transition to a system with a lot of cooperation.” In this “moral state,” a set of coordination norms emerge that help individuals better coordinate their activity, and it is precisely through this that social norms and moral standards can emerge.

However, coordination norms favour cooperation: cooperation turns out to be a rewarding behaviour for the individual as well. Mahammad Salahshour: “A moral system behaves like a Trojan horse: once established out of the individuals’ self-interest to promote order and organization, it also brings self-sacrificing cooperation.”

Salahshour hopes to gain a better understanding of social systems through his work. “This has the potential to improve people’s lives in the future,” he says. “However, my game-theoretic approach can also be used to explain the emergence of social norms in social media. People exchange information while also making strategic decisions, such as who to support or what cause to support.”

He stated that two dynamics are at work at the same time: information exchange and the emergence of cooperative strategies. Their interaction is still poorly understood, but perhaps game theory will soon shed new light on this topic as well.