The Research-Proven Way To Win More Cooperation

research-proven way to win more cooperation

Our social nature is the basis for marketing public and social goods. How effective we are in social interactions directly influences the success of our marketing and ultimately whether we succeed in our social mission. Here’s the research-proven way to win more cooperation:

  • Always start a social exchange or relationship by offering cooperation.
  • Don’t envy the other person’s success, as long as you’re also succeeding.
  • Assume that you’ll meet each other again.

How do we know this? By giving algorithms the ability to choose, interact, and compete with each other.

The Prisoner’s Dilemma

Economists use a game called the Prisoner’s Dilemma to explore cooperation in various situations. In it’s essence, it’s simple. You and your accomplice must decide whether to  cooperate with each other to avoid lengthy jail sentences. If you cooperate and don’t confess or implicate the other, you both will receive little punishment. If you both confess and implicate each other, you will both get lengthy sentences. And, if only one defects from the partnership and implicates the other, the first will get off easy and the other will do hard time. The police are keeping you in separate cells, so you don’t know what the other is doing.

The obvious, short-term incentive is to not cooperate. If you cooperate, you risk your partner defecting, thereby going to prison while your partner gets off lightly.

We face similar choices all the time. As individuals, we have little incentive to cooperate with others in any particular moment. We think, “Why should I go out of my way, risk my time and energy and money, with no guarantee to repayment? Let the other person give first.” This attitude is warranted when you consider that another person can rob, injure or even kill you. If we can have the other person offer something to us for nothing, then we win the exchange.

Marketing has the same temptation. “Why should I go out of my way to help someone with free advice, free products, or great service, without a guarantee that they’ll become my client?”

Tournaments of Social Algorithms

David Axelrod, in his book The Evolution of Cooperation, describes competitions that he ran among computer algorithms programmed to make choices and respond to another algorithm’s choices in multiple rounds of the Prisoner Dilemma. In the competition, an algorithm starts a series of Prisoner’s Dilemma rounds by choosing to either cooperate or defect. The second algorithm then responds with either cooperation or defection, based on its strategy and the other algorithm’s choice. The choices go back and forth for a set number of rounds.

Algorithms earn points based on the pair of decisions in each round. Defecting when the other side cooperates earns you the most points. Both sides cooperating earns the second most points. Both sides defecting comes in third. Cooperating when the other side defects earns you the least amount of points.

The algorithm with the most points at the end of the tournament wins.

And the Winning Social Algorithm Is…

In Axelrod’s tournaments, the clear winner was an algorithm called TIT FOR TAT. This algorithm always started by offering cooperation, and then matching whatever the other algorithm did. This means it never defected more than its opponents, never tried to take advantage by defecting first, stopped cooperating whenever the other algorithm stopped, and started cooperating if the other algorithm did. In order words, this algorithm played nice.

This is a paradoxical result. Defecting when the other party cooperates yields the biggest payoff, yet TIT FOR TAT never made this move and still won.

As Axelrod says,

TIT FOR TAT won both rounds of the tournament, but it never received more points in any game than the other player. Indeed, it can’t possibly score more than the other player in a game because it always lets the other player defect first, and it will never defect more times than the other player does. It won, not by doing better than the other player, but by eliciting cooperation from the other player. In this way, TIT FOR TAT does well by promoting the mutual interest rather than by exploiting the other’s weakness (p. 137).

Three Lessons for Winning More Cooperation

Alexrod’s computer-based research into the nature of cooperation has at least three lessons for marketing the social good.

Lesson One: Always Start by Cooperating

For marketers, starting with cooperation means offering great customer service, free items, and true partnership. By cooperating, you’ll find those who want to cooperate with you. Together, you can build a group of cooperators that is mutually beneficial. You may not score the big points from one-sided defection, but like TIT FOR TAT you’ll win in the end.

Lesson Two: Don’t be Envious

If you’re cooperating with others, you’ll likely see them benefiting from your cooperation. It’s easy to become envious of their success and feel tempted to defect. As your clients prosper from your cooperation, you’ll be tempted to defect, to be less helpful, possibly by giving less or charging more.

Don’t defect out of envy. Your success rests on theirs. As Axelrod says,

“In a non-zero-sum world you do not have to do better than the other player to do well for yourself. This is especially true when you are interacting with many different players. Letting each of them do the same or a little better than you is fine, as long as you tend to do well yourself. There is no point in being envious of the success of the other player, since in an iterated Prisoner’s Dilemma of long duration the other’s success is virtually a prerequisite for your doing well yourself” (p. 112).

This matches with what I’ve written in other posts regarding fairness: It’s okay for some to have more than others, as long as everyone has enough.

Lesson Three: Assume You’ll Meet Again

Axelrod’s tournaments used multiple rounds of the Prisoner’s Dilemma. This matches our reality as social beings. We encounter the same people repeatedly in our families, jobs, and communities. Cooperation springs from the likelihood of encountering the same person again, and the impact our choice to not cooperate will have on those future encounters.

This becomes the basis for fairness. We expect to be treated as if we will be encountered again, as if we matter.

Because we humans can socialize with strangers and at large scale, I expand this likelihood beyond encountering the same individual. We often encounter individuals who are similar to others with whom we interact, or who are in similar situations. Making a broader assumption about future meetings helps you build a good reputation among a broader audience of people.

What’s your game plan for winning cooperation in the social arena, and how is it working for you?


The Evolution of Cooperation and other books mentioned in this blog are available in the bookstore.

Leave a Reply

Your email address will not be published. Required fields are marked *