Mechanical Turk, Dynamic Networks, and Cooperation

In a fun paper recently published in PNAS, Dynamic social networks promote cooperation in experiments with humans, Dave Rand, Nicholas Christakis, and I explored how a dynamic social network affects cooperation. Scientists have been using the public goods game for a long time to understand people’s tendency to cooperate with others. Recently, research has explored whether it’s important to array people in a network that looks like a real-world social network in order to foster cooperation. However, this doesn’t seem to increase cooperation among people.

We examined whether the secret ingredient in creating cooperation was a dynamic social network: we give people the ability to change the structure of your own social network. Someone screwing you over? No need to simply defect, rather than cooperate. You can now just stop interacting with them. Theory dictates that if we would allow a lot of rewiring, actions have consequences, and you get a lot of cooperation. And that is exactly what we found. To quote our abstract:

Human populations are both highly cooperative and highly organized. Human interactions are not random but rather are structured in social networks. Importantly, ties in these networks often are dynamic, changing in response to the behavior of one’s social partners. This dynamic structure permits an important form of conditional action that has been explored theoretically but has received little empirical attention: People can respond to the cooperation and defection of those around them by making or breaking network links. Here, we present experimental evidence of the power of using strategic link formation and dissolution, and the network modification it entails, to stabilize cooperation in sizable groups.

And we did this all on Amazon Mechanical Turk, a great place to run social science experiments. As Dave (my co-first author) notes:

“Lab experiments are incredibly valuable, because they let you very tightly control the experimental conditions, which you need to demonstrate causality,” Rand said. “But the thing about lab experiments is they tend to be very time-consuming and expensive, because it’s difficult to get people to come into the lab. The Internet offers an amazing opportunity for streamlining the process. But the problem has been: Where do you get the people, and how do you set these systems up?”

Developed several years ago, Mechanical Turk is an online labor market where employers can hire workers to perform what they call “human intelligence tasks” — simple, repetitive ones that are easy for humans — such as describing the content of a picture, transcribing audio or translating text from one language to another — but are frustratingly difficult to program computers to perform.

“What we’re doing is crowd-sourcing experimental social science,” Rand said. “We are now an ‘employer’ on Mechanical Turk, but instead of asking people to label images, we’re hiring them to take part in our experiments.

“From a philosophical perspective, I think this is an amazingly important technology for the social sciences, because it’s democratizing,” Rand continued. “You no longer need to be at a university that has a big lab, with a huge research budget and someone maintaining a subject pool. Now anyone who has an idea can spend a day building a survey online, post it on Mechanical Turk, and see what happens.”

The paper has a number of other interesting results, which can be seen here, or read about at the Harvard Gazette.