Researchers have developed a series of different games, such as the ultimatum game, the trust game, the dictator game, and the Prisoner's dilemma, to investigate choices and decision making in social interactions and negotiations. In the ultimatum game, for example, two people are randomly assigned to one of two roles: the proposer and the responder. The proposer is first granted some money, such as $100. This individual is then encouraged to offer a certain percentage to the other person, designated as the responder. The proposer might, for example, offer $20 or $40 to the responder. The responder can then either accept or reject the offer. If the offer is rejected, both individuals receive no money.
These games uncover some interesting insights about the preferences and inclinations of individuals during economic exchanges. For example, in the ultimatum game, proposers tend to offer between 40% and 50% of the amount they are granted. Nevertheless, if the proposer offers only 20% or so, over 50% of responders will reject the offer. That is, individuals often prefer to redress an injustice than to receive a small amount of money.
These games have also clarified some of the neural mechanisms that underpin negotiation, cooperation, and competition. For example, the medial prefrontal cortex--a region that underpins the capacity to adopt the perspective of someone else--seems to facilitate cooperation in some contexts.
The trust game, also called the investor game, was developed by Berg, Dickhaut, and McCabe (1995). In the trust game, one person, called the investor or sender, is granted some money, such as $10. This person can then transfer a certain amount, such as 50%, to the a second person, called the trustee or receiver. This trustee can then triple this amount and return a certain percentage to the investor. The investor, if trusting, will transfer most of the money to the second person to triple this amount. If mistrusting, the investor will retain most of the money.
Typically, investors tend to send about 50% of their endowment to the trustee. The trustee, then, usually returns the same amount.
A binary version of the trust game is also often utilized in which each person is granted only two alternatives (e.g., Peters & Kashima, 2007& Scharlemann, Eckel, Kacelnik, & Wilson, 2001). In this variant, the investor is granted two options. They can either retain $10--in which case the other person receives $5--or they can distribute all the money to the trustee. The trustee can either distribute $8 to the investor and retain $13 or ask the investor to reach a final choice. At this point, the investor can either distribute $10 or $12 to both parties. The first decision by the investor is the primary measure of trust.
The dictator game is a simpler variation of the ultimatum game. In particular, like in the ultimatum game, the proposer is granted some money. The proposer is then asked to indicate the amount of money to be transferred to the other person. The proposer, for example, might transfer $20 from an endowment of $100 to this individual. However, in this game, the other person does not respond. The dictator game, therefore, merely assesses the altruism of the proposer and, in essence, is not actually a game but a simple exchange.
Usually, proposers are not as cooperative in the context of a dictator game than in the context of an ultimatum game. In the dictator game, the proposer is not susceptible to the risk that a small offer could be rejected.
The impunity game, like the dictator game, is really a variant of the ultimatum game. The only difference between the impunity game and the ultimatum game is that any decision of the responder does not affect the proposer (for a comparison, see Bolton & Zwick, 1995).
Specifically, in the impunity game, two people are randomly assigned to one of two roles: the proposer and the responder. The proposer is first granted some money, such as $100. This individual is then encouraged to offer a certain percentage to the other person, designated as the responder. The proposer might, for example, offer $30 to the responder. The responder can then either accept or reject the offer. If the offer is rejected, the responder foregoes the money: This person would not receive the $30 that was offered, for instance. The proposer retains the money that was not offered to the responder--in this example $70.
Occasionally, the responder will indeed reject this offer. Unlike with the ultimatum game, with the impunity game, this rejection does not equalize the money. Instead, this rejection may represent a signal to demonstrate protest.
In the prisoner's dilemma, participants need to decide whether they will cooperate or defect with another person. For example, they may need to decide whether they will invest some money into a scheme, a form of cooperation, or not invest any money, a form of defection. The complication is their reward depends on the response of their opponent.
If both individuals cooperate--for example, if they both invest--they both earn a very large reward. If only one individual cooperates, they earn a modest reward& furthermore, the person who cooperated or invested earns a smaller profit. If neither individual invests, they both earn no reward. Hence, they should cooperate, but only if they feel the other person will cooperate. The coordination game is similar, except the parties earn the largest reward if they undertake the same response as one another (for evidence of the brain regions that are activated when people engage in these games, see Emonds, Declerck, Boone, Vandervliet, & Parizel, 2011).
The name of this game evolved from the classical variant in which two male prisoners have been accused of conspiring to commit a crime. Each prisoner is interviewed separately. A prisoner will be released if he testifies against the other person and this other person remains silent. In this instance, the person who remains silent, a form of cooperation, is sentenced for 10 years. If both prisoners remain silent, however, they each receive only six months jail. If both prisoners testify against each other--that is, they both defect--they receive a 5 year sentence.
If prisoners are certain the other person will testify against them, they will obviously testify as well, to reduce the sentence from 10 to 5 years. If prisoners are not certain the other person will testify against them, they might consider remaining silent. Hence, their behavior will depend on whether they expect their co-accused to testify or not.
Several variations of these games have been utilized. To illustrate, in most instances, pairs of individuals only complete these exchanges once with each other. In other instances, however, the pairs of individuals might engage in these games several times with one another. In these contexts, the participants may be governed by additional motivations, such as the need to enhance their reputation and promote trust to facilitate future exchanges.
Second, participants can engage in this game either with a person or with a computer. For example, if they play the ultimatum game, they might be informed the computer is programmed to either accept or reject the offer: The probability the computer will accept, rather than reject, the offer increases as the amount of money that is offered rises.
Third, levels of empathy can be manipulated. For example, in the standard dictator game, the dictator is assigned their role before deciding the amount of money they would like to transfer to the other person. In contrast, to evoke empathy, the individuals are not assigned the role first. Instead, they are asked to indicate the amount of money they would transfer to the other person if they were the dictator. Next, they are assigned a role. This protocol ensures each person can appreciate the role of everyone else, representing a form of empathy (e.g., Stahl & Haruvy, 2006).
Many other variations have been introduced. These variations revolve around the amount of money, the formation of teams, restrictions on the range of alternatives that individuals can select, and the provision of advice.
In some variants of the trust game, a portion of participants are especially cooperative: That is, they transfer a significant amount of money to the other person, trusting this individual will return a significant portion. Other people are not as cooperative.
Interestingly, the medial prefrontal cortex is especially active in these cooperative participants (McCabe, Houser, Ryan et al., 2001), as shown by fMRI imaging. Nevertheless, this region is intensely activated only when participants complete this game with a human instead of with a computer.
The medial prefrontal cortex, comprising Broca's areas 25 and 32 but sometimes considered part of the anterior cingulate, also mediates theory of mind--that is, the capacity to consider the perspective of another person. Hence, the capacity to understand a context from the perspective of someone else seems to facilitate cooperation.
The basal ganglia, a set of nuclei that are integral to Parkinson's disease and Huntington's disease, comprises four main structures: the striatum, the pallidum, the substantia nigra, and the subthalamic nucleus. The striatum is traditionally divided into dorsal and ventral regions. The dorsal striatum entails the caudate nucleus and the putamen. The ventral striatum entails the olfactory tubercle and the nucleus accumbens.
The dorsal striatum is associated with the experience of reward. Interestingly, this region is also activated when individuals are granted an opportunity to punish someone who has acted unfairly (de Quervain, Fishacher, Treyer, et al. 2004).
Specifically, de Quervain, Fishacher, Treyer, et al (2004) administered a variant of the trust game. In this variant, however, the first person or investor was granted an opportunity to punish the trustee. That is, if the trustee did not return any money, the investor could then penalize this person. In one condition, the investor would also incur a cost if they punished the trustee. In another condition, the investor would not incur a cost.
This punishment coincided with activation of the dorsal striatum. Indeed, if this region was especially active, investors were willing to incur appreciable costs to punish the other person. Hence, punishment of transgressions seems to be rewarding in some sense (de Quervain, Fishacher, Treyer, et al. 2004).
As Guroglu, van den Bos, van Dijk, Rombouts, and Crone (2011) showed, two cortical neural networks seem to shape the behavior of responders during the ultimatum game. One of these networks, comprising the insular cortex and the dorsal anterior cingulate cortex enables individuals to override personal norms or tendencies. The second network, comprising the temporoparietal junction, coupled with the dorsolateral prefrontal cortex, enables individuals to reflect upon the perspective and intentions of the proposer more effectively and then change their goals accordingly.
Specifically, in this study, the participants, aged between 10 and 20, were designated the role of responders in the ultimatum game. On each trial, the proposer allocated 10 coins between the two individuals over computer. However, the proposer could select only one of two options. In one condition, for example, the two options were 5 and 5 or 8 and 2. In another condition, both options were identical: 8 and 2. In other words, this proposal is unfair but not intentional.
Regardless of age, whenever participants rejected offers that were unfair but unintentional--or accepted offers that were unfair and intentional--the insular cortex and dorsal anterior cingulate cortex were activated. On these trials, participants seemed to override their natural inclination to, for example, accept an unintentional unfair offer.
In participants aged 13 or more, in response to unfair but unintentional offers, the temporoparietal junction and dorsolateral prefrontal cortex were activated. In this condition, participants needed to reflect upon the intentions or perspective of the proposer, which was unclear. Furthermore, they needed to reflect upon how the proposer might respond in this ambiguous context--reflections that presumably entail the temporoparietal junction and develop with age. The dorsolateral prefrontal cortex may enable individuals to override their natural response and accommodate the other person.
When individuals engage in games, they do not always attempt to maximize their investment. For example, in the dictator game, people often choose to transfer some of their money to another person--money that cannot be returned.
Many models have been proposed to explain the motivations of these transfers. People might want to enhance their reputation or increase the public goods--that is, the total amount of money--that is available.
One motivation is called the warm glow effect. That is, individuals might experience a positive state after they behave fairly. These individuals experience this positive state when they, and not someone else, behaves fairly.
This warm glow effect can explain some interesting patterns of observations. Specifically, in most games, as the number of participants increases, individuals become less generous. To illustrate, suppose that a set of people are permitted to transfer money anonymously to one another, a variant of the dictator game. Some individuals will indeed transfer money, even though such generosity does not increase the likelihood they will receive additional money later. These individuals might transfer money merely to evoke a positive state or warm glow.
Nevertheless, as the number of participants increases, each individual feels their impact on everyone else diminishes. They know the amount of money that each individual receives is less contingent upon their own actions. Consequently, in groups of five rather than two, for example, individuals become less generous (Stahl & Haruvy, 2006)
Stahl and Haruvy (2006) uncovered an important exception. In one variant of the game, on each trial, all the individuals chose the amount of money they would like to transfer to other people. However, only one of these choices was randomly selected and implemented. The other choices were not implemented. In this instance, generosity does not diminish as conspicuously, and occasionally increases, with group size. Presumably, individuals experience a warm glow regardless of whether their choice is selected. However, they will lose money only if their choice is selected. Therefore, the expected loss of money is low. If they behave generously, they will definitely enjoy warm glow but are unlikely to lose money.
Numerous studies have examined the factors that determine the responses of participants in these games. For further examples, see trust.
Anderson and Dickinson (2010) examined the effects of sleep deprivation over one night on behavior during the ultimatum game, dictator game, and trust game After deprivation of sleep, participants were more likely to reject low offers from the other person. That is, in this state, the individuals were more concerned with justice than financial gain. Perhaps, when tired, individuals need to rely more on just systems rather than personal resources.
Furthermore, as Anderson and Dickinson (2010) showed, deprivation also curbed the likelihood that investors would transfer money to the trustee, indicating limited levels of trust. This caution perhaps also indicates a reluctance of individuals to expose themselves to the possibility of exploitation and injustice.
According to Kugler, Connolly, and Kausel (2009), when participants complete these games, they seldom calculate the probabilities and consider the possible outcomes. Instead, some of their prevailing inclinations, such as their tendency to trust, will often dictate their decisions.
Consistent with this possibility, even a few modest prompts to consider the future consequences of any decisions was sufficient to reduce the money that investors would send the other person. That is, if individuals were instructed to predict the likely response of the other person--or to imagine their regret if the money they sent was not returned--they became more reluctant to send money.
After people are exposed to cues that prime business or economics, they become less trusting and cooperative. For example, when people complete the prisoner's dilemma, they are more inclined to defect when the task is referred to as the Wall St game than when the task is referred to as the Community game (Liberman, Samuels, & Ross, 2004). Likewise, after people are exposed to boardroom tables, briefcases, and other objects that epitomize business, their proposals during the Ultimatum game diminish, implying a reduction in trust (Kay, Wheeler, Bargh, & Ross, 2004).
Anxiety has been shown to affect responses on the Ultimatum game. In this game, on each trial, one person, called the proposer, is granted a sum of money, such as 10 pounds. The proposer offers a certain percentage, such as 10% or 20%, to another person, called the receiver. The receiver can either accept the offer or reject the offer, in which everyone receives nothing. In this study, the participants completed the Ultimatum game on computer. They saw a photograph of the supposed proposer before each trial, and the photograph was different on each trial. Compared to other individuals, participants who had been diagnosed with anxiety disorders were more likely to accept very unfair or low offers, such as 5% (Grecucci et al., 2013). This tendency was pronounced in people with generalized anxiety disorder but not in people with panic disorder.
When people experience generalized anxiety disorder, they are especially sensitive to the possibility of rejection. They are, therefore, more attuned to the problems, instead of the benefits, of assertive behavior. They will thus be reluctant to reject unfair offers and tend to behave cautiously in negative interpersonal situations. In this study, they did not differ from other participants in response to fairer offers, partly because their concerns are not primed as strongly in positive interpersonal situations. In contrast, in response to unfair offers, they were not as likely as other participants to report anger, either because they are not assertive or because of their pessimistic expectations (Grecucci et al., 2013). This limited anger tends to diminish the likelihood of rejection.
Interestingly, if participants had been administered selective serotonin reuptake inhibitors, their behavior was more similar to people who were not diagnosed with anxiety (Grecucci et al., 2013). Selective serotonin reuptake inhibitors have been shown to promote both cooperation but also social dominance.
Almakias and Weiss (2012) examined whether or not attachment style affects trust, as manifested in the ultimatum game (see attachment theory). Individuals who report an anxious attachment style are sensitive to rejection and strive incessantly to reinforce their closest friendships or relationships Individuals who report an avoidant attachment style attempt to shun close relationships, usually because they had been disappointed in the past. Individuals who report a secure attachment style do not exhibit either of these tendencies.
In this study, participants completed the ultimatum game. The participants were not acquainted with one another. In addition, the experiences of close relationship scale was administered to gauge attachment style.
In general, proposers who reported an anxious attachment style offered considerable sums of money to the responder. Presumably, this offer may represent an attempt to prevent rejection. Responders who reported anxious attachment tended to accept most offers, also to please the other person and prevent rejection.
In contrast, proposes who reported avoidant attachment offered negligible sums of money to the responder. These individuals shun situations in which they can be exploited or rejected and, therefore, do not offer large sums. That is, if their offer is low, they do not need to ascribe this rejection to exploitation.
Behavior in the trust game is, at least partly, explained by genetics, as Cesarini, Dawes, Fowler, Johannesson, Lichtenstein, and Wallace (2008) showed with samples from United States and Sweden. That is, in this study, individuals participated in the trust game. These individuals were members of monozygotic twin pairs or dizygotic twin pairs. Monozygotic twins were more likely than dizygotic twin pairs to demonstrate similar inclinations: One twin would often send almost the same amount to the trustee as did the other twin. They also returned similar amounts to the investor. These findings imply that genetics, and not merely social norms or parental socialization, can affect cooperative or trusting behavior. Wallace, Cesarini, Lichtenstein, and Johannesson (2007) also showed that genetic variability partly determines decisions on the ultimatum game.
Jensen, Call, and Tomasello (2007) showed that chimpanzees, unlike humans, do not punish unfair offers. That is, in the ultimatum game, they tend to accept any offers. Specifically, these researchers developed a system in which the proposer could shift one of two trays towards the responder by pulling a rope. In particular, the proposer could choose one of two options. One of the options was unfair: the proposer would receive 8 raisins and responder would receive 2 raisins, for example. The other option was fair. The responder could accept or reject the offer by pulling one of two ropes. The responder almost always accepted the offer, however.
In the standard dictator game, the dictator is merely assigned money, usually fortuitously, by the experimenter. However, in one variant, the dictator might earn the money. When dictators earn the money, they feel an additional right over these funds. They are not as willing to transfer money to the other person (Oxoby & Spraggon, 2008).
Specifically, in a study conducted by Oxoby and Spraggon (2008), participants needed to complete some questions, distilled from the Graduate Management Admissions Test (GMAT) and the Graduate Record Examination (GRE). If participants answered many questions correctly, they were granted more money.
After completing these questions, some participants were then assigned the role of dictator. That is, they were asked to specify the amount of money they would like to transfer to another person. If they had earned this money, as a consequence of their performance on the exam, they were more inclined to retain the entire sum.
Investors who are granted the right to veto the responses of investees tend to be more trusting. In one study, conducted by Kanagaretnam, Mestelman, Nainar, and Shehata (2012), pairs of individuals participated in the investment or trust game. The investor was granted 100 Francs. This person then chose to distribute a certain percentage to the investee. The money that was distributed to the investee was tripled. The investee then returned a specific amount. The individuals repeated this procedure several times.
Some participants, however, were exposed to some variants of this game. In one variant, the investor could veto the response to demonstrate their displeasure--in which case neither individual receives any money. In another condition, the investor could also veto the response& but, on this occasion, only the investor received some money: 100 Francs. Both of these variants increased the level of trust that investors demonstrated. That is, when these provisions were included, the investor distributed a larger amount.
According to Kanagaretnam, Mestelman, Nainar, and Shehata (2012), people do not like to be manipulated by someone else. The prospect of this manipulation, called betrayal aversion, diminishes trust. When investors are empowered with the opportunity to veto the response of investees, this concern diminishes and trust thus increases.
Some studies have examined whether or not interactions before these games affect the behavior of participants. For example, in one study, conducted by Servatka, Tucker, and Vadovic (2011), participants completed the trust or investor game. In one condition, the receiver was permitted to send a message to the sender before the game. In this message, the sender often promised to return enough money to ensure that both players would benefit. In another condition, the receiver donated $10 to the sender before they played the game--an amount that was equivalent to the money the sender was assigned by the experimenter. In a third condition, both of these measures were included. In a control condition, none of these measures were included.
Both the message and the donation increased trust in the sender: That is, the sender transferred more money to the receiver when one or both of these measures were included. Nevertheless, the message was more likely to promote trust than was the donation. Indeed, the message was more effective when no donation was offered. Conceivably, financial transfers, in contrast to messages, might evoke some memories or associations with distrust as well as demonstrate some goodwill.
As Meleady, Hopthrow, and Crisp (2013) showed, after individuals actually discuss why they should collaborate--or even imagine themselves discussing why they should collaborate--they become more likely to choose cooperative, rather than competitive, courses of action in the prison dilemma. That is, after these actual or imagined discussions, the possibility that other people may collaborate seems more vivid and salient. Because of the availability hypothesis (Tversky & Kahneman, 1973), people overestimate the likelihood of alternatives that seem vivid or salient. They presume that other people will be cooperative, increasing their own inclination to collaborate.
In this study, participants were about to work in teams of 6 or so people. They learnt they could either cooperate or compete on a social dilemma. If everyone cooperates, they will all receive a reasonable outcome. If only a couple of people cooperate, the individuals who compete will prevail but the people who do not compete will be significantly disadvantaged.
In the control conditions, participants merely listed the reasons they may cooperate. In the imagined condition, participants answered a series of questions, designed to foster vivid images of a group discussion about the benefits of cooperation. They were asked to imagine a discussion that revolves around the main principles of this social dilemma, different perspectives on the best solution, and the risks of each solution. Next, they imagined everyone conceding that cooperation is the best solution and discussing ways to trust everyone and promote commitment. Finally, in one condition, participants were prompted to actually convene this discussion.
If participants imagined a discussion in which everyone agreed to cooperate, they were more likely to cooperate than if they neither imagined nor convened this discussion. Yet, actual discussions were still more effective than imagined discussions. Subsequent studies replicated this finding, showing that even people who are usually selfish will cooperate, even if this behavior depletes more mental energy, as gauged by measures of Stroop interference. The results also showed the subjective probability of cooperation mediated the benefits of these discussions on behavior, consistent with the availability hypothesis.
Almakias, S., & Weiss, A. (2012). Ultimatum game behavior in light of attachment theory. Journal of Economic Psychology, 33, 515-526. doi:10.1016/j.joep.2011.12.012
Anderson, C., & Dickinson, D. L. (2010). Bargaining and trust: The effects of 36-h total sleep deprivation on socially interactive decisions. Journal of Sleep Research, 19, 54-63.
Bahry, D. L., & Wilson, R. K. (2006). Confusion or fairness in the field? Rejections in the ultimatum game under the strategy method. Journal of Economic Behavior and Organization, 60, 37-54.
Bellemare, C., & Kroger, S. (2007). On representative social capital. European Economic Review, 51, 183-202.
Berg, J., Dickhaut, J., & McCabe, K. (1995). Trust, reciprocity and social history. Games and Economic Behavior, 10, 122-142.
Boarini, R., Laslier, J., & Robin, S. (2009). Interpersonal comparisons of utility in bargaining: Evidence from a transcontinental ultimatum game. Theory and Decision, 67, 341-373.
Bolton, G. E., & Zwick, R. (1995). Anonymity versus punishment in Ultimatum Game bargaining. Games and Economic Behavior,10, 95-121.
Buchan, N. R., & Croson, R. T. A. (2004). The boundaries of trust: own and others' actions in the US and China. Journal of Economic Behavior and Organization, 55, 485-504.
Buchan, N. R., Croson, R. T. A., & Dawes, R. M. (2002). Swift neighbors and persistent strangers: A cross-cultural investigation of trust and reciprocity in social exchange. American Journal of Sociology, 108, 168-206.
Burks, S. V., Carpenter, J. P., & Verhoogen, E. (2003). Playing both roles in the trust game. Journal of Economic Behavior and Organization, 51, 195-216.
Burns, J. (2006). Racial stereotypes, stigma and trust in post-apartheid South Africa. Economic Modeling, 23, 805-821.
Camerer, C. F. (2003). Behavioral game theory: Experiments in strategic interaction. Princeton, NJ: Princeton University Press.
Cesarini, D., Dawes, C. T., Fowler, J. H., Johannesson, M., Lichtenstein, P., & Wallace, B. (2008). Heritability of cooperative behavior in the trust game. Proceedings of the National Academy of Sciences, 105, 3721-3726.
Corbae, D., & Duffy, J. (2008). Experiments with network formation. Games and Economic Behavior, 64, 81-120.
Coricelli, G., Morales, L. G., & Mahlsted, A. (2006). The investment game with asymmetric information. Metroeconomica, 57, 13-30.
Cox, J. C. (2004). How to identify trust and reciprocity. Games and Economic Behavior, 46, 260-281.
Danielson, A. J., & Holm, H. J. (2007). Do you trust your brethren? Eliciting trust attitudes and trust behavior in a Tanzanian congregation. Journal of Economic Behavior and Organizations, 62, 255-271.
DeBruin, L. M. (2002). Facial resemblance enhances trust. Proceedings of the Royal Society of London Series B-Biographical Sciences, 269, 1307-1312.
de Quervain, D., Fishacher, U, Treyer, V. et al., (2004). The neural basis of altruistic punishment. Science, 305, 1254-1258.
Eckel, C. C., & Grossman, P. J. (1996). Altruism in anonymous dictator games. Games and Economic Behavior, 16, 181-191.
Eckel, C., & Wilson, R. (2003). The human face of game theory: trust and reciprocity in sequential games. In: Ostrom, E., Walker, J. (Eds.), Trust and reciprocity: Interdisciplinary lessons from experimental research (pp. 245-274). New York, Russell Sage Foundation.
Eckel, C., & Wilson, R. (2006). Judging a book by its cover: Beauty and expectations in a trust game. Political Research Quarterly 59, 189-202.
Emonds, G., Declerck, C. H., Boone, C., Vandervliet, E. J., & Parizel, P. M. (2011). Comparing the neural basis of decision making in social dilemmas of people with different social value orientations, a fMRI study. Journal of Neuroscience, Psychology, and Economics, 4, 11-24.
Engle-Warnick, J., & Slonim, R. L. (2004). The evolution of strategies in a repeated trust game. Journal of Economic Behavior and Organization, 55, 553-573.
Fershtman, C., & Gneezy, U. (2001). Discrimination in a segmented society: An experimental approach. Quarterly Journal of Economics, 116, 351-377.
Glaeser, E. L., Laibson, D. L., Scheinkman, J. A., & Soutter, C. L. (2000). Measuring trust. The Quarterly Journal of Economics, 115, 811-846.
Grecucci, A., Giorgetta, C., Brambilla, P., Zuanon, S., Perini, L., Balestrieri, M., Bonini, N., et al. (2013). Anxious ultimatums: how anxiety disorders affect socioeconomic behaviour. Cognition and emotion, 27, 230-244. doi:10.1080/02699931.2012.698982
Gunnthorsdottir, A., McCabe, K., & Smith, V. (2002). Using the Machiavellianism instrument to predict trustworthiness in a bargaining game. Journal of Economic Psychology, 23, 49-66.
Guroglu, B., van den Bos, W., van Dijk, E., Rombouts, S. A. R. B., & Crone, E. A. (2011).Dissociable brain networks involved in development of fairness considerations: Understanding intentionality behind unfairness. Neuroimage, 57, 634-641. doi:10.1016/j.neuroimage.2011.04.032
Haley, K., & Fessler, D. (2005). Nobody's watching? Subtle cues affect generosity in an anonymous economic game. Evolution and Human Behavior, 26, 245-256. doi:10.1016/j.evolhumbehav.2005.01.002
Ho, T. H., & Weigelt, K. (2005). Trust building among strangers. Management Science, 51, 519-530.
Holm, H., & Nystedt, P. (2005). Intra-generational trust--A semi-experimental study of trust among different generations. Journal of Economic Behavior and Organization, 58, 403-419.
Jensen, K., Call, J., & Tomasello, M. (2007). Chimpanzees are rational maximizers in an ultimatum game. Science 318, 107-109. doi: 10.1126/science.1145850.
Kanagaretnam, K., Mestelman, S., Nainar, S. M. K., & Shehata, M. (2012). The impact of empowering investors on trust and trustworthiness. Journal of Economic Psychology, 33, 566-577. doi:10.1016/j.joep.2011.11.002
Kay, A. C., Wheeler, S. C., Bargh, J. A., & Ross, L. (2004). Material priming: The influence of mundane physical objects on situational construal and competitive behavioral choice. Organizational Behavior and Human Decision Processes, 95, 83-96.
Kiyonari, T., Yamagishi, T., Cook, K. S., & Cheshire, C. (2006). Does trust beget trustworthiness? Trust and trustworthiness in two games and two cultures. Social Psychology Quarterly, 69, 270-283.
Knafo, A., et al. (2008). Individual differences in allocation of funds in the dictator game associated with length of the arginine vasopressin la receptor (AVPRla) RS3 promoter region and correlation between RS3 length and hippocampal mRNA. Genes, Brain, and Behavior, 7, 266-275.
Kugler, T., Connolly, T., & Kausel, E. E. (2009). The effect of consequential thinking on Trust Game behavior. Journal of Behavioral Decision Making, 22, 101-119.
Kugler, T., Kocher, M., Sutter, M., & Bornstein, G. (2007). Trust between individuals and groups: Groups are less trusting than individuals but just as trustworthy. Journal of Economic Psychology, 28, 646-657.
Liberman, V., Samuels, S. M., & Ross, L. (2004). The name of the game: Predictive power of reputations versus situational labels in determining prisoner's dilemma game moves. Personality and Social Psychology Bulletin, 30, 1175-1185.
McCabe, K., Houser, D., Ryan, L et al. (2001). A functional imaging study of cooperation in two-person reciprocal exchange. Proceedings of the National Academy of Sciences, USA, 98, 11832-11835.
Meleady, R., Hopthrow, T., & Crisp, R. J. (2013). Simulating social dilemmas: Promoting cooperative behavior through imagined group discussion. Journal of Personality and Social Psychology, 104, 839-853. doi: 10.1037/a0031233
Oxoby, R. J., & Spraggon, J. (2008). Mine and yours: Property rights in dictator games. Journal of Economic Behavior & Organization, 65, 703-713.
Peters, K., & Kashima, Y. (2007). From social talk to social action: Shaping the social triad with emotion sharing. Journal of Personality and Social Psychology, 93, 780-797.
Scharlemann, J. P. W., Eckel, C. C., Kacelnik, A., & Wilson, R. K. (2001). The value of a smile: Game theory with a human face. Journal of Economic Psychology, 22, 617-640.
Servatka, M., Tucker, S., & Vadovic, R. (2011). Words speak louder than money. Journal of Economic Psychology.,32, 700-709. doi:10.1016/j.joep.2011.04.003
Slonick, S. J. (2007). Cash and alternate methods of accounting in an experimental game. Journal of Economic Behavior and Organization, 62, 316-321.
Stahl, D. O., & Haruvy, E. (2006). Other-regarding preferences: Egalitarian warm glow, empathy, and group size. Journal of Economic Behavior and Organization, 61, 20-41.
Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5, 207-232. doi: 10.1016/0010-0285(73)90033-9
Wallace, B., Cesarini, D. Lichtenstein, P., & Johannesson, M. (2007). Heritability of ultimatum game responder behaviour. Proceedings of the National Academy of Sciences, 104, 15631-15634.
Last Update: 7/18/2016