Tipultech logo


Author: Dr Simon Moss


Dopamine is a neurotransmitter, activating five classes of receptors, called D1 to D5, and is produced in several regions of the brain including the ventral tegmental area and the substantia nigra. Dopamine also acts as a hormone in the brain, released by the hypothalamus to inhibit the release of prolactin. Furthermore, dopamine supplied as medication can increase heart rate and blood pressure.

Abnormally elevated levels of dopamine are assumed to be related to psychosis and schizophrenia. Indeed, amphetamine and cocaine, which increase dopamine levels, are related to psychosis. Antipsychotic medications, therefore, are often designed to block dopamine function.


Dopamine underpins many neural functions, especially processes that relate to motivation, reward, activity, sleep, attention, and learning. For example, dopaminergic neurons tend to be activated when unexpected rewards arise. Stimuli that tend to be paired with these rewards also provoke this activation. Accordingly, dopaminergic neurons seemed to be involved with maximizing and predicting rewards, especially when uncertain.


Dopamine is involved in the responses to potential rewards. Specifically, as described by Tobler, Fiorillo, and Schultz (2005), when individuals anticipate a possible reward, the midbrain is activated. The level of activation is proportional to both the magnitude and probability of some reward. Dopaminergic projections from the midbrain then activate the ventral striatum, as well as prefrontal regions, particularly the ventromedial prefrontal cortex and the dorsolateral prefrontal cortex.

Attention and memory

A diminution in the concentration of dopamine in the prefrontal cortex can impede the distribution of information from other neural regions, ultimately disrupting attention and perhaps underpinning attention deficit disorder.

Several studies attest to the role of dopamine in attention. Moderate, compared to low, levels of dopamine has been shown to improve the capacity of individuals to switch attention efficiently between tasks (Dreisbach & Goschke, 2004). Furthermore, moderate levels of dopamine seem to direct attention more efficiently to stimuli that are relevant to ongoing tasks (e.g., Drabant, Hariri, Meyer Lindenberg, Munoz, Mattay, Kolachana, et al., 2006). Finally, dopamine facilitates the maintenance or retention of information that is germane to ongoing tasks (e.g., Colzato, Van Wouwe, & Hommel, 2007).

Similarly, dopamine seems to be involved in the functioning of working memory--a system that retains and manipulates information to facilitate planning, thinking, and comprehension (e.g., Baddeley, 2000). In particular, moderate, rather than low, levels of dopamine seem to enhance performance on tasks that seem to utilize working memory (Floresco & Phillips, 2001;; Kimberg, D'Esposito, & Farah, 1997).


Dopamine might also enhance the capacity of individuals to reject negative thoughts. Specifically, when dopamine levels are limited, activation of brain regions tends to be more diffuse than concentrated (e.g., Bush, 2010). Presumably, if dopamine levels are low, brain activation is significantly dependent upon random noise--that is, fleeting states, thoughts, or conditions--rather than more sustained patterns of cognition. Therefore, many regions of the brain are activated marginally. In contrast, if dopamine levels are high, specific cognitions prevail, and activation is more confined to particular regions.

Diffuse rather than concentrated activation can undermine the efficiency of some brain regions. To illustrate, the left inferior frontal gyrus, which primarily overlaps with the ventrolateral prefrontal cortex, partly underpins the inhibition of unwanted thoughts. As Berman et al. (2011) showed, when activation in this region was diffuse rather than concentrated to specific locations, people could not inhibit negative words as effectively. Presumably, this diffuse activation implies the individuals cannot focus on specific classes of stimuli efficiently. They attempt to inhibit a broad range of stimuli, compromising efficiency.

In this study, on each trial, four words were presented. Two of these words were red, and two of these words were blue. On some trials, participants were told to disregard only the red words. Next, a word was presented in black. Participants were instructed to press one button if the black word was one of the target blue words. They were instructed to press another button otherwise, even if the black word corresponded to one of the red words. To complete this task effectively, participants needed to ignore the red words.

In general, participants could not readily disregard the red words that were negative in tone. That is, on trials in which a negative word--that is, a word that should be disregarded--appeared, reaction time was often protracted. Interestingly, if activation of the left inferior frontal gyrus was diffuse, participants were especially unable to disregard the negative words.

This limitation could underpin the rumination that characterizes depression. In particular, dopamine is often limited in depressed individuals (e.g., Hasler et al., 2008). Thus, activation of specific regions, such as the left inferior frontal gyrus, may be more diffuse. Negative thoughts, therefore, cannot be as readily inhibited. Consistent with this possibility, Berman et al. (2011) showed that activation of this region was more diffuse in depressed participants. Furthermore, the capacity to disregard the negative words was also impaired in depressed participants.


Many researchers claim that dopamine may underpin extraversion. Wacker, Mueller, Stemmler, and Hennig (2012), however, show this association between dopamine and extraversion is complex. In particular, dopamine is positively associated with only one facet of extraversion: agency.

This possibility emerged from a theory that was proposed by Depue and Collins (1999). According to this theory, extraversion comprises three main facets: agency, affiliation, and sensation seeking. Agency encompasses characteristics such as dominance during social interactions, assertive behavior, elevated levels of activity, and goal achievement. In contrast, affiliation refers to the formation of warm, affectionate bonds, whereas sensation seeking refers to the pursuit of risky, novel, bold, and impulsive ventures.

Depue and Collins (1999) maintained that agency is underpinned by activation of the behavioral facilitation system--a hypothetical circuit that corresponds roughly with the mesocorticolimbic dopamine system. The mesocorticolimbic dopamine system entails dopaminergic projections from the ventral tegmental area to the nucleus accumbens, medial orbito prefrontal cortex, as well as a variety of other cortical and subcortical regions. This system may amplify the salience of rewards and thus could underpin agency.

According to this model, the mesocorticolimbic dopamine system does not underpin the other facets of extraversion. Affiliation, for example, is more dependent on oxytocin, and impulsive seeking seems to depend on a heterogeneous array of systems.

Genes seems to influence levels of dopamine in the mesocorticolimbic dopamine system. Specifically, the Met allele of the catechol-O-methyltransferase gene increases levels of dopamine in the prefrontal cortex but diminishes levels of dopamine in the mesocorticolimbic dopamine system. Conversely, the Val allele of this gene decreases levels of dopamine in the prefrontal cortex but increases levels of dopamine in the mesocorticolimbic dopamine system. Consequently this Val allele should be positively associated with agency, after controlling affiliation and sensation seeking.

This possibility was confirmed by Wacker, Mueller, Stemmler, and Hennig (2012). Carriers of two Val alleles demonstrated elevated levels of agency, but only after controlling measures of affiliation and sensation seeking.

Determinants of dopamine levels

Arousal and activation

Active emotions and states, such as anger and excitement seem to augment the levels of dopamine. Indeed, some researchers argue that such states enhance working memory by increasing levels of dopamine and related neurotransmitters such as noradrenalin (e.g., Ashby, Valentin, & Turken, 2002;; Flaherty, 2005).

Theories of dopamine

Role of the mesolimbic pathway

Many studies have examined and contested the role of this mesolimbic pathway, and the function of dopamine in particular, in the experience of rewards. Certainly, this pathway is involved in the experience of reward. However, the precise role of this pathway is contentious (for a review, see Berridge, 2007). In general, to examine these roles, researchers have attempted to clarify the function of dopamine.

The first hypothesis is called the activation sensorimotor hypothesis. According to this perspective, dopamine underpins effort, arousal, and activation. That is, dopamine increases the likelihood that individuals devote effort and energy to important and rewarding activities.

Although this role of dopamine is accepted, this explanation is not precise. That is, the activation sensorimotor hypothesis does not specify how dopamine promotes this effort. The three other prevalent hypotheses differentiate the precise role of dopamine and the mesolimbic pathway.

The second hypothesis is the hedonic argument. According to this argument, the dopamine in the nucleus accumbens elicits feelings of pleasure. Thus, any actions that elicit this increase in dopamine are reinforced and will be repeated in the future. This hypothesis is often proposed to explain the pleasurable effects of many illicit drugs, most of which increase levels of dopamine in this region.

Many findings, at first glance, seem to confirm this hypothesis. That is, many pleasurable activities, including food and sex, do increase levels of dopamine in the nucleus accumbens. Furthermore, antagonists of dopamine reverse the reinforcing effects of these activities, underpinning adhenonia. In addition, subjective ratings of pleasure correlate with dopamine levels in the ventral striatum.

Nevertheless, some findings challenge this hypothesis. For example, objective indices of liking, such as particular facial expressions in response to sweet food, are observed even after dopaminergic projections are removed in rats. Even after over 99% of dopamine in the nucleus accumbens is removed, these positive hedonic responses are maintained. Amphetamine microinjections into the nucleus accumbens do not increase these indices of liking either. In addition, when individuals learn to expect a reward, these events did not activate dopaminergic neurons in monkeys. Similar results have been observed in humans.

Certainly, the nucleus accumbens, but not dopamine, may be involved in these hedonic reactions. For example, opioid transmission from the nucleus accumbens does amplify facial manifestations of pleasure in response to sucrose.

The next hypothesis revolves more around learning. According to this hypothesis, the mesolimbic pathway, and dopamine in particular, may facilitate or amplify the learning of associations. For example, when dopamine levels are elevated in the nucleus accumbens, the association between a stimulus and some response are more likely to be learnt efficiently.

Many researchers maintain that dopamine in general, and the nucleus accumbens in particular, signal reward prediction errors. That is, when some action generates a greater reward than expected, activation in the nucleus accumbens is more pronounced. Presumably, the activation of this nucleus accumbens indicates the individual had previously underestimated the potential rewards from this action. This activation thus reinforces this action in the future.

A diversity of studies support this supposition. For example, rewarding stimuli will tend to activate dopaminergic neurons only if unexpected. The same stimuli when expected may not activate these neurons (Waelti, Dickinson, & Schultz, 2001).

Despite these findings, Berridge (2007) maintains that dopamine does not enhance learning. That is, other mechanisms seem to underpin learning. Dopamine may correspond to unexpected rewards, called prediction errors, but not affect learning. That is, even in response to unexpected rewards, dopamine may not be responsible for the formation of associations between events, such as stimuli and responses.

Certainly, some evidence directly confirms the learning hypothesis. When D1 dopamine receptors are obstructed in the nucleus accumbens, attempts to shape specific behaviors with classical conditioning are not maintained later. Memory consolidation seems to be disrupted (for a review, see Berridge, 2007). Indeed, dopamine might facilitate the formation of stronger habits--that is, habits that are maintained even after the rewards are removed (e.g., Everitt & Robbins, 2005).

Similar to this proposition, many researchers maintain that dopamine might elicit excessive learning. To clarify, drugs that increase levels of dopamine imply a significant prediction error. That is, individuals feel the reward was significantly better than expected. Because of this prediction error, they form a very strong association between the stimulus, such as the surrounding environment, and their response, in this instance consuming a drug. In the future, these surroundings will evoke a powerful need to seek these drugs (e.g., Berke, 2003).

Nevertheless, many studies question the proposition that dopamine causes improvements in learning. Mice that cannot produce dopamine because of a genetic mutation, for example, still demonstrate classical conditioning, similar in magnitude to control mice (e.g., Cannon & Palmiter, 2003;; for similar findings, see Berridge and Robinson, 1998). Similarly, mice that produce excess dopamine do not seem to learn more rapidly than other mice (e.g., Yin, Zhuang, & Balleine, 2006). These mice show elevated levels of reward seeking but not enhanced learning: The excessive learning, therefore, can readily explain the reward seeking behavior, contrary to some models of addiction for example.

The final hypothesis relates more to wanting (Berridge and Robinson, 1998). Specifically, dopamine may increase the salience of rewards associated with some object or event. That is, when dopamine levels are adequate, individuals perceive these objects or events as more salient and attractive.

To clarify, according to this hypothesis, the dopamine is not needed to learn the relationships between an object and a reward, as the previous hypothesis assumed. Furthermore, dopamine is not even needed to like the object-a liking that does not always coincide with a motivation to seek this stimulus. Instead, dopamine is needed to translate this learning and liking into a motivation to approach the object or event. Berridge (2007) maintains that most studies can be reconciled with this hypothesis.

The dual-state model of prefrontal cortex dopamine function

Many scholars have maintained that dopamine in the prefrontal cortex facilitates working memory and thus may enhance intelligence. Yet, as Durstewitz and Seamans (2008) show, the association between dopamine and intelligence is complex. Specifically, according to their dual-state model of prefrontal cortex dopamine function, dopamine may increase some facets of intelligence and compromise other facets of intelligence.

In the prefrontal cortex, two classes of receptors can be differentiated: D1 and D2. As Durstewitz and Seamans (2008) argued, the relative activation of these receptors affects intelligence. When D1 activation prevails, patterns of activation in the prefrontal cortex tend to remain stable. That is, higher levels of energy are needed to override the existing state. Consequently, the contents of working memory are readily maintained. That is, working memory is more proficient, enhancing fluid intelligence and reasoning ability.

In contrast, when D2 activation prevails, patterns of activation in the prefrontal cortex tend to shift more frequently, facilitating flexibility but undermining working memory. Consequently, fluid intelligence may dissipate. Nevertheless, facets of intelligence that demand flexibility, perhaps including verbal fluency, might improve.

Both genes and levels of dopamine in prefrontal regions affects whether D1 or D2 activation will prevail. Specifically, one gene, called the catechol-O-methyltransferase gene, affects dopamine functioning in the prefrontal cortex. In this region, to eliminate dopamine, this neurotransmitter is usually degraded in the extrasynaptic areas rather than transported. At codon 158 of this gene, methionine sometimes replaced valine, called the Met allele. This allele reduces degradation of dopamine and, therefore, increases level of dopamine in prefrontal regions.

In carriers of the Met allele, levels of dopamine in the prefrontal cortex tend to be moderate, and these moderate levels of dopamine have been shown to increase activation of D1 receptors and thus enhance fluid intelligence but not flexibility. In carriers of the Val allele, levels of dopamine tend to be low: very low or very high levels increase activation of the D2 receptions and thus enhance flexibility but impair fluid intelligence.

Wacker, Mueller, Stemmler, and Hennig (2012) undertook a study that confirms this theory. In this study, to assess fluid intelligence and reasoning ability, participants completed four subtests of Catell's Cultural Free Test: series, classifications, matrices, and conditions. In particular, to assess other facets of intelligence, participants also completed tests of numerical and verbal knowledge. If participants were carriers of one or both of the Met alleles, their performance on fluid intelligence tests was elevated, but only after controlling knowledge. Furthermore, their performance on the knowledge tests was impaired.


Dopamine cannot traverse the blood-brain barrier. However, levodopa, or L-DOPA, a precursor to dopamine can traverse this barrier. Hence, to increase dopamine levels in the brain, L-DOPA is sometimes administered to patients with Parkinson's disease.

Dopamine is a member of the catecholamine family, with the chemical formula C6H3(OH)2-CH2-CH2-NH2. The enzyme dopamine beta-hydroxylase can facilitate the transition from dopamine to noradrenaline. Cocaine and amphetamines inhibit the re-uptake of dopamine

Principal sites in which dopamine is released

Dopamine release throughout many regions of the brain maintains basal levels of happiness (Esch & Stefano, 2004). Furthermore, rewarding or pleasurable activities, such as the consumption of appetizing food, evoke the release of dopamine in many brain regions.

One of the major targets of dopamine is the ventral striatum--and, especially, a segment called the nucleus accumbens. First, this region is replete with dopamine and opioid receptors. Second, together with the ventral tegmental area, the nucleus accumbens increases the transmission of activity in response to dopamine and odioid activity, called reward anticipation. Accordingly, the nucleus accumbens and ventral tegmental area seem to mediate responses to reward and, for example, might underpin pathological gambling or excessive eating (Kelley & Berridge, 2002). In particular, the nucleus accumbens, together with the limbic system, primarily mediates the anticipation of rewarding experiences, whereas the amygdala and orbitofrontal cortex are largely involved in ongoing rewarding experiences (Burgdorf & Panksepp, 2006).

Dopamine projections from the nucleus accumbens and ventral tegmental area project significantly to the orbitofrontal cortex. The orbitofrontal cortex mediates cognitive processes that evaluate the rewards, determining whether future outcomes are likely to be positive or negative. Accordingly, the orbitofrontal cortex regulates the appreciation of rewards (Rolls, 2000).

The orbitofrontal cortex can also heighten the subjective experience of rewards by integrating distinct sensory pathways. This region, for example, can integrate the sense of smell and taste to augment the experience of pleasure in response to food.

The nucleus accumbens

The nucleus accumbens is part of the striatum--a region in the basal ganglia. To clarify, the basal ganglia are a cluster of nuclei, located towards the center of the brain, near the amygdala for instance. Overall, the basal ganglia comprise two large areas, the striatum and the globus pallidus, and two smaller areas, the substantia nigra and the subthalamic nucleus.

The striatum can be divided into the dorsal and ventral regions. The dorsal striatum comprises the caudate and the putamen, which are divided by the internal capsule. These regions facilitate the selection and initiation of actions. The ventral striatum comprises the olfactory tubercle and nucleus accumbens.

The nucleus accumbens comprises two distinct structures: nucleus accumbens core and the nucleus accumbens shell. Furthermore, the nucleus accumbens is part of the mesolimbic pathway, a pathway that is often, but not universally, assumed to be the key underpinning of rewards.

In particular, this pathway progresses from the ventral tegmental area, in the midbrain. These neurons produce dopamine, GABA, and glutamate. The dopaminergic neurons activate the nucleus accumbens. Furthermore, neurons from the amygdala, hippocampus, and medial prefrontal cortex that release glutamate also project onto the nucleus accumbens. The nucleus accumbens then projects axons that release GABA, an inhibitory neurotransmitter, onto the ventral parts of the globus pallidus. The nucleus accumbens also projects onto the substantia nigra. Most addictive drugs, such as cocaine, increase the production of dopamine in the nucleus accumbens.

The nucleus accumbens seems to be vital to the placebo effect. For example, in some studies, participants are informed they will receive either a drug or placebo, but actually receive only a placebo. Next, participants are asked to evaluate the extent to which they feel the drug will be effective. If participants expect a significant improvement as a consequence of the drug, the nucleus accumbens tends to release more dopamine and be activated to a greater extent. Accordingly, the nucleus accumbens seems to correspond to the expectation of positive effects, perhaps initiating neural and physiological responses that facilitate these changes.


Alloway, T. P., Gathercole, S. E., Kirkwood, H., & Ellioo, J. (20080. Evaluating the validity of the Automated Working Memory Assessment. Educational Psychology, 28, 725-734.

Ashby, F. G., Valentin, V. V., & Turken, A. U. (2002). The effects of positive affect and arousal on working memory and executive attention: Neurobiology and computational models. In S. Moore & M. Oaksford (Eds.), Emotional contagion: From brain to behavior (pp. 245-287). Amsterdam, the Netherlands: Benjamins.

Baddeley, A. (1996). Exploring the central executive. Quarterly Journal of Experimental Psychology Section A: Human Experimental Psychology, 49, 5-28.

Baddeley, A. D. (2000). The episodic buffer: A new component of working memory? Trends in Cognitive Science, 4, 417-423.

Baddeley, A. D. (2003). Working memory: Looking back and looking forward. Nature Reviews, Neuroscience, 4, 829-839.

Baddeley, A. D. (2007). Working memory, thought, and action. Oxford, UK: Oxford University Press.

Baddeley, A. D., & Hitch, G. J. (1974). Working memory. In G. A. Bower (Ed.), Recent advances in learning and motivation (Vol. 8, pp. 47-89). New York: Academic Press.

Baddeley, A. D., & Larsen, J. D. (2007). The phonological loop unmasked? A comment on the evidence for a "perceptual-gestural" alternative. Quarterly Journal of Experimental Psychology, 60, 497-504.

Baddeley, A. D., Baddeley, H. A., Bucks, R. S., & Wilcock, G. C. (2001). Attentional control in Alzheimer's disease. Brain, 124, 1492-1508.

Baddeley, A. D., Chincotta, D., & Adlam, A. (2001). Working memory and the control of action: evidence from task switching. Journal of Experimental Psychology. General, 130, 641-57.

Baddeley, A. D., Grant, W., Wight, E., & Thomson, N. (1975) Imagery and visual working memory. In: Attention and Performance V (eds P. M. A. Rabbitt & S. Dornic), pp. 205-17. Academic Press, London.

Baddeley, A.D. (1986). Working memory. Oxford: Clarendon Press.

Berke, J. D. (2003). Learning and memory mechanisms involved in compulsive drug use and relapse. In J. Q. Wang (Ed) Drugs of abuse: neurological reviews and protocols (Methods in Molecular Medicine) (pp. 75-101). Totowa, NJ: Humana.

Berman, M. G, Nee, D. E., Casement, M., Kim H. S., Deldin, P., Kross, E. ... & Jonides, J. (2011). Neural and behavioral effects of interference resolution in depression and rumination. Cognitive, Affective, and Behavioral Neuroscience, 11, 85-96. doi: 10.3758/s13415-010-0014-x

Berridge, K. C. (2007). The debate over dopamine's role in reward: the case for incentive salience. Psychopharmacology, 191, 391-431.

Berridge, K. C., & Robinson, T. E. (1998). What is the role of dopamine in reward: Hedonic impact, reward learning, or incentive salience? Brain Research Reviews, 28, 309-369

Botvinick, M. M., Huffstetler, S., & Mcguire, J. T. (2009). Effort discounting in human nucleus accumbens. Cognitive, Affective, & Behavioral Neuroscience, 9 , 16-27. doi:10.3758/CABN.9.1.16

Breiter, H. C., Aharon, I., Kahneman, D., Dale, A., & Shizgal, P. (2001). Functional imaging of neural responses to expectancy and experience of monetary gains and losses. Neuron, 30, 619-639.

Burgdorf, J., & Panksepp, J. (2006). The neurobiology of positive emotions. Neuroscience and Biobehavioral Reviews, 30, 173-187.

Bush, G. (2010). Attention-deficit/hyperactivity disorder and attention networks. Neuropsychopharmacology, 35, 278-300.

Cannon, C. M., & Palmiter, R. D. (2003). Reward without dopamine. Journal of Neuroscience, 23, 10827-10831.

Chemali, Z. N., Chahine, L. M., & Naasan, G. (2008). On happiness: A minimalist perspective on a complex neural circuitry and its psychosocial constructs. Journal of Happiness Studies, 9, 489-501.

Colzato, L. S., Van Wouwe, N. C., & Hommel, B. (2007). Feature binding and affect: Emotional modulation of visuo-motor integration. Neuropsychologia, 45, 440-446.

Costa, V. D., Lang, P. J., Sabatinelli, D., Bradley, M. M., & Versace, F. (2010). Emotional imagery: Assessing pleasure and arousal in the brain's reward circuitry. Human Brain Mapping, 31, 1446-1457. doi:10.1002/hbm.20948.

Delgado, M. R., Nystrom, L. E., Fissell, C., Noll, D. C., & Fiez, J. A. (2000). Tracking the hemodynamic responses to reward and punishment in the striatum. Journal of Neurophysiology, 84, 3072-3077.

Depue, R. A., & Collins, P. F. (1999). Neurobiology of the structure of personality: Dopamine, facilitation of incentive motivation, and extraversion. Behavioral and Brain Sciences, 22, 491-517. doi:10.1017/S0140525X99002046

Drabant, E. M., Hariri, A. R., Meyer Lindenberg, A., Munoz, K. E., Mattay, V. S., Kolachana, B. S., et al. (2006). Catechol o-methyltransferase valsuperscript 1-sup-5-sup-8Met genotype and neural mechanisms related to affective arousal and regulation. Archives of General Psychiatry, 63, 1396-1406.

Dreisbach, G., & Goschke, T. (2004). How positive affect modulates cognitive control: Reduced perseveration at the cost of increased distractibility. Journal of Experimental Psychology: Learning, Memory, and Cognition, 30, 343-353.

Durstewitz, D., & Seamans, J. K. (2008). The dual-state theory of prefrontal cortex dopamine function with relevance to catechol-O-methyltransferase genotypes and schizophrenia. Biological Psychiatry, 64, 739-749. doi:10.1016/j.biopsych.2008.05.015

Elliott, R., Friston, K. J., & Dolan, R. J. (2000). Dissociable neural responses in human reward systems. Journal of Neuroscience, 20, 6159-6165.

Esch, T., & Stefano, G. B. (2004). The neurobiology of pleasure, reward processes, addiction and their health implications. Neuro Endocrinology Letters, 25, 235-251.

Everitt, B. J., & Robbins, T. W. (2005). Neural systems of reinforcement for drug addiction: From actions to habits to compulsion. Nature Neuroscience, 8, 1481-1489.

Flaherty, A. W. (2005). Frontotemporal and dopaminergic control of idea generation and creative drive. Journal of Comparative Neurology, 493,147-153.

Flanagan, D. P., & Harrison, P. L. (Eds.). (2005). Contemporary intellectual assessment: Theories, tests, and issues (2nd ed.) New York: Guilford Press.

Floresco, S. B., & Phillips, A. G. (2001). Delay-dependent modulation of memory retrieval by infusion of a dopamine D1 agonist into the rat medial prefrontal cortex. Behavioral Neuroscience, 115, 934-939.

Hasler, G., Fromm, S., Carlson, P. J., Luckenbaugh, D. A., Waldeciz, T., Geraci, M., et al. (2008). Neural response to catecholamine depletion in unmedicated subjects with major depressive disorder in remission and healthy subjects. Archives of General Psychiatry, 65, 521-531.

Kelley, A. E., & Berridge, K. C. (2002). The neuroscience of natural rewards: Relevance to addictive drugs. Journal of Neuroscience, 22, 3306-3311.

Kelley, A. E., Will, M. J., Steininger, T. L., Zhang, M., & Haber, S. N. (2003). Restricted daily consumption of highly palatable food (chocolate ensure) alters striatal enkephalin gene expression. The European Journal of Neuroscience, 18, 2592-2598.

Kimberg, D. Y., D'Esposito, M., & Farah, M. J. (1997). Effects of bromocriptine on human subjects depend on working memory capacity. Neuroreport, 8, 3581-3585.

Mantyla, T., Carelli, M. G., & Forman, H. (2007). Time monitoring and executive functioning in children and adults. Journal of Experimental Child Psychology, 96, 1-19.

McClure, S. M., York, M. K., & Montague, P. R. (2004). The neural substrates of reward processing in humans: The modern role of fMRI. Neuroscientist, 10, 260-268.

Menon, V. & Levitin, D. J. (2005) The rewards of music listening: Response and physiological connectivity of themesolimbic system. NeuroImage, 28, 175-184.

Nicola, S. M., Yun, I. A., Wakabayashi, K. T., & Fields, H. L. (2004). Firing of nucleus accumbens neurons during the consummatory phase of a discriminative stimulus task depends on previous reward predictive cues. Journal of Neurophysiology, 91, 1866-1882.

O'Donnell, P., & Goto, Y. (2001). Synchronous activity in the hippocampus and nucleus accumbens in vivo. Journal of Neuroscience, 21, RC131

Rolls, E. T. (2000). The orbitofrontal cortex and reward. Cerebral Cortex, 10, 284-294.

Sabatinelli, D., Lang, P. J., Bradley, M. M., Costa, V. D., & Versace, F. (2007). Pleasure rather than salience activates human nucleus accumbens and medial prefrontal cortex. Journal of Neurophysiology, 98, 1374-1379.

Salamone, J. D., Correa, M., Farrar, A., & Mingote, S. M. (2007). Effort-related functions of nucleus accumbens dopamine and associated forebrain circuits. Psychopharmacology, 191, 461-482.

Salamone, J. D., Correa, M., Mingote, S. M., & Weber, S. M. (2003). Nucleus accumbens dopamine and the regulation of effort in food-seeking behavior: Implications for studies of natural motivation, psychiatry, and drug abuse. Journal of Pharmacology & Experimental Therapeutics, 305, 1-8.

Salamone, J. D., Correa, M., Mingote, S. M., Weber, S. M., & Farrar, A. M. (2006). Nucleus accumbens dopamine and the forebrain circuitry involved in behavioral activation and effort-related decision making: Implications for understanding anergia and psychomotor slowing in depression. Current Psychiatry Reviews, 2, 267-280.

Salamone, J. D., Cousins, M. S., & Bucher, S. (1994). Anhedonia or anergia? Effects of haloperidol and nucleus accumbens dopamine depletion on instrumental response selection in a T-maze cost/benefit procedure. Behavioural Brain Research, 65, 221-229.

Schultz, W., Tremblay, L., & Hollerman, J. R. (2000). Reward processing in primate orbitofrontal cortex and basal ganglia. Cerebral Cortex, 10, 272-283.

Smith, S. R., Servesco, A. M., Edwards, J. W., Rahban, R., Barazani, S., Nowinski, L. A., Little, J. A., Blazer, A. L., & Green, J. G. (2008). Exploring the validity of the comprehensive trail making test. Clinical Neuropsychologist, 22, 507-518.

Tobler, P. N., Fiorillo, C. D., & Schultz, W. (2005). Adaptive coding of reward value by dopamine neurons. Science, 307, 1642-1645.

Wacker, J., Mueller, E. K., Stemmler, G., & Hennig, J. (2012). How to consistently link extraversion and intelligence to the catechol-o-methyltransferase (comt) gene: On defining and measuring psychological phenotypes in neurogenetic research. Journal of Personality and Social Psychology, 102, 427-444. doi: 10.1037/a0026544

Waelti, P., Dickinson, A., & Schultz, W. (2001). Dopamine responses comply with basic assumptions of formal learning theory. Nature, 412, 43-48.

Walton, M. E., Kennerley, S. W., Bannerman, D. M., Phillips, P. E. M., & Rushworth, M. F. S. (2006). Weighing up the benefits of work: Behavioral and neural analyses of effort-related decision making. Neural Networks, 19, 1302-1314.

Wu, H. M., Wang, X. L., Chang, C. W., Li, N., Gao, L., Geng, N., Ma, J. H., Zhao, W., & Gao, G. D. (2010). Preliminary findings in ablating the nucleus accumbens using stereotactic surgery for alleviating psychological dependence on alcohol. Neuroscience Letters, 473, 77-81.

Yin, H. H., Zhuang, X., & Balleine, B. W. (2006). Instrumental learning in hyperdopaminergic mice. Neurobiology of Learning and Memory, 3, 238-283.

Academic Scholar?
Join our team of writers.
Write a new opinion article,
a new Psyhclopedia article review
or update a current article.
Get recognition for it.

Last Update: 6/8/2016