Conventional wisdom holds that adolescence is a time of “storm and stress,” characterized by increased risk taking and sensitivity to rewards (Hall, 1904; Casey et al., 2010). Psychobiological models provide important caveats but generally support this conventional wisdom. For example, according to a developmental imbalance model, reward processing matures before self-control; this temporary imbalance, as well as underconnectivity between reward-sensitive and self-regulatory regions of the brain, drives adolescent risk taking (Somerville et al., 2010; Casey, 2015). Other models of adolescent psychobiology, such as the dual systems model (Smith et al., 2013) and the triadic model (in which aversive processing also influences risk taking; Ernst, 2014), also posit opposing roles for reward-sensitive and self-regulatory circuits, and attribute adolescent risk taking to an asymmetry between them. Structural and functional asymmetry, as well as underconnectivity, between reward-sensitive and self-regulatory brain regions is reliably documented in adolescents (Hagmann et al., 2010; Mills et al., 2014; Baker et al., 2015).
Nonetheless, these models have fallen under scrutiny when it comes to explaining real-world risk taking. As discussed by Bjork and Pardini (2015), although the peak imbalance in brain circuitry occurs between the ages of 14 and 16 years, the peak in binge drinking, risky sex, and unintentional injury, including death, does not occur until the ages of 19–23 years in the United States. Similarly, a recent study found that, although striatal reward regions matured structurally before prefrontal control regions, this asymmetry was not linked to self-reported real-life risk taking (Mills et al., 2014). Despite correlational evidence that both reward sensitivity and laboratory risk taking peak during adolescence, there is little evidence that reward sensitivity causes risk taking in either the laboratory or the real world (cf., Galvan et al., 2007; van Duijvenvoorde et al., 2014). Thus, the link between reward response and risk taking remains elusive.
Conventional wisdom notwithstanding, some have questioned whether the average teenager lives up to his or her reckless stereotype (Willoughby et al., 2014; Casey, 2015). Instead, some have argued that reckless risk taking in real life is limited to a subset of adolescents, whose behavior reflects a lifelong trait rather than a developmental stage (Bjork and Pardini, 2015). Others have argued that the propensity for risk taking may be present during childhood but may go undetected until opportunities for risk taking emerge during adolescence (for review and discussion, see Reyna and Farley, 2006). Thus, the developmental trajectory of risk taking remains controversial, as does the stereotype of a rebellious adolescent.
A study by Braams et al. (2015), published in The Journal of Neuroscience, provides additional data to inform these debates through the use of a longitudinal design, a large sample (n = 254 at 2-year follow-up), and both child and adult comparison groups. Using fMRI, the researchers measured activation of the nucleus accumbens (NAcc), a region of the ventral striatum that consistently responds to rewards, while participants guessed whether a computer-generated coin toss would land on heads or tails. Participants were notified that they had won money after each correct guess, or lost money after each incorrect guess. Outside of the scanner, participants completed a risky decision-making task, the balloon analog risk task (BART), in which they inflated a computerized balloon pump-by-pump. Participants earned a monetary reward each time they chose to pump the balloon but lost all reward if the balloon burst. Thus, larger rewards could be earned by taking more risks.
The results of this study provide insight into the developmental trajectory of reward and risk processing, as well as informing the controversy between individual and developmental differences. Regarding the developmental trajectory, the results showed an adolescent peak in both reward processing and risk taking. Adolescents showed a higher NAcc response to reward outcomes (i.e., money won after correctly guessing heads or tails) than did either children or adults, which is consistent with other reports of ventral striatum response to reward feedback (Van Leijenhorst et al., 2010). Risk taking on the BART also peaked during adolescence, consistent with previous findings (van Duijvenvoorde et al., 2015). Surprisingly, however, the adolescents who showed the strongest NAcc response to reward were not necessarily those who took the most risks.
Braams et al. (2015) found individual differences not only in the level of reward responsiveness at each developmental time point, but also in the direction of change over time. Thus, some participants were more reward-sensitive than others, and for some, that sensitivity increased with age, whereas for others, it decreased. Other studies have highlighted the prevalence of individual differences among adolescents in both risk and reward sensitivity (Cservenka et al., 2013; van Duijvenvoorde et al., 2015). Bjork and Pardini (2015) note that age typically explains little of the variance in NAcc response, even when there is a significant effect of age, because there is high variability within age groups. In Braams et al. (2015), although the average developmental trajectory of NAcc activation was an inverted U-shape, many participants displayed the opposite pattern (Braams et al., 2015, their Fig. 4A), underscoring the presence of both individual and developmental differences.
The study by Braams et al. (2015) also provides insight into the relation between reward processing and risk taking. The authors found that risk taking on the BART, measured by the number of burst balloons, peaked in adolescence. This is consistent with other studies reporting an adolescent peak in risk taking (van Duijvenvoorde et al., 2015). Interestingly, however, the adolescents with the highest NAcc response to rewards were not necessarily those who took the most risks on the BART (i.e., model fit did not improve when the BART was added as a predictor of NAcc response; Braams et al., 2015, their Table 5). This is surprising in light of evidence suggesting that reward processing may drive risk taking, particularly during adolescence (Galvan et al., 2007; Chein et al., 2011). For example, Reyna et al. (2011) found that adolescents, compared with adults, were more willing to take risks for high-magnitude rewards.
One way to reconcile these findings is that the BART, like most tasks used to assess adolescent risk taking, does not provide the conditions necessary to dissociate reward sensitivity from risk sensitivity. In the BART, risks and rewards are not orthogonally varied. Therefore, although higher risks can be traded off against higher rewards, it is not clear whether a person who chooses to pump the balloon more times is highly sensitive to reward or is insensitive to risk. This is also a limitation of other tasks commonly used to assess adolescent risk taking. For example, the wheel-of-fortune task, in which participants choose between a smaller, surer reward and a larger, riskier reward, cannot dissociate reward sensitivity from risk tolerance (Cservenka et al., 2013). Tasks that independently vary risk and reward, such as the Columbia Card Task (Figner et al., 2009; van Duijvenvoorde et al., 2015) and the risky choice framing task (Reyna et al., 2011), have consistently found an adolescent peak in risk taking, but have reported inconsistent findings regarding whether reward sensitivity develops linearly (van Duijvenvoorde et al., 2015) or quadratically (Galvan et al., 2006; Steinberg, 2010; Reyna et al., 2011).
Some studies have suggested that pubertal hormones underlie increases in both risk taking and reward sensitivity during adolescence (van Duijvenvoorde et al., 2014). However, reports have been inconsistent as to whether pubertal development adds explanatory power beyond that of age (Peper and Dahl, 2013). Braams et al. (2015) found a quadratic relationship between age and bilateral NAcc activation; in contrast, they found a linear relationship between pubertal development and left NAcc activation (Braams et al., 2015, their Table 6). There was no relationship between pubertal development and activation in the right NAcc. Thus, even though age significantly predicted pubertal development (Braams et al., 2015, their Fig. 5), age and pubertal development yielded different models of NAcc development.
Pubertal hormones, such as testosterone, may influence risk and reward processing via an effect on social processing. Gonadal hormone levels are associated with perspective taking and other social cognition (Peper and Dahl, 2013), and much evidence suggests that social context influences risk taking during adolescence. For example, in a simulated driving task, adolescents took more risks in the presence of peers than alone, and NAcc activation predicted risk taking in the presence of peers (Chein et al., 2011). A study using the same driving task showed that adolescents took fewer risks in the presence of their mothers than they took alone, and ventral striatum activation during risky decisions decreased in the presence of their mothers (Telzer et al., 2015b). Similarly, an earlier study by Braams et al. (2014) reported that adolescent ventral striatum activation increased in response to rewards received by the participant and their best friend, but decreased following rewards received by a disliked peer. Reward processing in adolescence may also be affected by social experience outside of the laboratory. Telzer et al. (2015a) found that more supportive relationships with peers predicted less NAcc activation when adolescents took risks during the BART task.
One of the most exciting recent advances in adolescent brain research is work suggesting that adolescent reward sensitivity can be channeled in service of prosocial goals. For example, increased NAcc activation when adolescents chose rewards for their family, compared with rewards for themselves, predicted decreased real-life risk taking in a 1-year follow up (Telzer et al., 2013). Reward sensitivity per se may not lead to risk taking; instead, the influence of reward sensitivity may depend on combination with other traits. For example, using a variant of the BART, Humphreys et al. (2013) found that the combination of high sensation seeking and high associative sensitivity (the tendency to find meaningful associations in one's environment) predicted fewer balloon explosions and more points earned in the condition that offered the highest benefit from experience-based learning. In addition, incentives improve performance on a variety of cognitive control tasks (Paulsen et al., 2015), suggesting that reward response may either support or hinder goal-oriented behavior, depending on the context.
In summary, despite recent progress in mapping the neural substrates and developmental trajectory of adolescent reward sensitivity and risk taking, the relation between these functions remains to be well characterized. The use of tasks that dissociate risk from reward will be important to this endeavor. Future work should build on the possibility that the unique adolescent profile of reward sensitivity and risk sensitivity may be managed to support goal-consistent behavior.
Footnotes
Editor's Note: These short, critical reviews of recent papers in the Journal, written exclusively by graduate students or postdoctoral fellows, are intended to summarize the important findings of the paper and provide additional insight and commentary. For more information on the format and purpose of the Journal Club, please see http://www.jneurosci.org/misc/ifa_features.shtml.
- Correspondence should be addressed to Christina F. Chick, Department of Psychiatry and Behavioral Sciences, Stanford University School of Medicine, 401 Quarry Road, Stanford, CA 94305. cchick{at}stanford.edu