Skip to main content

Main menu

  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Collections
    • Podcast
  • ALERTS
  • FOR AUTHORS
    • Information for Authors
    • Fees
    • Journal Clubs
    • eLetters
    • Submit
    • Special Collections
  • EDITORIAL BOARD
    • Editorial Board
    • ECR Advisory Board
    • Journal Staff
  • ABOUT
    • Overview
    • Advertise
    • For the Media
    • Rights and Permissions
    • Privacy Policy
    • Feedback
    • Accessibility
  • SUBSCRIBE

User menu

  • Log out
  • Log in
  • My Cart

Search

  • Advanced search
Journal of Neuroscience
  • Log out
  • Log in
  • My Cart
Journal of Neuroscience

Advanced Search

Submit a Manuscript
  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Collections
    • Podcast
  • ALERTS
  • FOR AUTHORS
    • Information for Authors
    • Fees
    • Journal Clubs
    • eLetters
    • Submit
    • Special Collections
  • EDITORIAL BOARD
    • Editorial Board
    • ECR Advisory Board
    • Journal Staff
  • ABOUT
    • Overview
    • Advertise
    • For the Media
    • Rights and Permissions
    • Privacy Policy
    • Feedback
    • Accessibility
  • SUBSCRIBE
PreviousNext
Research Articles, Behavioral/Cognitive

No Effect of Commercial Cognitive Training on Brain Activity, Choice Behavior, or Cognitive Performance

Joseph W. Kable, M. Kathleen Caulfield, Mary Falcone, Mairead McConnell, Leah Bernardo, Trishala Parthasarathi, Nicole Cooper, Rebecca Ashare, Janet Audrain-McGovern, Robert Hornik, Paul Diefenbach, Frank J. Lee and Caryn Lerman
Journal of Neuroscience 2 August 2017, 37 (31) 7390-7402; https://doi.org/10.1523/JNEUROSCI.2832-16.2017
Joseph W. Kable
1Departments of Psychology and
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
M. Kathleen Caulfield
1Departments of Psychology and
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Mary Falcone
2Psychiatry, and
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Mary Falcone
Mairead McConnell
1Departments of Psychology and
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Leah Bernardo
2Psychiatry, and
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Trishala Parthasarathi
1Departments of Psychology and
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Nicole Cooper
1Departments of Psychology and
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Rebecca Ashare
2Psychiatry, and
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Janet Audrain-McGovern
2Psychiatry, and
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Robert Hornik
3Annenberg School for Communication, University of Pennsylvania, Philadelphia, Pennsylvania 19104, and
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Robert Hornik
Paul Diefenbach
4Department of Digital Media, Drexel University, Philadelphia, Pennsylvania 19104
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Paul Diefenbach
Frank J. Lee
4Department of Digital Media, Drexel University, Philadelphia, Pennsylvania 19104
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Caryn Lerman
2Psychiatry, and
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF
Loading

Abstract

Increased preference for immediate over delayed rewards and for risky over certain rewards has been associated with unhealthy behavioral choices. Motivated by evidence that enhanced cognitive control can shift choice behavior away from immediate and risky rewards, we tested whether training executive cognitive function could influence choice behavior and brain responses. In this randomized controlled trial, 128 young adults (71 male, 57 female) participated in 10 weeks of training with either a commercial web-based cognitive training program or web-based video games that do not specifically target executive function or adapt the level of difficulty throughout training. Pretraining and post-training, participants completed cognitive assessments and functional magnetic resonance imaging during performance of the following validated decision-making tasks: delay discounting (choices between smaller rewards now vs larger rewards in the future) and risk sensitivity (choices between larger riskier rewards vs smaller certain rewards). Contrary to our hypothesis, we found no evidence that cognitive training influences neural activity during decision-making; nor did we find effects of cognitive training on measures of delay discounting or risk sensitivity. Participants in the commercial training condition improved with practice on the specific tasks they performed during training, but participants in both conditions showed similar improvement on standardized cognitive measures over time. Moreover, the degree of improvement was comparable to that observed in individuals who were reassessed without any training whatsoever. Commercial adaptive cognitive training appears to have no benefits in healthy young adults above those of standard video games for measures of brain activity, choice behavior, or cognitive performance.

SIGNIFICANCE STATEMENT Engagement of neural regions and circuits important in executive cognitive function can bias behavioral choices away from immediate rewards. Activity in these regions may be enhanced through adaptive cognitive training. Commercial brain training programs claim to improve a broad range of mental processes; however, evidence for transfer beyond trained tasks is mixed. We undertook the first randomized controlled trial of the effects of commercial adaptive cognitive training (Lumosity) on neural activity and decision-making in young adults (N = 128) compared with an active control (playing on-line video games). We found no evidence for relative benefits of cognitive training with respect to changes in decision-making behavior or brain response, or for cognitive task performance beyond those specifically trained.

  • cognitive training
  • delay discounting
  • impulsivity
  • neuroimaging
  • working memory

Introduction

Individuals are often confronted with choices between rewards that vary in value, risk, and timing. Individuals vary in their preference for immediate over future rewards (delay discounting) and for certain versus risky rewards (risk sensitivity; Holt and Laury, 2002; Kable and Glimcher, 2007; Levy et al., 2010), and these preferences affect health, educational, and other life outcomes (Bickel and Marsch, 2001; Duckworth and Seligman, 2005; Kirby et al., 2005; Reimers et al., 2009; MacKillop et al., 2011; Meier and Sprenger, 2012).

Several lines of evidence suggest that executive functions may promote the choice of delayed over immediate rewards. Measures of cognitive ability and working memory are reliably correlated with reduced discounting (Shamosh and Gray, 2008; Burks et al., 2009), and similar dorsolateral prefrontal cortical (dlPFC) regions are engaged during working memory and delay discounting tasks (Wesley and Bickel, 2014). Several neuroimaging studies demonstrate that engaging the dlPFC during decision-making can affect value-related activity in ventromedial prefrontal cortex (vmPFC) and ventral striatum (VS; Hare et al., 2009; Rushworth et al., 2011; Jimura et al., 2013; FitzGerald et al., 2014; Vaidya and Fellows, 2015; Bissonette and Roesch, 2016), biasing choices away from immediate rewards (DelParigi et al., 2007; Hare et al., 2009; Kober et al., 2010; Hare et al., 2011). A similar principle may apply to risky rewards, as risk-averse individuals exhibit higher activity in dlPFC (Christopoulos et al., 2009; Gianotti et al., 2009) and the disruption of dlPFC leads to more risk-seeking choices (Knoch et al., 2006). Based on these findings, interventions that enhance executive function could shift decision-making away from immediate and risky rewards (Bickel et al., 2011; McClure and Bickel, 2014; Wesley and Bickel, 2014).

Recent evidence suggests that executive function may be enhanced through adaptive computerized cognitive training (Ball et al., 2002; Willis et al., 2006; Morrison and Chein, 2011; Nouchi et al., 2013; Au et al., 2015; Hardy et al., 2015), and that cognitive training can alter dlPFC activity in a manner reflecting increased capacity or recruitment of additional neural resources (Olesen et al., 2004; Dahlin et al., 2008; Takeuchi et al., 2011; Jolles et al., 2013). The one study to test the effects of cognitive training on decision-making found reduced discounting in a small sample of stimulant addicts (Bickel et al., 2011). If cognitive training reduces delay discounting, this would have important implications for the prevention and treatment of addiction, obesity, and other disorders related to unhealthy behaviors, but there is reason for skepticism. Some large individual studies, reviews, and meta-analyses have concluded that the benefits of training do not transfer to cognitive outcomes beyond the trained tasks (Owen et al., 2010; Shipstead et al., 2012; Melby-Lervåg and Hulme, 2013; Thompson et al., 2013; Roberts et al., 2016), and no well-powered, well-controlled randomized trial has examined the effects of cognitive training on decision-making and brain activity.

In this first randomized controlled trial of the effects of adaptive cognitive training on choice behavior and neural responses, 128 young adults received 10 weeks of a web-based computerized intervention, consisting of either commercially available adaptive cognitive training or control training using computer games delivered in the same manner. The control training was designed to account not just for nonspecific placebo and social desirability effects, but also for two components believed to be critical to efficacy of adaptive cognitive training (Morrison and Chein, 2011; Shipstead et al., 2012). Unlike cognitive training, control games were not explicitly designed to tax executive functions and were not adaptive (i.e., difficulty levels were not adjusted over the course of training to users' current level of performance). All participants completed cognitive assessments pretraining and post-training, as well as functional magnetic resonance imaging (fMRI) during performance of delay discounting and risk sensitivity tasks. We hypothesized that cognitive training would enhance cognitive control processes and bias decision-making and neural activity away from choices of immediate or risky rewards.

Materials and Methods

All procedures were approved by the University of Pennsylvania Institutional Review Board. This trial was registered at clinicaltrials.gov as Clinical trial reg. no. NCT01252966.

Participants and eligibility

Individuals between 18 and 35 years of age who reported home computer and internet access could participate. Three hundred ninety-five participants provided informed consent and completed an in-person eligibility screen. The in-person eligibility screen included a brief IQ test to identify those with low/borderline intelligence (score of <90 on Shipley Institute of Living Scale, n = 10; Zachary, 1986), an fMRI safety form to assess fMRI contraindications (n = 22), and baseline assessments of delay discounting and risk sensitivity. Participants exhibiting extreme choice behavior were not eligible to be randomized (discount rate, k < 0.0017, n = 34; discount rate, k > 0.077, n = 7; risk sensitivity, α < 0.34, n = 36; or risk sensitivity, α > 1.32, n = 16; both k and α out of range, n = 6; technical error, n = 2). These criteria were chosen based on previous work in our laboratory and were the estimated 10th and 90th percentiles of the normal range in discount rate and the 5th and 95th percentiles of the normal range in risk sensitivity. The purpose of this exclusion was to minimize potential ceiling and floor effects on the behavioral outcomes and to ensure engagement during the scanning tasks. The scanning tasks asked the same questions of every participant and were designed to be sensitive to changes in discount rate or risk sensitivity in a wide range of participants; excluded participants fell outside of this range and would have chosen all or nearly all of one type of option on one of the scanning tasks. Other exclusion criteria were as follows: self-reported history of neurological, psychiatric, or addictive disorders (excluding nicotine), positive breath alcohol reading (>0.01), color blindness, left-handedness, and claustrophobia (n = 11). Eligible participants completed a 1 week “run-up” period to screen for noncompliance. During this week, they were instructed to complete games from the control training 5 times/week for 30 min/d. Those who completed fewer than four sessions were not randomized (n = 54); nor were those who did not complete the pretreatment scan visit (n = 31).

Eligible participants (n = 166) were randomized to condition in blocks of 4 (n = 84 to the cognitive training group and n = 82 to the active control group). Thirty-eight participants (22.9%) were lost to follow-up (20 participants in cognitive training group, 18 participants in active control group); these individuals were younger (mean age, 23 vs 25 years; p = 0.002) and less likely to have completed college (p = 0.02). Thus, the final analyzed sample for this fMRI-based clinical trial included 128 participants (cognitive training group, 64 participants; active control group, 64 participants).

Interventions

Participants in both conditions initiated their assigned training in the week following the baseline fMRI scan (see below). All participants were instructed to complete their assigned web-based training from home 5 times/week for 30 min/session, for a total of 50 sessions over 10 weeks. Participant compliance with training was monitored electronically, and small monetary incentives were provided for completion ($5/session). Adherence was measured as the percentage of assigned sessions that were completed; partial sessions were counted if a participant completed at least 15 min of training. Participants were classified as good adherers if they completed at least 70% of assigned sessions (approximately the top two quartiles) and poor adherers if they completed <70% of assigned sessions.

Cognitive training condition.

The cognitive training condition used Lumosity, a commercially available platform (http://www.lumosity.com/). The training program consists of internet-based games that claim to train specific cognitive domains. Many games are based on traditional psychological tasks (such as the flanker task or n-back working memory task), and all are designed to be engaging. All participants were assigned identical games (supplied by Lumosity) in a standardized order that rotated among the following five cognitive domains: working memory (∼27% of games over the 10 week training period); attention (∼13%); flexibility (∼24%); problem solving (∼15%); short-term memory (∼12%); and speed (∼9%). Individual games were ∼2–3 min long (depending on participant response speed), so that a 30 min training session consisted of 10–15 games. A core aspect of cognitive training is that it is adaptive, meaning that difficulty increased progressively across sessions as performance improved. There were a total of 23 possible exercises; examples are provided in Table 1. Standardized feedback on performance was based on the LPI (see below), but participants were not taught specialized cognitive strategies for completing the games.

View this table:
  • View inline
  • View popup
Table 1.

Cognitive training exercises: selected examples of 23 possible games

The Lumosity program has been shown to improve performance on tasks measuring memory, cognitively flexibility, problem solving, and response inhibition to a greater extent than crosswords puzzles (Hardy et al., 2015); however, no study has validated the platform against an active condition consisting of nonadaptive video games. In using a wide set of tasks that target different cognitive abilities, Lumosity is similar in approach to several other broad-based cognitive training programs (Owen et al., 2010; Schmiedek et al., 2010; McDougall and House, 2012; Nouchi et al., 2013). Like the literature on training paradigms that specifically target working memory, previous findings regarding broad-based cognitive training are mixed, with some reports of significant improvements (Schmiedek et al., 2010; McDougall and House, 2012; Nouchi et al., 2013; Hardy et al., 2015) and some notable null results (Owen et al., 2010).

Active control condition.

Participants in the active control condition received an active intervention designed to account for the nonspecific effects of cognitive stimulation common to any video games or training program, such as engagement, expectancy, novelty, motivation, and contact (Motter et al., 2016). We used computer video games, which have been used as an active control for cognitive training programs in several previous studies (Kundu et al., 2013; Nouchi et al., 2013). Video games were developed by the Drexel University RePlay Lab (http://replay.drexel.edu/index.html) and included a total of 40 possible games (http://drexelgames.com/); examples are provided in Table 2. Participants were not prompted to complete particular games within each session and could spend as much time on each game as they chose as long as they spent 30 min playing in total. These games were not specifically designed to tax executive functions and therefore were not expected to engage these abilities more than typical computerized games but were designed to be entertaining and engaging. Although these games can become more challenging as one progresses through the game within a session, user performance is not tracked over sessions and game difficulty is not adapted during each session to current user abilities, as in the cognitive training condition (i.e., users start from the beginning of the game each session). Both adaptive testing and the targeting of specific processes are believed to be key components of the efficacy of cognitive training (Morrison and Chein, 2011; Shipstead et al., 2012). At the same time, participants in both groups were given the same information regarding the study purpose (e.g., “we are investigating the effects of certain types of computer games on brain activity and decision-making behavior”), controlling for expectancy effects. The variety of games available in both conditions allowed each to present a novel experience. To control for motivation and contact, participants in both conditions received the same completion incentives and the same weekly phone calls to review study compliance and were blinded to their specific training condition.

View this table:
  • View inline
  • View popup
Table 2.

Active control games: selected examples of 40 possible games

Neuroimaging and primary outcomes

Participants completed blood oxygen level-dependent fMRI sessions at baseline and following the 10-week training period. Participants completed the delay discounting and risk sensitivity tasks in the scanner. Task blocks alternated within each session, and order was counterbalanced across participants. All fMRI scans were performed using a Siemens Trio 3 T scanner and a Siemens 32-channel head coil optimized for parallel imaging. High-resolution T1-weighted anatomical images were collected using an MPRAGE sequence (T1 = 1100 ms; 160 axial slices, 0.9375 × 0.9375 × 1.000 mm; 192 × 256 matrix). T2*-weighted functional images were collected using an EPI sequence (voxel size, 3 × 3 × 3 mm; 64 × 64 matrix; 53 axial slices; TR, 3000 ms; TE, 25 ms; 104 volumes). A B0 field map was acquired (TR, 1270 ms; TE 1, 5.0 ms; TE 2, 7.46 ms) to support the off-line estimation of geometric distortion in the functional data.

Delay discounting.

Participants chose between a smaller immediate reward ($20 today) and a larger reward available after a longer delay (e.g., $40 in a month). The immediate reward was fixed, and the magnitude and delay of the larger, later reward varied from trial to trial. Each trial began with the presentation of the later option (amount and delay); the standard immediate option was not displayed. When subjects made their choice, a marker indicating their choice (checkmark if the later option was chosen, “X” if the immediate option was chosen) appeared for 1 s. Subjects had 4 s to make their choice. Subjects made 120 such choices in each session, over four 5 min and 18 s scans.

The primary behavioral outcome was discount rate (k), which was estimated by fitting a logistic regression to choice data. The subjective value (SV) of the choice options was assumed to follow hyperbolic discounting, as follows: Embedded Image where A is the amount of the option, D is the delay until the receipt of the reward (for immediate choice, D = 0), and k is a discount rate parameter that varies across subjects. Higher values of k indicate greater discounting and less tolerance of delay. The proportion of smaller immediate choices was also calculated as a secondary metric of discounting, which does not make assumptions about the parametric form of discounting. A two-parameter quasi-hyperbolic model (Laibson, 1997) was also fit to these data, but as these fits yielded similar conclusions (no change in either condition in either β or δ parameters of the quasi-hyperbolic model), they are not presented in detail here.

Risk sensitivity.

Participants chose between a smaller certain reward (100% chance of $20) and a larger riskier reward (e.g., 50% chance of $40). The certain reward was fixed, and the magnitude and probability of the larger, uncertain reward varied from trial to trial. Each trial began with the presentation of the risky option (amount and probability); the standard certain option was not displayed. When subjects made their choice, a marker indicating that choice (checkmark if the risky option was chosen, “X” if the certain option was chosen) appeared for 1 s. Subjects had 4 s to make their choice. Subjects made 120 such choices in each session, over four 5 min and 18 s scans.

The primary behavioral outcome was the subject's degree of risk sensitivity (α), estimated by fitting a logistic regression to choice data. The SV of the choice options was assumed to follow a power utility function, as follows: Embedded Image where p is the probability of winning amount A and α is a risk sensitivity parameter that varies across subjects. For the risky option, there is always a 1 − p chance of winning nothing. Higher α indicates a larger risk tolerance and lesser degree of risk aversion. The proportion of smaller certain choices was also calculated as a secondary metric of risk sensitivity, which does not make assumptions about the parametric form of risk aversion.

Neuroimaging analysis.

Image processing and analyses were conducted with the FMRIB Software Library (FSL) version 5.08. Functional images were motion corrected using MCFLIRT (FMRIB Linear Image Restoration Tool with Motion Correction), high-pass filtered (cutoff, 104 s), distortion corrected with the B0 map, and spatially smoothed (kernel FWHM, 9 mm). High-resolution anatomical scans were skull stripped with BET (FMRIB Brain Extract Tool) and coregistered with functional images using boundary-based registration. These were then normalized to the Montreal Neurological Institute (MNI) template via an affine transformation with 12 df.

Subject-level analyses were performed using the FSL tool FEAT (FMRIB fMRI Expert Analysis Tool). Task regressors were time locked to trial onset (event duration, 0.1 s) and convolved with a canonical gamma hemodynamic response function. In one set of GLMs, parametric regressors modeling the subjective value of the variable option (larger delayed or risky option) were generated using the discount rate and risk sensitivity parameters estimated from each subject, and orthogonalized to the task regressor (Kable and Glimcher, 2007; Levy et al., 2010). In a second set of GLMs, categorical regressors modeling whether the variable option (larger delayed or risky option) was chosen were included instead of the parametric value regressors. All GLMs included a regressor that designated missed trials; these trials were excluded from the regressors of interest.

Due to limitations in the single-step variance partitioning of FLAME (FMRIB Local Analysis of Mixed Effects), to approximate a two-group repeated-measures ANOVA, contrasts for the overall mean and the difference between pretreatment and post-treatment sessions were performed at the subject level and then carried up to the group level to analyze potential group, time (scan session), and interaction effects. One-sample t tests were then conducted to test for main effects and effects of time, and two-sample t tests were conducted to test for group and group-by-time interaction effects. Whole-brain analyses were thresholded at p < 0.001 and then corrected at the cluster level for multiple comparisons (p < 0.05) through permutation testing using cluster mass as implemented in the FSL tool Randomize (Winkler et al., 2014). Higher-power region of interest (ROI) analyses were also conducted in the dlPFC, vmPFC, and VS. The dlPFC ROI (123 voxels at 2 × 2 × 2 mm; 6.2 mm spherical kernel, centered on MNI coordinates −43, 10, and 29) was based on a meta-analysis identifying overlap between working memory and delay discounting activations (Wesley and Bickel, 2014). The vmPFC and VS ROIs were based on a meta-analysis of value-related neural signals (Bartra et al., 2013). ROI analyses were corrected for multiple comparisons (3 ROIs × 2 tasks × 2 regressors) using Bonferroni's method.

Fourteen participants were excluded from the neuroimaging analyses due to excessive in-scanner motion (>5% of image-to-image relative mean displacements >0.5 mm, n = 4), excessive missed trials (>10% nonresponses in a single run for more than two runs within a session, n = 6), incomplete or corrupted data (>25% unusable runs within a single session, n = 3), or expressed knowledge of experimental conditions (i.e., active control versus cognitive training, n = 1). Thus, 114 subjects were included in the final analyses of the task fMRI data (mean age, 25.1 years; 51 women overall; cognitive training group, 56 subjects).

Cognitive performance (secondary) outcomes

Cognitive testing was performed 1 week before training, at the mid-training time point (5 weeks into training), and at the end of the 10 week training period. The 1 h cognitive battery included the following assessments: attention (Penn Continuous Performance Task; Kurtz et al., 2001); working memory (visual/spatial n-back; Ragland et al., 2002; Green et al., 2005; Owen et al., 2005; Ehlis et al., 2008); response inhibition (stop signal task; Logan, 1994; Logan et al., 1997); interference control (Stroop test; Stroop, 1935); and cognitive flexibility (color shape task; Miyake et al., 2004; see below for task descriptions). These tasks were selected based on evidence that performance in these domains may improve following cognitive training (Anguera et al., 2013; Ngandu et al., 2015) and may generalize to durable improvements in functioning (Subramaniam et al., 2014). Tasks were also selected to cover a range of distinct facets of executive function, based on behavioral and neural evidence (Wager and Smith, 2003; Laird et al., 2005; Miyake and Friedman, 2012; Aron et al., 2014). Outliers were identified for each cognitive outcome based on pretreatment performance of >3 SDs from the mean.

Visual/spatial n-back (working memory).

During the n-back, participants are instructed to remember the location of a stimulus, a gray circle that is ∼5 cm in diameter, as it appears randomly in eight possible locations around the perimeter of a computer screen. The stimulus appears for 200 ms, followed by an interstimulus interval (ISI) of 2800 ms. A crosshair remains visible during the stimulus presentation to cue participants to look at the center of the screen so that all stimuli appearing around the perimeter of the screen can be seen clearly. The n-back task includes four conditions of varying difficulty levels, as follows: the 0-back, 1-back, 2-back, and 3-back. Participants respond only to targets (25% of stimuli) by pressing the SPACEBAR (Green et al., 2005; Owen et al., 2005; Ehlis et al., 2008). The primary outcomes are number correct and correct response time.

Penn continuous performance test (visual attention and vigilance).

This task is based on the Penn continuous performance test (CPT; Kurtz et al., 2001). In this task, a series of red vertical and horizontal lines (seven segment displays) flash in a digital numeric frame (resembling a digital clock). The participant must press the spacebar whenever these lines form complete numbers or complete letters. Stimuli are presented for 300 ms, followed by a fixed 700 ms ISI. The task is divided into two parts, each lasting 3 min, as follows: in the first part, the participant is requested to respond to numbers; and in the second part, the response is to letters. The primary outcomes are number correct and correct response time.

Stop signal task (response inhibition).

In this task, participants are instructed to press labeled keyboard keys as quickly and as accurately as possible to indicate the direction the arrow faced. Following a 32-trial practice, audio stop signals are presented on 25% of trials for a 32-trial practice and three task blocks of 64 trials each. The initial stop delay in each block is 250 ms and adjusts by 50 ms increments depending on whether the participant is able to successfully inhibit a response (Logan, 1994; Logan et al., 1997). The adjusting stop delay allows the determination of the delay at which inhibition occurs on ∼50% of trials. All trials consist of a 500 ms warning stimulus followed by a 1000 ms go signal (left- and right-facing arrows) and a 1000 ms blank screen intertrial interval. The primary outcome is the stop signal response time, which was calculated as the difference in mean response time on successful go trials and the mean stop delay on successful inhibition trials.

Stroop test (resistance to interference).

The Stroop test is a measure of the ability to screen out distracting stimuli (Stroop, 1935). In this task, participants view a series of words on a computer monitor and, using the keyboard, are asked to press the key associated with the color of the word rather than the word itself. Stimuli are presented and remain onscreen until the participant responds or 3.5 s have elapsed (whichever comes first), followed by a fixed 100 ms ISI. Participants are instructed to respond as quickly and accurately as possible. Congruent trials are trials in which the word and color match (e.g., the word “green” appears in the color green). Incongruent trials are trials in which the words are printed in colors that do not match the colors of the words (e.g., the word “red” might appear in green). The primary outcome is the Stroop effect, an interference score calculated as the response time on incongruent trials minus the response time on congruent trials. The Stroop effect measures the ability to suppress a habitual response in favor of an unusual one, taking into account the overall speed of naming.

Color shape task (flexibility).

In each trial of this task (Miyake et al., 2004), a cue letter (C or S) appears above a colored rectangle with a shape in it (outline of a circle or triangle). Participants are instructed to indicate whether the color is red or green when the cue is C, and whether the shape was a circle or triangle when the cue is S. The cue appears 150 ms before the stimulus, and both the cue and the stimulus remain on the screen until the participant responds. The primary outcome is the task switch cost, which is calculated as the difference in response time on switch trials (cue is different than the previous trial) versus the response time on stay trials (cue is the same as the previous trial). Smaller switch costs indicate greater cognitive flexibility.

Lumosity performance index.

To track average performance on Lumosity tasks during training, the platform generated an LPI, which is the weighted average of performance across tasks based on percentiles for a given age group. An exponential smoothing procedure is used to account for day-to-day fluctuations. The LPI was used to assess improvements on trained exercises with practice in the cognitive training condition.

Follow-up study of test–rest performance on cognitive assessments

After observing improvements on the cognitive assessments for both the active control and cognitive training groups, we performed a follow-up study to examine the effects of repeated testing with these assessments in the absence of any intervention. We recruited 35 participants between 18 and 35 years of age, excluding colorblind individuals and current users of Lumosity on-line training. These participants completed the cognitive testing battery on three occasions, separated by 1 week intervals and with no contact or intervention in the interim. Although this is a shorter delay than the pretraining, mid-training, and post-training assessments in the primary study, our primary concern was the extent of the potential practice effects, and healthy adults show similar practice gains throughout the first 3 months of serial testing (Bartels et al., 2010). Participants who completed fewer than three sessions (n = 5) or showed performance of >3 SDs from the mean on one of the cognitive tasks at the first testing session (n = 1) were excluded from the analysis. The analyzed sample (n = 29) was 69% female and had an average age of 23 years. As the no-contact control group was recruited separately, we were unable to apply methodological procedures (e.g., minimization techniques; Pocock and Simon, 1975; Scott et al., 2002) to reduce the likelihood of baseline differences. Therefore, to better compare the active control and cognitive training groups to this no-contact group, we selected a subset of participants matched on baseline cognitive composite score (see below; n = 25 for all groups). Each participant in the no-contact group was matched with their nearest unmatched neighbors among both the active control and cognitive training participants in ranked baseline performance, excluding match distances beyond a caliper of 0.1 (Stuart, 2010).

Experimental design and statistical analysis

Multiple regression models were estimated for the choice behavior and cognitive outcomes using Stata xt-reg (StataCorp) with maximum likelihood techniques. Models included terms for main and interacting effects of treatment (active control vs cognitive training) and time point (pretreatment vs post-treatment), including age, sex, and education as covariates. Delay discounting rates (k) were log transformed to normalize the distribution. Cognitive models also included the mid-treatment time point in addition to pretreatment and post-treatment; these models were examined for the full sample and separately within the sample of good adherers (≥70% of sessions completed) to determine whether engagement with the programs affected outcomes. Outliers were excluded based on pretreatment performance of >3 SDs from the mean. To form a composite cognitive performance score, z-scores were calculated separately for each of the five tasks across time points and treatment conditions (tasks for which lower values indicate improved performance were reverse scored) then averaged together within subjects for each time point. For the cognitive training group only, changes in performance on trained tasks (LPI) over time were examined using multiple regression with terms for main and interacting effects of adherence (percentage of assigned sessions completed; continuous measure) and time (day of training period), controlling for age, sex, and education. Pairwise correlation was used to identify baseline correlations between decision-making outcomes and cognitive performance.

Results

Descriptive data

The cognitive training and active control groups did not differ on any baseline variables (p values >0.05; Table 3). Overall, 44% of participants were female, 59% graduated college, and the average age was 25 years. Adherence (percentage of sessions completed) was high across both conditions, as follows: 80% (SD, 19) in the active control group and 74% (SD, 20) in the cognitive training group (F(1,126) = 3.26; p = 0.07). There were no differences between the cognitive training and active control groups in pretreatment delay discounting (cognitive training group: mean logk, −1.82; range, −3.07 to −0.92; active control group: mean logk, −1.79; range, −3.07 to −1.06; F(1,126) = 0.13; p = 0.72) or risk sensitivity (cognitive training group: mean α = 0.68; range, 0.21–1.41; active control group: mean α = 0.65; range, 0.28–1.49; F(1,126) = 0.49; p = 0.49).

View this table:
  • View inline
  • View popup
Table 3.

Baseline variables by condition

Choice (primary outcomes)

There were no effects of training condition on decision-making or changes in decision-making (Fig. 1). There was no main effect of time on discount rates [β = −0.002; 95% confidence interval (CI), −0.03 to 0.03; Wald χ2(1) = 0.02; p = 0.89] or degrees of risk sensitivity (β = 0.008; 95% CI, −0.01 to 0.03; Wald χ2(1) = 0.72; p = 0.40), and no treatment by time interaction effect on discount rates (β = 0.02; 95% CI, −0.09 to 0.13; Wald χ2(1) = 0.11; p = 0.74) or degrees of risk sensitivity (β = −0.006; 95% CI, −0.08 to 0.07; Wald χ2(1) = 0.03; p = 0.87). Similar results were obtained when using the percentage of immediate or certain choices as indexes of delay discounting or risk sensitivity instead of logk or α.

Figure 1.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 1.

Decision-making task outcomes. Performance on the delay discounting and risk sensitivity tasks in each group at pretreatment and post-treatment scan sessions. In the multiple regression models, there were no treatment by time interaction effects on decision-making task performance (p values >0.5).

To examine whether participants who were higher discounters or more risk seeking may experience greater benefits from training, we performed an exploratory analysis using multiple regression to examine associations between baseline decision-making and change in decision-making, controlling for age, sex, and education. Although baseline decision-making was significantly associated with change in decision-making (all p values <0.01), these effects did not differ by treatment group (all p values >0.05). To examine the form of this interaction, we divided participants into tertiles based on their baseline decision-making. The interaction was clearly driven by regression to the mean, with the lowest discounters exhibiting a trend toward increased discount rates (change in logk, 0.11 ± 0.05; p = 0.054) and the highest discounters exhibiting a trend toward decreased discount rates (change in logk, −0.09 ± 0.05; p = 0.07), and with the most risk-averse individuals exhibiting a trend toward more risk tolerance (change in α = 0.06 ± 0.03; p = 0.02) and the most risk-tolerant individuals exhibiting a trend toward more risk aversion (change in α = −0.04 ± 0.04; p = 0.31).

Neural activity (primary outcomes)

There were no effects of condition (cognitive training vs active control) on changes in neural activity during choices (Fig. 2). In a whole-brain analysis, there was robust and widespread choice-related activity (choice vs baseline contrast) that was similar in both tasks and centered in frontal-parietal, cingular-opercular, and sensorimotor regions. There was also robust and widespread value-related activity (parametric subjective value contrast) that was similar in both tasks and centered in previously identified valuation regions (vmPFC, VS, and posterior cingulate) as well as frontal-parietal and cingular-opercular regions activated by the choice task. In the risk sensitivity task, there were increases in choice-related activity from pretreatment to post-treatment in both groups in medial prefrontal, posterior cingulate, and lateral temporal cortex, all regions associated with the “default-mode network” (Raichle et al., 2001). Critically, however, these changes over time did not differ as a function of treatment condition and, therefore, could not be attributed to an effect of cognitive training.

Figure 2.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 2.

Whole-brain analyses of neural activity. Mean activation (choice trials vs baseline; A, B) and subjective value effects (C, D) across the whole brain, for both the delay discounting (A, C) and risk sensitivity (B, D) tasks, as well as changes in mean activation from pretreatment to post-treatment in the risk sensitivity task (E), independent of treatment condition. Subjective value effects were determined using parametric regressors based on discount rate and risk sensitivity parameters estimated from each subject and orthogonalized to the task regressor. There were no effects of treatment condition on changes in neural activity over time in either task. All brain images are height thresholded at p < 0.001 to form clusters and are corrected for multiple comparisons using permutation testing on cluster mass at p < 0.05. The 3-D brain images were generated using the surface-rendering tool Surf Ice, developed at the University of South Carolina. Source code for the program is available at www.nitrc.org/projects/surfice/.

To determine whether our whole-brain analysis missed any subtle neural effects in the brain regions we had predicted, we examined choice-related and value-related activity in dlPFC, vmPFC, and VS regions identified in previous meta-analyses (Bartra et al., 2013; Wesley and Bickel, 2014). We had hypothesized that cognitive training would enhance activity in dlPFC in both tasks, leading to enhanced vmPFC/VS activity for delayed rewards and reduced vmPFC/VS activity for risky rewards. However, there were no main effects of testing session or treatment condition, or effects of treatment condition on changes in neural activity in these more sensitive ROI analyses (Fig. 3).

Figure 3.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 3.

ROI analyses of neural activity. Mean activation (choice trials vs baseline; top row) and subjective value effects (parametric contrast; bottom row) in dlPFC, vmPFC, and VS ROIs for both the delay discounting and risk sensitivity tasks. There were no effects of treatment condition on changes in neural activity for any ROI in either task. Solid lines, cognitive training; dashed lines, active control. Pre, Pretreatment; Post, post-treatment.

When we examined categorical differences in activity depending on whether the variable (larger delayed or risky) option was selected or not, we again observed robust and widespread increases in frontal-parietal and cingular-opercular regions when the variable option was selected, but there were no changes in these effects from pretreatment to post-treatment, and no treatment by condition interactions, in either the whole-brain or ROI analyses.

Cognitive performance (secondary outcomes)

Baseline working memory accuracy was negatively correlated with delay discounting (i.e., better performance was associated with lower discounting; r(125) = −0.27; p = 0.002; Table 4), similar to findings reported in prior studies (Duckworth and Seligman, 2005; Shamosh and Gray, 2008). Baseline cognitive flexibility was positively correlated with delay discounting, with a higher switch cost (i.e., less cognitive flexibility) associated with lower discounting (r(124) = −0.23; p = 0.008).

View this table:
  • View inline
  • View popup
Table 4.

Baseline correlations between decision-making and cognitive measures

Examining composite cognitive scores, participants in both groups showed improved cognitive performance post-treatment (main effect of time: β = 0.19; 95% CI, 0.14–0.23; Wald χ2(1) = 74.9; p < 0.0001; Fig. 4). However, the degree of improvement was similar in both groups, and there was no significant treatment by time interaction (Wald χ2(1) = 1.17; p = 0.56). A similar pattern was observed when considering each task individually (Table 5): faster stop signal reaction time (response inhibition: β = −12.4; 95% CI, −15.9 to −8.9; Wald χ2(1) = 54.1; p < 0.0001); smaller Stroop effect (resistance to interference: β = −6.5; 95% CI, −12.9 to −0.23; Wald χ2(1) = 4.03; p = 0.04); greater working memory accuracy (visuospatial n-back number correct: β = 1.1; 95% CI, 0.76–1.5; Wald χ2(1) = 38.38; p < 0.0001); and fewer false positives (β = −0.42; 95% CI, −0.75 to −0.09; Wald χ2(1) = 6.06; p = 0.014); greater sustained attention accuracy, faster response time, and fewer false positives (continuous performance task, number correct: β = 1.8; 95% CI, 1.2–2.4; Wald χ2(1) = 35.87; p < 0.0001; correct response time: β = −7.0; 95% CI, −9.3 to −4.7; Wald χ2(1) = 36.2; p < 0.0001; false positives: β = −1.5; 95% CI, −2.0 to −0.95; Wald χ2(1) = 31.77; p < 0.0001); and reduced switch cost on the color/shape task (β = −25.6; 95% CI, −42.3 to −8.6; Wald χ2(1) = 8.72; p = 0.003). Post hoc examination of the color/shape task data indicated that, although the overall switch cost decreased, response times for both switch and stay trials decreased by a similar percentage (∼16.7% decrease on switch trials vs ∼18.0% decrease on stay trials). These results may indicate a general improvement in response times rather than a true improvement in switching ability. There was a significant treatment by time interaction effect only on working memory accuracy (Wald χ2(2) = 8.8; p = 0.012) and false-positive rates (Wald χ2(2) = 8.19; p = 0.017), which would not survive correction for multiple testing; post hoc t tests showed greater improvements in accuracy and greater reduction in false-positive rate in the cognitive training condition compared with the active control group from baseline to mid-treatment. There were no other treatment by time interactions on individual tasks (p values >0.05). Restricting the analyses to those ≥70% adherent (cognitive training group, n = 42; active control group, n = 50) did not alter the results, save that the treatment by time interaction effect on working memory accuracy was no longer significant (Wald χ2(2) = 2.9; p > 0.2).

Figure 4.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 4.

Practice effects on cognitive measures. A, Composite cognitive performance scores (averaged z-scores across all five cognitive tests) by treatment group and testing session. There were significant main effects of treatment (participants in the no-contact control group scored lower than the other two groups at all sessions; p = 0.02) and testing session (participants in all conditions improved over time; p < 0.0001), but there was no treatment by session interaction effect (p = 0.85). B, Matching subsets of participants on baseline performance. There were significant effects of testing session (p < 0.0001), but there were no main effects of treatment (p = 0.64) or a treatment by session interaction (p = 0.86).

View this table:
  • View inline
  • View popup
Table 5.

Changes in cognitive performance by condition

Practice effects on cognitive measures (secondary outcomes)

Participants in the follow-up study were slightly younger (mean age, 23 years vs 25 years in primary study; p = 0.01) and more likely to be female (69% vs 44% in primary study; p = 0.01). Age and sex were included as covariates in the analysis; however, neither was associated with task performance. Composite cognitive scores increased across the three sessions to an extent similar to that observed in the active control and cognitive training groups (Fig. 4). In an analysis comparing this group with the active control and cognitive training groups, there were significant effects of testing session (β = 0.19; 95% CI, 0.15–0.23; Wald χ2(1) = 81.47; p < 0.0001) and treatment condition (β = 0.13; 95% CI, 0.02–0.24; Wald χ2(1) = 5.37; p = 0.02), but there was no treatment by time interaction effect (Wald χ2(4) = 1.38; p = 0.85; Fig. 4, left). Given the significant effect of treatment condition, we further examined subsets of each group matched on baseline cognitive composite. In these matched subsets, there was a significant effect of testing session (Wald χ2(1) = 56.43; p < 0.0001), but there was no effect of treatment condition (Wald χ2(1) = 0.22; p = 0.64) and no treatment by time interaction (Wald χ2(4) = 1.32; p = 0.86; Fig. 4, right).

Performance on trained tasks in the cognitive training group

Performance on the training tasks in the cognitive training condition was measured with the LPI. Over the training period, LPI increased in the cognitive training group by an average of 390.8 points (SD, 222.2). This increase was correlated with adherence, such that participants who completed more sessions continued to improve throughout the training period, whereas participants who completed fewer sessions plateaued over time (Fig. 5; adherence by time interaction effect: β = 0.02; Wald χ2(1) = 19.18; p < 0.0001). A similar analysis could not be completed in the active control condition.

Figure 5.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 5.

Performance over time in cognitive training group. Performance on trained tasks over time in the cognitive training group, grouped by adherence to the training schedule. In the multiple regression model, there was a significant adherence (continuous measure) by time interaction effect (β = 0.02, p < 0.001). For simplicity, adherence is graphed by tertile based on the percentage of assigned sessions that were completed (low adherence, <74% completed; moderate adherence, 75–88% completed; high adherence, 89–100% completed).

Discussion

Motivated by findings that adaptive cognitive training alters activity in brain regions associated with cognitive control (Olesen et al., 2004; Dahlin et al., 2008; Takeuchi et al., 2011; Jolles et al., 2013) and that the engagement of these regions can bias choices away from immediate and risky rewards (Knoch et al., 2006; DelParigi et al., 2007; Christopoulos et al., 2009; Gianotti et al., 2009; Hare et al., 2009; Kober et al., 2010; Hare et al., 2011), we hypothesized that cognitive training would alter neural activity during decision-making, reduce delay discounting, and increase risk sensitivity. We conducted a randomized, controlled trial of commercial adaptive cognitive training versus control training involving nonadaptive, nontargeted computer games in healthy young adults. Contrary to our hypotheses, we found no effects of cognitive training on brain activity during decision-making and no effects of cognitive training on delay discounting or risk sensitivity. We did observe a baseline association between working memory and delay discounting. If the effects of cognitive training did transfer beyond the trained tasks, one would therefore expect that improvement on measures of working memory would result in changes in delay discounting. Although participants in the commercial training condition did improve with practice on the specific tasks performed during training, both conditions showed similar improvement on standardized cognitive measures over time, and similar levels of improvement were observed in a follow-up study of practice effects on the cognitive measures in the absence of any intervention. These results do not support the hypothesis that cognitive training results in transfer effects beyond the trained tasks. Commercial adaptive cognitive training in young adults appears to have no effects beyond those of standard video games on neural activity, choice behavior, or cognition.

Does cognitive training alter neural activity and decision-making?

We found no effects of cognitive training on our primary behavioral measures, delay discounting, and risk sensitivity. We also found no effects of cognitive training on neural activity during decision-making. This rules out the possibility that cognitive training results in neural changes, but these neural changes are not sufficient to generate significant behavioral effects. The conclusions that cognitive training does not affect decision-making or brain activity for the most part do not depend on comparison to a control group, as there were largely no changes in these measures after cognitive training. The only changes we observed were increases in choice-related activity in default-mode regions from pretreatment to post-treatment in the risk sensitivity task, but these effects were not specific to cognitive training. These changes could represent effects of cognitive stimulation that were common across both conditions, but they might also merely represent effects of repeated exposure to the task.

Although statistical null results should always be interpreted with caution, our study is relatively high powered to detect neural changes across conditions compared with other brain-imaging studies (Buschkuehl et al., 2012; Penadés et al., 2013; Subramaniam et al., 2014; Conklin et al., 2015). Our sample size of 128 individuals (64 individuals/group) who were included in the analysis of decision-making outcomes provides 80% power to detect a moderate effect (Cohen's d, ∼0.44) with α set to 0.05 (Faul et al., 2007). The slightly smaller sample of 114 individuals with good imaging data provides 80% power to detect an effect size of d = 0.47 for the analysis of neural activity. Although it is possible that cognitive training may provide a benefit that was too small to detect in this study, the data reveal no actual difference between conditions (Fig. 1).

Our findings are of interest as they differ from a previous study reporting beneficial effects of cognitive training on delay discounting (Bickel et al., 2011). In this prior study, 27 stimulant addicts undergoing treatment for substance abuse were assigned to either working memory training or control training. In the control group, participants viewed the same working memory programs but were provided with the answers so that they did not need to engage working memory systems. The investigators observed a significant decrease in delay discounting among participants in the working memory training group, compared with a nonsignificant increase in delay discounting in the control group. This contrasts with our finding no changes in discounting in either the cognitive training or active control groups. The difference in outcomes of the two studies could be due to differences in methodology. First, the details of both the training and control conditions differed across the two studies. As discussed below, there may be differences between working memory-specific and broad-based cognitive training programs. Second, the sample size for the prior study (n = 27) is smaller than our study (n = 128). Finally, Bickel et al. (2011) examined the effects of cognitive training in stimulant addicts undergoing treatment, compared with the healthy young adults in our study. It is possible that cognitive training may be more beneficial in substance abuse, especially for addicts that are acutely trying to maintain abstinence (Loughead et al., 2010, 2015; Patterson et al., 2010; Falcone et al., 2014).

Does cognitive training affect cognitive abilities?

Participants in the cognitive training group did improve on the tasks used during training. However, participants in both the active control and cognitive training groups demonstrated similar degrees of improvement on the cognitive assessment battery, which contained measures that were not directly trained but were within the general domain of executive function targeted by the training. The lack of difference between the cognitive training and active control groups is itself of great relevance, as most cognitive training regimens, like Lumosity but unlike our active control training, use tasks inspired by classic measures of executive function and delivered in an adaptive manner. Additionally, though, participants in both the active control and cognitive training groups demonstrated no greater improvement than participants in a follow-up study who were simply retested without any intervention, suggesting that the observed improvements are due to practice with the cognitive assessments rather than a beneficial effect of computer games. Thus, our findings fit with a growing number of studies that demonstrate the effects of cognitive training on measures closely related to the training tasks (near transfer) but no effects on measures that are less closely related (far transfer; Thompson et al., 2013; Cortese et al., 2015; Lawlor-Savage and Goghari, 2016; Melby-Lervåg et al., 2016).

An important consideration in evaluating the effects of cognitive training is the control group. Unlike many previous efforts (Lampit et al., 2014; Noack et al., 2014; Bogg and Lasecki, 2015), we included an active control condition with a similar level of engagement, expectancy, novelty, motivation, and interpersonal interaction (Motter et al., 2016). Any of these factors could account for the effects of cognitive training relative to passive (no-contact) control conditions. In contrast, an active control condition isolates differences of practical or theoretical importance. It is of practical importance whether commercial training programs outperform conventional web-based video games, and it is of theoretical importance whether adaptive training provides any benefit over nonadaptive training.

Limitations

An important caveat is that the efficacy of adaptive cognitive training may vary across populations. The participants in this study were young, healthy individuals without pre-existing cognitive impairments; it is possible that these participants were already functioning at high levels and therefore would not derive much benefit from cognitive training. Participants performed very well on the cognitive tasks at baseline, scoring on average ∼90% correct on the n-back and ∼95% correct on the CPT. However, there was sufficient room for improvement, and we did detect significant improvements over time in all groups. Other studies have found beneficial effects of working memory training on measures of self-control other than delay discounting, including reduced alcohol intake among problem drinkers (Houben et al., 2011) and reduced food intake in overweight individuals (Houben et al., 2016). Therefore, our results leave open the possibility that cognitive training could have stronger effects in children, older adults, or individuals with certain clinical conditions (Rueda et al., 2005; Willis et al., 2006; Vinogradov et al., 2012; Heinzel et al., 2014).

It is also possible that different results would be found if different cognitive domains were targeted. Studies which have focused on training specific cognitive domains have most consistently found transfer effects when training working memory (Au et al., 2015). The Lumosity cognitive training platform targets multiple cognitive domains involved in executive function, an approach used by several other broad-based cognitive training programs (Owen et al., 2010; Schmiedek et al., 2010; McDougall and House, 2012; Nouchi et al., 2013). Of the training exercises assigned, ∼27% specifically targeted working memory. However, we cannot rule out that a different balance of exercises (e.g., a greater “dose” of working memory exercises) might provide different benefits. On the other hand, several studies have demonstrated links between self-control and the other domains targeted by the Lumosity program (e.g., attention and cognitive flexibility; Hofmann et al., 2012; Fleming et al., 2016; Kleiman et al., 2016). The training interval, even considering working memory exercises alone, was also longer than many previous studies (Ball et al., 2002; Nouchi et al., 2013; Oei and Patterson, 2013; Noack et al., 2014), making it less likely that a null effect was due to an insufficient dose of training.

Conclusion

In view of our negative results regarding adaptive cognitive training, discovering interventions that change decision-making in healthy young adults should remain a priority. Greater discounting of delayed rewards is associated with smoking and substance use (Bickel and Marsch, 2001; Reynolds, 2006; Weller et al., 2008; MacKillop et al., 2011; Story et al., 2014; Grabski et al., 2016), food consumption (Rollins et al., 2010; Appelhans et al., 2011), and obesity (Weller et al., 2008; Davis et al., 2010; Lavagnino et al., 2016). Given these and other links among delay discounting, risk sensitivity, and health outcomes, interventions that target decision-making in healthy young adults could have widespread, important effects on public health, and deserve to be rigorously evaluated.

Footnotes

  • This research was supported by National Cancer Institute Grants R01-CA-170297 (to J.W.K. and C.L.) and R35-CA-197461 (to C.L.). We thank the following individuals for their assistance in data collection for this study: Anne Marie Burke, Gabriel Donnay, Jennifer Jorgensen, Rebecca Kazinka, Sangil Lee, Jeffrey Luery, Rickie Miglin, Dahlia Mukherjee, Sarah Price, Maura Schlussel, Rachel Sharp, Hyoun Ju Sohn, Dominique Spence, and Kalijah Terilli.

  • The authors declare no competing financial interests.

  • Correspondence should be addressed to Dr. Joseph W. Kable, Department of Psychology, University of Pennsylvania, 3720 Walnut Street, Philadelphia, PA 19104. kable{at}psych.upenn.edu

References

  1. ↵
    1. Anguera JA,
    2. Boccanfuso J,
    3. Rintoul JL,
    4. Al-Hashimi O,
    5. Faraji F,
    6. Janowich J,
    7. Kong E,
    8. Larraburo Y,
    9. Rolle C,
    10. Johnston E,
    11. Gazzaley A
    (2013) Video game training enhances cognitive control in older adults. Nature 501:97–101. doi:10.1038/nature12486 pmid:24005416
    OpenUrlCrossRefPubMed
  2. ↵
    1. Appelhans BM,
    2. Woolf K,
    3. Pagoto SL,
    4. Schneider KL,
    5. Whited MC,
    6. Liebman R
    (2011) Inhibiting food reward: delay discounting, food reward sensitivity, and palatable food intake in overweight and obese women. Obesity (Silver Spring) 19:2175–2182. doi:10.1038/oby.2011.57 pmid:21475139
    OpenUrlCrossRefPubMed
  3. ↵
    1. Aron AR,
    2. Robbins TW,
    3. Poldrack RA
    (2014) Inhibition and the right inferior frontal cortex: one decade on. Trends Cogn Sci 18:177–185. doi:10.1016/j.tics.2013.12.003 pmid:24440116
    OpenUrlCrossRefPubMed
  4. ↵
    1. Au J,
    2. Sheehan E,
    3. Tsai N,
    4. Duncan GJ,
    5. Buschkuehl M,
    6. Jaeggi SM
    (2015) Improving fluid intelligence with training on working memory: a meta-analysis. Psychon Bull Rev 22:366–377. doi:10.3758/s13423-014-0699-x pmid:25102926
    OpenUrlCrossRefPubMed
  5. ↵
    1. Ball K,
    2. Berch DB,
    3. Helmers KF,
    4. Jobe JB,
    5. Leveck MD,
    6. Marsiske M,
    7. Morris JN,
    8. Rebok GW,
    9. Smith DM,
    10. Tennstedt SL,
    11. Unverzagt FW,
    12. Willis SL
    (2002) Effects of cognitive training interventions with older adults: a randomized controlled trial. JAMA 288:2271–2281. doi:10.1001/jama.288.18.2271 pmid:12425704
    OpenUrlCrossRefPubMed
  6. ↵
    1. Bartels C,
    2. Wegrzyn M,
    3. Wiedl A,
    4. Ackermann V,
    5. Ehrenreich H
    (2010) Practice effects in healthy adults: a longitudinal study on frequent repetitive cognitive testing. BMC Neurosci 11:118. doi:10.1186/1471-2202-11-118 pmid:20846444
    OpenUrlCrossRefPubMed
  7. ↵
    1. Bartra O,
    2. McGuire JT,
    3. Kable JW
    (2013) The valuation system: a coordinate-based meta-analysis of BOLD fMRI experiments examining neural correlates of subjective value. Neuroimage 76:412–427. doi:10.1016/j.neuroimage.2013.02.063 pmid:23507394
    OpenUrlCrossRefPubMed
  8. ↵
    1. Bickel WK,
    2. Marsch LA
    (2001) Toward a behavioral economic understanding of drug dependence: delay discounting processes. Addiction 96:73–86. doi:10.1046/j.1360-0443.2001.961736.x pmid:11177521
    OpenUrlCrossRefPubMed
  9. ↵
    1. Bickel WK,
    2. Yi R,
    3. Landes RD,
    4. Hill PF,
    5. Baxter C
    (2011) Remember the future: working memory training decreases delay discounting among stimulant addicts. Biol Psychiatry 69:260–265. doi:10.1016/j.biopsych.2010.08.017 pmid:20965498
    OpenUrlCrossRefPubMed
  10. ↵
    1. Bissonette GB,
    2. Roesch MR
    (2016) Neurophysiology of reward-guided behavior: correlates related to predictions, value, motivation, errors, attention, and action. Curr Top Behav Neurosci 27:199–230. doi:10.1007/7854_2015_382 pmid:26276036
    OpenUrlCrossRefPubMed
  11. ↵
    1. Bogg T,
    2. Lasecki L
    (2015) Reliable gains? Evidence for substantially underpowered designs in studies of working memory training transfer to fluid intelligence. Front Psychol 5:1589. doi:10.3389/fpsyg.2014.01589 pmid:25657629
    OpenUrlCrossRefPubMed
  12. ↵
    1. Burks SV,
    2. Carpenter JP,
    3. Goette L,
    4. Rustichini A
    (2009) Cognitive skills affect economic preferences, strategic behavior, and job attachment. Proc Natl Acad Sci U S A 106:7745–7750. doi:10.1073/pnas.0812360106 pmid:19416865
    OpenUrlAbstract/FREE Full Text
  13. ↵
    1. Buschkuehl M,
    2. Jaeggi SM,
    3. Jonides J
    (2012) Neuronal effects following working memory training. Dev Cogn Neurosci 2 [Suppl 1]:S167–S179. doi:10.1016/j.dcn.2011.10.001 pmid:22682905
    OpenUrlCrossRefPubMed
  14. ↵
    1. Christopoulos GI,
    2. Tobler PN,
    3. Bossaerts P,
    4. Dolan RJ,
    5. Schultz W
    (2009) Neural correlates of value, risk, and risk aversion contributing to decision making under risk. J Neurosci 29:12574–12583. doi:10.1523/JNEUROSCI.2614-09.2009 pmid:19812332
    OpenUrlAbstract/FREE Full Text
  15. ↵
    1. Conklin HM,
    2. Ogg RJ,
    3. Ashford JM,
    4. Scoggins MA,
    5. Zou P,
    6. Clark KN,
    7. Martin-Elbahesh K,
    8. Hardy KK,
    9. Merchant TE,
    10. Jeha S,
    11. Huang L,
    12. Zhang H
    (2015) Computerized cognitive training for amelioration of cognitive late effects among childhood cancer survivors: a randomized controlled trial. J Clin Oncol 33:3894–3902. doi:10.1200/JCO.2015.61.6672 pmid:26460306
    OpenUrlAbstract/FREE Full Text
  16. ↵
    1. Cortese S,
    2. Ferrin M,
    3. Brandeis D,
    4. Buitelaar J,
    5. Daley D,
    6. Dittmann RW,
    7. Holtmann M,
    8. Santosh P,
    9. Stevenson J,
    10. Stringaris A,
    11. Zuddas A,
    12. Sonuga-Barke EJ
    (2015) Cognitive training for attention-deficit/hyperactivity disorder: meta-analysis of clinical and neuropsychological outcomes from randomized controlled trials. J Am Acad Child Adolesc Psychiatry 54:164–174. doi:10.1016/j.jaac.2014.12.010 pmid:25721181
    OpenUrlCrossRefPubMed
  17. ↵
    1. Dahlin E,
    2. Neely AS,
    3. Larsson A,
    4. Bäckman L,
    5. Nyberg L
    (2008) Transfer of learning after updating training mediated by the striatum. Science 320:1510–1512. doi:10.1126/science.1155466 pmid:18556560
    OpenUrlAbstract/FREE Full Text
  18. ↵
    1. Davis C,
    2. Patte K,
    3. Curtis C,
    4. Reid C
    (2010) Immediate pleasures and future consequences. A neuropsychological study of binge eating and obesity. Appetite 54:208–213. doi:10.1016/j.appet.2009.11.002 pmid:19896515
    OpenUrlCrossRefPubMed
  19. ↵
    1. DelParigi A,
    2. Chen K,
    3. Salbe AD,
    4. Hill JO,
    5. Wing RR,
    6. Reiman EM,
    7. Tataranni PA
    (2007) Successful dieters have increased neural activity in cortical areas involved in the control of behavior. Int J Obesity 31:440–448. doi:10.1038/sj.ijo.0803431
    OpenUrlCrossRefPubMed
  20. ↵
    1. Duckworth AL,
    2. Seligman ME
    (2005) Self-discipline outdoes IQ in predicting academic performance of adolescents. Psychol Sci 16:939–944. doi:10.1111/j.1467-9280.2005.01641.x pmid:16313657
    OpenUrlCrossRefPubMed
  21. ↵
    1. Ehlis AC,
    2. Bähne CG,
    3. Jacob CP,
    4. Herrmann MJ,
    5. Fallgatter AJ
    (2008) Reduced lateral prefrontal activation in adult patients with attention-deficit/hyperactivity disorder (ADHD) during a working memory task: a functional near-infrared spectroscopy (fNIRS) study. J Psychiatr Res 42:1060–1067. doi:10.1016/j.jpsychires.2007.11.011 pmid:18226818
    OpenUrlCrossRefPubMed
  22. ↵
    1. Falcone M,
    2. Wileyto EP,
    3. Ruparel K,
    4. Gerraty RT,
    5. LaPrate L,
    6. Detre JA,
    7. Gur R,
    8. Loughead J,
    9. Lerman C
    (2014) Age-related differences in working memory deficits during nicotine withdrawal. Addict Biol 19:907–917. doi:10.1111/adb.12051 pmid:23496760
    OpenUrlCrossRefPubMed
  23. ↵
    1. Faul F,
    2. Erdfelder E,
    3. Lang AG,
    4. Buchner A
    (2007) G*Power 3: a flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behav Res Methods 39:175–191. doi:10.3758/BF03193146 pmid:17695343
    OpenUrlCrossRefPubMed
  24. ↵
    1. FitzGerald TH,
    2. Schwartenbeck P,
    3. Dolan RJ
    (2014) Reward-related activity in ventral striatum is action contingent and modulated by behavioral relevance. J Neurosci 34:1271–1279. doi:10.1523/JNEUROSCI.4389-13.2014 pmid:24453318
    OpenUrlAbstract/FREE Full Text
  25. ↵
    1. Fleming KA,
    2. Heintzelman SJ,
    3. Bartholow BD
    (2016) Specifying associations between conscientiousness and executive functioning: mental set shifting, not prepotent response inhibition or working memory updating. J Pers 84:348–360. doi:10.1111/jopy.12163 pmid:25564728
    OpenUrlCrossRefPubMed
  26. ↵
    1. Gianotti LR,
    2. Knoch D,
    3. Faber PL,
    4. Lehmann D,
    5. Pascual-Marqui RD,
    6. Diezi C,
    7. Schoch C,
    8. Eisenegger C,
    9. Fehr E
    (2009) Tonic activity level in the right prefrontal cortex predicts individuals' risk taking. Psychol Sci 20:33–38. doi:10.1111/j.1467-9280.2008.02260.x pmid:19152538
    OpenUrlCrossRefPubMed
  27. ↵
    1. Grabski M,
    2. Curran HV,
    3. Nutt DJ,
    4. Husbands SM,
    5. Freeman TP,
    6. Fluharty M,
    7. Munafò MR
    (2016) Behavioural tasks sensitive to acute abstinence and predictive of smoking cessation success: a systematic review and meta-analysis. Addiction 111:2134–2144. doi:10.1111/add.13507 pmid:27338804
    OpenUrlCrossRefPubMed
  28. ↵
    1. Green A,
    2. Ellis KA,
    3. Ellis J,
    4. Bartholomeusz CF,
    5. Ilic S,
    6. Croft RJ,
    7. Phan KL,
    8. Nathan PJ
    (2005) Muscarinic and nicotinic receptor modulation of object and spatial n-back working memory in humans. Pharmacol Biochem Behav 81:575–584. doi:10.1016/j.pbb.2005.04.010 pmid:15936063
    OpenUrlCrossRefPubMed
  29. ↵
    1. Hardy JL,
    2. Nelson RA,
    3. Thomason ME,
    4. Sternberg DA,
    5. Katovich K,
    6. Farzin F,
    7. Scanlon M
    (2015) Enhancing cognitive abilities with comprehensive training: a large, online, randomized, active-controlled trial. PLoS One 10:e0134467. doi:10.1371/journal.pone.0134467 pmid:26333022
    OpenUrlCrossRefPubMed
  30. ↵
    1. Hare TA,
    2. Camerer CF,
    3. Rangel A
    (2009) Self-control in decision-making involves modulation of the vmPFC valuation system. Science 324:646–648. doi:10.1126/science.1168450 pmid:19407204
    OpenUrlAbstract/FREE Full Text
  31. ↵
    1. Hare TA,
    2. Malmaud J,
    3. Rangel A
    (2011) Focusing attention on the health aspects of foods changes value signals in vmPFC and improves dietary choice. J Neurosci 31:11077–11087. doi:10.1523/JNEUROSCI.6383-10.2011 pmid:21795556
    OpenUrlAbstract/FREE Full Text
  32. ↵
    1. Heinzel S,
    2. Lorenz RC,
    3. Brockhaus WR,
    4. Wüstenberg T,
    5. Kathmann N,
    6. Heinz A,
    7. Rapp MA
    (2014) Working memory load-dependent brain response predicts behavioral training gains in older adults. J Neurosci 34:1224–1233. doi:10.1523/JNEUROSCI.2463-13.2014 pmid:24453314
    OpenUrlAbstract/FREE Full Text
  33. ↵
    1. Hofmann W,
    2. Schmeichel BJ,
    3. Baddeley AD
    (2012) Executive functions and self-regulation. Trends Cogn Sci 16:174–180. doi:10.1016/j.tics.2012.01.006 pmid:22336729
    OpenUrlCrossRefPubMed
  34. ↵
    1. Holt CA,
    2. Laury SK
    (2002) Risk aversion and incentive effects. Am Econ Rev 92:1644–1655. doi:10.1257/000282802762024700
    OpenUrlCrossRef
  35. ↵
    1. Houben K,
    2. Wiers RW,
    3. Jansen A
    (2011) Getting a grip on drinking behavior: training working memory to reduce alcohol abuse. Psychol Sci 22:968–975. doi:10.1177/0956797611412392 pmid:21685380
    OpenUrlCrossRefPubMed
  36. ↵
    1. Houben K,
    2. Dassen FC,
    3. Jansen A
    (2016) Taking control: working memory training in overweight individuals increases self-regulation of food intake. Appetite 105:567–574. doi:10.1016/j.appet.2016.06.029 pmid:27349707
    OpenUrlCrossRefPubMed
  37. ↵
    1. Jimura K,
    2. Chushak MS,
    3. Braver TS
    (2013) Impulsivity and self-control during intertemporal decision making linked to the neural dynamics of reward value representation. J Neurosci 33:344–357. doi:10.1523/JNEUROSCI.0919-12.2013 pmid:23283347
    OpenUrlAbstract/FREE Full Text
  38. ↵
    1. Jolles DD,
    2. van Buchem MA,
    3. Crone EA,
    4. Rombouts SA
    (2013) Functional brain connectivity at rest changes after working memory training. Hum Brain Mapp 34:396–406. doi:10.1002/hbm.21444 pmid:22076823
    OpenUrlCrossRefPubMed
  39. ↵
    1. Kable JW,
    2. Glimcher PW
    (2007) The neural correlates of subjective value during intertemporal choice. Nat Neurosci 10:1625–1633. doi:10.1038/nn2007 pmid:17982449
    OpenUrlCrossRefPubMed
  40. ↵
    1. Kirby KN,
    2. Winston GC,
    3. Santiesteban M
    (2005) Impatience and grades: delay-discount rates correlate negatively with college GPA. Learn Individ Differ 15:213–222. doi:10.1016/j.lindif.2005.01.003
    OpenUrlCrossRef
  41. ↵
    1. Kleiman T,
    2. Trope Y,
    3. Amodio DM
    (2016) Cognitive control modulates attention to food cues: support for the control readiness model of self-control. Brain Cogn 110:94–101. doi:10.1016/j.bandc.2016.04.006 pmid:27157690
    OpenUrlCrossRefPubMed
  42. ↵
    1. Knoch D,
    2. Gianotti LR,
    3. Pascual-Leone A,
    4. Treyer V,
    5. Regard M,
    6. Hohmann M,
    7. Brugger P
    (2006) Disruption of right prefrontal cortex by low-frequency repetitive transcranial magnetic stimulation induces risk-taking behavior. J Neurosci 26:6469–6472. doi:10.1523/JNEUROSCI.0804-06.2006 pmid:16775134
    OpenUrlAbstract/FREE Full Text
  43. ↵
    1. Kober H,
    2. Mende-Siedlecki P,
    3. Kross EF,
    4. Weber J,
    5. Mischel W,
    6. Hart CL,
    7. Ochsner KN
    (2010) Prefrontal-striatal pathway underlies cognitive regulation of craving. Proc Natl Acad Sci U S A 107:14811–14816. doi:10.1073/pnas.1007779107 pmid:20679212
    OpenUrlAbstract/FREE Full Text
  44. ↵
    1. Kundu B,
    2. Sutterer DW,
    3. Emrich SM,
    4. Postle BR
    (2013) Strengthened effective connectivity underlies transfer of working memory training to tests of short-term memory and attention. J Neurosci 33:8705–8715. doi:10.1523/JNEUROSCI.5565-12.2013 pmid:23678114
    OpenUrlAbstract/FREE Full Text
  45. ↵
    1. Kurtz MM,
    2. Ragland JD,
    3. Bilker W,
    4. Gur RC,
    5. Gur RE
    (2001) Comparison of the continuous performance test with and without working memory demands in healthy controls and patients with schizophrenia. Schizophr Res 48:307–316. doi:10.1016/S0920-9964(00)00060-8 pmid:11295383
    OpenUrlCrossRefPubMed
  46. ↵
    1. Laibson DI
    (1997) Golden eggs and hyperbolic discounting. Q J Econ 112:443–477. doi:10.1162/003355397555253
    OpenUrlAbstract/FREE Full Text
  47. ↵
    1. Laird AR,
    2. McMillan KM,
    3. Lancaster JL,
    4. Kochunov P,
    5. Turkeltaub PE,
    6. Pardo JV,
    7. Fox PT
    (2005) A comparison of label-based review and ALE meta-analysis in the Stroop task. Hum Brain Mapp 25:6–21. doi:10.1002/hbm.20129 pmid:15846823
    OpenUrlCrossRefPubMed
  48. ↵
    1. Lampit A,
    2. Hallock H,
    3. Valenzuela M
    (2014) Computerized cognitive training in cognitively healthy older adults: a systematic review and meta-analysis of effect modifiers. PLoS Med 11:e1001756. doi:10.1371/journal.pmed.1001756 pmid:25405755
    OpenUrlCrossRefPubMed
  49. ↵
    1. Lavagnino L,
    2. Arnone D,
    3. Cao B,
    4. Soares JC,
    5. Selvaraj S
    (2016) Inhibitory control in obesity and binge eating disorder: a systematic review and meta-analysis of neurocognitive and neuroimaging studies. Neurosci Biobehav Rev 68:714–726. doi:10.1016/j.neubiorev.2016.06.041 pmid:27381956
    OpenUrlCrossRefPubMed
  50. ↵
    1. Lawlor-Savage L,
    2. Goghari VM
    (2016) Dual n-back working memory training in healthy adults: a randomized comparison to processing speed training. PLoS One 11:e0151817. doi:10.1371/journal.pone.0151817 pmid:27043141
    OpenUrlCrossRefPubMed
  51. ↵
    1. Levy I,
    2. Snell J,
    3. Nelson AJ,
    4. Rustichini A,
    5. Glimcher PW
    (2010) Neural representation of subjective value under risk and ambiguity. J Neurophysiol 103:1036–1047. doi:10.1152/jn.00853.2009 pmid:20032238
    OpenUrlAbstract/FREE Full Text
  52. ↵
    1. Logan GD
    (1994) On the ability to inhibit thought and action: a user's guide to the stop signal paradigm. In: Inhibitory processes in attention, memory, and language (Dagenbach D, Carr TH, eds), pp 189–238. San Diego: Academic.
  53. ↵
    1. Logan GD,
    2. Schacher RJ,
    3. Tannock R
    (1997) Impulsivity and inhibitory control. Psychol Sci 8:60–66. doi:10.1111/j.1467-9280.1997.tb00545.x
    OpenUrlCrossRef
  54. ↵
    1. Loughead J,
    2. Ray R,
    3. Wileyto EP,
    4. Ruparel K,
    5. Sanborn P,
    6. Siegel S,
    7. Gur RC,
    8. Lerman C
    (2010) Effects of the alpha4beta2 partial agonist varenicline on brain activity and working memory in abstinent smokers. Biol Psychiatry 67:715–721. doi:10.1016/j.biopsych.2010.01.016 pmid:20207347
    OpenUrlCrossRefPubMed
  55. ↵
    1. Loughead J,
    2. Wileyto EP,
    3. Ruparel K,
    4. Falcone M,
    5. Hopson R,
    6. Gur R,
    7. Lerman C
    (2015) Working memory-related neural activity predicts future smoking relapse. Neuropsychopharmacology 40:1311–1320. doi:10.1038/npp.2014.318 pmid:25469682
    OpenUrlCrossRefPubMed
  56. ↵
    1. MacKillop J,
    2. Amlung MT,
    3. Few LR,
    4. Ray LA,
    5. Sweet LH,
    6. Munafò MR
    (2011) Delayed reward discounting and addictive behavior: a meta-analysis. Psychopharmacology 216:305–321. doi:10.1007/s00213-011-2229-0 pmid:21373791
    OpenUrlCrossRefPubMed
  57. ↵
    1. McClure SM,
    2. Bickel WK
    (2014) A dual-systems perspective on addiction: contributions from neuroimaging and cognitive training. Ann NY Acad Sci 1327:62–78. doi:10.1111/nyas.12561 pmid:25336389
    OpenUrlCrossRefPubMed
  58. ↵
    1. McDougall S,
    2. House B
    (2012) Brain training in older adults: evidence of transfer to memory span performance and pseudo-Matthew effects. Neuropsychol Dev Cogn B Aging Neuropsychol Cogn 19:195–221. doi:10.1080/13825585.2011.640656 pmid:22248429
    OpenUrlCrossRefPubMed
  59. ↵
    1. Meier S,
    2. Sprenger CD
    (2012) Time discounting predicts creditworthiness. Psychol Sci 23:56–58. doi:10.1177/0956797611425931 pmid:22157518
    OpenUrlCrossRefPubMed
  60. ↵
    1. Melby-Lervåg M,
    2. Hulme C
    (2013) Is working memory training effective? A meta-analytic review. Dev Psychol 49:270–291. doi:10.1037/a0028228 pmid:22612437
    OpenUrlCrossRefPubMed
  61. ↵
    1. Melby-Lervåg M,
    2. Redick TS,
    3. Hulme C
    (2016) Working memory training does not improve performance on measures of intelligence or other measures of “far transfer”: evidence from a meta-analytic review. Perspect Psychol Sci 11:512–534. doi:10.1177/1745691616635612 pmid:27474138
    OpenUrlCrossRefPubMed
  62. ↵
    1. Miyake A,
    2. Friedman NP
    (2012) The nature and organization of individual differences in executive functions: four general conclusions. Curr Dir Psychol Sci 21:8–14. doi:10.1177/0963721411429458 pmid:22773897
    OpenUrlCrossRefPubMed
  63. ↵
    1. Miyake A,
    2. Emerson MJ,
    3. Padilla F,
    4. Ahn JC
    (2004) Inner speech as a retrieval aid for task goals: the effects of cue type and articulatory suppression in the random task cuing paradigm. Acta Psychol 115:123–142. doi:10.1016/j.actpsy.2003.12.004
    OpenUrlCrossRefPubMed
  64. ↵
    1. Morrison AB,
    2. Chein JM
    (2011) Does working memory training work? The promise and challenges of enhancing cognition by training working memory. Psychon Bull Rev 18:46–60. doi:10.3758/s13423-010-0034-0 pmid:21327348
    OpenUrlCrossRefPubMed
  65. ↵
    1. Motter JN,
    2. Devanand DP,
    3. Doraiswamy PM,
    4. Sneed JR
    (2016) Clinical trials to gain FDA approval for computerized cognitive training: what is the ideal control condition? Front Aging Neurosci 8:249. doi:10.3389/fnagi.2016.00249 pmid:27853432
    OpenUrlCrossRefPubMed
  66. ↵
    1. Ngandu T,
    2. Lehtisalo J,
    3. Solomon A,
    4. Levälahti E,
    5. Ahtiluoto S,
    6. Antikainen R,
    7. Bäckman L,
    8. Hänninen T,
    9. Jula A,
    10. Laatikainen T,
    11. Lindström J,
    12. Mangialasche F,
    13. Paajanen T,
    14. Pajala S,
    15. Peltonen M,
    16. Rauramaa R,
    17. Stigsdotter-Neely A,
    18. Strandberg T,
    19. Tuomilehto J,
    20. Soininen H, et al.
    (2015) A 2 year multidomain intervention of diet, exercise, cognitive training, and vascular risk monitoring versus control to prevent cognitive decline in at-risk elderly people (FINGER): a randomised controlled trial. Lancet 385:2255–2263. doi:10.1016/S0140-6736(15)60461-5 pmid:25771249
    OpenUrlCrossRefPubMed
  67. ↵
    1. Noack H,
    2. Lövdén M,
    3. Schmiedek F
    (2014) On the validity and generality of transfer effects in cognitive training research. Psychol Res 78:773–789. doi:10.1007/s00426-014-0564-6 pmid:24691586
    OpenUrlCrossRefPubMed
  68. ↵
    1. Nouchi R,
    2. Taki Y,
    3. Takeuchi H,
    4. Hashizume H,
    5. Nozawa T,
    6. Kambara T,
    7. Sekiguchi A,
    8. Miyauchi CM,
    9. Kotozaki Y,
    10. Nouchi H,
    11. Kawashima R
    (2013) Brain training game boosts executive functions, working memory and processing speed in the young adults: a randomized controlled trial. PLoS One 8:e55518. doi:10.1371/journal.pone.0055518 pmid:23405164
    OpenUrlCrossRefPubMed
  69. ↵
    1. Oei AC,
    2. Patterson MD
    (2013) Enhancing cognition with video games: a multiple game training study. PLoS One 8:e58546. doi:10.1371/journal.pone.0058546 pmid:23516504
    OpenUrlCrossRefPubMed
  70. ↵
    1. Olesen PJ,
    2. Westerberg H,
    3. Klingberg T
    (2004) Increased prefrontal and parietal activity after training of working memory. Nat Neurosci 7:75–79. doi:10.1038/nn1165 pmid:14699419
    OpenUrlCrossRefPubMed
  71. ↵
    1. Owen AM,
    2. McMillan KM,
    3. Laird AR,
    4. Bullmore E
    (2005) N-back working memory paradigm: a meta-analysis of normative functional neuroimaging studies. Hum Brain Mapp 25:46–59. doi:10.1002/hbm.20131 pmid:15846822
    OpenUrlCrossRefPubMed
  72. ↵
    1. Owen AM,
    2. Hampshire A,
    3. Grahn JA,
    4. Stenton R,
    5. Dajani S,
    6. Burns AS,
    7. Howard RJ,
    8. Ballard CG
    (2010) Putting brain training to the test. Nature 465:775–778. doi:10.1038/nature09042 pmid:20407435
    OpenUrlCrossRefPubMed
  73. ↵
    1. Patterson F,
    2. Jepson C,
    3. Loughead J,
    4. Perkins K,
    5. Strasser AA,
    6. Siegel S,
    7. Frey J,
    8. Gur R,
    9. Lerman C
    (2010) Working memory deficits predict short-term smoking resumption following brief abstinence. Drug Alcohol Depend 106:61–64. doi:10.1016/j.drugalcdep.2009.07.020 pmid:19733449
    OpenUrlCrossRefPubMed
  74. ↵
    1. Penadés R,
    2. Pujol N,
    3. Catalán R,
    4. Massana G,
    5. Rametti G,
    6. García-Rizo C,
    7. Bargalló N,
    8. Gastó C,
    9. Bernardo M,
    10. Junqué C
    (2013) Brain effects of cognitive remediation therapy in schizophrenia: a structural and functional neuroimaging study. Biol Psychiatry 73:1015–1023. doi:10.1016/j.biopsych.2013.01.017 pmid:23452665
    OpenUrlCrossRefPubMed
  75. ↵
    1. Pocock SJ,
    2. Simon R
    (1975) Sequential treatment assignment with balancing for prognostic factors in the controlled clinical trial. Biometrics 31:103–115. doi:10.2307/2529712 pmid:1100130
    OpenUrlCrossRefPubMed
  76. ↵
    1. Ragland JD,
    2. Turetsky BI,
    3. Gur RC,
    4. Gunning-Dixon F,
    5. Turner T,
    6. Schroeder L,
    7. Chan R,
    8. Gur RE
    (2002) Working memory for complex figures: an fMRI comparison of letter and fractal n-back tasks. Neuropsychology 16:370–379. doi:10.1037/0894-4105.16.3.370 pmid:12146684
    OpenUrlCrossRefPubMed
  77. ↵
    1. Raichle ME,
    2. MacLeod AM,
    3. Snyder AZ,
    4. Powers WJ,
    5. Gusnard DA,
    6. Shulman GL
    (2001) A default mode of brain function. Proc Natl Acad Sci U S A 98:676–682. doi:10.1073/pnas.98.2.676 pmid:11209064
    OpenUrlAbstract/FREE Full Text
  78. ↵
    1. Reimers S,
    2. Maylor E,
    3. Stewart N,
    4. Chater N
    (2009) Associations between a one-shot delay discounting measure and age, income, education, and real-world impulsive behavior. Pers Individ Dif 47:973–978. doi:10.1016/j.paid.2009.07.026
    OpenUrlCrossRef
  79. ↵
    1. Reynolds B
    (2006) A review of delay-discounting research with humans: relations to drug use and gambling. Behav Pharmacol 17:651–667. doi:10.1097/FBP.0b013e3280115f99 pmid:17110792
    OpenUrlCrossRefPubMed
  80. ↵
    1. Roberts G,
    2. Quach J,
    3. Spencer-Smith M,
    4. Anderson PJ,
    5. Gathercole S,
    6. Gold L,
    7. Sia KL,
    8. Mensah F,
    9. Rickards F,
    10. Ainley J,
    11. Wake M
    (2016) Academic outcomes 2 years after working memory training for children with low working memory: a randomized clinical trial. JAMA Pediatr 170:e154568. doi:10.1001/jamapediatrics.2015.4568 pmid:26954779
    OpenUrlCrossRefPubMed
  81. ↵
    1. Rollins BY,
    2. Dearing KK,
    3. Epstein LH
    (2010) Delay discounting moderates the effect of food reinforcement on energy intake among non-obese women. Appetite 55:420–425. doi:10.1016/j.appet.2010.07.014 pmid:20678532
    OpenUrlCrossRefPubMed
  82. ↵
    1. Rueda MR,
    2. Rothbart MK,
    3. McCandliss BD,
    4. Saccomanno L,
    5. Posner MI
    (2005) Training, maturation, and genetic influences on the development of executive attention. Proc Natl Acad Sci U S A 102:14931–14936. doi:10.1073/pnas.0506897102 pmid:16192352
    OpenUrlAbstract/FREE Full Text
  83. ↵
    1. Rushworth MF,
    2. Noonan MP,
    3. Boorman ED,
    4. Walton ME,
    5. Behrens TE
    (2011) Frontal cortex and reward-guided learning and decision-making. Neuron 70:1054–1069. doi:10.1016/j.neuron.2011.05.014 pmid:21689594
    OpenUrlCrossRefPubMed
  84. ↵
    1. Schmiedek F,
    2. Lövdén M,
    3. Lindenberger U
    (2010) Hundred days of cognitive training enhance broad cognitive abilities in adulthood: findings from the COGITO Study. Front Aging Neurosci 2:27. doi:10.3389/fnagi.2010.00027 pmid:20725526
    OpenUrlCrossRefPubMed
  85. ↵
    1. Shamosh N,
    2. Gray J
    (2008) Delay discounting and intelligence: a meta-analysis. Intelligence 4:289–305. doi:10.1016/j.intell.2007.09.004
    OpenUrlCrossRef
  86. ↵
    1. Shipstead Z,
    2. Redick TS,
    3. Engle RW
    (2012) Is working memory training effective? Psychol Bull 138:628–654. doi:10.1037/a0027473 pmid:22409508
    OpenUrlCrossRefPubMed
  87. ↵
    1. Scott NW,
    2. McPherson GC,
    3. Ramsay CR,
    4. Campbell MK
    (2002) The method of minimization for allocation to clinical trials: a review. Control Clin Trials 23:662–674. doi:10.1016/s0197-2456(02)00242-8 pmid:12505244
    OpenUrlCrossRefPubMed
  88. ↵
    1. Story GW,
    2. Vlaev I,
    3. Seymour B,
    4. Darzi A,
    5. Dolan RJ
    (2014) Does temporal discounting explain unhealthy behavior? A systematic review and reinforcement learning perspective. Front Behav Neurosci 8:76. doi:10.3389/fnbeh.2014.00076 pmid:24659960
    OpenUrlCrossRefPubMed
  89. ↵
    1. Stroop JR
    (1935) Studies of interference in serial verbal reactions. J Exp Psychol 18:643–662. doi:10.1037/h0054651
    OpenUrlCrossRef
  90. ↵
    1. Stuart EA
    (2010) Matching methods for causal inference: a review and a look forward. Stat Sci 25:1–21. doi:10.1214/09-STS313 pmid:20871802
    OpenUrlCrossRefPubMed
  91. ↵
    1. Subramaniam K,
    2. Luks TL,
    3. Garrett C,
    4. Chung C,
    5. Fisher M,
    6. Nagarajan S,
    7. Vinogradov S
    (2014) Intensive cognitive training in schizophrenia enhances working memory and associated prefrontal cortical efficiency in a manner that drives long-term functional gains. Neuroimage 99:281–292. doi:10.1016/j.neuroimage.2014.05.057 pmid:24867353
    OpenUrlCrossRefPubMed
  92. ↵
    1. Takeuchi H,
    2. Taki Y,
    3. Hashizume H,
    4. Sassa Y,
    5. Nagase T,
    6. Nouchi R,
    7. Kawashima R
    (2011) Effects of training of processing speed on neural systems. J Neurosci 31:12139–12148. doi:10.1523/JNEUROSCI.2948-11.2011 pmid:21865456
    OpenUrlAbstract/FREE Full Text
  93. ↵
    1. Thompson TW,
    2. Waskom ML,
    3. Garel KL,
    4. Cardenas-Iniguez C,
    5. Reynolds GO,
    6. Winter R,
    7. Chang P,
    8. Pollard K,
    9. Lala N,
    10. Alvarez GA,
    11. Gabrieli JD
    (2013) Failure of working memory training to enhance cognition or intelligence. PLoS One 8:e63614. doi:10.1371/journal.pone.0063614 pmid:23717453
    OpenUrlCrossRefPubMed
  94. ↵
    1. Vaidya AR,
    2. Fellows LK
    (2015) Ventromedial frontal cortex is critical for guiding attention to reward-predictive visual features in humans. J Neurosci 35:12813–12823. doi:10.1523/JNEUROSCI.1607-15.2015 pmid:26377468
    OpenUrlAbstract/FREE Full Text
  95. ↵
    1. Vinogradov S,
    2. Fisher M,
    3. de Villers-Sidani E
    (2012) Cognitive training for impaired neural systems in neuropsychiatric illness. Neuropsychopharmacology 37:43–76. doi:10.1038/npp.2011.251 pmid:22048465
    OpenUrlCrossRefPubMed
  96. ↵
    1. Wager TD,
    2. Smith EE
    (2003) Neuroimaging studies of working memory: a meta-analysis. Cogn Affect Behav Neurosci 3:255–274. doi:10.3758/CABN.3.4.255 pmid:15040547
    OpenUrlCrossRefPubMed
  97. ↵
    1. Weller RE,
    2. Cook EW 3rd.,
    3. Avsar KB,
    4. Cox JE
    (2008) Obese women show greater delay discounting than healthy-weight women. Appetite 51:563–569. doi:10.1016/j.appet.2008.04.010 pmid:18513828
    OpenUrlCrossRefPubMed
  98. ↵
    1. Wesley MJ,
    2. Bickel WK
    (2014) Remember the future II: meta-analyses and functional overlap of working memory and delay discounting. Biol Psychiatry 75:435–448. doi:10.1016/j.biopsych.2013.08.008 pmid:24041504
    OpenUrlCrossRefPubMed
  99. ↵
    1. Willis SL,
    2. Tennstedt SL,
    3. Marsiske M,
    4. Ball K,
    5. Elias J,
    6. Koepke KM,
    7. Morris JN,
    8. Rebok GW,
    9. Unverzagt FW,
    10. Stoddard AM,
    11. Wright E
    (2006) Long-term effects of cognitive training on everyday functional outcomes in older adults. JAMA 296:2805–2814. doi:10.1001/jama.296.23.2805 pmid:17179457
    OpenUrlCrossRefPubMed
  100. ↵
    1. Winkler AM,
    2. Ridgway GR,
    3. Webster MA,
    4. Smith SM,
    5. Nichols TE
    (2014) Permutation inference for the general linear model. Neuroimage 92:381–397. doi:10.1016/j.neuroimage.2014.01.060 pmid:24530839
    OpenUrlCrossRefPubMed
  101. ↵
    1. Zachary R
    (1986) Shipley Institute of Living scale: revised manual. Los Angeles: Western Psychological Services.
Back to top

In this issue

The Journal of Neuroscience: 37 (31)
Journal of Neuroscience
Vol. 37, Issue 31
2 Aug 2017
  • Table of Contents
  • Table of Contents (PDF)
  • About the Cover
  • Index by author
  • Advertising (PDF)
  • Ed Board (PDF)
Email

Thank you for sharing this Journal of Neuroscience article.

NOTE: We request your email address only to inform the recipient that it was you who recommended this article, and that it is not junk mail. We do not retain these email addresses.

Enter multiple addresses on separate lines or separate them with commas.
No Effect of Commercial Cognitive Training on Brain Activity, Choice Behavior, or Cognitive Performance
(Your Name) has forwarded a page to you from Journal of Neuroscience
(Your Name) thought you would be interested in this article in Journal of Neuroscience.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Print
View Full Page PDF
Citation Tools
No Effect of Commercial Cognitive Training on Brain Activity, Choice Behavior, or Cognitive Performance
Joseph W. Kable, M. Kathleen Caulfield, Mary Falcone, Mairead McConnell, Leah Bernardo, Trishala Parthasarathi, Nicole Cooper, Rebecca Ashare, Janet Audrain-McGovern, Robert Hornik, Paul Diefenbach, Frank J. Lee, Caryn Lerman
Journal of Neuroscience 2 August 2017, 37 (31) 7390-7402; DOI: 10.1523/JNEUROSCI.2832-16.2017

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Respond to this article
Request Permissions
Share
No Effect of Commercial Cognitive Training on Brain Activity, Choice Behavior, or Cognitive Performance
Joseph W. Kable, M. Kathleen Caulfield, Mary Falcone, Mairead McConnell, Leah Bernardo, Trishala Parthasarathi, Nicole Cooper, Rebecca Ashare, Janet Audrain-McGovern, Robert Hornik, Paul Diefenbach, Frank J. Lee, Caryn Lerman
Journal of Neuroscience 2 August 2017, 37 (31) 7390-7402; DOI: 10.1523/JNEUROSCI.2832-16.2017
Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • Introduction
    • Materials and Methods
    • Results
    • Discussion
    • Footnotes
    • References
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF

Keywords

  • cognitive training
  • delay discounting
  • impulsivity
  • neuroimaging
  • working memory

Responses to this article

Respond to this article

Jump to comment:

No eLetters have been published for this article.

Related Articles

Cited By...

More in this TOC Section

Research Articles

  • PVN–NAc Shell–VP Circuit OT and OTR Neurons Regulate Pair Bonding via D2R and D1R
  • Continuous Diffusion-Detected Neuroplasticity during Motor Learning
  • Gestational Chlorpyrifos Exposure Imparts Lasting Alterations to the Rat Somatosensory Cortex
Show more Research Articles

Behavioral/Cognitive

  • Zooming In and Out: Selective Attention Modulates Color Signals in Early Visual Cortex for Narrow and Broad Ranges of Task-Relevant Features
  • The Amygdala Regulates Social Motivation for Selective Vocal Imitation in Zebra Finches
  • Continuous Diffusion-Detected Neuroplasticity during Motor Learning
Show more Behavioral/Cognitive
  • Home
  • Alerts
  • Follow SFN on BlueSky
  • Visit Society for Neuroscience on Facebook
  • Follow Society for Neuroscience on Twitter
  • Follow Society for Neuroscience on LinkedIn
  • Visit Society for Neuroscience on Youtube
  • Follow our RSS feeds

Content

  • Early Release
  • Current Issue
  • Issue Archive
  • Collections

Information

  • For Authors
  • For Advertisers
  • For the Media
  • For Subscribers

About

  • About the Journal
  • Editorial Board
  • Privacy Notice
  • Contact
  • Accessibility
(JNeurosci logo)
(SfN logo)

Copyright © 2025 by the Society for Neuroscience.
JNeurosci Online ISSN: 1529-2401

The ideas and opinions expressed in JNeurosci do not necessarily reflect those of SfN or the JNeurosci Editorial Board. Publication of an advertisement or other product mention in JNeurosci should not be construed as an endorsement of the manufacturer’s claims. SfN does not assume any responsibility for any injury and/or damage to persons or property arising from or related to any use of any material contained in JNeurosci.