Skip to main content

Main menu

  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Collections
    • Podcast
  • ALERTS
  • FOR AUTHORS
    • Information for Authors
    • Fees
    • Journal Clubs
    • eLetters
    • Submit
  • EDITORIAL BOARD
  • ABOUT
    • Overview
    • Advertise
    • For the Media
    • Rights and Permissions
    • Privacy Policy
    • Feedback
  • SUBSCRIBE

User menu

  • Log in
  • My Cart

Search

  • Advanced search
Journal of Neuroscience
  • Log in
  • My Cart
Journal of Neuroscience

Advanced Search

Submit a Manuscript
  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Collections
    • Podcast
  • ALERTS
  • FOR AUTHORS
    • Information for Authors
    • Fees
    • Journal Clubs
    • eLetters
    • Submit
  • EDITORIAL BOARD
  • ABOUT
    • Overview
    • Advertise
    • For the Media
    • Rights and Permissions
    • Privacy Policy
    • Feedback
  • SUBSCRIBE
PreviousNext
Editorial

Analytical Transparency and Reproducibility in Human Neuroimaging Studies

Marina Picciotto
Journal of Neuroscience 4 April 2018, 38 (14) 3375-3376; DOI: https://doi.org/10.1523/JNEUROSCI.0424-18.2018
Marina Picciotto
Yale University School of Medicine, New Haven, Connecticut 06510
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Marina Picciotto
  • Article
  • Info & Metrics
  • eLetters
  • PDF
Loading

The Journal of Neuroscience is committed to editorial transparency and scientific excellence. Consistent with these goals, this editorial is the first of a series aimed at highlighting current outstanding issues and recommendations on statistical procedures. The goal of this initiative is to help the community served by JNeurosci to maintain the high quality of the science published in the journal. Some concerns relate to long-standing issues that remain important for the field; for example, selective reporting of findings and circular inference (Kriegeskorte et al., 2009). Other concerns relate to analytical transparency and reproducibility; for example, demands for internal reproduction with confirmatory datasets within a single study (Ioannidis et al., 2014). We aim to share methodological guidelines embraced by the editorial board and to reflect the expectations of the field distilled from reviewers' comments. We would like to support initiatives that result in higher levels of reproducibility and analytical transparency while avoiding rigid prescriptive checklists that might hamper data exploration and detection of unforeseen findings.

After alarm calls concerned about the reproducibility of findings in biomedical research (Ioannidis, 2005; Button et al., 2013), there has been a recent surge of guidelines detailing best practices in the analysis of neuroimaging data (Gross et al., 2013; Gilmore et al., 2017; Munafò et al., 2017; Nichols et al., 2017; Poldrack et al., 2017). Several contributions have addressed the perception of limited statistical power in neuroscience (Barch and Yarkoni, 2013; Button et al., 2013). This is a particularly relevant issue in human neuroimaging, for which a large number of studies are underpowered (Nord et al., 2017; Poldrack et al., 2017). However, it has also become evident that statistical power varies greatly across, as well as within, subfields of neuroscience depending on the effect size (Nord et al., 2017). Our understanding of these issues leads us to suggest avoiding the simplistic reaction of blindly demanding extremely large sample sizes no matter what the study design. Satisfying demands for larger and larger sample sizes might lead to studies reporting statistically significant, but conceptually or clinically trivial, effects. This can also lead to suboptimal use of resources. Some of these issues can be avoided by conducting power analyses wherever possible. However, power analyses are only meaningful when based on knowledge of the size of effects specifically related to the experimental question. Exploratory studies might lack that background knowledge. Studies falling into this category might benefit from using a Bayesian inferential framework in which it becomes possible to evaluate the strength of the evidence as data are collected (e.g., Bayes factor for a particular hypothesis) without inflating the risk of false-positives (Dienes, 2016). In a Bayesian framework, sample size could be determined a posteriori by using a predefined stopping criterion; for example, reaching a Bayes factor larger than 10, an accepted mark of strong evidence (Kass and Raftery, 1995). The issues outlined here led to the JNeurosci policy of requiring authors to report experimental design and stats analyses fully, one element of which is focused on reporting metrics related to the magnitude of the effects.

Preregistration has also been promoted as an important step toward achieving higher reproducibility in human neuroimaging studies (Poldrack et al., 2017). This approach, a standard practice in randomized clinical trials, has the advantage of avoiding “hypothesizing after results are known” (HARKing) and “researcher degrees of freedom” (i.e., selecting analytical procedures according to their study-specific outcome rather than first principles). We encourage authors to consider preregistration of study design when possible. However, mandatory adoption of this approach might provide a sterile straightjacket for exploratory components of cognitive neuroscience studies. One possible option is to complement the analytical flexibility of exploratory analyses with an internal replication within a single report. Reproducibility is enhanced by declaring the analytical procedures assessed during the exploratory stage and then testing a fixed procedure on an independent dataset. This approach results in a rapid transition between the hypothesis-generating and hypothesis-testing stages of the research cycle.

Standards of evidence and analytical methodologies in cognitive neuroscience change continuously, as one would expect to observe in a young and dynamic research field. Here, we have highlighted outstanding statistical issues that neuroscience researchers need to consider. We have provided a number of suggestions for striking a balance between analytical flexibility and reproducibility of the findings.

We invite you to contribute to this discussion by e-mailing JNeurosci at JN_EiC@sfn.org or tweeting to @marinap63.

The Editorial Board of The Journal of Neuroscience.

References

  1. ↵
    1. Barch DM,
    2. Yarkoni T
    (2013) Introduction to the special issue on reliability and replication in cognitive and affective neuroscience research. Cogn Affect Behav Neurosci 13:687–689. doi:10.3758/s13415-013-0201-7 pmid:23922199
    OpenUrlCrossRefPubMed
  2. ↵
    1. Button KS,
    2. Ioannidis JP,
    3. Mokrysz C,
    4. Nosek BA,
    5. Flint J,
    6. Robinson ES,
    7. Munafò MR
    (2013) Power failure: why small sample size undermines the reliability of neuroscience. Nat Rev Neurosci 14:365–376. doi:10.1038/nrn3475 pmid:23571845
    OpenUrlCrossRefPubMed
  3. ↵
    1. Dienes Z
    (2016) How Bayes factors change scientific practice. J Math Psychol 72:78–89. doi:10.1016/j.jmp.2015.10.003
    OpenUrlCrossRef
  4. ↵
    1. Gilmore RO,
    2. Diaz MT,
    3. Wyble BA,
    4. Yarkoni T
    (2017) Progress toward openness, transparency, and reproducibility in cognitive neuroscience. Ann N Y Acad Sci 1396:5–18. doi:10.1111/nyas.13325 pmid:28464561
    OpenUrlCrossRefPubMed
  5. ↵
    1. Gross J,
    2. Baillet S,
    3. Barnes GR,
    4. Henson RN,
    5. Hillebrand A,
    6. Jensen O,
    7. Jerbi K,
    8. Litvak V,
    9. Maess B,
    10. Oostenveld R,
    11. Parkkonen L,
    12. Taylor JR,
    13. van Wassenhove V,
    14. Wibral M,
    15. Schoffelen JM
    (2013) Good practice for conducting and reporting MEG research. Neuroimage 65:349–363. doi:10.1016/j.neuroimage.2012.10.001 pmid:23046981
    OpenUrlCrossRefPubMed
  6. ↵
    1. Ioannidis JP
    (2005) Why most published research findings are false. PLOS Med 2:e124. doi:10.1371/journal.pmed.0020124 pmid:16060722
    OpenUrlCrossRefPubMed
  7. ↵
    1. Ioannidis JP,
    2. Munafò MR,
    3. Fusar-Poli P,
    4. Nosek BA,
    5. David SP
    (2014) Publication and other reporting biases in cognitive sciences: detection, prevalence, and prevention. Trends Cogn Sci 18:235–241. doi:10.1016/j.tics.2014.02.010 pmid:24656991
    OpenUrlCrossRefPubMed
  8. ↵
    1. Kass RE,
    2. Raftery AE
    (1995) Bayes factors. J Am Stat Assoc 90:773–795. doi:10.1080/01621459.1995.10476572
    OpenUrlCrossRefPubMed
  9. ↵
    1. Kriegeskorte N,
    2. Simmons WK,
    3. Bellgowan PS,
    4. Baker CI
    (2009) Circular analysis in systems neuroscience: the dangers of double dipping. Nat Neurosci 12:535–540. doi:10.1038/nn.2303 pmid:19396166
    OpenUrlCrossRefPubMed
  10. ↵
    1. Munafò MR,
    2. Nosek BA,
    3. Bishop DVM,
    4. Button KS,
    5. Chambers CD,
    6. Percie du Sert N,
    7. Simonsohn U,
    8. Wagenmakers E-J,
    9. Ware JJ,
    10. Ioannidis JP
    (2017) A manifesto for reproducible science. Nat Hum Behav 1:0021. doi:10.1038/s41562-016-0021
    OpenUrlCrossRef
  11. ↵
    1. Nichols TE,
    2. Das S,
    3. Eickhoff SB,
    4. Evans AC,
    5. Glatard T,
    6. Hanke M,
    7. Kriegeskorte N,
    8. Milham MP,
    9. Poldrack RA,
    10. Poline JB,
    11. Proal E,
    12. Thirion B,
    13. Van Essen DC,
    14. White T,
    15. Yeo BT
    (2017) Best practices in data analysis and sharing in neuroimaging using MRI. Nat Neurosci 20:299–303. doi:10.1038/nn.4500 pmid:28230846
    OpenUrlCrossRefPubMed
  12. ↵
    1. Nord CL,
    2. Valton V,
    3. Wood J,
    4. Roiser JP
    (2017) Power-up: a reanalysis of “Power Failure” in neuroscience using mixture modeling. J Neurosci 37:8051–8061. doi:10.1523/JNEUROSCI.3592-16.2017 pmid:28706080
    OpenUrlAbstract/FREE Full Text
  13. ↵
    1. Poldrack RA,
    2. Baker CI,
    3. Durnez J,
    4. Gorgolewski KJ,
    5. Matthews PM,
    6. Munafò MR,
    7. Nichols TE,
    8. Poline JB,
    9. Vul E,
    10. Yarkoni T
    (2017) Scanning the horizon: towards transparent and reproducible neuroimaging research. Nat Rev Neurosci 18:115–126. doi:10.1038/nrn.2016.167 pmid:28053326
    OpenUrlCrossRefPubMed
Back to top

In this issue

The Journal of Neuroscience: 38 (14)
Journal of Neuroscience
Vol. 38, Issue 14
4 Apr 2018
  • Table of Contents
  • Table of Contents (PDF)
  • About the Cover
  • Index by author
  • Advertising (PDF)
  • Ed Board (PDF)
Email

Thank you for sharing this Journal of Neuroscience article.

NOTE: We request your email address only to inform the recipient that it was you who recommended this article, and that it is not junk mail. We do not retain these email addresses.

Enter multiple addresses on separate lines or separate them with commas.
Analytical Transparency and Reproducibility in Human Neuroimaging Studies
(Your Name) has forwarded a page to you from Journal of Neuroscience
(Your Name) thought you would be interested in this article in Journal of Neuroscience.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Print
View Full Page PDF
Citation Tools
Analytical Transparency and Reproducibility in Human Neuroimaging Studies
Marina Picciotto
Journal of Neuroscience 4 April 2018, 38 (14) 3375-3376; DOI: 10.1523/JNEUROSCI.0424-18.2018

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Respond to this article
Request Permissions
Share
Analytical Transparency and Reproducibility in Human Neuroimaging Studies
Marina Picciotto
Journal of Neuroscience 4 April 2018, 38 (14) 3375-3376; DOI: 10.1523/JNEUROSCI.0424-18.2018
Reddit logo Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • References
  • Info & Metrics
  • eLetters
  • PDF

Responses to this article

Respond to this article

Jump to comment:

No eLetters have been published for this article.

Related Articles

Cited By...

More in this TOC Section

  • Introducing Open Peer Review at JNeurosci
  • New Mentored Journal Club
  • New feature: Neuro and Beyond
Show more Editorial

Subjects

  • Experimental Design Editorials
  • Home
  • Alerts
  • Visit Society for Neuroscience on Facebook
  • Follow Society for Neuroscience on Twitter
  • Follow Society for Neuroscience on LinkedIn
  • Visit Society for Neuroscience on Youtube
  • Follow our RSS feeds

Content

  • Early Release
  • Current Issue
  • Issue Archive
  • Collections

Information

  • For Authors
  • For Advertisers
  • For the Media
  • For Subscribers

About

  • About the Journal
  • Editorial Board
  • Privacy Policy
  • Contact
(JNeurosci logo)
(SfN logo)

Copyright © 2023 by the Society for Neuroscience.
JNeurosci Online ISSN: 1529-2401

The ideas and opinions expressed in JNeurosci do not necessarily reflect those of SfN or the JNeurosci Editorial Board. Publication of an advertisement or other product mention in JNeurosci should not be construed as an endorsement of the manufacturer’s claims. SfN does not assume any responsibility for any injury and/or damage to persons or property arising from or related to any use of any material contained in JNeurosci.