The contents of the Peer Review Center are also available as a live, interactive training session, complete with slides, talking points, and activities. …
How to Report Statistics
Ensure appropriateness and rigor, avoid flexibility and above all never manipulate results
In many fields, a statistical analysis forms the heart of both the methods and results sections of a manuscript. Learn how to report statistical analyses, and what other context is important for publication success and future reproducibility.
A matter of principle
First and foremost, the statistical methods employed in research must always be:
Appropriate for the study design
Rigorously reported in sufficient detail for others to reproduce the analysis
Free of manipulation, selective reporting, or other forms of “spin”
Just as importantly, statistical practices must never be manipulated or misused. Misrepresenting data, selectively reporting results or searching for patterns that can be presented as statistically significant, in an attempt to yield a conclusion that is believed to be more worthy of attention or publication is a serious ethical violation. Although it may seem harmless, using statistics to “spin” results can prevent publication, undermine a published study, or lead to investigation and retraction.
Supporting public trust in science through transparency and consistency
Along with clear methods and transparent study design, the appropriate use of statistical methods and analyses impacts editorial evaluation and readers’ understanding and trust in science.
In 2011 False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant exposed that “flexibility in data collection, analysis, and reporting dramatically increases actual false-positive rates” and demonstrated “how unacceptably easy it is to accumulate (and report) statistically significant evidence for a false hypothesis”.
Arguably, such problems with flexible analysis lead to the “reproducibility crisis” that we read about today.
A constant principle of rigorous science
The appropriate, rigorous, and transparent use of statistics is a constant principle of rigorous, transparent, and Open Science. Aim to be thorough, even if a particular journal doesn’t require the same level of detail. Trust in science is all of our responsibility. You cannot create any problems by exceeding a minimum standard of information and reporting.
Sound statistical practices
While it is hard to provide statistical guidelines that are relevant for all disciplines, types of research, and all analytical techniques, adherence to rigorous and appropriate principles remains key. Here are some ways to ensure your statistics are sound.
Define your analytical methodology before you begin
Take the time to consider and develop a thorough study design that defines your line of inquiry, what you plan to do, what data you will collect, and how you will analyze it. (If you applied for research grants or ethical approval, you probably already have a plan in hand!) Refer back to your study design at key moments in the research process, and above all, stick to it.
To avoid flexibility and improve the odds of acceptance, preregister your study design with a journal
Many journals offer the option to submit a study design for peer review before research begins through a practice known as preregistration. If the editors approve your study design, you’ll receive a provisional acceptance for a future research article reporting the results. Preregistering is a great way to head off any intentional or unintentional flexibility in analysis. By declaring your analytical approach in advance you’ll increase the credibility and reproducibility of your results and help address publication bias, too. Getting peer review feedback on your study design and analysis plan before it has begun (when you can still make changes!) makes your research even stronger AND increases your chances of publication—even if the results are negative or null. Never underestimate how much you can help increase the public’s trust in science by planning your research in this way.
Imagine replicating or extending your own work, years in the future
Imagine that you are describing your approach to statistical analysis for your future self, in exactly the same way as we have described for writing your methods section. What would you need to know to replicate or extend your own work? When you consider that you might be at a different institution, working with different colleagues, using different programs, applications, resources — or maybe even adopting new statistical techniques that have emerged — you can help yourself imagine the level of reporting specificity that you yourself would require to redo or extend your work. Consider:
- Which details would you need to be reminded of?
- What did you do to the raw data before analysis?
- Did the purpose of the analysis change before or during the experiments?
- What participants did you decide to exclude?
- What process did you adjust, during your work?
Even if a necessary adjustment you made was not ideal, transparency is the key to ensuring this is not regarded as an issue in the future. It is far better to transparently convey any non-optimal techniques or constraints than to conceal them, which could result in reproducibility or ethical issues downstream.
Existing standards, checklists, guidelines for specific disciplines
You can apply the Open Science practices outlined above no matter what your area of expertise—but in many cases, you may still need more detailed guidance specific to your own field. Many disciplines, fields, and projects have worked hard to develop guidelines and resources to help with statistics, and to identify and avoid bad statistical practices. Below, you’ll find some of the key materials.
TIP: Do you have a specific journal in mind?
Be sure to read the submission guidelines for the specific journal you are submitting to, in order to discover any journal- or field-specific policies, initiatives or tools to utilize.
Biomedical Research | SAMPL guidelines The “Statistical Analyses and Methods in the Published Literature” (SAMPL) guidelines covers basic statistical reporting for research in biomedical journals. |
General | PLOS ONE guidelines for statistical reporting While specific to PLOS ONE, these guidelines should be applicable to most research contexts since the journal serves many research disciplines. |
Systematic reviews & Meta-analyses | PRISMA The “Preferred Reporting Items for Systematic Reviews and Meta-Analyses” (PRISMA) is an evidence-based minimum set of items focusing on the reporting of reviews evaluating randomized trials and other types of research. |
Life Sciences | MDAR checklist The “Consistent reporting of Materials, Design, and Analysis” (MDAR) checklist was developed and tested by a cross-publisher group of editors and experts in order to establish and harmonize reporting standards in the Life Sciences. The checklist, which is available for use by authors to compile their methods, and editors/reviewers to check methods, establishes a minimum set of requirements in transparent reporting and is adaptable to any discipline within the Life Sciences, by covering a breadth of potentially relevant methodological items and considerations. |
Articles on statistical methods and reporting
Makin, T.R., Orban de Xivry, J. Science Forum: Ten common statistical mistakes to watch out for when writing or reviewing a manuscript. eLife 2019;8:e48175 (2019). https://doi.org/10.7554/eLife.48175
Munafò, M., Nosek, B., Bishop, D. et al. A manifesto for reproducible science. Nat Hum Behav 1, 0021 (2017). https://doi.org/10.1038/s41562-016-0021
Writing tips
Your use of statistics should be rigorous, appropriate, and uncompromising in avoidance of analytical flexibility. While this is difficult, do not compromise on rigorous standards for credibility!
Do
- Remember that trust in science is everyone’s responsibility.
- Keep in mind future replicability.
- Consider preregistering your analysis plan to have it (i) reviewed before results are collected to check problems before they occur and (ii) to avoid any analytical flexibility.
- Follow principles, but also checklists and field- and journal-specific guidelines.
- Consider a commitment to rigorous and transparent science a personal responsibility, and not simple adhering to journal guidelines.
- Be specific about all decisions made during the experiments that someone reproducing your work would need to know.
- Consider a course in advanced and new statistics, if you feel you have not focused on it enough during your research training.
Don’t
- Misuse statistics to influence significance or other interpretations of results
- Conduct your statistical analyses if you are unsure of what you are doing—seek feedback (e.g. via preregistration) from a statistical specialist first.