Degrees of Freedom in Planning, Running, Analyzing, and Reporting Psychological Studies: A Checklist to Avoid p-Hacking

@article{Wicherts2016DegreesOF,
  title={Degrees of Freedom in Planning, Running, Analyzing, and Reporting Psychological Studies: A Checklist to Avoid p-Hacking},
  author={Jelte M. Wicherts and Coosje Lisabet Sterre Veldkamp and Hilde E. M. Augusteijn and Marjan Bakker and Robbie C. M. van Aert and Marcel A. L. M. van Assen},
  journal={Frontiers in Psychology},
  year={2016},
  volume={7}
}
The designing, collecting, analyzing, and reporting of psychological studies entail many choices that are often arbitrary. The opportunistic use of these so-called researcher degrees of freedom aimed at obtaining statistically significant results is problematic because it enhances the chances of false positive results and may inflate effect size estimates. In this review article, we present an extensive list of 34 degrees of freedom that researchers have in formulating hypotheses, and in… 

Tables and Topics from this paper

Making the Black Box Transparent: A Template and Tutorial for Registration of Studies Using Experience-Sampling Methods
A growing interest in understanding complex and dynamic psychological processes as they occur in everyday life has led to an increase in studies using ambulatory assessment techniques, including the
Seven steps toward transparency and replicability in psychological science.
Psychological scientists strive to advance understanding of how and why we animals do and think and feel as we do. This is difficult, in part because flukes of chance and measurement error obscure
Making ERP research more transparent: Guidelines for preregistration.
TLDR
An overview of the problems associated with undisclosed analytic flexibility is presented, why and how EEG researchers would benefit from adopting preregistration are discussed, and guidelines and examples on how to preregister data preprocessing and analysis steps in typical ERP studies are provided.
Virtual reality check: Statistical power, reported results, and the validity of research on the psychology of virtual reality and immersive environments
TLDR
Transparency in data analysis, increased statistical power, and more careful reporting of statistical outcomes are suggested to heighten methodological rigor and improve reproducibility in the field of VR research.
Perceived Statistical Knowledge Level and Self-Reported Statistical Practice Among Academic Psychologists
TLDR
The results showed that, although the use of effect size estimates is becoming generalized, an under-reporting of CIs for ES persists, and academics with higher-level knowledge about ES statistics seem to have a profile closer to good statistical practices.
Researcher degrees of freedom in phonetic research
  • Timo B. Roettger
  • Psychology
    Laboratory Phonology: Journal of the Association for Laboratory Phonology
  • 2019
The results of published research critically depend on methodological decisions that have been made during data analysis. These so-called ‘researcher degrees of freedom’ (Simmons, Nelson, &
Fact or fiction: reducing the proportion and impact of false positives
TLDR
The concept of researcher ‘degrees-of-freedom’ is described to explain how many false-positive findings arise, and how the various strategies of registration, pre-specification, and reporting standards that are being adopted both reduce and make these visible.
p-value Problems? An Examination of Evidential Value in Criminology
This study aims to assess the evidential value of the knowledgebase in criminology after accounting for the presence of potential Type I errors. The present study examines the distribution of 1248
Causes of reporting bias: a theoretical framework.
TLDR
This work builds upon a classification of determinants of selective reporting that was recently developed in a systematic review of the topic and features four clusters of causes that represent a sufficient cause for reporting bias to occur.
Causes of reporting bias: a theoretical framework
TLDR
This work builds upon a classification of determinants of selective reporting that was recently developed in a systematic review of the topic and features four clusters of causes that may lead to reporting bias mediated through the necessary causes.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 77 REFERENCES
The (mis)reporting of statistical results in psychology journals
TLDR
The authors' results indicate that around 18% of statistical results in the psychological literature are incorrectly reported, and that errors were often in line with researchers’ expectations.
An Introduction to Registered Replication Reports at Perspectives on Psychological Science
TLDR
This issue of Perspectives on Psychological Science includes the first example of a new type of journal article, one designed to provide a more definitive measure of the size and reliability of important effects: the Registered Replication Report (RRR; see Simons & Holcombe, 2014).
Finding the Missing Science : The Fate of Studies Submitted for Review by a Human Subjects Committee
Publication bias, including prejudice against the null hypothesis, and other biasing filters may operate on researchers as well as journal editors and reviewers. A survey asked 33 psychology
The Rules of the Game Called Psychological Science
TLDR
This paper considers 13 meta-analyses covering 281 primary studies in various fields of psychology and finds indications of biases and/or an excess of significant results in seven, highlighting the need for sufficiently powerful replications and changes in journal policies.
Underreporting in Psychology Experiments
Many scholars have raised concerns about the credibility of empirical findings in psychology, arguing that the proportion of false positives reported in the published literature dramatically exceeds
Encourage Playing with Data and Discourage Questionable Reporting Practices
TLDR
Making research data and research materials publicly available and consulting methodologists or statisticians for help and a second opinion will help in increasing the transparency of the data analysis phase of the empirical research cycle within psychology.
The prevalence of statistical reporting errors in psychology (1985–2013)
TLDR
It was found that half of all published psychology papers that use NHST contained at least one p-value that was inconsistent with its test statistic and degrees of freedom, and the average prevalence of inconsistent p-values has been stable over the years or has declined.
False-Positive Psychology
TLDR
It is shown that despite empirical psychologists’ nominal endorsement of a low rate of false-positive findings, flexibility in data collection, analysis, and reporting dramatically increases actual false- positive rates, and a simple, low-cost, and straightforwardly effective disclosure-based solution is suggested.
Researchers’ Intuitions About Power in Psychological Research
TLDR
Survey of published research psychologists found large discrepancies between their reports of their preferred amount of power and the actual power of their studies, and recommended that researchers conduct and report formal power analyses for their studies.
Opportunistic biases: Their origins, effects, and an integrated solution.
TLDR
How a number of accepted research practices can lead to opportunistic biases is explained, the prevalence of these practices in psychology is discussed, the different effects that opportunist biases have on psychological science are considered, the strategies that methodologists have proposed to prevent or correct for the effects of these biases are evaluated, and an integrated solution is introduced to reduce the prevalence and influence of opportunistic bias.
...
1
2
3
4
5
...