It Does Not Follow

@article{Simonsohn2012ItDN,
  title={It Does Not Follow},
  author={Uri Simonsohn},
  journal={Perspectives on Psychological Science},
  year={2012},
  volume={7},
  pages={597 - 599}
}
  • U. Simonsohn
  • Published 1 November 2012
  • Psychology
  • Perspectives on Psychological Science
Francis (2012a, 2012b, 2012c, 2012d, 2012e, in press) attacks individual papers through critiques that apply faulty logic to analyses ironically biased by cherry picking. However well intentioned, the critiques are probably counterproductive to their stipulated goal and certainly unfair to the targeted authors. 

Figures from this paper

It really just does not follow, comments on Francis (2013)
Follow the argument where it leads: Simonsohn's criticisms on publication bias critiques are unfounded
Simonsohn (2012) argued that my recent publication bias critiques (Francis, 2012a,b,c,d,e,f, in press) are invalid. We both investigate consistency within a set of reported statistical findings to
REPLY TO FRANCIS
“In numerous one-off critique-articles, Francis presents evidence that individual psychology papers suffer from publication bias, and concludes that the results from these papers ought to be fully
Accounting and Public Policy: The Importance of Credible Research
ABSTRACT: Accounting as a professional practice plays a profound, unavoidable, and often unnoticed role in the lives of all citizens. As members of the Public Interest Section of the American Accou...
Let’s Put Our Money Where Our Mouth Is
  • J. Maner
  • Psychology
    Perspectives on psychological science : a journal of the Association for Psychological Science
  • 2014
TLDR
Recommendations include changing the way reviewers respond to imperfections in empirical data, focusing less on individual tests of statistical significance and more on meta-analyses, and attending carefully to the theoretical contribution of a manuscript in addition to its methodological rigor.
Psychology's Renaissance
TLDR
It is shown that the scientific practices of experimental psychologists have improved dramatically and is argued that meta‐analytical thinking increases the prevalence of false positives.
We should focus on the biases that matter: A reply to commentaries
Direct replication of Gervais & Norenzayan (2012): No evidence that analytic thinking decreases religious belief
TLDR
A precise, large, multi-site pre-registered replication of one of the experiments in which manipulations intended to foster analytic thinking decreased religious belief observed little to no effect of the experimental manipulation on religious belief.
...
...

References

SHOWING 1-10 OF 20 REFERENCES
Evidence that publication bias contaminated studies relating social class and unethical behavior
  • G. Francis
  • Psychology
    Proceedings of the National Academy of Sciences
  • 2012
TLDR
The multiple replications might appear to provide strong evidence for the claim that people of a higher social class were more likely to engage in unethical behavior, but the analysis shows that the findings are unbelievable.
Negative results are disappearing from most disciplines and countries
TLDR
The overall frequency of positive supports has grown by over 22% between 1990 and 2007, with significant differences between disciplines and countries, which support the hypotheses that research is becoming less pioneering and/or that the objectivity with which results are produced and published is decreasing.
The Handbook of Research Synthesis and Meta-Analysis
The chapter on stochastically dependent effect sizes by Gleser and Olkin (2009) in The handbook of research synthesis and meta-analysis (2nd ed.) describes. Download Here: tinyurl.com/ohnxrcn When
False-Positive Psychology
TLDR
It is shown that despite empirical psychologists’ nominal endorsement of a low rate of false-positive findings, flexibility in data collection, analysis, and reporting dramatically increases actual false- positive rates, and a simple, low-cost, and straightforwardly effective disclosure-based solution is suggested.
The earth is round (p < .05)
After 4 decades of severe criticism, the ritual of null hypothesis significance testing (mechanical dichotomous decisions around a sacred .05 criterion) still persists. This article reviews the
Publication decisions revisited: the effect of the outcome of statistical tests on the decision to p
TLDR
Evidence that published results of scientific investigations are not a representative sample of results of all scientific studies is presented and practice leading to publication bias have not changed over a period of 30 years is indicated.
Publication bias in "Red, rank, and romance in women viewing men," by Elliot et al. (2010).
  • G. Francis
  • Psychology
    Journal of experimental psychology. General
  • 2013
TLDR
Because of the presence of publication bias, the findings in Elliot et al. (2010) should be considered nonscientific or anecdotal and it remains an open question whether the color red influences women's ratings of men's attributes.
The same old New Look: Publication bias in a study of wishful seeing
TLDR
A statistical analysis of the experimental findings reveals evidence of publication bias in the study, so the existence of wishful seeing remains unproven.
Too good to be true: Publication bias in two prominent studies from experimental psychology
  • G. Francis
  • Psychology, Biology
    Psychonomic bulletin & review
  • 2012
TLDR
Application of this test reveals evidence of publication bias in two prominent investigations from experimental psychology that have purported to reveal evidence of extrasensory perception and to indicate severe limitations of the scientific method.
Publication Bias in Meta-Analysis
Publication bias is the term for what occurs whenever the research that appears in the published literature is systematically unrepresentative of the population of completed studies. Simply put, when
...
...