The Replication Crisis and Methodological Reform in Psychology and Biomedicine (2010–present)

  1. Bem precognition paper spotlights peer-review concerns

    Labels: Daryl Bem, Journal Peer

    A controversial paper in a top psychology journal reported statistical evidence for “precognition,” drawing intense scrutiny of how surprising claims can pass peer review. The debate became a high-profile example in discussions about questionable research practices and the need for better replication and transparency.

  2. Simmons–Nelson–Simonsohn warn about analytic flexibility

    Labels: Simmons Nelson, Analytic Flexibility

    A widely cited methods paper argued that common, undisclosed choices in data collection and analysis can greatly increase false-positive results (finding an effect that is not real). It helped clarify why many published findings can look stronger than they are, and it energized calls for clearer reporting and stricter standards in psychology.

  3. Stapel fraud case amplifies distrust in published findings

    Labels: Diederik Stapel, Research Fraud

    Investigations into Dutch social psychologist Diederik Stapel found extensive data fabrication across many studies. Although fraud is different from ordinary error, the case reinforced worries that incentives and weak checks can allow unreliable results into the literature.

  4. Open Science Framework publicly launches

    Labels: Open Science, Preregistration

    The Open Science Framework (OSF) was publicly released as a free, open-source platform to manage projects, share materials, and support preregistration (publicly time-stamping a study plan before results are known). OSF became key infrastructure for transparency and later reform efforts in psychology and biomedicine.

  5. Begley and Ellis call for stronger preclinical cancer standards

    Labels: Begley and, Preclinical Cancer

    A Nature commentary argued that preclinical cancer research needed higher methodological standards to improve reliability and translation to patient benefit. It helped bring “reproducibility” concerns from psychology into the biomedical spotlight, focusing on how lab methods and reporting affect follow-up work.

  6. Center for Open Science launches to promote reproducibility

    Labels: Center for, Reproducibility Movement

    The Center for Open Science (COS) was launched to build tools and community norms for openness, integrity, and reproducibility. COS became a major organizer of large-scale replication projects and a driver of new publishing standards.

  7. Psychological Science introduces Open Practice Badges

    Labels: Psychological Science, Open Practice

    The journal Psychological Science began awarding badges for open data, open materials, and preregistration. The badges aimed to change incentives by making transparent practices visible and creditable in publication.

  8. Reproducibility Project: Cancer Biology begins publishing plans

    Labels: Reproducibility Project, Cancer Biology

    An organized effort to replicate key experiments from high-impact cancer biology papers was announced with a “Registered Report/Replication Study” approach. This model emphasized pre-committing to methods and peer review before results, aiming to reduce bias and improve interpretability in biomedicine.

  9. TOP Guidelines published to standardize journal transparency

    Labels: TOP Guidelines, Journal Policy

    The Transparency and Openness Promotion (TOP) Guidelines offered journals a menu of policy standards—such as data, code, and materials sharing; preregistration; and replication—at increasing levels of strictness. They became a common reference point for journals seeking concrete, enforceable reforms.

  10. Large replication study estimates psychology reproducibility

    Labels: Open Science, Replication Study

    The Open Science Collaboration reported results from attempts to replicate 100 psychology studies, finding that fewer than half showed statistically significant effects in the replications. The paper made replication rates a measurable, discussable topic and accelerated reforms around preregistration, sharing, and multi-lab collaboration.

  11. Nature survey finds widespread concern about reproducibility

    Labels: Nature Survey, Researcher Opinion

    A large survey reported that most researchers believed there was a reproducibility “crisis,” and many said they had failed to reproduce others’ results. The survey showed the problem was not limited to one field and helped justify broader institutional and policy action.

  12. ClinicalTrials.gov Final Rule takes effect for results reporting

    Labels: ClinicalTrials gov, Final Rule

    U.S. regulations implementing FDAAA requirements took effect, strengthening expectations for registration and results reporting on ClinicalTrials.gov for many clinical trials. By increasing public accountability for trial reporting, the rule targeted selective publication and missing results that can distort medical evidence.

  13. ICMJE journals require clinical trial data-sharing statements

    Labels: ICMJE, Data-Sharing Statement

    Major medical journal editors (ICMJE) began requiring submitted clinical trial reports to include a data-sharing statement. Even when data are not shared, the policy increases transparency about availability and sets a clear expectation that datasets should be reusable when possible.

  14. NIH Data Management and Sharing Policy goes into effect

    Labels: NIH, DMS Policy

    NIH’s Data Management and Sharing (DMS) Policy took effect, requiring many NIH-funded projects to submit a plan for how scientific data will be managed and shared. This marked a shift from mostly voluntary norms toward routine, funder-backed expectations for data availability in biomedicine.

First
Last
StartEnd
Last Updated:Jan 1, 1980

The Replication Crisis and Methodological Reform in Psychology and Biomedicine (2010–present)