Tuesday, March 5, 2013

Registered replication reports at APS journal, Perspectives


I'm excited to announce a brand new initiative at the APS journal, Perspectives on Psychological Science. The journal will be publishing a new type of article: the registered replication report. +Alex Holcombe and I will be acting as editors for these reports. Below is the mission statement explaining the goals and approach. And, here is a link to the Perspectives website with more information about the reports, the submission process, etc. Send us your proposals!

Mission Statement

Replicability is a cornerstone of science. Yet replication studies rarely appear in psychology journals. The new Registered Replication Reports article type in Perspectives on Psychological Science fortifies the foundation of psychological science by publishing collections of replications based on a shared and vetted protocol. It is motivated by the following principles:
• Psychological science should emphasize findings that are robust, replicable, and generalizable.
• Direct replications are necessary to estimate the true size of an effect.
• Well-designed replication studies should be published regardless of the size of the effect or statistical significance of the result.

Traditional psychology journals emphasize theoretical and empirical novelty rather than reproducibility. When journals consider a replication attempt, the process can be an uphill battle for authors. Given the challenges associated with publishing replication attempts, researchers have little incentive to conduct such studies in the first place. Yet, only with multiple replication attempts can we adequately estimate the true size of an effect.

A central goal of publishing Registered Replication Reports is to encourage replication studies by modifying the typical submission and review process. Authors submit a detailed description of the method and analysis plan. The submitted plan is then sent to the author(s) of the replicated study for review. Because the proposal review occurs before data collection, reviewers have an incentive to make sure that the planned replication conforms to the methods of the original study. Consequently, the review process is more constructive than combative. Once the replication plan is accepted, it is posted publicly, and other laboratories can follow the same protocol in conducting their own replications of the original result. Those additional replication proposals are vetted by the editors to make sure they conform to the approved protocol.

The results of the replication attempts are then published together in Perspectives on Psychological Science as a Registered Replication Report. Crucially, the results of the replication attempts are published regardless of the outcome, and the protocol is pre-determined and registered in advance. The conclusion of a Registered Replication Report should avoid categorizing each result as a success or failure to replicate. Instead, it should focus on the cumulative estimate of the effect size. Together with the separate results of each replication attempt, the journal will publish a figure illustrating the measured effects from each study and a meta-analytic effect size estimate. The details of the protocol, including any stimuli or code provided by the original authors or replicating laboratories as well as data from each study, will be available on the Open Science Framework (OSF) website and will be linked from the published report and the APS website for further inspection and analysis by other researchers. Once all the replication attempts have been collected into a final report, the author(s) of the original article will be invited to submit a short, peer-reviewed commentary on the collection of replication attempts.

This publication model provides many broader benefits to psychological science:
  1. Because the registered replication attempts are published regardless of outcome, researchers have an incentive to replicate classic findings before beginning a new line of research extending those findings. 
  2. Subtleties of methodology that rarely appear in method sections of traditional journals will emerge from the constructive review process because original authors will have an incentive to make them known (i.e., helping to make sure the replications are designed properly).
  3. Multiple labs can attempt direct replications of the same finding, and all such replication attempts will be interlinked, providing a cumulative estimate of the true size of the effect.
  4. The emphasis on estimating effect sizes rather than on the dichotomous characterization of a replication attempt as a success or failure based on statistical significance could lead to greater awareness of the shortcomings of traditional null-hypothesis significance testing.
  5. Authors and journalists will have a source for vetted, robust findings, and a stable estimate of the effect size for controversial findings.
  6. Researchers may hesitate to publish a surprising result from a small-sample study without first verifying that result with an adequately powered design.

No comments: