top of page

Recruitment

Item 23b: If relevant, why the trial ended or was stopped

Examples

“At the time of the interim analysis, the total follow-up included an estimated 63% of the total number of patient-years that would have been collected at the end of the study, leading to a threshold value of 0.0095, as determined by the Lan-DeMets alpha-spending function method . . . At the interim analysis, the RR [risk ratio] was 0.37 in the intervention group, as compared with the control group, with a p value of 0.00073, below the threshold value. The Data and Safety Monitoring Board advised the investigators to interrupt the trial and offer circumcision to the control group, who were then asked to come to the investigation centre, where MC (medical circumcision) was advised and proposed . . . Because the study was interrupted, some participants did not have a full follow-up on that date, and their visits that were not yet completed are described as ‘planned’ in this article [433]."

 

“In January 2000, problems with vaccine supply necessitated the temporary nationwide replacement of the whole cell component of the combined DPT/Hib vaccine with acellular pertussis vaccine. As this vaccine has a different local reactogenicity profile, we decided to stop the trial early [434]."

 

Explanation

Arguably, trialists who arbitrarily conduct unplanned interim analyses after very few events accrue using no statistical guidelines run a high risk of catching the data at a random extreme, which likely represents a large overestimate of treatment benefit [435].

 

Readers will likely draw weaker inferences from a trial that was truncated in a data driven manner versus one that reports its findings after reaching a results-independent goal (box 9). Where relevant, authors should report the reason for stopping the trial before completion as planned (eg, result of an interim analysis, lack of funding, poor recruitment of participants, intervention no longer available, or the question becoming no longer relevant after publication of another study). Authors should also disclose factors extrinsic to the trial that affected the decision to stop the trial, and who made the decision to stop the trial, including reporting the role the funding agency had in the deliberations and in the decision to stop the trial [436].

Box start

Box 9: Early stopping of randomised trials

Randomised trials can end when they reach their sample size goal, their event count goal, their length of follow-up goal, or their scheduled date of closure. In these situations, the trial will stop in a manner independent of its results and stopping is unlikely to introduce bias in the results. Alternatively, randomised trials can stop earlier than planned because of the result of an interim analysis showing larger than expected benefit or harm of the experimental intervention. Randomised trials can also stop earlier than planned when investigators find evidence of no important difference between experimental and control interventions (ie, stopping for futility). In addition, trials may stop early because the trial becomes unviable: funding vanishes, researchers cannot access eligible patients or study interventions, or the results of other studies make the research question irrelevant.

Full reporting of why a trial ended is important for evidence based decision making (item 23b). Researchers [436] examining why 143 trials stopped early for benefit found that many failed to report key methodological information regarding how the decision to stop was reached: the planned sample size (n=28), interim analysis after which the trial was stopped (n=45), or whether a stopping rule informed the decision (n=48). Item 16b of the CONSORT checklist requires the reporting of timing of interim analyses, what triggered them, how many took place, whether these were planned or ad hoc, and whether there were statistical guidelines and stopping rules in place a priori. Furthermore, it is helpful to know whether an independent data monitoring committee participated in the analyses (and who composed it, with particular attention to the role of the funding source), and who made the decision to stop. Often the data monitoring committee make recommendations and the funders (sponsors) or the investigators make the decision to stop.

Trials that stop early for reasons apparently independent of trial findings, and trials that reach their planned termination, are unlikely to introduce bias by stopping [437]. In these cases, the authors should report whether interim analyses took place and whether these results were available to the funder.

The push for trials that change the intervention in response to interim results, thus enabling a faster evaluation of promising interventions for rapidly evolving and fatal conditions, will require even more careful reporting of the process and decision to stop trials early [174, 438].

CONSORT=Consolidated Standards of Reporting Trials.

Box end

A systematic review of 143 randomised trials that were stopped earlier than planned for benefit found that these trials reported stopping after accruing a median of 66 events. The review estimated a median relative risk of 0.47 and a strong association between the number of events accrued and the magnitude of the effect, with smaller trials with fewer events yielding the largest treatment effects (odds ratio 31, 95% CI 12 to 82) [436]. While an increasing number of trials published in high impact medical journals report stopping early, many still do not report how the decision to stop the trial was made. In a systematic review of 110 paediatric trials that reported on the presence of a data monitoring committee, interim analysis, or early stopping, 32 were terminated early. Of these 32 trials, 22 (69%) did not report predefined stopping guidelines and 15 (47%) did not provide information on statistical monitoring methods [439].

Logo: jointly funded by the UKRI Medical Research Council and the NIHR (National Institute for Health and Care Research)
University of Oxford logo
University of Toronto logo
The University of North Carolina at Chapel Hill logo
University of Southern Denmark (SDU) logo
University of Ottawa (uOttawa) logo
Université Paris Cité (UPC) logo

The 2025 update of SPIRIT and CONSORT, and this website, are funded by the MRC-NIHR: Better Methods, Better Research [MR/W020483/1]. The views expressed are those of the authors and not necessarily those of the NIHR, the MRC, or the Department of Health and Social Care.

bottom of page