Metascience fail: Four lessons from inaccurate data on "missing" clinical trial results
Standard methodologies for counting the number and percentage of clinical trials that remain unreported may be overstating the scale of the problem, feedback from two medical research institutions suggests.
In February 2024, TranspariMED and allied groups published a report claiming that 475 clinical trials involving 83,903 patients across five Nordic countries had never made their results public.
The report was widely covered in national media in Denmark, Sweden, and Norway. Crucially, it included a simple list of all unreported trials per institution, allowing universities and hospitals to follow up on them.
How many results are really missing?
The report stated that Copenhagen University Hospital (Rigshospitalet) had left 39 clinical trials unreported, making it the institution with the largest total number of missing trial results. According to our data, 23% of Rigshospitalet trials were missing results.
After checking up on 38 trials, Professor Anders Perner from Rigshospitalet reached out to TranspariMED and sent us the following breakdown:
Professor Perner's data suggest that only 14 of the 28 reviewed trials truly remained unreported (7 unpublished, 4 still "under analysis", and 1 "submitted" but not yet public, 1 "uncertain"). One additional trial was published only as an abstract which, as per study protocol, we did not count as reported.
What happened in Denmark?
Let's take a closer look at the data.
We all agree that 14 trials currently remain not fully reported. But what about the 18 trials that supposedly did report results? (13 publications in journals, 4 results on the EudraCT registry, and 1 result published as a PhD.)
Professor Perner did not share his dataset, just a breakdown, so we do not have access to the full picture. But earlier on, he did share some specific trial numbers. In some of those cases, results had only been published after we had completed our searches - so our claim that they remained unreported at the time was accurate. But in the case of at least 3 trials, it seemed that our researchers had overlooked relevant results, so our data were incorrect.
In the case of the remaining 6 trials, the problem may be shoddy registry management practices rather than non-reporting.
For 5 trials, Rigshospitalet apparently failed to ensure that researchers updated the registry entry after the trial had been cancelled. One additional trial had not been run by the institution in the first place, indicating that a researcher from a different institution made a mistake while registering the study.
What happened in Norway?
The worst performer identified in TranspariMED's report was Haukeland University Hospital in Norway.
The report found a research waste rate of 50%, with 15 out of 30 trials remaining unreported. Those data were widely cited in Norwegian media, including in the hospital's local newspaper, causing Haukeland to take a closer look.
Haukeland pushed back hard again the findings.
A statement by the institution claimed that out of the 15 "unreported" trials, 10 studies had never started up. Only 1 trial remained genuinely unreported, the institution said.
(Another 3 trials had published results, but because Haukeland declined to share their data with us, we cannot determine whether these results had been published before or after we ran our searches.)
The director of the research department stated that:
"This is about registration errors, not lack of publication. We will tighten up our procedures regarding registration in ClinicalTrials.gov and similar databases."
Four key lessons
First and foremost, as anyone who has worked with registry data knows, clinical trial registries are a complete mess. They should not be, and it is the responsibility of universities and hospitals to establish effective oversight and ensure that all data on trials in registries is complete, accurate and - crucially - kept up to date.
Second, universities and hospitals should remind their researchers to include the clinical trial registration number in the abstracts and metadata of all related publications so that they can easily be found. Both the WHO and CONSORT guidelines already recommend this. Looking for medical evidence should not be a lengthy treasure hunt. If two researchers independently using a strong search protocol and cannot find a trial result, what are the chances that a busy clinician or patient can find them?
Third, meta-researchers should always check back with institutions to verify their findings. I have increasingly started to do this in my own academic research. Searching for hundreds or thousands of trial results requires immense effort. The workload involved in contacting a few dozen institutions, managing their responses and cleaning up the data is not trivial, but far less than is required for the original searches.
Finally, metascience that stays in journals is a waste of time. If you want to drive real change, reaching out to the media and directly engaging with institutions is indispensable. As a first step, publish a simplified dataset to help institutions to follow up on their trials. As a second step, share that dataset with institutions and initiate a dialogue.
Stop doing metascience in a bunker
If we want to help institutions to do the right thing, we need to give them the information and tools required to do so.
Haukeland has already promised to strengthen its procedures. I trust that both Rigshospitalet and Haukeland are now following up on their unreported trials to get those results out there. If this study had remained hidden away in a preprint or in a journal, none of this would ever have happened.
For inspiration on how to use metaresearch to effectively drive positive change, take a look at this classic study and read this excellent blog.
TranspariMED thanks Professor Anders Perner from Rigshospitalet and Professor Marta Ebbing from Haukeland for following up on the clinical trials run by their institutions. TranspariMED invites other institutions to generate similar critical analyses, which we will publish on this blog.
Comments