There is a sense of irony that our paper about issues with data and how to prevent and correct errors came out just after our letter to the editor attempting to correct a misanalyzed cluster randomized trial was dismissed by the authors. Perhaps had we been able to share our article describing these errors and a variety of ways to prevent and correct them, the response would have been different.
Given our experiences with other attempts to correct the literature, though, that is doubtful.
In our letter, we lay out that the authors prespecified the appropriate analysis, ignored the correct analysis for certain comparisons in their paper, and then used the misanalyzed results as the focus for their press-releases. Although we offered to work with the authors in private because we believe collaboration can be more constructive than letters back and forth, the authors turned us down. In turn, their reply to our letter did not acknowledge their error. In fact, they doubled down by claiming that because others made the mistake in the past it was okay. Ted Kyle with ConscienHealth describes it more.
Our PNAS article describes challenges such as these and others we faced when trying to identify and correct the literature.
This recent exchange of letters is reminiscent of others, so we should not have been surprised. In another case we experienced, the authors suggested using proven false methodology, and the editor responded with telling readers they can decide for themselves which to believe.
The thing is, the authors’ and journals’ responses are not that atypical. The errors we are trying to correct are based on sound statistical principles (as close to ‘fact’ as one can get), not just opinions on what method we think is better or what color we find most appealing. And yet it is uncommon to be taught about these methods and errors. Add to this that there is not yet a culture of acknowledging mistakes when they are made and working to fix them, compounded by journals that are not sure how to proceed with error correction even as members of the Committee on Publication Ethics, and we have major road blocks to the self-correcting nature of science.
We hope that our new discussion of issues with data analyses make some of these philosophies accessible and that it encourages focusing on improvements in the scientific evidence base above other interests and desires. Much of what we laid out is not ground breaking or brand new philosophies, but we hope our perspectives help reach a new audience, or even an existing audience but in a different way.