What’s in a word?

In the scientific community, the word ‘retraction’ carries with it a pervasive stigma, often conflated with the idea of an academic death penalty. Retractions, or the pulling of a paper from publication, can tarnish a researcher’s reputation, call into question the legitimacy of a lifetime of work, and dismantle careers.

Outside of the personal realm, retractions alter public perceptions of science.

Last month, the Retraction Watch blog released a Retraction Watch Database. The database is a comprehensive list of retractions in scientific journals since 1923. Out of the 18,000 retractions and notes available on the database, 63 are affiliated with U of T.

What constitutes a retraction?

Science prides itself on being self-correcting, and retractions are a powerful mechanism for that self-correction.

When errors are relatively minor and restricted to a small portion of a publication, a complete withdrawal of the scientific finding is unnecessary and a correction may be issued.

The World Association of Medical Editors defines scientific misconduct as including the falsification, distortion, and omission of data; failure to report misconduct; and the destruction of information relevant to a publication.

Retractions are issued to correct the scientific literature and alert readers of the unreliable conclusions of a paper. According to the Committee on Publication Ethics (COPE), they are “[not] to punish authors who misbehave.”

Yet intent and outcome are not always synchronous. The closure of Toronto-based researchers Dr. Sylvia Asa’s and Dr. Shereen Ezzat’s labs, and the termination of their positions within the University Health Network (UHN), is evidence of how retractions can pose dire consequences to academics’ careers.

Retraction guidelines are inconsistent and could be misinterpreted

Husband-wife duo Asa and Ezzat account for four of U of T’s retractions listed on the Retraction Watch Database. Asa and Ezzat’s cases of scientific misconduct made headlines in the Toronto Star in 2015 and 2016.

They were found responsible for scientific misconduct in the form of material non-compliance. They failed, as principal investigators, to disclose alterations to images and provide preliminary data that matched the published ones in a number of cases of published work.

Asa lost her position as the head of UHN’s Laboratory Medicine Program, the largest program of its kind in Canada, and the UHN imposed sanctions against Asa and Ezzat.

Regarding her 2002 paper, which was one of her articles that was later retracted, Asa told The Varsity that “this was a paper that was almost five years of work. Most of my research starts with a clinical problem, and one of the things I’ve studied is pituitary [tumours].”

“[UHN] claim that two images [of the electrophoresis gels] came from the same one and had been manipulated,” said Asa. “The fact is that we had all the raw data, we had all the original data.”

“Nothing changes anything in that paper, based on the fact that the image was wrong. Patients who have pituitary tumours, for all the people who were involved in the research, all the work that we did is still true,” said Asa. “The results of that paper are no different today.”

The journals in which Asa published her findings were alerted to the irregularities in her research via an outside source.

UHN opened an investigation into Asa’s publications as a result of these allegations, and implicated the pathobiologist in the fabrication and falsification of images. These allegations were challenged in court by Asa and Ezzat, where it was ultimately found that they could not prove who tampered with the images, based on the evidence.

Asa told The Varsity that she felt targeted by the retraction process.

“The retraction process is interesting. It’s definitely necessary. But it has limitations… There have been mechanisms put in place in a lot of different parts of the world, to be more objective and have more standardized criteria for how an investigation is done,” said Asa.

But in a case almost identical to Asa’s, a Montréal researcher was given the opportunity to issue a correction instead of having to retract the entire article.

Cases like this demonstrate the wildly different implementation of retraction guidelines across institutions.

An article in Science suggested that this may be because it is ultimately up to the editors and institutions to determine whether the paper is withdrawn, as COPE only provides guidelines to clarify when a paper should be retracted.

In addition, a study in BMJ Open revealed that retraction notices did not adhere to COPE guidelines in BioMed Central journals. In 11 per cent of retracted articles, the reason for retraction was unclear — six per cent did not state who was retracting the article, while four per cent were retracted simply because not all authors were aware of the paper submission.

The stigma around retractions

A common misconception is that a retraction is invariably associated with data fabrication or scientific misconduct. Yet, of the 63 U of T affiliated papers listed on the Retraction Watch database, only seven are listed for misconduct and eight are listed for fabrication. Fourteen publications have been retracted due to errors in data, attributed to honest error.

Dr. Peter Jüni, Director of the Applied Health Research Centre at St. Michael’s Hospital and professor in the Department of Medicine, has co-authored such a paper.

The publication, a network meta-analysis on the effectiveness of nonsteroidal anti-inflammatory drugs for treating osteoarthritis, was published in March 2017 and retracted in July 2017.

A network meta-analysis compares multiple treatment interventions for a condition directly, using existing comparisons of the interventions in published trials, and indirectly, across different trials.

According to Jüni, research assistants in his team had unknowingly incorporated a duplicate article that had “slightly different results extracted twice” to build their meta-analysis.

“The authors published twice, but they didn’t make it clear that these are the results describing the same population with light differences,” said Jüni. “My colleagues decided to re-run the analysis… and eliminate the duplicate article and… add two new articles that were brought up by colleagues.”

“Now if you include all of those… in an integrated analysis… your numbers will change very slightly,” explained Jüni. “The conclusions of the paper didn’t change at all.”

Jüni recognized the duplicated paper, and the authors were alerted to the two missed trials via colleagues in Ottawa.

Although the error was minimal in nature, The Lancet and the authors agreed it was more feasible to retract and republish the article, as the error ran through different parts of the results and several portions of the paper.

Jüni recognized a flaw in the retraction process that could be exacerbated by the associated stigma of retraction.

“If this is not indexed properly, which was happening at the beginning — the National Library of Medicine just pointed to the retracted article, but it was not clear in PubMed or Medline that this was basically paired with a republication — then it could mean potential questions regarding your reputation,” said Jüni. “The question is then, should we call it differently?”

“Would I prefer to have another label associated with it? Yes, because of the associated stigma — but I don’t think it will happen and I think the important part is that the indexing system changes their way of reporting it. It’s not optimal, but honestly, I can live with it. And obviously I have to live with it,” continued Jüni.

Despite the sting of retractions and the potential fallout, Jüni believes that researchers have an obligation to self-report mistakes.

“You need to live as a leader, in a culture where everybody admits [they don’t] know or [made] a mistake. I need to start with that as the Director of Applied Health Research — if I don’t live it, my people don’t dare admit mistakes. We need that to make research better. That’s part of the quality assurance process.”

It is clear that the retraction process is flawed — it holds too much stigma, does not implement guidelines consistently, and fails on many occasions to communicate to the public the reasons for paper withdrawal. However, it is currently the only system we have to correct the literature and protect scientific endeavours.

What implications do retractions have for scientific research?

Trudo Lemmens, professor and chair of the Department of Health Law and Policy at the Faculty of Law, believes that the increase in the number of retractions may be due to a growing concern around scientific integrity due to a growth in scientific publications over the years.

Science reports that an increase in retractions could be attributed to more comprehensive oversight from scientific journals. Though editorial practices differ from journal to journal, a rise in retractions hints at stricter editorial practices.

In 2009, COPE published guidelines that suggest a publication should be retracted if the findings are unreliable due to scientific misconduct, plagiarism, duplication, or honest error. By 2015, these guidelines were adopted by two-thirds of 147 high-impact journals, and have helped standardize the retraction process.

Editor’s Note (November 27): A previous version of this article incorrectly suggested that Jüni’s assistants were the ones to discover the duplicate.