Monday, 8 May 2023

Article that assessed MDPI journals as “predatory” retracted and replaced

 Source: https://retractionwatch.com/2023/05/08/article-that-assessed-mdpi-journals-as-predatory-retracted-and-replaced/

Article that assessed MDPI journals as “predatory” retracted and replaced

A 2021 article that found journals from the open-access publisher MDPI had characteristics of predatory journals has been retracted and replaced with a version that softens its conclusions about the company. MDPI is still not satisfied, however. 

The article, “Journal citation reports and the definition of a predatory journal: The case of the Multidisciplinary Digital Publishing Institute (MDPI),” was published in Research Evaluation. It has been cited 20 times, according to Clarivate’s Web of Science. 

María de los Ángeles Oviedo García, a professor of business administration and marketing at the University of Seville in Spain, and the paper’s sole author, analyzed 53 MDPI journals that were included in Clarivate’s 2018 Journal Citation Reports. 

Oviedo García assessed each journal by eight criteria associated with predatory publications, including self-citation. She also compared the MDPI journals to the journals with the highest impact factor in their subject category. The original abstract described her findings like this: 

The formal criteria together with the analysis of the citation patterns of the 53 journals under analysis all singled them out as predatory journals. 

Soon after the paper was published in July 2021, MDPI issued a “comment” about the article that responded to Oviedo García’s analysis point by point. The comment called out “the misrepresentation of MDPI, as well as concerns around the accuracy of the data and validity of the research methodology.”

In September 2021, Research Evaluation published an expression of concern about the article that stated: 

The journal and publisher have been alerted to concerns about this article, and an investigation is in progress. In the interim, we alert readers that these concerns have been raised.

The article was retracted and replaced with a revised version earlier this month. The notice, which is labeled as a “correction,” stated that the replacement addressed:

concerns about conclusions drawn in the article. The conclusions in the updated article are reached based on cited sources.

We asked Oviedo García what led to the retraction and replacement, and she told us: 

In a nutshell, after the publication of the article both the journal’s editors and the publisher received communications raising concerns about it. Then, that original version was revised and have been now published replacing the old version of the article. 

Oviedo García told us that she did not “have full details” about who raised concerns about the article. “The revision of the article was a joint work between the publisher and myself,” she said. 

Thed van Leeuwen, a senior researcher at the Centre for Science and Technology Studies of Leiden University in the Netherlands, and an editor of Research Evaluation, has not responded to our request for comment. 

Language throughout the article was changed to describe the findings less definitively. (See a comparison we created here.) The sentence in the abstract we quoted above now states that the analysis of the 53 journals “suggest[s] they may be predatory journals.” 

Critical language remains in the new version, such as this discussion of the MDPI journals’ huge and increasing number of special issues:  

The fact that the number of special issues in JCR-indexed MDPI-journals is so much higher than the number of ordinary issues per year coupled with their constant increase since 2018 inevitably awakens suspicions of a lucrative business aim. 

The revision removes some references to MDPI’s temporary inclusion on librarian Jeffrey Beall’s list of “potentially predatory” publishers, along with links to an archived version of the list, which was taken down in 2017.  

The revised version also includes additional caveats about the limitations of the work, such as which analyses Oviedo García did not conduct on the control group of top-ranked journals. 

For instance, rather than the original statement that the uniformly short review times at MDPI journals were “highly questionable,” the paper now states:

As such the question arises whether or not this speed is achieved with a thorough peer review in line with editorial and publishing best practices or if the rigor and quality of the peer review process is compromised in order to achieve these speeds. It is beyond the scope of this research to answer that question based on the analysis conducted, further research is needed to address this key question. 

A new paragraph in the section discussing the article’s limitations calls for further research to compare MDPI journals with other journals with similar impact factors, rather than the top-ranked journal in the subject area, as Oviedo García had done. MDPI’s comment on the article specifically called the comparison of its journals to those with the highest impact factors “flawed,” among many other critiques.

After Research Evaluation published the expression of concern about the paper, MDPI contacted Oxford University Press, the journal’s publisher, to follow up on the status of the article multiple times, said Giulia Stefenelli, chair of MDPI’s Board of Scientific Officers. But, Stefenelli told us, “We did not receive any response from OUP that showed progress on the handling of this paper, nor did we receive an update when this paper was retracted and replaced by a revised version.”

Stefenelli expressed dissatisfaction with the journal’s process, and the republished version of the paper:  

We were expecting more transparency and communication, and to be informed at key stages of the progression of the investigation. As we have demonstrated, the original article contained serious flaws in its methodology, which have yet to be addressed or corrected. This was highlighted in our initial comment on the article (https://www.mdpi.com/about/announcements/2979).

We would expect more details to be made publicly available for the readers so they can understand why this article was retracted and corrected. We found no track record of the original version retracted nor record about the corrections that have been made. In this case, we question OUP’s processing of this retraction/correction, as it would appear to be against COPE guidelines and standard practices.

The article sends a strong message affecting MDPI directly and ultimately begs the question, is this an article the public can trust? It is important to note that a retraction is issued when there is clear evidence that the findings are unreliable, as a result of major error. The fact that this article was retracted raises questions about the details of the significant changes made in order for it to be republished.

Like Retraction Watch? You can make a tax-deductible contribution to support our work, follow us on Twitter, like us on Facebook, add us to your RSS reader, or subscribe to our daily digest. If you find a retraction that’s not in our database, you can let us know here. For comments or feedback, email us at team@retractionwatch.com.

No comments:

Post a Comment