Newswise — A rapid rise in the number of academic articles being published could undermine public trust in science, warns an international study involving the Institute of Environmental Science and Technology of the Universitat Autònoma de Barcelona (ICTA-UAB).

The number of articles published worldwide rose from about 1.9 million per year in 2016 to a stunning 2.8 million in 2022 – an increase of 47% – despite little change in the number of scientists.
This surge has attracted widespread comment, but the new study provides detailed analysis of the situation. It uses data on publisher growth, processing times of articles and “citation behaviours” (articles referencing each other).

The study, involving ICTA-UAB researcher Dan Brockington, finds that certain publishers, such as Multidisciplinary Publishing Institute (MDPI) and Elsevier, have “disproportionately hosted” this growth – and sets out ways to address the issue.

“The vital contribution of this paper is that it provides comparative data across multiple publishing houses, that accounts for the vast majority of indexed papers and journals. This makes it possible to see whether any publishers are behaving unusually, or if there are sectoral shifts at play”, says Brockington.

“Public trust in science depends on science being done properly,” says Dr Mark Hanson, from the University of Exeter. “That means articles should be properly peer-reviewed, which takes time. It means some articles will be rejected, then either revised and improved or sent back to the drawing board.

Their findings suggest that for some publishers that is not happening. That is bad for public trust in science because those articles clearly are not all being treated with normal standards of rigour.

“But a crucial finding is that this is not simply a consequence of more open access publishing. There are open access publishers who are not increasing their content so dramatically. Rather it is about the sort of business model in which open access publishing is embedded”, Brockington adds.

We’ve got issues
One publishing house featured prominently in the work is Multidisciplinary Publishing Institute (MDPI). MDPI has been behind about 27% of the growth added to the system since 2016, though it is not alone.

Publishers like MDPI and Frontiers have enabled this growth by creating numerous “special issues” which publish articles with reduced turnaround times.

Special issues – also called “topics” or “collections” – focus on a particular topic, and traditionally arise from a conference or a pressing scientific subject.

However, the spike in special issues has been accompanied by changes in the definition of the term. Certain publishers took that label and removed the meaning of the word ‘special’.
“Special issues work differently from normal research. Instead of authors submitting their work for peer review, guest editors are chosen to produce a special issue, and they can invite whoever they choose to write an article”, Dan Brockington says.

That is similar to the way things have traditionally worked, but in the new model very few articles are rejected, and peer review happens very rapidly.

The study found that MDPI had an average turnaround time of about 37 days, a fraction of other publishing groups. This low turnaround time was highly consistent across its journals.
Researchers highlight that, from submission to acceptance, one cannot properly peer-review most complex scientific papers in 37 days.

The remarkable shift in some publishing houses, across multiple journals, raises questions about the freedom of journal editors. “How is it possible for editors to remain ultimately responsible for what gets published if so many journals are changing their behaviour in the same way?”, asks Brockington.

Impact inflation
The sudden rise in the number of articles published has created what the authors call “impact inflation”.
The “impact” of a journal is based on measures including citations: if a journal’s articles are commonly cited by others, the journal is seen to have a high impact. That’s important for authors because journal impact is used to determine who gets grants and funding.

The new study also reveals high rates of “self-citation” (papers referencing other papers from the same publisher) in MDPI journals, which has drastically raised those journals’ profiles.
Commenting on how the situation might be addressed, Dr Hanson said: “Researchers face pressure to ‘publish or perish’ to be competitive for funding applications. While we highlight some groups, it’s really sector-wide. The funding bodies and regulatory groups will need to step in and define the line, then say who’s gone past it.”

“We need far more transparency about academic publishers if we are ever going to govern their behaviour effectively,” says Dan Brockington. ”The current system is dysfunctional. It is not working. But we will not know what will work better without clearer and more readily available data.”