Published in A blog by Ross Mounce

So I read Jeffrey Brainard’s piece in Science magazine on Clarivate’s decision to punish eLife for innovating – by stripping eLife of a proprietary Journal Impact Factor™ number, that Clarivate itself awards (sidenote: to be clear, I see no value in Journal Impact Factors as they are statistically illiterate, irreproducible, and easily gameable, amongst many other issues that have long been documented). With the

References

Recalibrating the scope of scholarly publishing: A modest step in a vast decolonization process

Published in Quantitative Science Studies

Abstract By analyzing 25,671 journals largely absent from common journal counts, as well as Web of Science and Scopus, this study demonstrates that scholarly communication is more of a global endeavor than is commonly credited. These journals, employing the open-source publishing platform Open Journal Systems (OJS), have published 5.8 million items; they are in 136 countries, with 79.9% in the Global South and 84.2% following the OA diamond model (charging neither reader nor author). A substantial proportion of journals operate in more than one language (48.3%), with research published in 60 languages (led by English, Indonesian, Spanish, and Portuguese). The journals are distributed across the social sciences (45.9%), STEM (40.3%), and the humanities (13.8%). For all their geographic, linguistic, and disciplinary diversity, 1.2% are indexed in the Web of Science and 5.7% in Scopus. On the other hand, 1.0% are found in Cabell’s Predatory Reports, and 1.4% show up in Beall’s (2021) questionable list. This paper seeks to both contribute to and historically situate the expanded scale and diversity of scholarly publishing in the hope that this recognition may assist humankind in taking full advantage of what is increasingly a global research enterprise.

Citation indexes, Open Science, journal publishing, Africa, bibliodiversity

Does the African academy need its own citation index?

Published
Authors

Why does being indexed in Web of Science and Scopus matter so much for African journals? And why is getting indexed so difficult? In this paper we revisit the history of the first citation index, the logic behind its highly selective coverage, its persistent under-representation of Africa’s journals, and its symbolic importance for many researchers. We ask if the solution is to create an alternative African citation index, or if there are other ways to promote the visibility and findability of African journals. Garfield’s company, the Institute for Scientific Information (ISI), launched the first Science Citation Index in 1963. An analysis of extant ISI documentation shows that 95% of the first 615 science journals indexed were published in Europe and America. This coverage changed author behavior and journal choices and reinforced existing status hierarchies. Garfield later justified journal selection decisions mathematically, defending his decision to prioritize ‘core’ journals, and coining what he called ‘Garfield’s law of concentration’. In 1973, the ISI launched a Social Science Citation Index, indexing 1000 core social science journals, again with no African representation. In the 1990s, the indexes were digitized, allowing their data to be mined. The creation of university rankings amplified the reputational importance and commercial value of the indexes. Today Web of Science and Scopus continue to use the ‘citedness’ of candidate journals by journals already within the index to inform selection decisions. As a result journals published in the global peripheries, in small fields, or in languages other than English, struggle to get indexed. In 2023, if one excludes South Africa, only around 60 of the 30,000 plus journals indexed in Web of Science were published from Sub Saharan Africa. One response is to create an alternative Africa-focused journal index and database. We end by describing the history of attempts to create such an index, including current initiatives. Another is to promote the international visibility and findability of African journals through the provision of high-quality metadata, the use of DOIs and hosting on international portals.

The Great Curve II: Citation distributions and reverse engineering the JIF

There have been calls for journals to publish the distribution of citations to the papers they publish (1 2 3). The idea is to turn the focus away from just one number – the Journal Impact Factor (JIF) – and to look at all the data. Some journals have responded by publishing the data that underlie the JIF (EMBO J, Peer J, Royal Soc, Nature Chem). It would be great if more journals did this.