thoughts for a meeting later this afternoon…
With journal rankings, the development of body mass index (BMI) and insurance I find to be a relevant reference i.e. it was useful commercially to have a measure which would then be applied by insurers that increased the adoption of it rather than the value to health research and helping humans improve their health.
Issues and points of interest with journal ranking include:
Positive: For many disciplinary journals, at least for academic authors, there is greater
kudos in tenure and promotional terms from publishing in one of the disciplinary journals,
as these appear in academic ranking tables, whereas ICT4D journals tend not to.
Neutral: Much eroded since the move from paper publication to online search
accessibility, it is still likely that different journals reach somewhat different audiences.
Negative: It is likely that rejection rates are higher in disciplinary journals. Even if the
paper is accepted, the time and effort required to achieve publication will be higher than
in ICT4D journals, despite refereeing…This means that (notwithstanding access
programmes such as INASP’s PERii) they are less accessible to audiences outside
industrialized country academia; yet such an audience is potentially a prime one for
ICT4D writing, particularly for those seeking to impact policy and practice. Very much
related to this, the non-citation-based impact of items published in open access journals is
likely to be higher than for subscription journal publication.6
Two interesting papers:
An interesting perspective through a Marxist lens. It also reflects what is being measured and its value as per BMI example:
…the ranking list forces the intellectual laborer into the role of the “maximized worker,” i.e., the list represents the model to which the worker must conform to in order for the institution to insure a baseline margin of profitability…If the researcher is required to publish her work in a limited number of outlets, her work must conform to, and to some degree be shaped and limited by, the concerns of those outlets…while the exposed ideology of ranking lists runs counter to the professed motivations of science, it functions correctly as a means through which capitalist society successfully reproduces labor power and perpetuates the status quo (Althusse,2001)9
So with existing pressures on academic staff to publish research, will using ranking and indices be more beneficial either for additional status (staff, dept, institution) or additional value to learning technology as a field. Except that learning technology has many fields as per the spreadsheet – psychology, computing, education. How could statisticians and economists wanting adoption of alternative methods of ranking either positively or negatively influence ‘knowledge’ and its usefulness.
For staff looking to collaborate with others in developing countries, how much additional time and resource will be needed either to publish open-access or in a subscription based journal, particularly to influence policy or practice in each of the countries. For example collaborators are not simply researching in a developing country but also learning about how their research study is perceived and evaluated in the developing country as ‘useful’…
1. Quinn S et al (2011-2018) H Index – Wikipedia, available at https://en.wikipedia.org/wiki/H-index
2. Weinberg J, Brooks T et al (2015) Journal Rankings — Useful? (guest post by Thom Brooks), available at http://dailynous.com/2015/08/03/journal-rankings-useful-guest-post-by-thom-brooks/
3. Stringer et al (2008), Effectiveness of Journal Ranking Schemes as a Tool for Locating Information, Plos One, available at http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0001683, Heeks R (2010) pp5-7, An ICT4D Journal Ranking Table, available at https://www.academia.edu/3107601/An_ICT4D_Journal_Ranking_Table
4. Xia J (2011) p5 Positioning Open Access Journals in a LIS Journal Ranking, College and Research Libraries 73:2, available at https://crl.acrl.org/index.php/crl/article/view/16216
5. Perkins, Ross A.; Lowenthal, Patrick R.. Open access journals in educational technology: Results of a survey of experienced users. Australasian Journal of Educational Technology, [S.l.], v. 32, n. 3, july 2016. ISSN 1449-5554. Available at: https://ajet.org.au/index.php/AJET/article/view/2578, doi: https://doi.org/10.14742/ajet.2578
6. Heeks R (2010) pp5-7, An ICT4D Journal Ranking Table, available at https://www.academia.edu/3107601/An_ICT4D_Journal_Ranking_Table
7. Theoharakis, Vasilios and Skordia, Mary (2003). How do statisticians perceive statistics journals? American Statistician, 57 (2), pp. 115-124 available at https://amstat.tandfonline.com/doi/abs/10.1198/0003130031414
8. Kodrzycki Y, Pingkang Y (2006) New Approaches to Ranking Economics Journals, The B.E. Journal of Economic Analysis & Policy 5:1, available at https://www.degruyter.com/view/j/bejeap.2005.5.issue-1/bejeap.2006.5.1.1520/bejeap.2006.5.1.1520.xml?rskey=1lBxlT&result=1&q=Ranking+Economics+Journals
9.Bales S et al (2011), Journal-Ranking Lists, Ideology, and the Academic Librarian: A Critical Analysis, available at https://www.academia.edu/32067476/Journal-Ranking_Lists_Ideology_and_the_Academic_Librarian_A_Critical_Analysis