learning technology journal rankings

thoughts for a meeting later this afternoon…

With journal rankings, the development of body mass index (BMI) and insurance I find to be a relevant reference i.e. it was useful commercially to have a measure which would then be applied by insurers that increased the adoption of it rather than the value to health research and helping humans improve their health.

Issues and points of interest with journal ranking include:

    • validity of h index across different disciplines where articles from older / more established journals, citations and number of them may be more academically ‘useful’ than others in terms of potential career enhancement for an academic; or just contributing something new, interesting and useful to a discipline1
    • with positive and negative citations, the number of citations as a measure is not necessarily an indicator of quality in itself e.g. if citations not included from books, chapters or conference proceeding for example 2
    • difference between e.g. ISI journal citations report and impact factor, databases such as Web of Science, Web of Knowledge, Scopus, SCIMAGO rankings, Google Scholar and to some extent – websites like ResearchGate and Academia.Edu3
    • Web of Science and Google Scholar do not properly handle articles which have non English characters or diacritics 4
    • open access still seen as inferior to subscription models even if open access journals charge fees to authors for publication – based on e.g. academics submitting flawed papers which have still been published without sufficient peer review 5
    • open access publication vs subscription journals included / excluded from indices such as ISI or general versus specialist e.g. in an ICT for Development study, they found

Positive: For many disciplinary journals, at least for academic authors, there is greater
kudos in tenure and promotional terms from publishing in one of the disciplinary journals,
as these appear in academic ranking tables, whereas ICT4D journals tend not to.
Neutral: Much eroded since the move from paper publication to online search
accessibility, it is still likely that different journals reach somewhat different audiences.

Negative: It is likely that rejection rates are higher in disciplinary journals. Even if the
paper is accepted, the time and effort required to achieve publication will be higher than
in ICT4D journals, despite refereeing…This means that (notwithstanding access
programmes such as INASP’s PERii) they are less accessible to audiences outside
industrialized country academia; yet such an audience is potentially a prime one for
ICT4D writing, particularly for those seeking to impact policy and practice. Very much
related to this, the non-citation-based impact of items published in open access journals is
likely to be higher than for subscription journal publication.6

  • An analysis of statisticians found variability in perception of journals according to geographical origin, research interests and employment type7
  • An analysis of economics journals and impact found differences in the impact and influence when looking beyond the impact factor for economics as a discipline compared to economics more broadly in social sciences 8

Two interesting papers:

  1. Reviews the effectiveness of journal ranking and proposes a new model is in Plos One at http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0001683
  2. A specific learning technology and open access publication in Australasian Journal of Educational Technology, 2016, 32(3) Perkins & Lowenthal at https://ajet.org.au/index.php/AJET/article/view/2578, Patrick Lowenthal has done a lot of related work in this area, available through his website

An interesting perspective through a Marxist lens. It also reflects what is being measured and its value as per BMI example:

…the ranking list forces the intellectual laborer into the role of the “maximized worker,” i.e., the list represents the model to which the worker must conform to in order for the institution to insure a baseline margin of profitability…If the researcher is required to publish her work in a limited number of outlets, her work must conform to, and to some degree be shaped and limited by, the concerns of those outlets…while the exposed ideology of ranking lists runs counter to the professed motivations of science, it functions correctly as a means through which capitalist society successfully reproduces labor power and perpetuates the status quo (Althusse,2001)9

So with existing pressures on academic staff to publish research, will using ranking and indices be more beneficial either for additional status (staff, dept, institution) or additional value to learning technology as a field. Except that learning technology has many fields as per the spreadsheet – psychology, computing, education. How could statisticians and economists wanting adoption of alternative methods of ranking either positively or negatively influence ‘knowledge’ and its usefulness.

For staff looking to collaborate with others in developing countries, how much additional time and resource will be needed either to publish open-access or in a subscription based journal, particularly to influence policy or practice in each of the countries. For example collaborators are not simply researching in a developing country but also learning about how their research study is perceived and evaluated in the developing country as ‘useful’…

1. Quinn S et al (2011-2018) H Index – Wikipedia, available at https://en.wikipedia.org/wiki/H-index
2. Weinberg J, Brooks T  et al (2015) Journal Rankings — Useful? (guest post by Thom Brooks), available at http://dailynous.com/2015/08/03/journal-rankings-useful-guest-post-by-thom-brooks/
3. Stringer et al (2008), Effectiveness of Journal Ranking Schemes as a Tool for Locating Information, Plos One, available at http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0001683, Heeks R (2010) pp5-7, An ICT4D Journal Ranking Table, available at https://www.academia.edu/3107601/An_ICT4D_Journal_Ranking_Table
4. Xia J (2011) p5 Positioning Open Access Journals in a LIS Journal Ranking, College and Research Libraries 73:2, available at https://crl.acrl.org/index.php/crl/article/view/16216
5. Perkins, Ross A.; Lowenthal, Patrick R.. Open access journals in educational technology: Results of a survey of experienced users. Australasian Journal of Educational Technology, [S.l.], v. 32, n. 3, july 2016. ISSN 1449-5554. Available at: https://ajet.org.au/index.php/AJET/article/view/2578, doi: https://doi.org/10.14742/ajet.2578
6. Heeks R (2010) pp5-7, An ICT4D Journal Ranking Table, available at https://www.academia.edu/3107601/An_ICT4D_Journal_Ranking_Table
7. Theoharakis, Vasilios and Skordia, Mary (2003). How do statisticians perceive statistics journals? American Statistician, 57 (2), pp. 115-124 available at https://amstat.tandfonline.com/doi/abs/10.1198/0003130031414
8. Kodrzycki Y, Pingkang Y (2006) New Approaches to Ranking Economics Journals, The B.E. Journal of Economic Analysis & Policy 5:1, available at https://www.degruyter.com/view/j/bejeap.2005.5.issue-1/bejeap.2006.5.1.1520/bejeap.2006.5.1.1520.xml?rskey=1lBxlT&result=1&q=Ranking+Economics+Journals
9.Bales S et al (2011), Journal-Ranking Lists, Ideology, and the Academic Librarian: A Critical Analysis, available at https://www.academia.edu/32067476/Journal-Ranking_Lists_Ideology_and_the_Academic_Librarian_A_Critical_Analysis

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: