How to Compile a List of Standard Journals

How should a university department that is genuinely interested in improving the state of research prepare the list of standard journals?

By Ashok R. Chandran
Media (Kochi)

Last month, this column brought under the scanner the window-dressing approach and low-grade tactics that university academics adopt to survive (or thrive!) without seriously doing research.1 We saw how the absence of good research was papered over by polluting the ‘list of standard journals’ used in research assessment.

But what if a university department is genuinely interested in improving the state of research? How should it prepare the list of standard journals? This month’s column constructively proposes a method that journalism departments in Kerala’s universities can consider adopting.

Basics

Improving the research culture takes time and requires the department to implement a research plan systematically across 5–10 years. Once the faculty decide to strengthen the research orientation of the department, an early step would be to compile a list of topics that are at present researched in the department and likely to be studied in the next 3–5 years by the faculty themselves and research scholars. With such a list of topics in hand, the list of standard journals can be prepared in a month or two, in the following manner.

Method

Step 1

Pick two or three top journals in the Communication category in the latest Journal Citation Report (JCR), a widely used resource for evaluating journals.

The selected ones need not necessarily be the top three journals in the Communication category; they could be journals among the top 20 in the Communication category in JCR, and those that publish on topics in the department’s research plan for the next 3–5 years. Even if it may not be immediately possible to get published in these top journals, Step 1 is important to keep the department’s goal in focus. The selected ones would serve as ‘aspirational journals’ and signal the department’s goal—where the faculty desire and hope to publish in 5–10 years.

Step 2

Add 3–4 highly respected academic journals from India in Communication or Journalism.

Since hardly any Communication or Journalism journal from India figures in the JCR or the Social Sciences Citation Index (SSCI), Step 2 would call for going beyond those lists of journals. So long as the selected journals are truly respected nationally, a department would be justified in including them in the list of standard journals.

How can the faculty go about this task? Consultation or survey among peers can be one initiative. For integrity, however, it must be supplemented by a more objective evaluation of each journal by the department’s faculty. Such an evaluation exercise would also raise awareness among faculty and serve as an excellent orientation into the world of contemporary scholarship.

In this, one can tweak the criteria used to select journals for the SSCI. Broadly, these would include adherence to basic publishing standards (timeliness, editorial conventions, peer review), assessment of editorial content (whether or not the journal publishes topics of research interest to the department), test of international or national diversity (among contributing authors, editors, and advisory board members), and citation analysis (how often a journal gets cited in other international or national journals).2 If a regional journal with significant impact (i.e., which gets repeatedly cited in national and international journals) emerges from India, that too can be included.

A department must resist the temptation to include its home journals simply because they are home journals. The evaluation of a home journal should be using the same criteria (and as rigorous) as for other national journals in Communication or Journalism.

Step 3

Identify 5–10 good journals from India or abroad in the humanities and social sciences—in history, political science, or sociology, or interdisciplinary areas like development studies.

The selection process could make use of the JCR and SSCI, or be similar to the one used in Step 2, and as always keeping current and future research topics of the department firmly in sight.

This would help faculty to publish in good journals and win peer recognition in their specialised or allied areas of research. For example, if one faculty member plans to build her research profile in the next decade by studying various aspects of women in Kerala journalism, the department can include the Indian Journal of Gender Studies, which figures in the JCR and SSCI. Other Indian social science journals in the latest JCR are: Contributions to Indian Sociology; Indian Economic and Social History Review; Journal of South Asian Development; and Science, Technology and Society.

Strengths and Limitations

The three-step method’s major advantage is that the resulting list will contain only good, peer-reviewed journals.

Moreover, it will reflect a healthy mix of idealism and realism. At one end, the top-rung journals in the list will reflect the laudable direction in which research in the department is headed (what should be achieved in the long term); journals at the other end would signal the minimum benchmarks held by the faculty, and the current or near future level of research in the department (what can be achieved in the medium term). In the first two years, it is quite likely that none of the faculty get published in any of the standard journals, but that would be the price paid for decades of neglect.

The method outlined here is flexible in that it can be used with different tools if a department wishes to. For example, in the place of Thomson Reuters’ JCR, a department can opt to use the more easily accessible SCImago journal rankings (which uses Elsevier’s Scopus database). Down the road, a combination of metrics can be used to offset the weaknesses of any particular set of journal rankings.

This is not to suggest that the proposed method is perfect. It is possible for an article to win peer recognition (i.e., get heavily cited nationally or internationally) after being published in an unknown journal. In such a case, the list of standard journals compiled using the method here would disappoint the scholar concerned. But such an event being rare, if it does happen, the method can be modified at that juncture to accommodate one-off cases.

A few may criticise the method outlined here for privileging peer review even though the peer review system itself is not foolproof. After all, it was only two months ago that leading journal publisher SAGE retracted 60 articles from a peer-reviewed journal after the publisher investigated and busted a peer-review ring of fake IDs. Indeed, the peer review system is not perfect, but much of the academic world continues to prefer peer review partly because a superior solution is yet to emerge. Here is what folks at Nature, one of the most prestigious journals in the world, say about this: ‘We are continually impressed with peer review’s positive impact on almost every paper we publish. Even papers that are misunderstood by reviewers are usually rewritten and improved before resubmission. Mistakes are made, but peer review, through conscientious effort on the part of referees, helps to protect the literature, promote good science and select the best. Until a truly viable alternative is provided, we wouldn’t have it any other way.’3

That said, we all know that any single method or measure is unlikely to satisfy all situations. The method outlined here is for preparing a list of standard journals for incentivising good research in a department that wishes to upgrade itself on the research front. The method outlined and the tools mentioned are not for evaluating the research contributions of a scholar. Research assessment is a wider topic for which other criteria too have to be framed, and tools like the H-Index or a combination of such metrics might have to be used.

Web extra

Links updated January 2023.

Photo by Becca Tapert on Unsplash

Footnotes

  1. Ashok R. Chandran, ‘When a University Bored of Studies Sets the Standards,’ Media, August 2014, pp 36–38. []
  2. For more information on each criterion, see Jim Testa, ‘The Thomson Reuters Journal Selection Process,’ May 2012. []
  3. Peer Review Policy’, Nature []