Academic journal

From Citizendium, the Citizens' Compendium
Jump to: navigation, search
This article is developed but not approved.
Main Article
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
This editable Main Article is under development and not meant to be cited; by editing it you can help to improve it towards a future approved, citable version. These unapproved articles are subject to a disclaimer.
In relation to the natural sciences, see Scientific journal
The cover of an academic journal, depicting its title and the essential bibliographic metadata of this particular issue (volume, number, year of publication, publisher), as well as some further information (e.g. that this journal is published Open Access).
(CC) Image: Bollen et al., 2009
A two-dimensional map of relations between the subjects of academic journals, based on clickstream data.

An academic journal is a regular peer-reviewed periodical that publishes scholarship relating to an academic discipline. An academic journal provides a place for the introduction and scrutiny of new research, and is often a forum for the critique of existing research. This is most often manifested in the publication of original research articles (research findings), review articles, and book reviews). Scientific journals and journals in the quantitative social sciences vary somewhat in form and function from journals in the humanities and qualitative social sciences. American and British systems of academic publishing are similar; other regions have somewhat different practices.

Scholarly articles

In academia, manuscript submissions are generally unsolicited. Professional scholars generally submit an article to a journal; then, the editor (or co-editors) determines whether to reject the submission outright (often on grounds of not being appropriate to the subject of the journal) or to send out the article for peer review. Peer review is often a double-blind system: the author does not know who the reviewers are and the reviewers do not know who the author is. In order to facilitate this double-blind system, authors eschew self-references in their articles, and their names never appear on any page headings of the manuscript.

The journal editor chooses the reviewers. There are usually two reviewers; a third is sometimes asked if the two disagree. In some fields, three reviewers is the norm. The opinions of these outside reviewers are used in the determination to publish the article, to return it to the author for revision, or to reject the article. (There are many variations on this process, discussed in the article on peer review). Even accepted articles are subject to further (often considerable) editing by the journal before publication. Because of this lengthy process, an accepted article will typically not appear in print until several months, at the very least, after its initial submission—several years is not unknown.

The process of peer review is generally considered critical to establishing a reliable body of research and knowledge. Scholars can only be expert in a limited area; they rely on peer-reviewed journals to provide reliable and credible research which they can build upon for subsequent or related research. As a result, significant scandal ensues when an author is found to have falsified the research included in a published article, as many other scholars, and more generally the field of study itself, have relied upon that research.

Review Articles

For more information, see: review article.

Review articles, often called "reviews of progress," serve as a check on the research published in the journals. Unlike research articles, review articles are usually solicited from long-standing experts in the field. Some journals are entirely devoted to review articles, others contain a few each issue, but most do not publish review articles at all. Such reviews often cover the research for the preceding year, some for longer or shorter periods; some are devoted to very specific topics, some to general surveys. Some are enumerative, with intent to list all significant articles in a subject. Others are selective, including what they think is worth including. Yet others are evaluative, aiming to give a judgment of the state of progress in the field. Some are published in series, covering each year a complete subject field, or covering a number of specific fields over several years.

Unlike original research articles, book reviews tend to be solicited, and are sometimes planned years in advance. Authors are often paid a few hundred dollars for such reviews. Because of this, the standard definitions of open access do not require review articles to be open access, although many are. They are typically relied on by students beginning a study in a field, or for current awareness for those already in the field.

Due to concerns about inconsistent quality of review articles[1][2], the systematic review article which used technqiues from meta-analysis has been developed. The systematic review forms the core of evidence-based medicine.

Book reviews

Book reviews of scholarly books serve as a check on the research published in book form. Unlike articles, book reviews tend to be solicited. Journals typically have a separate book review editor who determines which new books should be reviewed and by whom. If an outside scholar accepts the book review editor's request to review a book, he or she generally receives a free copy of that book from the journal in exchange for a timely and publishable review. Publishers or authors send books to book review editors in the hope that their books will be reviewed. The length and depth of reviews vary considerably from journal to journal. The extent to which textbooks and other non-scholarly books are covered also varies from journal to journal.


The prestige of an academic journal is established over time. It can reflect many factors, some but not all expressible quantitatively. Prestige is usually expressed in terms of the impact factor (a measure of popularity) but various alternatives are emerging.[3][4] The many alternatives to the impact factor have been compared.[5][6]

Impact factor

For more information, see: impact factor.

In the sciences, and the quantitative social sciences, impact factor is a convenient numerical measure, reflecting the number of later articles citing those articles already published in the journal. There are other possible quantitative factors, and there is question whether the number of citations is a best quantitative measure of prestige—See the discussion of impact factor. There is also a question of whether any quantitative factor can reflect true prestige. An excellent review on the pitfalls of impact factor is available [1]. In the Anglo-American humanities, there has not yet been a tradition (as currently exists in the sciences) of giving numerical prestige "values" to journals in schemes to quantify the relative importance of research (based on the number of references made to an article in other academic articles). Perhaps a key reason for this is the relative unimportance of academic journals in these field, as contrasted with the importance of academic monographs. Very recently, there have been some preliminary work towards determining the validity of such measurement [2], Faculty of 1000 [3].

Google Web-URL citations

Google Web-URL citations have been compared to the Impact factor.[7]

Faculty of 1000

For Biology articles, Faculty of 1000 is another ranking index for individual papers.


The H-index, also called the Hirsch index, is another alternative measure.[8] It is available online at and per the website is "an index that quantifies both the scientific productivity and the scientific impact of a journal (it is also applicable to scientists, countries...). The index is based on the set of the journal's most quoted papers and the number of citations that they have received in other publications."[9]


Ranking journals by the average quality of its papers has been proposed.[10]

Pagerank and its variations

Bollen et al. proposed using the PageRank algorithm used by Google to distinguish the "quality" of citations and hence improve Impact Factor calculation.[11] [12][13]

   ISI Impact Factor          PageRank            Combined
 1 52.28 ANNU REV IMMUNOL     16.78 NATURE        51.97 NATURE
 2 37.65 ANNU REV BIOCHEM     16.39 J BIOL CHEM   48.78 SCIENCE
 3 36.83 PHYSIOL REV          16.38 SCIENCE       19.84 NEW ENGL J MED
 4 35.04 NAT REV MOL CELL BIO 14.49 PNAS          15.34 CELL
 5 34.83 NEW ENGL J MED        8.41 PHYS REV LETT 14.88 PNAS
 6 30.98 NATURE                5.76 CELL          10.62 J BIOL CHEM
 7 30.55 NAT MED               5.70 NEW ENGL J MED 8.49 JAMA
 8 29.78 SCIENCE               4.67 J AM CHEM SOC  7.78 LANCET
 9 28.18 NAT IMMUNOL           4.46 J IMMUNOL      7.56 NAT GENET
10 28.17 REV MOD PHYS          4.28 APPL PHYS LETT 6.53 NAT MED

The table shows the top 10 journals by ISI Impact Factor, PageRank, and a modified system that combines the two (based on 2003 data). Nature and Science are generally regarded as the most prestigious journals, and in the combined system they come out on top. That the New England Journal of Medicine is cited even more than Nature or Science might reflect the mix of review articles and original articles that it publishes. It is necessary to analyze the data for a journal in the light of a detailed knowledge of the journal literature.


The Eigenfactor has been proposed as an alternative to the impact factor ( and[14][15]

Y factor

The Y factor is similar to the Eigenfactor.[16]

Online usage

In health sciences, alternatives have been developed for predicting the impact of an article soon after publication without waiting for the two years needed to calculate an impact factor. These alternatives have included predicting the impact factor by using webhits[17] and other factors such as "indexing in numerous databases; number of authors; abstraction in synoptic journals; clinical relevance scores; number of cited references" and the nature of the article.[18].


Frequency of tweeting may predict citation counts[19]; however, this study is controversial.[20]

Total citations / total articles

Scopus, a product of Elsevier B.V. has introduced its Analytics which provides interactive charting of the total citations divided by the total number of articles for multiple journals and publication years.[21]

SCImago Journal Rank (SJR)

SCImago Journal Rank (SJR), also based on data from Scopus, borrows from the PageRank concept to weight citations based on prestige of the citing journals.[22] According to its website, "is an indicator that expresses the number of connections that a journal receives through the citation of its documents divided between the total of documents published in the year selected by the publication, weighted according to the amount of incoming and outgoing connections of the sources." [23]

Prior work had explored using citation data to enhance information retrieval from biomedical journals.[24] SJR is available online at

Index Copernicus

Index Copernicus is similar to the impact factor.

Financial operation

Academic journals in the humanities and social sciences are usually subsidized by universities or professional organizations, and do not exist to make a profit. However, they often accept advertisements (usually from academic book publishers) as a way of off-setting production costs. It is standard practice for academic journals to charge libraries much higher subscription rates than individual subscribers pay. Editors of journals tend to have other professional responsibilities, most often as teaching professors. In the case of the very largest journals, there is sometimes paid staff to assist in the editing. The production of the journals is almost always done by paid staff from the publisher. Publishers in the subjects are often university presses; some of them specialize in such journals, such as the Oxford University Press.

New developments

In recent years, the Internet has revolutionized the production of, and access to, academic journals. Journal content is often available online via services subscribed to by academic libraries. Individual articles are indexed in databases by subject, and can be increasingly found in such databases as Google Scholar. Other specialized databases serve as platforms for disseminating journals (i.e. JSTOR or ScienceDirect). Some of the smallest and most specialized journals are prepared in-house by an academic department and published only on the internet--recently such publication has sometimes taken the form of a blog.

Open access

For more information, see: Open access.

There is currently a movement in higher education encouraging open access, either by self archiving, where the author places his paper in a repository where it can be searched for and read, or by publishing in an open access journal, which does not charge for subscriptions, being either subsidized or financed through author page charges. To date, open access has had a much greater effect on science journals than on those in the humanities.


  1. Antman EM, Lau J, Kupelnick B, Mosteller F, Chalmers TC (1992). "A comparison of results of meta-analyses of randomized control trials and recommendations of clinical experts. Treatments for myocardial infarction". JAMA 268 (2): 240–8. PMID 1535110[e]
  2. Tatsioni A, Bonitsis NG, Ioannidis JP (2007). "Persistence of contradicted claims in the literature". JAMA 298 (21): 2517–26. DOI:10.1001/jama.298.21.2517. PMID 18056905. Research Blogging.
  3. Citation metrics.
  4. Anderson, Kent (2008). Citations: Incitement or Excitement? The Scholarly Kitchen. Retrieved on 2008-04-09.
  5. Davis P. (2009) Scientific Impact Measures Compared
  6. Bollen J, Van de Sompel H, Hagberg A, Chute R (2009). "A principal component analysis of 39 scientific impact measures.". PLoS One 4 (6): e6022. DOI:10.1371/journal.pone.0006022. PMID 19562078. PMC PMC2699100. Research Blogging[e]
  7. Kousha, Kayvan; Mike Thelwall (2007). "Google Scholar citations and Google Web-URL citations: A multi-discipline exploratory analysis". J. Am. Soc. Inf. Sci. Technol. 58 (7): 1055-1065. DOI:10.1002/asi.v58:7. Retrieved on 2008-09-26. Research Blogging.
  8. Anderson K. The “h-index”: An Objective Mismeasure?. Retrieved on 2008-06-30.
  10. Stringer, M.J.; M. Sales-Pardo & L.A.N. Amaral (2008), "Effectiveness of Journal Ranking Schemes as a Tool for Locating Information", PLoS ONE 3 (2): e1683, DOI:10.1371/journal.pone.0001683 [e]
  11. Bollen, Johan; Marko A. Rodriquez, Herbert Van de Sompel (2006-12-23). "Journal status". Scientometrics 69 (3): 669-687. DOI:10.1007/s11192-006-0176-z. Retrieved on 2010-05-05. Research Blogging.
  12. Journal Status, Johan Bollen, Marko A. Rodriguez, and Herbert Van de Sompel, May 17, 2006
  13. Dellavalle RP, Schilling LM, Rodriguez MA, Van de Sompel H, Bollen J (July 2007). "Refining dermatology journal impact factors using PageRank". J. Am. Acad. Dermatol. 57 (1): 116–9. DOI:10.1016/j.jaad.2007.03.005. PMID 17499388. Research Blogging.
  14. Bergstrom CT. (2007) Eigenfactor: Measuring the value and prestige of scholarly journals C&RL News 68(5)
  15. Philip Davis (7/23/2008). Eigenfactor. The Scholarly Kitchen.
  16. Bollen, Johan; Marko A. Rodriquez, Herbert Van de Sompel (2006-12-23). "Journal status". Scientometrics 69 (3): 669-687. DOI:10.1007/s11192-006-0176-z. Retrieved on 2010-05-05. Research Blogging.
  17. Perneger TV (2004). "Relation between online "hit counts" and subsequent citations: prospective study of research papers in the BMJ". BMJ 329 (7465): 546-7. DOI:10.1136/bmj.329.7465.546. PMID 15345629. Research Blogging.
  18. Lokker C, McKibbon KA, McKinlay RJ, Wilczynski NL, Haynes RB (2008). "Prediction of citation counts for clinical articles at two years using data available within three weeks of publication: retrospective cohort study". BMJ. DOI:10.1136/bmj.39482.526713.BE. PMID 18292132. Research Blogging.
  19. Eysenbach G (2011). "Can tweets predict citations? Metrics of social impact based on twitter and correlation with traditional metrics of scientific impact.". J Med Internet Res 13 (4): e123. DOI:10.2196/jmir.2012. PMID 22173204. Research Blogging.
  20. Tweets, and Our Obsession with Alt Metrics, The Scholarly Kitchen. Retrieved on 2012-01-04.
  21. Anonymous. Scopus Journal Analyzer. Elsevier B.V.. Retrieved on 2008-07-21.
  22. Falagas ME, Kouranos VD, Arencibia-Jorge R, Karageorgopoulos DE. Comparison of SCImago journal rank indicator with journal impact factor. FASEB J. 2008 Apr 11. PMID 18408168
  24. Bernstam EV, Herskovic JR, Aphinyanaphongs Y, Aliferis CF, Sriram MG, Hersh WR. Using citation data to improve retrieval from MEDLINE. J Am Med Inform Assoc. 2006 Jan-Feb;13(1):96-105. Epub 2005 Oct 12. PMID 16221938