A journal's impact indicator (often called its "impact factor") is a metric often used for evaluating the relative importance of a journal in its field, based on the number of published citations made to that journal's articles over time. There are many different types of journal impact indicators, each with their own methods, strengths, and weaknesses.
All the indicators discussed in this article use this methodology as their base:

The past number of years in the denominator — as well as what counts as a "citable item" — varies by impact indicator. Some indicators also adjust the equation by weighting citations to allow for cross-disciplinary comparison. Below is a quick comparison chart of four common impact indicators.
Indicator Name | Data Source | Years Covered | Comparable Across Disciplines? |
---|---|---|---|
Journal Citation Reports | Web of Science | 2 years | No |
CiteScore | Scopus | 4 years | No |
SCImago Journal & Country Rank | Scopus | 3 years | Yes |
Eigenfactor | Web of Science | 5 years | Yes |
For how to use each impact indicator and further information on their methodology, see the sections below.
Journal Citation Reports' Journal Impact Factor
Although the term "journal impact factor" is often used generically to describe any impact indicator, only the metric provided by Clarivate Analytics can truly lay claim to that name. To find the Journal Impact Factor (JIF) using Web of Science, you can search the Journal Citation Reports by journal name or browse by journal. Additionally, there is an option to browse by topic, which is useful for identifying top journals in a specific field. Clarivate, the company that owns Web of Science, has provided a LibGuide that contains tutorials, videos, and live trainings, as well as this explanation for how the JIF is calculated.
This particular journal impact metric is not suited for comparing journals across disciplines, as it does not adjust for different citation rates based on the field.
CiteScore
In Elsevier's Scopus database, search for the CiteScore by either subject area, title, publisher, or ISSN. After clicking on a journal, both the CiteScore value and the SCImago Journal Rank are listed, as both share the same data source, Scopus. Scopus provides useful answers to frequently asked questions about the CiteScore, including the methodology for used calculating the score.
Like the JIF, this journal metric is not suited for comparing journals across disciplines, as it also does not adjust for different citation rates based on the field.
SCImago Journal and Country Rank
Also based on journals indexed in the Scopus database, the SCImago Journal and Country Rank (SJR) site allows you to search by journal title, ISSN, or publisher name and to filter by subject area, subject category, region or country, and publication type (e.g., journals, conference proceedings). SCImago provides extensive help in navigating their site, including information about how the indicators are calculated.
The SCImago Journal and Country Rank considers the prestigiousness of journals when calculating its lists. By using a weighted calculation, the SCImago Journal Rank indicator attempts to enable cross-discipline comparisons.
Eigenfactor
Utilizing citation data from Clarivate's Journal Citation Reports, Eigenfactor was developed by two researchers, biologist Carl Bergstrom and information scientist Jevin West, to quantify journal influence. Eigenfactor can be searched by journal name, ISSN, publisher, year, and category. A Chrome plug-in called the EigenFactorizer color-codes search results in PubMed according to the Article Influence score of the journals in which those results appear. The project's About page provides much more information, including an explanation for how the Eigenfactor score and the Article Influence score are calculated.
Like SCImago, Eigenfactor weights citations in more highly cited journals more heavily and adjusts for citation rate differences across disciplines, making cross-disciplinary comparison easier. SCImago and Eigenfactor, however, use different methodologies to create their rankings.
These four examples are just a few of the available sources that provide journal impact indicators that may provide some insight into the quality of a particular journal. Be aware, however, that there are many companies that purport to offer journal metrics, often with names and products that sound similar to those described above, but that lack transparency in their methodologies. Stay wary of potentially predatory journals that may boast an impact factor using these misleading metrics.
Finally, remember that journal impact indicators do not determine the importance of an individual paper. Excellent papers can appear in less-well-cited journals, and mediocre papers can sometimes be published in highly cited ones. Journal impact indicators look at the journal as a whole and should not be used to determine the worth of a single paper; always use critical thinking to individually evaluate each article.