Skip to main content
It looks like you're using Internet Explorer 11 or older. This website works best with modern browsers such as the latest versions of Chrome, Firefox, Safari, and Edge. If you continue with this browser, you may see unexpected results.

Bibliometrics and Citation Analysis

Definitions for bibliometrics and citation analysis, information on bibliometric tools and resources within CSIRO on bibliometrics

Bibliometrics overview

Bibliometrics are quantitative measures that indicate influence or interest in academic research.  They can help to identify key authors and journals within a particular area, and they can also guide searching within academic literature. 

** If your interest relates to citation analysis for an Organisation or Flagship, or at a level higher than individual researchers, please visit the CSIRO Science Health and Excellence reporting page.

How do bibliometrics work?

The graphic below shows that bibliometrics are calculated from bibliographical databases, such as Web of Science (Thomson-Reuters) and Scopus (Elsevier). 

There are a couple of things to watch out for:-

  • Bibliometrics with the same name can vary depending on the database they were calculated from. E.g. citation counts can be calculated from both Web of Science and Scopus, but results may differ.  This is because different databases cover slightly different sets of publications.
  • Some bibliometrics are specific to a particular database. E.g. Impact Factors are produced by Thomson-Reuters who compile Web of Science.

Web of Science and Scopus both publish a list of publications and sources their databases cover.

 

 

Limitations of Google Scholar

When explaining bibliometrics, a common question is how does Google Scholar fit in?   CSIRO has selected Web of Science to measure citation metrics. Google scholar can be useful in certain circumstances, but it is useful to be aware of its limitations.

Unlike Web of Science, you do not know how Google Scholar is generating its search results, and so you need to judge the validity of the sources for yourself.  This also means that you don't know whether the sources included in Google Scholar's bibliometrics (e.g. the citation count) are of a high enough quality. So often the same bibliometric is higher when generated from Google Scholar, compared to Web of Science or Scopus.

Also, bibliometrics aside, while Google Scholar searches some of the resources that the Library subscribes to, it does not cover them all. So relying on Google Scholar alone could mean you miss out on things.  Google Scholar also does not have the number of options for refining your search as Web of Science (screen shot below).

 

Publication bibliometrics

  • Impact factors (IF): Allow you to judge the relative importance or impact of a journal within a subject category. (Shown below). They show the frequency with which the journal's papers are cited. IFs can be found in the Journal Citation Report (JCR), sometimes on the journal's web page. IFs are generated from data in the Web of Science database. You cannot use IFs to compare journals across different subject areas.  Individual articles should not be evaluated using Impact Factors, because their citation strength seldom relates to the average the JIF represents.
  • SCImago Journal Rank (SJR): "SCImago Journal Rank (SJR) is a prestige metric based on the idea that 'all citations are not created equal'."  CSIRO Libraries provides a subscription to Scopus and the SJR are freely available and the SCImago Journal Rank (SJR) and Country Rankings site gives a comparison.  You cannot use SJRs to compare journals across different subject areas. SJR is an average citation rate, with citations weighted according to the citation strength of the citing journal.
  • Source-Normalized Impact per Paper (SNIP): "Source Normalized Impact per Paper (SNIP) measures contextual citation impact by weighting citations based on the total number of citations in a subject field." (Find out more about SNIP).  CSIRO Libraries provides a subscription to Scopus which the SNIPs are generated from and the SNIPs are freely available.  The SNIP is different because every journal gets it's own "subject area" consisting of the journals that cite it.  Weighting is applied based on the propensity to cite those journals.
  • Eigenfactor metrics: These use a similar methodology to SJR (above), but they are based on Web of Science data. They are available from the Journal Citation Report (JCR)(shown below).  Citations are recursively weighted, and the metric represents a share of total weighted citations rather than an average citation count.

 

 

Article bibliometrics

  • Citation count: The number of times an article has been cited by other research. You can find this in the Web of Science (shown below) and also in Scopus.
  • Google Scholar also gives a citation count, though it's important to be aware of the limitations outlined above. Scopus gives a citation count as well.

 

Author bibliometrics

H-Index: An author's H-index is the number of papers (n) that have n or more citations. For example, an author's H-index is 7 if she/he has 7 papers that have 7 or more citations each.

Things to be aware of when interpreting h-indexes:-

  • H-indexes cannot be used to compare between subject areas. This is because they are based on the number of citations an article receives, and conventions differ considerably between subject areas. 
  • H-indexes are derived from bibliographical databases, such as Web of Science or Scopus. There is much overlap between these database, however the publications they cover do differ to some degree (see above). This means that your H-index can vary depending on which database you’ve derived it from, and so when quoting your H-index it’s important to specify the database you derived it from AND the date that the H-index was compiled.

Finding your H-index:-

  • Using Scopus: Log into Scopus and search for your name. After clicking on your name, you should see your h-index.
  • Using Web of Science: You can also find your H-index in Web of Science, by searching for your name while at the same time restricting the results to the institutions you have worked at, and then click the Create Citation Report button in the top-right, then manually remove any publications that you did not write, and finally looking at the h-index score in the top-right. (Or if a person has a ResearcherID or an ORCID you can search for this instead of their name. This has the advantage that it will not include publications that do not belong to them).

Altmetrics

Whereas bibliometrics look at citation counts, Impact Factors, H-indexes etc, altmetrics look at article level metrics like views, downloads and mentions in news stories, twitter, blogs, and social media traffic.  Altmetrics is the name of the group of metrics. Some of the companies offering tools are altmetric.com, ImpactStory, and Plum Analytics. Altmetric.com offers article metrics whereas ImpactStory offers metrics for a researcher's profile.

 


This demo page on the PLOS mashup site show altmetrics in action.  Click on the coloured circle icon next to the title of a paper.  Altmetrics can be used to expand a picture of research impact. Like citation counts they should be treated with caution. They measure the amount of 'attention' being paid to a piece of research, but they do not show whether this attention is positive or negative. Altmetrics should be considered as being additional to bibliometrics, as opposed to being a replacement."

 

Bibliometrics for the Individual