Bibliometrics include a range of things, some of which you may already be familiar with. For example, the number of times a journal article has been cited, a journal's Impact Factor or the h-index of a particular academic.
Bibliometrics are important because they allow you to see the amount of influence that a particular journal, article, university, research group of even an individual academic has had. This is very valuable to know when, for example, deciding which articles to cite, where to publish, or identifying key universities and academics to follow or collaborate with. The 2014 REF considered bibliometrics and the future REF plans to take them into consideration.
The non-academic 'cousin' of bibliometrics, namely altmetrics, provide a complementary measure, which shows how research has been shared across social media, news sites and influenced policy.
- What are the bibliometric measures that I can use?
- Where do bibliometrics come from and what should I be aware of?
- How do I decide which metrics to use?
Metrics about me
- How can I look up my own metrics or another academic's metrics (e.g. h-index)
- How can I ensure my metrics are accurate?
Journal and article metrics
- How can I look up the metrics for a journal? (Useful when deciding where to publish. e.g. CiteScore, SJR and Impact Factors)
- How can I look up the metrics for an article? (e.g. citations)
Looking at the whole University
- How can I get an overview of the whole University?
- How can I compare Portsmouth to other universities?
- How can I compare my department or research group with an equivalent research group or department at another university?
- How can I find out who Portsmouth is collaborating with?
- How can I find experts working in a particular area?
- How can I investigate the effect of adding a particular new academic to my research group or department?
Other bibliometrics FAQs
- Are metrics recorded in Pure?
- How does Pure relate to the bibliometrics databases, such as Scopus and Web of Science?
- What is the problem with using Google scholar?
- How can I find out the mainstream 'attention' that an article has received? (e.g. on social media, main stream news, blogs etc)
- How can I find out if an article's been referenced in a policy (e.g. government) document?
If you need further help, please contact the Research Outputs team (email@example.com) who are based in the Library. Or you may like to come along to the bibliometrics workshop on the Researcher Development Programme.
There are many bibliometric measures, some of which you may already be aware of, for example the number of citations that an article has received. All other metrics are actually derived from the number of citations that articles receive. Typically different metrics are used for different situations. These are a few examples of the most common ones -
- Bibliometrics used to assess articles: citation count
- Bibliometrics used to assess academic authors: the author's h-index
- Bibliometrics used to assess academic journals: the journals Impact Factor or SNIP value.
- Bibliometrics used to assess research groups or whole universities: Field Weighted Citation Index or number of publications or citations.
The quick reference guide explains the key metrics in more detail. Plus,the page below also covers many of the main metrics that you will need to be aware of and when you would use them.
- Bibliometrics with the same name can vary depending on the database they were calculated from. E.g. citation counts can be calculated from both Web of Science and Scopus, but results may differ. This is because different databases cover slightly different sets of publications. Web of Science and Scopus both publish a list of publications and sources their databases cover: Web of Science database coverage, Scopus database coverage.
- Some bibliometrics are specific to a particular database. E.g. Impact Factors are produced by Clarivate Analytics who make Web of Science.
Two 'golden rules' are to always use more than one metric (e.g. do not use Impact Factors alone to assess the quality of journals) and to use the context in which the metrics are generated to judge their relevance (e.g. metrics are less meaningful in some subject areas, such as the arts).
The Snowball guide explains when each metric can (and should) be used. Plus, this page gives more details about common scenarios.
Metrics about an individual academic includes the number of articles they have published, total number of citations, number of co-authors and so on. Each author has a profile on Scopus, which displays these metrics (above). To view yours, log into Scopus and search for your name.
A metric that is often discussed is an academic's H-index. Their H-index is the number of papers (n) they have published that have n or more citations. For example, an academic's H-index is 7 if she/he has published 7 papers that have 7 or more citations each.
Finding my own (or another academic's) H-index:-
- Using Scopus: Log into Scopus and search for their name. After clicking on their name, you should see their h-index (circled above).
- Using Web of Science: Log into Web of Science and searching for their name, while at the same time restricting the results to the universities they have worked at. Then click the Create Citation Report button in the top-right, then manually remove any publications they did not write, and finally look at the h-index score in the top-right. (Or if an author has a ResearcherID or an ORCID you can search for this instead of their name. This has the advantage that it will not include publications that do not belong to them).
Things to be aware of:-
- H-indexes cannot be used to compare between authors working in different subject areas. This is because they are based on the number of citations an article receives, and conventions differ considerably between subject areas.
- H-indexes are derived from bibliographical databases, such as Web of Science or Scopus. There is much overlap between these database, however the publications they cover do differ to some degree (see above). This means that an author's H-index can vary depending on which database it's been derived it from, and so when quoting a H-index it’s important to also say the database it was derived it from.
The metrics in Scopus, Web of Science and SciVal are derived from the details you add to your publications. These databases run a complex 'matching algorithm' to identify which articles 'belong' to you. There are a number of things you should do to ensure this algorithm is accurate.
- Please ensure that you always put the full affiliation on articles that your publish. ie. please use "University of Portsmouth", as opposed to the department in which you are based.
- Please get an ORCID id number, as this greatly helps to correctly linking your publications to you.
Please check your details on Web of Science and Scopus to ensure they are correct. If you notice errors, then use the contact forms on the Web of Science and Scopus websites to request that they make any necessarily corrections. Click here for further instructions.
Please be aware that if you are working in a subject area that the main databases (i.e. Web of Science and Scopus) do not fully cover, then it follows that the metrics about you on these systems will not be accurate. For example, academics working in the arts and some humanities subject areas may find that these data bases do not list all of their publications. To see if this applies to you, details of the coverage can be found here: Web of Science database coverage and Scopus database coverage.
Finally, it is worth mentioning that having a profile on Google scholar can also be a valuable way of promoting your work. However, please be sure to either set up your profile using a personal email address, or if you leave Portsmouth make sure that you change the email address to your new institution before your UoP computing account is removed.
These metrics give an indication of the 'quality' of a particular journal. This may be useful when deciding where to publish. There are a number of these metrics, which are calculated in slightly different ways. A question that's often asked is 'which metric should I use?'There are pros and cons of each.
As mentioned above, the two databases that you can use to look up bibliometrics are Scopus and Web of Science. They both offer slightly different journal metrics. A key difference to look out for is whether or not they are subject-normalised; only metrics that are subject-normalised can be used to compare journals from different subject areas.
While in the past the Impact Factor (from Web of Science) has been seen as the 'baseline' standard metric for journals, the Scopus based metrics are now well respected. In fact the SNIP (from Scopus) is actually advantageous over the Impact Factor becuase it is subject-normalsed.
Journal metrics derived from the Scopus database:-
- Source-Normalized Impact per Paper (SNIP): "Source Normalized Impact per Paper (SNIP) measures actual citations received relative to citations expected for the serial's [journal's] subject field."
- CiteScore:"CiteScore measures average citations received per document published in the serial [journal]." This metric is not subject-normalised, so you cannot use this metric to compare journals across different subject areas.
- SCImago Journal Rank (SJR): "SCImago Journal Rank (SJR) measures weighted citations received by the serial [journal]. Citation weighting depends on subject field and prestige (SJR) of the citing serial."
- You can look these metrics up on the Journal Metrics site, which takes take from Scopus and displays it in an easy to use format. Also, this guide gives further help - How to look up journal metrics in Scopus and use them to compare journals.
Journal metrics derived from the Web of Science database:-
- Journal impact factors (IF): Allow you to judge the relative importance or impact of a journal. They show the frequency with which the journal's papers are cited. This metric is not subject-normalised, so you cannot use this metric to compare journals across different subject areas.
- Eigenfactor metrics: These use a similar methodology to SJR (below), but they are based on Web of Science data. You cannot use this metric to compare journals across different subject areas.
- This guide gives further help - How to look up journal metrics In Web of Science and use them to compare journals.
In addition to these metrics, the researchers in the area of business use the ABS Academic Journal Guide, which provides a ranked list of journals.
You may also like to look at the info on how to tell which journals to avoid.
You can also see the SNIP and SJR metric in Pure, using the Metric tab as per the screenshot below.
The citation count is the number of times an article has been cited by other academic research. To find this in Web of Science please see the Web of Science library guide. In Scopus the citation count is shown on the right of the article (below). Google Scholar also gives a citation count, though it's important to be aware of the limitations outlined below.
You can also see the citations in Pure, using the Metric tab as per the screenshot of Pure above.
You can do this using SciVal. SciVal draws data from Scopus and presents it in a way that allows you to easily get an overview of Portsmouth, and also compare Portsmouth to other universities (explained below).
All staff and students can access SciVal - login instructions.
When you login, you'll see three main tabs: Overview, Benchmarking and Collaboration.
The Overview tab gives a summary of a particular university, department or research group (see above). You can look at Portsmouth or any other university.
The Benchmarking tab allows you to compare universities, departments and research groups (see above).
The Collaboration tab allows you to explore who Portsmouth (or any other university or business) has published with (see above).
The Quick Guide to SciVal gives more detailed information about what SciVal can do.
You can do this using the Benchmarking tab in SciVal (see above). To access the Benchmarking tab -
- Click on the Benchmarking tab.
- Select the universities you want to compare Portsmouth to from the left-hand menu.
- Then select the metrics you want to make the comparisons on by clicking the x and y axis buttons if you are in the Chart view, or by clicking the metric 1 and metric 2 etc buttons if you are in the 'table' view.
- Use the Export button (top-right) if you need to download the data into an Excel spreadsheet etc.
How can I compare my department or research group with an equivalent research group or department at another university?
To do this, you will need to go into the My SciVal tab (top-right of the screen) and create groups containing the relevant researchers from Portsmouth and the researchers from the other university.
You can then use the Benchmarking tab to make comparisons.
For more information, please contact firstname.lastname@example.org
A powerful feature of SciVal is the way in which its Collaboration tab (see above) allows you to identify and analyse existing and potential collaboration opportunities based on publication output and citation impact. You can do this using the Collaboration tab -
- Click on the Collaboration tab.
- Choose between either the Map or the Table view.
- In the left-hand menu, select the main university you want to focus on (e.g. Portsmouth).
- Under where it says "Institutions collaborating with [for example] the University of Portsmouth", use the drop-down menu to select the country you want to see the collaborators from. (Or if you are in the map view, just click on the map).
- Use the Sectors drop-down menu to limit results to academic, government, corporate or medical collaborators.
You can do this in SciVal using the Research Areas facility. To do this -
How can I investigate the effect of adding a particular new academic to my research group or department?
To do this, you will need to create a Research Group containing your existing academics and then add the new academic to that group. You can do this in the My SciVal tab (above).
Once you have done this, you can then see the effect adding this individual has on the metrics using the Benchmarking tab.
For more information, please contact email@example.com
Yes - Pure automatically pulls in metrics from Scopus and attached them to article in Pure, for example the number of citations received and the metrics for the journal.
To view these metrics in Pure, open the output record and click on metrics in the left-hand menu (below).
These systems have different purposes. Pure is an internal University of Portsmouth system, which holds details (and increasingly the full-text) of the publications produced by Portsmouth academics, along with detailed information about other aspects of their research, such as funding, impact, press coverage etc. The purpose of Pure is to manage and promote the research activities taking place at Portsmouth.
Conversely, the main bibliometric databases (i.e. Web of Science, Scopus and SciVal) are international databases, which hold data about publications produced by academics across the world. Unlike Pure, they do not cover the other aspects of the research life-cycle, and nor to they hold a copy of the full-text.
Therefore, each university now has its own Pure (or equivalent system) and also subscriptions to Web of Science, Scopus and SciVal. However, to integrate the systems, we now pull some metrics into Pure from Scopus and the Web of Science (see above).
This raises the question of whether we actually need to look at Scopus, Web of Science and SciVal directly or whether we can just look at the metrics via Pure? The answer is that if you just want a quick look at specific metrics on a particular article then you can look at them via Pure (see above), but if you want to explore the metrics in any depth (e.g. to answer the questions covered on this page) then you do need to go into Scopus/Web of Science/SciVal themselves.
Whereas bibliometrics look at citation counts, Impact Factors, h-indexes etc, altmetrics look things like mentions in the news, twitter, blogs, and social media traffic distribution across the world. Altmetrics are increasingly being used as an indicator of research impact. However, they should be treated with some caution. They really measure the amount of 'attention' being paid to a piece of research, but they do not show whether this attention is positive or negative. Altmetrics are nonetheless interesting and they should be considered as a useful addition to bibliometrics, as opposed to being a replacement.
You can see the altmetrics for an article by installing the free altmetrics 'bookmarklet' tool. It only takes a few seconds to install - go to the bookmarklet page, then scroll down until you see a blue 'Altmetric it!' button, which you need to drag and drop on your browser toolbar. This adds a small button to your browser. You then need to -
- Go to the web page containing the article.
- Click the bookmarklet buttton on your browser. A little coloured donut will appear in the top-right which gives you the altmetrics (see below).
A useful feature of altmetrics is that they can be used to see if an article has been referenced by a policy document, for example written by government. This is useful when exploring whether research has had a 'real world' impact. To do this, install the altmetrics bookmarklet tool, as described above. Once you've clicked on the coloured donut, you will a screen that will look like this -
When explaining bibliometrics, a common question is how does Google Scholar fit in? There is nothing wrong with using Google scholar to find research, but it's useful to know its limitations.
Unlike Web of Science, you do not know how Google Scholar is generating its search results, and so you need to judge the validity of the sources for yourself. This also means that you don't know whether the sources included in Google Scholar's bibliometrics (e.g. the citation count) are of a high enough quality. So often the same bibliometric is higher when generated from Google Scholar, compared to Web of Science or Scopus.
Also, bibliometrics aside, while Google Scholar searches some of the resources that the Library subscribes to, it does not cover them all. So relying on Google Scholar alone could mean you miss out on things. Google Scholar also does not have the number of options for refining your search as Web of Science.