Research metrics, also called bibliometrics or citation metrics, are quantitative measurements designed to help you or others evaluate research outputs. They might include measurements such as times an article is cited, journal impact factor measurements, social media mentions, the h-index, news media mentions, and more. Some of these metrics are more traditional than others, but all of them can be used demonstrate the impact of your work. Altimetrics are useful supplementary measures of impact, best used in tandem with traditional measures like citation counts. Together, the two types of metrics can illustrate the full impact of your work
The University has signed DORA (San Francisco Declaration on Research Assessment ) and does not use Journal based metrics (impact factors, ranking) to measure the quality of an output or, H-Index or citation counts alone without context of the article or other alternative metrics.
Research metrics, when used responsibly, can help you tell a story about your research; and can help you understand, describe and enhance your impact in your field and beyond. Examples of metrics and the tools used to generate them to assess the impact of your research beyond citation counts are below.
When using metrics it is important to understand the context and limitations of the metric so that it is used responsibly. Responsible metrics can be understood in terms of:
- Robustness: basing metrics on the best possible data in terms of accuracy and scope;
- Humility: recognizing that quantitative evaluation should support –but not supplant –qualitative, expert assessment;
- Transparency: keeping data collection and analytical processes open and transparent, so that those being evaluated can test and verify the results;
- Diversity: accounting for variation by field, using a variety of indicators to reflect and support a plurality of research & researcher career paths;
- Reflexivity: recognizing the potential & systemic effects of indicators and updating them in response
The Metrics Toolkit is a resource for researchers and evaluators that provides guidance for demonstrating and evaluating claims of research impact. With the Toolkit you can quickly understand what a metric means, how it is calculated, and if it’s good match for your impact question.
Altmetric Bookmarklet is a free plug in for Chrome, Firefox and Safari allows you to view the online shares and mentions of an article with a single click. This can provide information on social media shares, citations in Dimensions, readers on Mendeley, geographic demographic for Twitter and Mendeley and attention score in context
Note: The Bookmarklet only works on PubMed, arXiv or pages containing a DOI with Google Scholar friendly citation metadata. Twitter mentions are only available for articles published since July 2011
Repository statistics. If your work is deposited in a repository (publication or data) you can usually assess statistics about the views and downloads of the records/files. The University repository has details of view and file downloads on the individual output records and a summary of all the data is on the RMS stats page.Other repositories such as Figshare etc also provide similar data.
Impactstory is an open source web-based tool that provides altmetrics to help measure the impact of your research outputs. This includes journal articles, blog posts, datasets, and software. Impact story can link to your ORCID and Twitter accounts. This will provide you with details of your achievements; how open access your work is, Wikipedia mentions and greatest hits and your global research.